简体   繁体   中英

How to query hdfs from a spark cluster (2.1) which is running on kubernetes?

I was trying to access HDFS files from a spark cluster which is running inside Kubernetes containers.

However I keep on getting the error: AnalysisException: 'The ORC data source must be used with Hive support enabled;'

What I am missing here?

Are you have SparkSession created with enableHiveSupport()?

Similar issue: Spark can access Hive table from pyspark but not from spark-submit

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM