[英]How to query hdfs from a spark cluster (2.1) which is running on kubernetes?
I was trying to access HDFS files from a spark cluster which is running inside Kubernetes containers. 我试图从Kubernetes容器中运行的Spark集群访问HDFS文件。
However I keep on getting the error: AnalysisException: 'The ORC data source must be used with Hive support enabled;' 但是,我继续收到错误消息:AnalysisException:'必须在启用Hive支持的情况下使用ORC数据源;'
What I am missing here? 我在这里想念的是什么?
Are you have SparkSession created with enableHiveSupport()? 您是否使用enableHiveSupport()创建了SparkSession?
Similar issue: Spark can access Hive table from pyspark but not from spark-submit 相似的问题: Spark可以从pyspark访问Hive表,但不能从spark-submit访问
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.