[英]Spark installation issue: Error while instantiating org.apache.spark.sql.hive.HiveSessionStateBuilder
I followed all environment variable and installation instructions of Spark. 我遵循了Spark的所有环境变量和安装说明。 Now when I run pyspark
, I get following error: 现在,当我运行pyspark
,出现以下错误:
pyspark.sql.utils.IllegalArgumentException: u"Error while instantiating 'org.apache.spark.sql.hive.HiveSessionStateBuilder':" pyspark.sql.utils.IllegalArgumentException:u“实例化org.apache.spark.sql.hive.HiveSessionStateBuilder时出错:”
I have already added the PATH, HADOOP_HOME, SPARK_HOME, along with winutil.exe file. 我已经添加了PATH,HADOOP_HOME,SPARK_HOME以及winutil.exe文件。 Also tried one of the solutions posted on web for above error saying to change permissions like this 还尝试了针对上述错误在网络上发布的解决方案之一,说要更改这样的权限
C:\winutils\bin>winutils.exe chmod 777 \tmp\hive
Nothing worked. 没事。
As you can see above, spark does start but nothing else works. 正如您在上面看到的那样,spark确实会启动,但其他任何方法都不会起作用。 See below when I enter following command: 输入以下命令时,请参见下文:
What am I missing here? 我在这里想念什么?
(Assuming windows environment) check and set for the permission as given below. (假设在Windows环境中)检查并设置如下所示的权限。
C:\spark\spark-2.2.0-bin-hadoop2.7\bin>%HADOOP_HOME%\bin\winutils.exe ls \tmp\hive
drwx------ 1 BUILTIN\Administrators CORP\Domain Users 0 Oct 13 2017 \tmp\hive
C:\spark\spark-2.2.0-bin-hadoop2.7\bin>%HADOOP_HOME%\bin\winutils.exe chmod 777 \tmp\hive
C:\spark\spark-2.2.0-bin-hadoop2.7\bin>%HADOOP_HOME%\bin\winutils.exe ls \tmp\hive
drwxrwxrwx 1 BUILTIN\Administrators CORP\Domain Users 0 Oct 13 2017 \tmp\hive
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.