简体   繁体   中英

Spark installation issue: Error while instantiating org.apache.spark.sql.hive.HiveSessionStateBuilder

I followed all environment variable and installation instructions of Spark. Now when I run pyspark , I get following error:

pyspark.sql.utils.IllegalArgumentException: u"Error while instantiating 'org.apache.spark.sql.hive.HiveSessionStateBuilder':"

在此处输入图片说明

I have already added the PATH, HADOOP_HOME, SPARK_HOME, along with winutil.exe file. Also tried one of the solutions posted on web for above error saying to change permissions like this

C:\winutils\bin>winutils.exe chmod 777 \tmp\hive

Nothing worked.

As you can see above, spark does start but nothing else works. See below when I enter following command:

在此处输入图片说明

What am I missing here?

(Assuming windows environment) check and set for the permission as given below.

C:\spark\spark-2.2.0-bin-hadoop2.7\bin>%HADOOP_HOME%\bin\winutils.exe ls \tmp\hive
drwx------ 1 BUILTIN\Administrators CORP\Domain Users 0 Oct 13 2017 \tmp\hive

C:\spark\spark-2.2.0-bin-hadoop2.7\bin>%HADOOP_HOME%\bin\winutils.exe chmod 777 \tmp\hive

C:\spark\spark-2.2.0-bin-hadoop2.7\bin>%HADOOP_HOME%\bin\winutils.exe ls \tmp\hive
drwxrwxrwx 1 BUILTIN\Administrators CORP\Domain Users 0 Oct 13 2017 \tmp\hive

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM