简体   繁体   English

Spark安装问题:实例化org.apache.spark.sql.hive.HiveSessionStateBuilder时出错

[英]Spark installation issue: Error while instantiating org.apache.spark.sql.hive.HiveSessionStateBuilder

I followed all environment variable and installation instructions of Spark. 我遵循了Spark的所有环境变量和安装说明。 Now when I run pyspark , I get following error: 现在,当我运行pyspark ,出现以下错误:

pyspark.sql.utils.IllegalArgumentException: u"Error while instantiating 'org.apache.spark.sql.hive.HiveSessionStateBuilder':" pyspark.sql.utils.IllegalArgumentException:u“实例化org.apache.spark.sql.hive.HiveSessionStateBuilder时出错:”

在此处输入图片说明

I have already added the PATH, HADOOP_HOME, SPARK_HOME, along with winutil.exe file. 我已经添加了PATH,HADOOP_HOME,SPARK_HOME以及winutil.exe文件。 Also tried one of the solutions posted on web for above error saying to change permissions like this 还尝试了针对上述错误在网络上发布的解决方案之一,说要更改这样的权限

C:\winutils\bin>winutils.exe chmod 777 \tmp\hive

Nothing worked. 没事。

As you can see above, spark does start but nothing else works. 正如您在上面看到的那样,spark确实会启动,但其他任何方法都不会起作用。 See below when I enter following command: 输入以下命令时,请参见下文:

在此处输入图片说明

What am I missing here? 我在这里想念什么?

(Assuming windows environment) check and set for the permission as given below. (假设在Windows环境中)检查并设置如下所示的权限。

C:\spark\spark-2.2.0-bin-hadoop2.7\bin>%HADOOP_HOME%\bin\winutils.exe ls \tmp\hive
drwx------ 1 BUILTIN\Administrators CORP\Domain Users 0 Oct 13 2017 \tmp\hive

C:\spark\spark-2.2.0-bin-hadoop2.7\bin>%HADOOP_HOME%\bin\winutils.exe chmod 777 \tmp\hive

C:\spark\spark-2.2.0-bin-hadoop2.7\bin>%HADOOP_HOME%\bin\winutils.exe ls \tmp\hive
drwxrwxrwx 1 BUILTIN\Administrators CORP\Domain Users 0 Oct 13 2017 \tmp\hive

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 实例化org.apache.spark.sql.hive.HiveSessionStateBuilder的Pyspark错误:“ - Pyspark error instantiating 'org.apache.spark.sql.hive.HiveSessionStateBuilder':" Spark 安装 - 错误:无法找到或加载主类 org.apache.spark.launcher.Main - Spark installation - Error: Could not find or load main class org.apache.spark.launcher.Main 调用 o168.showString 时出错。 org.apache.spark.sql.execution.datasources.SchemaColumnConvertNotSupportedException - An error occurred while calling o168.showString. org.apache.spark.sql.execution.datasources.SchemaColumnConvertNotSupportedException Spark saveAsTable append saves data to hive but throws an error: org.apache.hadoop.hive.ql.metadata.Hive.alterTable - Spark saveAsTable append saves data to hive but throws an error: org.apache.hadoop.hive.ql.metadata.Hive.alterTable 我在 Py4JJavaError 上有问题:调用 z:org.apache.spark.api.python.PythonRDD.collectAndServe.over 时发生错误 - I have issue on Py4JJavaError: An error occurred while calling z:org.apache.spark.api.python.PythonRDD.collectAndServe.over Spark sql 中的 Hive 查询 - Hive queries in spark sql 在本地创建 sparkcontext 时出错调用 None.org.apache.spark.api.java.JavaSparkContext 时出错 - Error creating sparkcontext locally An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext Py4JJavaError:调用 None.org.apache.spark.api.java.JavaSparkContext 时出错 - Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext PySpark org.apache.spark.sql.AnalysisException:找不到表或视图: - PySpark org.apache.spark.sql.AnalysisException: Table or view not found: Apache Spark:启动PySpark时出错 - Apache Spark: Error while starting PySpark
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM