简体   繁体   中英

ImportError: cannot import name 'HiveContext' from 'pyspark.sql'

I am running pyspark in my PC (windows 10) but I can not import HiveContext:

from pyspark.sql import HiveContext
---------------------------------------------------------------------------
ImportError                               Traceback (most recent call last)
<ipython-input-25-e3ae767de910> in <module>
----> 1 from pyspark.sql import HiveContext

ImportError: cannot import name 'HiveContext' from 'pyspark.sql' (C:\spark\spark-3.0.0-preview-bin-hadoop2.7\python\pyspark\sql\__init__.py)

How I should proceed to resolve it?

You're using the preview release of Spark 3.0. According to the release notes , you should use SparkSession.builder.enableHiveSupport() .

In Spark 3.0, the deprecated HiveContext class has been removed. Use SparkSession.builder.enableHiveSupport() instead.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM