[英]Unable to import org module to PySpark cluster
I am trying to import FPGrowth from org module but it throws an error while installing the org module.我正在尝试从 org 模块导入 FPGrowth,但在安装 org 模块时会引发错误。 I also tried replacing org.apache.spark to pyspark, still doesn't work.我还尝试将 org.apache.spark 替换为 pyspark,仍然不起作用。
!pip install org
import org.apache.spark.ml.fpm.FPGrowth
below is the error:以下是错误:
ERROR: Could not find a version that satisfies the requirement org (from versions: none)
ERROR: No matching distribution found for org
---------------------------------------------------------------------------
ModuleNotFoundError Traceback (most recent call last)
<ipython-input-12-c730562e7076> in <module>
1 get_ipython().system('pip install org')
----> 2 import org.apache.spark.ml.fpm.FPGrowth
ModuleNotFoundError: No module named 'org'
To import FPGrowth
in PySpark you need to write:要在 PySpark 中导入FPGrowth
,您需要编写:
from pyspark.ml.fpm import FPGrowth
You can find additional instructions on how to use FPGrowth
in Spark documentation .您可以在Spark 文档中找到有关如何使用FPGrowth
的其他说明。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.