简体   繁体   English

python上的Apache-Spark错误:java.lang.reflect.InaccessibleObjectException

[英]Apache-Spark error on python : java.lang.reflect.InaccessibleObjectException

it's my first time using Apache-Spark with python (pyspark), and I was trying to run Quick Start Examples , but when I run the line saying:这是我第一次将 Apache-Spark 与 python (pyspark) 一起使用,我试图运行Quick Start Examples ,但是当我运行以下行时:

>>> textFile = spark.read.text("README.md")

it gives me the following error (I'm pasting just the first part because i think it's the most important):它给了我以下错误(我只粘贴第一部分,因为我认为它是最重要的):

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/home/daniele/Scaricati/spark/python/pyspark/sql/readwriter.py", line 311, in text
    return self._df(self._jreader.text(self._spark._sc._jvm.PythonUtils.toSeq(paths)))
  File "/home/daniele/Scaricati/spark/python/lib/py4j-0.10.4-src.zip/py4j/java_gateway.py", line 1133, in __call__
  File "/home/daniele/Scaricati/spark/python/pyspark/sql/utils.py", line 63, in deco
return f(*a, **kw)
  File "/home/daniele/Scaricati/spark/python/lib/py4j-0.10.4-src.zip/py4j/protocol.py", line 319, in get_return_value
py4j.protocol.Py4JJavaError: An error occurred while calling o22.text.
: java.lang.reflect.InaccessibleObjectException: Unable to make field private transient java.lang.String java.net.URI.scheme accessible: module java.base does not "opens java.net" to unnamed module @779d0812
at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:335)

Can someone help me to solve this?有人可以帮我解决这个问题吗? Sorry if my post is not that clear, but it's the first one on this forum.对不起,如果我的帖子不是那么清楚,但它是这个论坛上的第一个。 Thanks to everyone who will try to help, Daniele.感谢所有愿意提供帮助的人,Daniele。

The issue is that your spark version and java version are incompatible .问题是你的spark 版本和 java 版本不兼容 In order to resolve this you must do the following:为了解决这个问题,您必须执行以下操作:

  1. Check you PySpark version:检查您的 PySpark 版本:

    pyspark

  2. Check which Java version is required for your PySpark version (eg for PySpark 2.4.6 we need Java 8 - https://spark.apache.org/docs/2.4.6/ )检查您的 PySpark 版本需要哪个 Java 版本(例如,对于 PySpark 2.4.6,我们需要 Java 8 - https://spark.apache.org/docs/2.4.6/

  3. Check your available Java versions installed检查已安装的可用 Java 版本

    /usr/libexec/java_home -V

  4. If your Java version is not available install it (eg brew install adoptopenjdk8 )如果您的 Java 版本不可用,请安装它(例如brew install adoptopenjdk8

  5. Change your JAVA_HOME to point to the correct version.更改您的 JAVA_HOME 以指向正确的版本。 Example:例子:

    export JAVA_HOME="/Library/Java/JavaVirtualMachines/adoptopenjdk-8.jdk/Contents/Home"

  6. Confirm version java -version确认版本java -version

After this you should be able to perform your functions as required在此之后,您应该能够根据需要执行您的功能

textFile = spark.read.text("README.md")
textFile.show()

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM