簡體   English   中英

從 neo4j 讀取火花。 Py4JJavaError:調用 o104.load 時出錯

[英]Read from neo4j to spark. Py4JJavaError: An error occurred while calling o104.load

我是 Spark 的新手。 我正在嘗試將數據從 Neo4j 讀取到 Spark。 Spark 在本地安裝並通過 start_pyspark.sh 啟動

#!/bin/bash 

export PYSPARK_PYTHON=/opt/anaconda3/bin/python
export PYSPARK_DRIVER_PYTHON=/opt/anaconda3/bin/jupyter
export PYSPARK_DRIVER_PYTHON_OPTS="notebook --NotebookApp.open_browser=False    --NotebookApp.ip='*' --NotebookApp.port=8880 --notebook-dir=~/"

pyspark "$@" 

在啟動 PySpark 之前,下載了用於 Spark 的 Neo4j 連接器並將其放入 /opt/spark/jars。

Neo4j 數據庫在 neo4j://localhost:7687 啟動

我嘗試應用供應商建議( https://neo4j.com/docs/spark/current/python/ )但我失敗了。

Spark.version - 3.3.0

我的 python 代碼

import pyspark

from pyspark.sql import SparkSession

spark = SparkSession.Builder().getOrCreate()

df = spark.read.format("org.neo4j.spark.DataSource") \
 .option("url", "neo4j://localhost:7687") \
 .option("labels", "Department") \
 .option("authentication.type", "basic")\
 .option("authentication.basic.username", "*****")\
 .option("authentication.basic.password", "******")\
 .load()

啟動后出現以下錯誤:

Py4JJavaError  Traceback (most recent call last)
Input In [2], in <cell line: 1>()
----> 1 df = spark.read.format("org.neo4j.spark.DataSource") \
  2   .option("url", "neo4j://localhost:7687") \
  3   .option("labels", "Department") \
  4   .option("authentication.type", "basic")\
  5   .option("authentication.basic.username", "*****")\
  6   .option("authentication.basic.password", "******")\
  7   .load()

File /opt/spark/python/pyspark/sql/readwriter.py:184, in DataFrameReader.load(self, path, format, schema, **options)
182     return     self._df(self._jreader.load(self._spark._sc._jvm.PythonUtils.toSeq(path)))
183 else:
--> 184     return self._df(self._jreader.load())

File /opt/spark/python/lib/py4j-0.10.9.5-src.zip/py4j/java_gateway.py:1321, in    JavaMember.__call__(self, *args)
1315 command = proto.CALL_COMMAND_NAME +\
1316     self.command_header +\
1317     args_command +\
1318     proto.END_COMMAND_PART
1320 answer = self.gateway_client.send_command(command)
-> 1321 return_value = get_return_value(
1322     answer, self.gateway_client, self.target_id, self.name)
1324 for temp_arg in temp_args:
1325     temp_arg._detach()


File /opt/spark/python/pyspark/sql/utils.py:190, in capture_sql_exception.<locals>.deco(*a, **kw)
188 def deco(*a: Any, **kw: Any) -> Any:
189     try:
--> 190         return f(*a, **kw)
191     except Py4JJavaError as e:
192         converted = convert_exception(e.java_exception)


File /opt/spark/python/lib/py4j-0.10.9.5-src.zip/py4j/protocol.py:326, in get_return_value(answer, gateway_client, target_id, name)
324 value = OUTPUT_CONVERTER[type](answer[2:], gateway_client)
325 if answer[1] == REFERENCE_TYPE:
--> 326     raise Py4JJavaError(
327         "An error occurred while calling {0}{1}{2}.\n".
328         format(target_id, ".", name), value)
329 else:
330     raise Py4JError(
331         "An error occurred while calling {0}{1}{2}. Trace:\n{3}\n".
332         format(target_id, ".", name, value))


Py4JJavaError: An error occurred while calling o51.load. 
: java.lang.NoSuchMethodError: 'scala.collection.immutable.ArraySeq scala.runtime.ScalaRunTime$.wrapRefArray(java.lang.Object[])'
at org.neo4j.spark.DataSource.<init>(DataSource.scala:15)
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at  java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490)
at java.base/java.lang.Class.newInstance(Class.java:584)
at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSourceV2(DataSource.scala:726)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:207)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:171)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at py4j.Gateway.invoke(Gateway.java:282)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:182)
at py4j.ClientServerConnection.run(ClientServerConnection.java:106)
at java.base/java.lang.Thread.run(Thread.java:829)

有沒有人遇到過這個問題? 提前致謝。

嗨,我剛剛自己嘗試了最后一個版本(4.1.5),連接器與 Spark 3.3.0 一起使用。 您可以使用我在 Graph Connect 2022 研討會期間使用的這個筆記本自己嘗試。

https://github.com/conker84/gc-2k22-spark

我認為您的問題是您為您的項目使用了錯誤的 Scala 版本編譯的連接器。

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM