簡體   English   中英

如何使用 JDK 1.8 將 Hadoop AWS jar 添加到 Spark 2.4.5?

[英]How to add Hadoop AWS jar to Spark 2.4.5 with JDK 1.8?

I was facing an error: java.lang.ClassNotFoundException: Class org.apache.hadoop.fs.s3a.S3AFileSystem not found and stumbled upon the solution here which works. 然而,在答案之后給出的注釋中,作者指出以下內容:

com.amazonaws:aws-java-sdk-pom:1.11.760: depends on jdk version hadoop:hadoop-aws:2.7.0: depends on your hadoop version s3.us-west-2.amazonaws.com: depends on your s3 位置

因此,當我運行以下命令時:

pyspark --packages com.amazonaws:aws-java-sdk-pom:1.8.0_242,org.apache.hadoop:hadoop-aws:2.8.5

我面臨以下錯誤:

Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: com.amazonaws#aws-java-sdk-pom;1.8.0_242: not found]
    at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1302)
    at org.apache.spark.deploy.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:54)
    at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:304)
    at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:774)
    at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
    at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
    at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
    at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:920)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:929)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Traceback (most recent call last):
  File "/opt/app-root/lib/python3.6/site-packages/pyspark/python/pyspark/shell.py", line 38, in <module>
    SparkContext._ensure_initialized()
  File "/opt/app-root/lib/python3.6/site-packages/pyspark/context.py", line 316, in _ensure_initialized
    SparkContext._gateway = gateway or launch_gateway(conf)
  File "/opt/app-root/lib/python3.6/site-packages/pyspark/java_gateway.py", line 46, in launch_gateway
    return _launch_gateway(conf)
  File "/opt/app-root/lib/python3.6/site-packages/pyspark/java_gateway.py", line 108, in _launch_gateway
    raise Exception("Java gateway process exited before sending its port number")
Exception: Java gateway process exited before sending its port number

我更改命令的原因如下:

  1. JDK版本:
(app-root) java -version
openjdk version "1.8.0_242"
OpenJDK Runtime Environment (build 1.8.0_242-b08)
OpenJDK 64-Bit Server VM (build 25.242-b08, mixed mode)
  1. Pyspark 版本: 2.4.5
  2. Hadoop 版本: 2.8.5

如何解決此錯誤並使用正確的依賴項啟動 pyspark shell 以便從 S3 讀取文件?

如果使用任何其他版本並且它相當舊,內置 hadoop 的預構建火花會造成問題。 Strongly recommend to use Hadoop Free build https://spark.apache.org/docs/2.4.5/hadoop-provided.html

以下內容適用於帶有 Scala 2.11/2.12 的 Spark 2.4.5。

org.apache.hadoop:hadoop-aws:2.8.5
com.amazonaws:aws-java-sdk:1.11.659
org.apache.hadoop:hadoop-common:2.8.5

參考:

這對我來說適用於 spark:2.4.4-hadoop2.7:

    --conf spark.executor.extraClassPath=/hadoop-aws-2.7.3.jar:/aws-java-sdk-1.7.4.jar --driver-class-path /hadoop-aws-2.7.3.jar:/aws-java-sdk-1.7.4.jar

請更換

pyspark --packages com.amazonaws:aws-java-sdk-pom:1.8.0_242,org.apache.hadoop:hadoop-aws:2.8.5

pyspark --packages com.amazonaws:aws-java-sdk-pom:1.11.828,org.apache.hadoop:hadoop-aws:2.8.5

1.11.828 是 aws-java-sdk package 的版本,而不是 JDK 本身

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM