[英]Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.conf.Configuration.getPassword(Ljava/lang/String;)[C
Problem statement : While creating the Dataset of file located at bucket(GCP) from local spark java code with the following version of jar/lib then getting exception.问题陈述:使用以下版本的 jar/lib 从本地 spark java 代码创建位于存储桶 (GCP) 的文件数据集时,出现异常。
Exception 1: "Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.conf.Configuration.getPassword(Ljava/lang/String;)[C"异常 1:“线程“main”中的异常 java.lang.NoSuchMethodError: org.apache.hadoop.conf.Configuration.getPassword(Ljava/lang/String;)[C”
Getting above exception when use the following set of jar/lib.使用以下 jar/lib 集时出现上述异常。
Spark - org.apache.spark
spark-core_2.11
spark-sql_2.11
Hadoop - org.apache.hadoop
hadoop-auth 3.3.1
hadoop-hdfs 3.3.1
hadoop-common 3.3.1
hadoop-mapreduce-client-core 3.3.1
hadoop-mapreduce-client-jobclient 3.3.1
hadoop-nfs 3.3.1
hadoop-client 3.3.1
com.google.cloud.bigdataoss
gcs-connector-hadoop2-latest
Following is the local java source code以下是本地java源代码
SparkConf objSparkConf = new SparkConf();
objSparkConf.setAppName("Spark");
objSparkConf.setMaster("local[*]");
objSparkConf.set("fs.gs.impl", "com.google.cloud.hadoop.fs.gcs.GoogleHadoopFileSystem");
objSparkConf.set("fs.AbstractFileSystem.gs.impl", "com.google.cloud.hadoop.fs.gcs.GoogleHadoopFS");
objSparkConf.set("google.cloud.auth.service.account.enable", "true");
objSparkConf.set("google.cloud.auth.service.account.json.keyfile", "D:\\GCP\\test-environment-json-keyfile.json");
objSparkConf.set("fs.defaultFS", "gs://hudi-bucket");
JavaSparkContext spContext = new JavaSparkContext(objSparkConf);
SparkSession sparkSession = SparkSession.builder().appName("Test_Spark").getOrCreate();
String sFileUrl = "gs://test/2/4/CRUNCH_JOB.3d997666-7d58-4ee8-bf42-a30438983ccb.20211025_072502";
Dataset<Row> dataSet1 = sparkSession.read().format("csv").option("header", "true").load(sFileUrl);
In my case this error was caused by an older version of hadoop-core (1.2.1 instead of 2.6.5) which was brought in as a transitive Maven dependency.就我而言,此错误是由旧版本的 hadoop-core(1.2.1 而不是 2.6.5)引起的,该版本作为可传递的 Maven 依赖项引入。 You can also set a breakpoint and check which jar and version exactly your class org.apache.hadoop.conf.Configuration
comes from.您还可以设置断点并检查您的类org.apache.hadoop.conf.Configuration
来自哪个 jar 和版本。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.