[英]Spark : java.lang.NoClassDefFoundError: com/mongodb/hadoop/MongoInputFormat
I'm trying to read data from mongodb
with spark using mongo-hadoop
connector. 我正在尝试使用mongo-hadoop
连接器从mongodb
读取mongodb
数据。
I tried different versions of hadoop-mongo connector jar but still getting this error. 我尝试了不同版本的hadoop-mongo连接器jar,但仍然收到此错误。
There's no error during the compile time 编译期间没有错误
What can i do to resolve this?? 我该怎么解决?
Thanks in advance. 提前致谢。
Exception in thread "main" java.lang.NoClassDefFoundError: com/mongodb/hadoop/MongoInputFormat
at com.geekcap.javaworld.wordcount.Mongo.main(Mongo.java:47)
Caused by: java.lang.ClassNotFoundException: com.mongodb.hadoop.MongoInputFormat
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 1 more
My Code 我的密码
import com.mongodb.hadoop.BSONFileOutputFormat;
import com.mongodb.hadoop.MongoInputFormat;
import com.mongodb.hadoop.MongoOutputFormat;
import java.util.Arrays;
import java.util.Collections;
import java.util.LinkedList;
import java.util.Queue;
import org.apache.hadoop.conf.Configuration;
import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaPairRDD;
import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.api.java.JavaSparkContext;
import org.apache.spark.api.java.function.FlatMapFunction;
import org.bson.BSONObject;
public class MongoTest {
// Set configuration options for the MongoDB Hadoop Connector.
public static void main(String[] args) {
SparkConf conf = new SparkConf().setMaster("local").setAppName("App1");
JavaSparkContext sc = new JavaSparkContext(conf);
Configuration mongodbConfig;
mongodbConfig = new Configuration();
mongodbConfig.set("mongo.job.input.format", "com.mongodb.hadoop.MongoInputFormat");
mongodbConfig.set("mongo.input.uri","mongodb://localhost:27017/MyCollectionName.collection");
JavaPairRDD<Object, BSONObject> documents = sc.newAPIHadoopRDD(
mongodbConfig, // Configuration
MongoInputFormat.class, // InputFormat: read from a live cluster.
Object.class, // Key class
BSONObject.class // Value class
);
documents.saveAsTextFile("b.txt");
}
}
pom.xml dependencies: pom.xml依赖项:
<!-- Import Spark -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>1.4.0</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.11</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>mongodb-driver</artifactId>
<version>3.0.4</version>
</dependency>
<dependency>
<groupId>hadoopCom</groupId>
<artifactId>com.sample</artifactId>
<version>1.0</version>
<scope>system</scope>
<systemPath>/home/sys6002/NetBeansProjects/WordCount/lib/hadoop-common-2.7.1.jar</systemPath>
</dependency>
<dependency>
<groupId>hadoopCon1</groupId>
<artifactId>com.sample1</artifactId>
<version>1.0</version>
<scope>system</scope>
<systemPath>/home/sys6002/Downloads/mongo-hadoop-core-1.3.0.jar</systemPath>
</dependency>
</dependencies>
After several trails & changes, got this worked. 经过几番尝试和更改后,此方法奏效了。
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>1.5.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>1.5.1</version>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.14</version>
</dependency>
<dependency>
<groupId>org.mongodb.mongo-hadoop</groupId>
<artifactId>mongo-hadoop-core</artifactId>
<version>1.4.1</version>
</dependency>
</dependencies>
Java code Java代码
Configuration conf = new Configuration();
conf.set("mongo.job.input.format", "com.mongodb.hadoop.MongoInputFormat");
conf.set("mongo.input.uri", "mongodb://localhost:27017/databasename.collectionname");
SparkConf sconf = new SparkConf().setMaster("local").setAppName("Spark UM Jar");
JavaRDD<User> UserMaster = sc.newAPIHadoopRDD(conf, MongoInputFormat.class, Object.class, BSONObject.class)
.map(new Function<Tuple2<Object, BSONObject>, User>() {
@Override
public User call(Tuple2<Object, BSONObject> v1) throws Exception {
//return User
}
}
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.