[英]hadoop jar error while copying data from mongoDB to hdfs
I am trying to copy collection from mongodb to hadoop using HadoopMongodb connector using the code below : package hdfs; 我正在尝试使用HadoopMongodb连接器,使用以下代码将mongodb的集合从mongodb复制到hadoop:package hdfs;
import java.io.*;
import org.apache.commons.logging.*;
import org.apache.hadoop.conf.*;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.*;
import org.apache.hadoop.mapreduce.lib.output.*;
import org.apache.hadoop.mapreduce.*;
import org.bson.*;
import com.mongodb.hadoop.*;
import com.mongodb.hadoop.util.*;
public class ImportWeblogsFromMongo {
private static final Log log = LogFactory.getLog(ImportWeblogsFromMongo.class);
public static class ReadWeblogsFromMongo extends Mapper<Object, BSONObject, Text, Text> {
public void map(Object key, BSONObject value, Context context) throws IOException, InterruptedException {
System.out.println("Key: " + key);
System.out.println("Value: " + value);
String md5 = value.get("md5").toString();
String url = value.get("url").toString();
String date = value.get("date").toString();
String time = value.get("time").toString();
String ip = value.get("ip").toString();
String output = "\t" + url + "\t" + date + "\t" + time + "\t" + ip;
context.write(new Text(md5), new Text(output));
}
}
public static void main(String[] args) throws Exception {
final Configuration conf = new Configuration();
MongoConfigUtil.setInputURI(conf,"mongodb://localhost:27017/clusterdb.fish");
MongoConfigUtil.setCreateInputSplits(conf, false);
System.out.println("Configuration: " + conf);
@SuppressWarnings("deprecation")
final Job job = new Job(conf, "Mongo Import");
Path out = new Path("/home/mongo_import");
FileOutputFormat.setOutputPath(job, out);
job.setJarByClass(ImportWeblogsFromMongo.class);
job.setMapperClass(ReadWeblogsFromMongo.class);
job.setOutputKeyClass(Text.class);
job.setOutputValueClass(Text.class);
job.setInputFormatClass(MongoInputFormat.class);
job.setOutputFormatClass(TextOutputFormat.class);
job.setNumReduceTasks(0);
System.exit(job.waitForCompletion(true) ? 0 : 1);
}
}
1. After exporting the Jar file named importmongo.jar
2. I tried to execute this command hadoop jar /home/yass/importmongo.jar hdfs.ImportWeblogsFromMongo
but i got the following error: 1.导出名为importmongo.jar
的Jar文件后。2.我尝试执行此命令hadoop jar /home/yass/importmongo.jar hdfs.ImportWeblogsFromMongo
但出现以下错误:
Exception in thread "main" java.lang.NoClassDefFoundError: com/mongodb/hadoop/util/MongoConfigUtil
at hdfs.ImportWeblogsFromMongo.main(ImportWeblogsFromMongo.java:33)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: java.lang.ClassNotFoundException: com.mongodb.hadoop.util.MongoConfigUtil
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 7 more
NB :clustedb is the database name and fish its collection and hdfs.ImportWeblogsFromMongo is package.class 注意 :clustedb是数据库名称,它是它的集合和hdfs。ImportWeblogsFromMongo是package.class
Any suggestions please 有什么建议吗
I didn't resolve the issue this way but i found solution using Mongodump
by copying the file to Hdfs
,the lines below it may help someone to get the job done 我没有以这种方式解决问题,但是我通过将文件复制到Hdfs
找到了使用Mongodump
的解决方案,它下面的Hdfs
行可能会帮助某人完成工作
mongodump --db clusterdb --collection CollectionName
bsondump file.bson > file.json
hadoop dfs -copyFromLocal /path/to/file/fish.json mongo
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.