简体   繁体   中英

hadoop jar error while copying data from mongoDB to hdfs

I am trying to copy collection from mongodb to hadoop using HadoopMongodb connector using the code below : package hdfs;

import java.io.*;
import org.apache.commons.logging.*;
import org.apache.hadoop.conf.*;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.*;
import org.apache.hadoop.mapreduce.lib.output.*;
import org.apache.hadoop.mapreduce.*;
import org.bson.*;
import com.mongodb.hadoop.*;
import com.mongodb.hadoop.util.*;

public class ImportWeblogsFromMongo {
    private static final Log log = LogFactory.getLog(ImportWeblogsFromMongo.class);

    public static class ReadWeblogsFromMongo extends Mapper<Object, BSONObject, Text, Text> {
        public void map(Object key, BSONObject value, Context context) throws IOException, InterruptedException {
            System.out.println("Key: " + key);
            System.out.println("Value: " + value);
            String md5 = value.get("md5").toString();
            String url = value.get("url").toString();
            String date = value.get("date").toString();
            String time = value.get("time").toString();
            String ip = value.get("ip").toString();
            String output = "\t" + url + "\t" + date + "\t" + time + "\t" + ip;
            context.write(new Text(md5), new Text(output));
        }
    }

    public static void main(String[] args) throws Exception {
        final Configuration conf = new Configuration();
        MongoConfigUtil.setInputURI(conf,"mongodb://localhost:27017/clusterdb.fish");
        MongoConfigUtil.setCreateInputSplits(conf, false);
        System.out.println("Configuration: " + conf);
        @SuppressWarnings("deprecation")
        final Job job = new Job(conf, "Mongo Import");
        Path out = new Path("/home/mongo_import");
        FileOutputFormat.setOutputPath(job, out);
        job.setJarByClass(ImportWeblogsFromMongo.class);
        job.setMapperClass(ReadWeblogsFromMongo.class);
        job.setOutputKeyClass(Text.class);
        job.setOutputValueClass(Text.class);
        job.setInputFormatClass(MongoInputFormat.class);
        job.setOutputFormatClass(TextOutputFormat.class);
        job.setNumReduceTasks(0);
        System.exit(job.waitForCompletion(true) ? 0 : 1);
    }
    }

1. After exporting the Jar file named importmongo.jar 2. I tried to execute this command hadoop jar /home/yass/importmongo.jar hdfs.ImportWeblogsFromMongo but i got the following error:

Exception in thread "main" java.lang.NoClassDefFoundError: com/mongodb/hadoop/util/MongoConfigUtil
    at hdfs.ImportWeblogsFromMongo.main(ImportWeblogsFromMongo.java:33)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: java.lang.ClassNotFoundException: com.mongodb.hadoop.util.MongoConfigUtil
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    ... 7 more

NB :clustedb is the database name and fish its collection and hdfs.ImportWeblogsFromMongo is package.class

Any suggestions please

I didn't resolve the issue this way but i found solution using Mongodump by copying the file to Hdfs ,the lines below it may help someone to get the job done

   mongodump  --db clusterdb --collection CollectionName

   bsondump file.bson > file.json

   hadoop dfs -copyFromLocal /path/to/file/fish.json mongo

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM