简体   繁体   中英

Load Native Shared Libraries in a HBase MapReduce task

Recently I'm trying to implementing my algorithm in JNI codes(using C++).I did that and generate a shared library. Here is my JNI class.

    public class VideoFeature{
        // JNI Code Begin
        public native static  float Match(byte[] testFileBytes, byte[] tempFileBytes);
        static {
            System.loadLibrary("JVideoFeatureMatch");
        }
        // JNI Code End
    }

In the main function, I write

    //  MapReduce
        Configuration conf = HBaseConfiguration.create();

    //  DistributedCache shared library
        DistributedCache.createSymlink(conf);
    //  Both following ways seem work.
    //  DistributedCache.addCacheFile(new URI("/home/danayan/Desktop/libJVideoFeatureMatch.so#JVideoFeatureMatch"), conf);
        DistributedCache.addCacheFile(new URI("hdfs://danayan-pc:9000/lib/libJVideoFeatureMatch.so#libJVideoFeatureMatch.so"), conf);

In map method, codes following work.

 public  static class MatchMapper extends TableMapper<Text, IntWritable> {

        @Override
        public void map(ImmutableBytesWritable key, Result values, Context context) throws IOException, InterruptedException {

        // Other codes
        Path[] localFiles = DistributedCache.getLocalCacheFiles(context.getConfiguration());
        for(Path temp:localFiles) {
            String path = temp.toString();
            if(path.contains("JVideoFeatureMatch")) {
                 System.out.println("JVideoFeatureMatch  found !");
            }
        }
}

In other words, it seems that I 'DistributedCache' my shared library successfully.But I can't load it in the Map function.

 public  static class MatchMapper extends TableMapper<Text, IntWritable> {

    @Override
    public void map(ImmutableBytesWritable key, Result values, Context context) throws IOException, InterruptedException {
    // Other codes
    int score = (int)VideoFeature.Match(testBytes, tempBytes);
}

When I try to call the static function in the JNI class, a 'java.lang.Exception' is thrown :

java.lang.UnsatisfiedLinkError: no libJVideoFeatureMatch in java.library.path.

I have also tried 'System.load()'. And I have considered the use of prefix 'lib' and suffix '.so' in Linux system.

What's more, I set a jvm argument (Removing it makes no difference):

  -Djava.library.path=/usr/local/hadoop/lib/native/Linux-amd64-64

And I have loaded the shared library in local machine successfully by moving the shared library to the 'Java.library.path'(set above).

I have browsed some web site below:

Issue loading a native library through the DistributedCache

Native Libraries Guide loading native libraries in hadoop reducer?

I don't know if that I said clearly.If not, please let me know.

  1. First copy the library to HDFS:

     bin/hadoop fs -copyFromLocal mylib.so.1 /libraries/mylib.so.1 
  2. The job launching program should contain the following:

     DistributedCache.createSymlink(conf); DistributedCache.addCacheFile("hdfs://host:port/libraries/mylib.so.1#mylib.so", conf); 
  3. The MapReduce task can contain:

     System.load((new File("mylib.so")).getAbsolutePath()); 

The third point is different from the official documentation

Official documentation : Native Shared Libraries

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM