简体   繁体   中英

UnsatisfiedLinkError in spark EMR job with native library

I'm trying to run a spark job that uses native shared library (.so). I'm using --jars to copy my .so to all executors (and file seems to be there, along the spark .jar app), but somehow I'm failing to set up environment find and use the .so. Tried --conf spark.executor.extraLibraryPath and -Djava.library.path, but not really sure what paths to use.. Is there an easy way to make it work? (using AWS EMR 4.5.0, spark 1.6.x)

my spark-submit :

spark-submit \
--deploy-mode cluster \
--driver-java-options \
--jars s3://at/emr-test/asb_UT/libSplineFitWrapperJava.so \
--class com.SplineFittingDummy \
s3://at/emr-test/asb_UT/asb-0.0.1-SNAPSHOT-jar-with-dependencies.jar \
s3://at/emr-test/asb_UT/testPoints01.xml \
s3://at/emr-test/asb_UT/output

Problem was with way .so was build. After trying out different settings and available setups (solaris & sfw, debian & g++ 4.6, ...) that failed I tried to compile .so on EMR and everything is working now. Though it would be helpful if Amazon could provide some docker image with their setup, so we can compile without actually copying all source code to EMR..

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM