简体   繁体   中英

Spark with Mesos : java.lang.UnsatisfiedLinkError: libsvn_delta-1.so.0: cannot open shared object file: No such file or directory

I setup a mesos cluster with 1 master (192.168.2.1), and 2 slaves(192.168.2.2, 192.168.2.3).

And I'm able to access mesos successfully using http://192.168.2.1:5050 and I'm able to see both slaves registered as agents. I then setup spark on these 3 nodes.

I then downloaded libmesos.so (libmesos-1.8.1.so) from /usr/local/lib and set my local dev machine to

export MESOS_NATIVE_JAVA_LIBRARY=/mylocallibs/libmesos.so

When I try to connect to the master using this SparkConfig

SparkConf sparkConf = new SparkConf()
      .setMaster("mesos://192.168.2.1:5050")
      .setAppName("My app")
      .set("spark.executor.uri", <http url to spark tgz>)
      .set("spark.submit.deployMode", "cluster");

I get the following error

java.lang.UnsatisfiedLinkError: libsvn_delta-1.so.0: cannot open shared object file: No such file or directory

The way I setup spark on these 3 nodes is as follows:

# Download spark executable
wget http://www-us.apache.org/dist/spark/spark-2.4.3/spark-2.4.3-bin-hadoop2.7.tgz -O /opt/spark-2.4.3-bin-hadoop2.7.tgz

# Extract
cd /opt; tar xzf /opt/spark-2.4.3-bin-hadoop2.7.tgz

# Setup link for upgrades
ln -s /opt/spark-2.4.3-bin-hadoop2.7 /opt/spark

# Set spark_home
export SPARK_HOME=/opt/spark

cp $SPARK_HOME/conf/spark-env.sh.template $SPARK_HOME/conf/spark.env.sh

# Edit spark-env.sh and set variables
vi $SPARK_HOME/conf/spark-env.sh

export MESOS_NATIVE_JAVA_LIBRARY=/usr/local/lib/libmesos.so

export SPARK_EXECUTOR_URI=<http url to spark tgz>

cp $SPARK_HOME/conf/spark-defaults.conf.template $SPARK_HOME/conf/spark-defaults.conf

# Edit spark defaults and set variables
vi $SPARK_HOME/conf/spark-defaults.conf

export MESOS_NATIVE_JAVA_LIBRARY=/usr/local/lib/libmesos.so

export SPARK_EXECUTOR_URI=<http url to spark tgz>

I tried setting LD_LIBRARY_PATH on my local dev machine

export LD_LIBRARY_PATH=/mylocallibs/

I downloaded a later version of libsvn_delta-1.so.0 -> libsvn_delta-1.so.1 and renamed it to libsvn_delta-1.so.0 in /mylocallibs just to get going since I was stuck for long.

That only started a chain of other library files being unsatsified.

Have I missed anything obvious here?

library is already loaded by your application and the application tries to load it again, the UnsatisfiedLinkError will be thrown by the JVM

since you are using cluster mode you may need to send the library files through --files option before access it.. or copy lib folder to hdfs and try to access from hdfs in cluster mode it can read from hdfs. see this

In cluster mode the driver runs in the one of the executor node and it cant able to indentify your local path.

to make sure that it was working change deploy-mode to client to see whether its working or not.

or see this, Similar issue was fixed here. have a look at that.

Installing libcurl4-nss-dev fix the problem.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM