簡體   English   中英

在 Google Dataproc Spark 中鏈接原生 .so

[英]Linking native .so in Google Dataproc Spark

我有一個本地庫(Freeling),我使用 cmake 編譯並本地生成並通過集群啟動操作安裝(因此,它應該存在於 master 和每個 worker 中)

即便如此,我在調用System.loadLibrary時收到此錯誤

Exception in thread "main" java.lang.UnsatisfiedLinkError: no Jfreeling in java.library.path
    at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1867)
    at java.lang.Runtime.loadLibrary0(Runtime.java:870)
    at java.lang.System.loadLibrary(System.java:1122)

我嘗試使用以下屬性讓程序找到庫(它在靜態代碼塊中調用)

      "properties": {
        "spark.driver.extraClassPath": "/usr/local/share/freeling/APIs/java/Jfreeling.jar:/usr/local/lib/libfreeling.so",
        "spark.executor.extraClassPath": "/usr/local/share/freeling/APIs/java/Jfreeling.jar:/usr/local/lib/libfreeling.so",
        "spark.executor.extraLibraryPath": "/usr/local/lib/libfreeling.so",
        "spark.driver.extraLibraryPath": "/usr/local/lib/libfreeling.so",
        "spark.executorEnv.LD_PRELOAD": "/usr/local/lib/libfreeling.so",
        "spark.yarn.dist.files": "/usr/local/lib/libfreeling.so",
        "spark.yarn.appMasterEnv.LD_PRELOAD": "libfreeling.so",
        "spark.files": "/usr/local/lib/libfreeling.so",
        "spark.executorEnv.LD_LIBRARY_PATH": "libfreeling.so"
      },
      "jarFileUris": [
        "file:///usr/local/share/freeling/APIs/java/Jfreeling.jar",
        "file:///usr/local/lib/libfreeling.so"
      ],

您可以嘗試將您的庫放在/usr/lib/hadoop/lib/native/嗎? /etc/spark/conf/spark-env.sh ,它有

# Spark got rid of SPARK_LIBRARY_PATH in 1.0
# It has properties for extraLibraryPaths, but this is more extensible
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:${HADOOP_HOME}/lib/native

您應該將 /usr/local/share/freeling/APIs/java/Jfreeling.jar 添加到您的 CLASSPATH,並將 /usr/local/share/freeling/APIs/java/libJfreeling.so 添加到您的 LD_LIBRARY_PATH。

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM