简体   繁体   中英

Running Google Dataflow job on startup

Our Google Cloud Dataflow pipeline program calls some library which dynamically links to *.so files, so to run it I need to set linux environment variable LD_LIBRARY_PATH. There is a hack to do that: https://groups.google.com/forum/#!topic/comp.lang.java.programmer/LOu18-OWAVM , but I wonder is there a way to do that using some job that will run shell script before executing pipeline?

Are you using JNI for this? Can you set the environment variable in the Java code before your setup your JNI code?

You may also want to just load the .so file from the classpath and pass it in a stream. Is it possible to link in the .so file that way?

See filesToStage here for how to lookup the file. https://cloud.google.com/dataflow/pipelines/specifying-exec-params

Perhaps the suggestions in these links could work as well, and you could package the .so file in your jar: How to bundle a native library and a JNI library inside a JAR?

https://www.adamheinrich.com/blog/2012/12/how-to-load-native-jni-library-from-jar/

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM