简体   繁体   English

启动时运行Google Dataflow作业

[英]Running Google Dataflow job on startup

Our Google Cloud Dataflow pipeline program calls some library which dynamically links to *.so files, so to run it I need to set linux environment variable LD_LIBRARY_PATH. 我们的Google Cloud Dataflow管道程序调用了一些可动态链接到* .so文件的库,因此要运行它,我需要设置linux环境变量LD_LIBRARY_PATH。 There is a hack to do that: https://groups.google.com/forum/#!topic/comp.lang.java.programmer/LOu18-OWAVM , but I wonder is there a way to do that using some job that will run shell script before executing pipeline? 有一种方法可以做到这一点: https : //groups.google.com/forum/#!topic/ comp.lang.java.programmer/LOu18- OWAVM ,但是我想知道是否有一种方法可以使用将在执行管道之前运行shell脚本?

Are you using JNI for this? 您正在为此使用JNI吗? Can you set the environment variable in the Java code before your setup your JNI code? 您可以在设置JNI代码之前在Java代码中设置环境变量吗?

You may also want to just load the .so file from the classpath and pass it in a stream. 您可能还想从类路径中加载.so文件,然后将其传递到流中。 Is it possible to link in the .so file that way? 是否可以通过这种方式链接.so文件?

See filesToStage here for how to lookup the file. 有关如何查找文件的信息,请参见此处的filesToStage。 https://cloud.google.com/dataflow/pipelines/specifying-exec-params https://cloud.google.com/dataflow/pipelines/specifying-exec-params

Perhaps the suggestions in these links could work as well, and you could package the .so file in your jar: How to bundle a native library and a JNI library inside a JAR? 也许这些链接中的建议也可以起作用,并且您可以将.so文件打包在jar中: 如何在JAR内捆绑本机库和JNI库?

https://www.adamheinrich.com/blog/2012/12/how-to-load-native-jni-library-from-jar/ https://www.adamheinrich.com/blog/2012/12/how-to-load-native-jni-library-from-jar/

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM