简体   繁体   中英

Flink: Wrap executable non-flink jar to run it in a flink cluster

Assume that I have an executable jar file that doesn't have any flink code inside and my job is to make it distributed with flink. I have already done this once by creating and executing the StreamExecutionEnvironment somewhere in the code and placing the code of the jar that can be distributed inside flink operators (ie, map functions).

Yesterday, I was asked to do a similar job but with minimal effort. They told me to find a way to wrap this flink-less jar in a way that it can be executed by a flink cluster (without injecting code and altering the jar like i did above). Is there a way to do this? The docs state that to support execution from a packaged jar "a program must use the environment obtained by StreamExecutionEnvironment.getExecutionEnvironment()". Is there no other way?

My only guess right now is to wrap the entry point of the jar. To place it inside flink operators but unfortunately I don't know that this jar does

You could write a map only program and package it in jar. In the map, you execute the main through reflection of the provided jar.

Your small wrapper could be put into Flink lib to be reusable for other jars or you add the other jar in the distributed cache .

Btw, I haven't fully understood the use case or I find weird, since it's unclear to me how the parallelism is supposed to work. So sorry if the answer does not help.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM