简体   繁体   中英

Multiple Jobs in the Same Apache Flink Project

I have multiple jobs in the same Flink project. I wondering that when i submit job to the Flink cluster, i do not know whic job is running for this submitted jar. For example;

i have 2 jobs in the same flink project, and create jar then deploy it to the cluster. Then I submit two jobs using the same jar.

Then i change the job2, and create new jar file. Then submit job2 with new uploaded jar.

In this case, when o look the submit UI, i really don't know which job was submitted the from this jar.

To prevent it, I can create multiple flink projects with different jar name.

Note: I'm using the CI/CD pipeline, and i can not create jar name dynamically if i have multiple jobs in the same project.

What is the best practices for that?

You can specify job name while executing the flink run command by passing argument to your main class and then using that value in env.execute(params.get('your-job-name')). In this way you can use same jar and pass different job names so that you can distinguish it flink dashboard after deployment.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM