简体   繁体   中英

Apache Flink (How to uniquely tag Jobs)

Is it possible to tag jobs with a unique name so I can stop them at a later date?. I don't really want to grep and persist Job IDs.

In a nutshell I want to stop a job as part of my deployment and deploy the new one.

You can name jobs when you start them in the execute(name: String) call, eg,

val env: StreamExecutionEnvironment = StreamExecutionEnvironment.getExecutionEnvironment()

val result: DataStream[] = ???       // your job logic
result.addSink(new YourSinkFunction) // add a sink

env.execute("Name of your job")      // execute and assign a name

The REST API of the JobManager provides a list of job details which include the name of the job and its JobId.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM