简体   繁体   English

Apache Flink(如何唯一标记作业)

[英]Apache Flink (How to uniquely tag Jobs)

Is it possible to tag jobs with a unique name so I can stop them at a later date?. 是否可以用唯一的名称标记作业,以便以后我可以将其停止? I don't really want to grep and persist Job IDs. 我真的不想grep并保留作业ID。

In a nutshell I want to stop a job as part of my deployment and deploy the new one. 简而言之,我想在部署过程中停止一项工作,然后部署新工作。

You can name jobs when you start them in the execute(name: String) call, eg, 您可以在execute(name: String)调用中启动作业时命名作业,例如,

val env: StreamExecutionEnvironment = StreamExecutionEnvironment.getExecutionEnvironment()

val result: DataStream[] = ???       // your job logic
result.addSink(new YourSinkFunction) // add a sink

env.execute("Name of your job")      // execute and assign a name

The REST API of the JobManager provides a list of job details which include the name of the job and its JobId. JobManagerREST API提供了作业详细信息列表,包括作业名称及其JobId。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM