简体   繁体   English

如何使用 Java 代码向 Flink 集群提交作业?

[英]How do I submit a job to a Flink cluster using Java code?

I have already uploaded a fat jar containing my application code to the /lib folder of all nodes in my Flink cluster.我已经将一个包含我的应用程序代码的 fat jar 上传到我的 Flink 集群中所有节点的 /lib 文件夹。 I am trying to start the Flink job from a separate java application, but can't find a good way to do so.我正在尝试从单独的 Java 应用程序启动 Flink 作业,但找不到这样做的好方法。

The closest thing to a solution that I have currently found is the Monitoring Rest API which has a run job API .我目前找到的最接近解决方案的是 Monitoring Rest API,它有一个运行作业 API However, this only allows you to run jobs submitted via the job upload function.但是,这仅允许您运行通过作业上传功能提交的作业。

I have seen the ClusterClient.java in the flink-client module, but could not see any examples of how I might use this.我在 flink-client 模块中看到了ClusterClient.java ,但看不到任何关于如何使用它的示例。

Any examples of how someone has submitted jobs successfully through java code would be greatly appreciated!任何有关某人如何通过 java 代码成功提交作业的示例将不胜感激!

You can use RestClusterClient to run a PackagedProgram which points to your Flink job.您可以使用RestClusterClient运行指向您的 Flink 作业的PackagedProgram If your job accepts some arguments, you can pass them.如果你的工作接受一些参数,你可以传递它们。

Here is an example for a standalone cluster running on localhost:8081 :以下是在localhost:8081上运行的独立集群的示例:

// import org.apache.flink.api.common.JobSubmissionResult;
// import org.apache.flink.client.deployment.StandaloneClusterId;
// import org.apache.flink.client.program.PackagedProgram;
// import org.apache.flink.client.program.rest.RestClusterClient;
// import org.apache.flink.configuration.Configuration;
// import org.apache.flink.configuration.JobManagerOptions;
// import org.apache.flink.configuration.RestOptions;

String clusterHost = "localhost";
int clusterPort = 8081;

Configuration config = new Configuration();
config.setString(JobManagerOptions.ADDRESS, clusterHost);
config.setInteger(RestOptions.PORT, clusterPort);

String jarFilePath = "/opt/flink/examples/streaming/SocketWindowWordCount.jar";
String[] args = new String[]{ "--port", "9000" };
PackagedProgram packagedProgram = new PackagedProgram(new File(jarFilePath), args);

RestClusterClient<StandaloneClusterId> client =
         new RestClusterClient<StandaloneClusterId>(config, StandaloneClusterId.getInstance());

int parallelism = 1;
JobSubmissionResult result = client.run(packagedProgram,  parallelism);

This seems to work for version 1.10这似乎适用于 1.10 版

private static final int PARALLELISM = 8;
private static final Configuration FLINK_CONFIG = new Configuration();

void foo() throws Exception {
    FLINK_CONFIG.setString(JobManagerOptions.ADDRESS, "localhost");
    FLINK_CONFIG.setInteger(RestOptions.PORT, 8081);
    FLINK_CONFIG.setInteger(RestOptions.RETRY_MAX_ATTEMPTS, 3);

    RestClusterClient<StandaloneClusterId> flinkClient = new RestClusterClient<>(FLINK_CONFIG, StandaloneClusterId.getInstance());

    String jar = "/path/to/jar";
    String[] args = new String[]{"..."};
    PackagedProgram program = PackagedProgram.newBuilder()
            .setJarFile(new File(jar))
            .setArguments(args)
            .build();

    JobGraph jobGraph = PackagedProgramUtils.createJobGraph(program, FLINK_CONFIG, PARALLELISM, false);

    JobID jobId = flinkClient.submitJob(jobGraph).get();
    ...
}

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 如何在不使用spark-submit的情况下将java程序中的spark作业提交到独立的spark集群? - How to submit spark job from within java program to standalone spark cluster without using spark-submit? Flink是什么通过群集GUI将args提交到作业的正确方法是什么? - Flink what is the proper way to submit args to job from cluster GUI? 如何从Java将级联作业提交到远程YARN集群? - How can I submit a Cascading job to a remote YARN cluster from Java? flink-群集未使用群集 - flink - cluster not using cluster 使用RestClusterClient在Flink群集上运行已部署的作业 - Run already deployed job on Flink Cluster using RestClusterClient 在 EMR 集群上提交 Spark 作业时,如何避免 java.lang.NoClassDefFoundError? - How do I avoid java.lang.NoClassDefFoundError when submitting Spark job on EMR cluster? 构建一个 Maven 项目并使用 Docker 作为 Flink 作业提交 - Build a Maven project and submit as the Flink job using Docker 在Kubernetes中进行Flink。 如何将提交作业步骤添加到jobmanager flink docker? - Flink in Kubernetes. How to add submit job step to jobmanager flink docker? 如何为 Flink DataStream 执行简单的中值算法(最好在 Java 和 Flink 1.14 中)? - How do I perform a simple median algorithm for a Flink DataStream (preferably in Java and Flink 1.14)? 在 Java Flink 作业中使用 Python 用户定义的 function - Using Python user defined function in a Java Flink Job
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM