[英]Flink job submitted from sql-client.sh, how to resume from savepoint?
I submitted a job from apache-flink sql-client, and created a savepoint.我从 apache-flink sql-client 提交了一个作业,并创建了一个保存点。 The issue is the meta data doesn't contain jar and classname, let alone the parameters.
问题是元数据不包含 jar 和类名,更不用说参数了。 How to restart?
如何重启?
flink run -s hdfs://ns/flink/flink-checkpoints/savepoint-c5dade-af74904ab30c -m yarn-cluster -yid application_1539849585041_0459 -c org.apache.flink.table.client.SqlClient opt/flink-sql-client-1.6.1.jar embedded -e /opt/flink/flink-bin/yarn-app/a/sql-client-kafka-json.yaml --library /opt/flink/flink-bin//lib -u 'INSERT INTO testjsonSink SELECT * FROM testjsonSource;'` flink run -s hdfs://ns/flink/flink-checkpoints/savepoint-c5dade-af74904ab30c -m yarn-cluster -yid application_1539849585041_0459 -c org.apache.flink.table.client.SqlClient opt/flink-sql-client- 1.6.1.jar 嵌入 -e /opt/flink/flink-bin/yarn-app/a/sql-client-kafka-json.yaml --library /opt/flink/flink-bin//lib -u 'INSERT INTO testjsonSink SELECT * FROM testjsonSource;'`
Adjust jars lib.调整 jars 库。 It been solved as before.
和以前一样解决了。 Poor documentation.
糟糕的文档。
official working with 1.12.1 with scala 1.12: flink run -s hdfs://dbt1caw005.webex.com:9000/flink-checkpoints/savepoint-dafd7c-05d66b098493 -C file:///opt/flink/jars/flink-python_2.12-1.12.1.jar -c org.apache.flink.table.client.SqlClient /opt/flink/opt/flink-sql-client_2.12-1.12.1.jar embedded -e /vdb/sql.yml -l /opt/flink/jars -u "INSERT INTO CALL_DURATION_USER SELECT orgId, userId, window_start, window_end, total_minutes, total_calls FROM ( SELECT , ROW_NUMBER() OVER (PARTITION BY orgId,window_end ORDER BY total_minutes desc) AS rownum FROM ( SELECT orgId, userId, HOP_START(ts, INTERVAL '1' DAY,INTERVAL '30' DAY) window_start, HOP_END(ts, INTERVAL '1' DAY,INTERVAL '30' DAY) window_end, CAST(sum(cast(legDuration as bigint)/60 official working with 1.12.1 with scala 1.12: flink run -s hdfs://dbt1caw005.webex.com:9000/flink-checkpoints/savepoint-dafd7c-05d66b098493 -C file:///opt/flink/jars/flink- python_2.12-1.12.1.jar -c org.apache.flink.table.client.SqlClient /opt/flink/opt/flink-sql-client_2.12-1.12.1.jar embedded -e /vdb/sql. yml -l /opt/flink/jars -u "INSERT INTO CALL_DURATION_USER SELECT orgId, userId, window_start, window_end, total_minutes, total_calls FROM ( SELECT , ROW_NUMBER() OVER (PARTITION BY orgId,window_end ORDER BY total_minutes desc) AS rownum FROM ( SELECT orgId, userId, HOP_START(ts, INTERVAL '1' DAY,INTERVAL '30' DAY) window_start, HOP_END(ts, INTERVAL '1' DAY,INTERVAL '30' DAY) window_end, CAST(sum(cast(legDuration as bigint )/60 ) AS BIGINT) total_minutes, CAST(count( ) AS BIGINT) total_calls FROM callduration_ts GROUP BY HOP(ts, INTERVAL '1' DAY,INTERVAL '30' DAY),orgId, userId ) ) WHERE rownum < 101"
) AS BIGINT) total_minutes, CAST(count( ) AS BIGINT) total_calls FROM callduration_ts GROUP BY HOP(ts, INTERVAL '1' DAY,INTERVAL '30' DAY),orgId, userId) ) WHERE rownum < 101"
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.