简体   繁体   English

Flink - 集成测试表 API?

[英]Flink - Integration Testing Table API?

I have built a very small and straight forward Flink app which consumes events from Kafka (json) deserialize them to a Java object and then create two Table and uses the Table API to to some simple operation and finally join the two table and write the result back to a Kafka我已经构建了一个非常小而直接的 Flink 应用程序,它使用来自 Kafka (json) 的事件将它们反序列化为 Java object 然后创建两个表并使用表 API 进行一些简单的操作,最后加入两个表并写入结果回到卡夫卡

What are the best practices to test such code?测试此类代码的最佳做法是什么? How do I go about to write Integration Test that verify that the code written with Table API produces the right result?我 go 如何编写集成测试来验证使用表 API 编写的代码是否产生了正确的结果?

(Using Flink 1.8.3) (使用 Flink 1.8.3)

We added an integration test for Kafka SQL connector since 1.10 in KafkaTableITCase.从 1.10 开始,我们在 KafkaTableITCase 中添加了针对 Kafka SQL 连接器的集成测试。 It creates a kafka table and writes some data into it (using json format), and read it again and applies a window aggreation, finally checks the window results using TestingSinkFunction.它创建一个 kafka 表并将一些数据写入其中(使用 json 格式),然后再次读取它并应用 window 聚合,最后使用 TestingSinkFunction 检查 window 结果。 You can check the code here:您可以在此处查看代码:

https://github.com/apache/flink/blob/release-1.10/flink-connectors/flink-connector-kafka-base/src/test/java/org/apache/flink/streaming/connectors/kafka/KafkaTableTestBase.java https://github.com/apache/flink/blob/release-1.10/flink-connectors/flink-connector-kafka-base/src/test/java/org/apache/flink/streaming/connectors/kafka/KafkaTableTestBase。 java

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM