I have a job which is running with old flink Kafka consumer ( FlinkKafkaConsumer ) Now I want to migrate it to KafkaSource . But I am not sure what wi ...
I have a job which is running with old flink Kafka consumer ( FlinkKafkaConsumer ) Now I want to migrate it to KafkaSource . But I am not sure what wi ...
In Spark, we have MapPartition function, which is used to do some initialization for a group of entries, like some db operation. Now I want to do the ...
From official doc, it says Flink support minor version upgrade - restoring a snapshot taken with an older minor version of Flink (1.x → 1.y).. Q1. Do ...
. Answers to this question are eligible for a +50 reputation bounty. Ri ...
Lets assume that I have an input DataStream and want to implement some functionality that requires "memory" so I need ProcessFunction that gives me ac ...
Let's say we have an EventTimeSlidingWindow with an EventTime trigger based on some watermark. If the watermark is generated very infrequently, say ev ...
I am trying to using flink sql to read data from kafka topic. We have a pattern where, if payload size if greater than 1MB, we upload the payload to s ...
We have about 500 million drivers in 12 timezones. We send different communications such as their earnings report, new promotions, policy change updat ...
I am creating a Flink application that reads strings from a Kafka topic for example "2 5 9" is a value. Then split the string with " " delimiter and c ...
I have a stream with sensors data that start from now() and emit data each second but their timestamp increase by 15 min. Let's say now is 19:00:00 so ...
I am trying to create a data stream processing of a product scanner which generates events in the form of the following Tuple4: Timestamp(long, in mil ...
I have two data sources - an S3 bucket and a postgres database table. Both sources have records in the same format with a unique identifier of type uu ...
We need to find number of unique elements in the input stream for multiple timewindows. The Input data Object is of below definition InputData(ele1: I ...
We want to use Apache Flink for the streaming job – read from one Kafka topic and write to another. The infrastructure will be deployed to Kubernetes. ...
I am using flink v1.13, there are 4 task managers (per 16 cpu) with 3800 tasks (default application parallelism is 28) In my application one operator ...
I am building a Flink pipeline and based on live input data need to read records from archive files in a RichFlatMapFunction (e.g. each day I want to ...
I'm trying to write an Flink streaming application that has a KafkaSource to read from a topic which has an AVRO schema defined for its data. I would ...
I am having trouble using AvroParquetReader inside a Flink Application. (flink>=1.15) Motivaton (AKA why I want to use it) According to official ...
I am trying to parse a nested field in a row of a data stream through RichMapFunction<Row, Row>. The input and output of this is Row type. This ...
I am using Apache Flink 1.16.0 version. I am trying to do a simple CEP by printing the elements to the console For any reason, there is nothing printe ...