简体   繁体   中英

Flink with Kafka Integration

I'm trying to integrate Flink with Kafka and read the data from Kafka producer. I'm trying to run the following code by following the code in documentation of flink-docs-release-1.11

import java.util.Properties;
import org.apache.flink.api.common.serialization.SimpleStringSchema;
import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer;

public class Flink_Kafka_Integration {
    public static void main(String[] args) throws Exception {
        final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
    
        Properties properties = new Properties();
        properties.setProperty("bootstrap.servers", "localhost:9092");
        properties.setProperty("group.id", "test");
    
    
        FlinkKafkaConsumer<String> myConsumer = new FlinkKafkaConsumer<>("my-topic", new SimpleStringSchema(), properties);
        DataStream<String> stream = env.addSource(myConsumer);
    
    }
}

I'm getting the following error,

The method addSource(SourceFunction<OUT>) in the type StreamExecutionEnvironment is not applicable for the arguments (FlinkKafkaConsumer<String>)
The type org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumerBase cannot be resolved. It is indirectly referenced from required .class files

I have included the a jar file called flink-streaming-java_2.12-1.11.3.jar in my project build path.

Any suggestions would be helpful.

The following are the versions of software I'm using:

Flink - 1.11.3

Scala - 2.12

flinkKafkaConsumer-2.12

You need to include flink-connector-kafka_2.12-1.11.3.jar in your project build.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM