简体   繁体   English

使用docker将kafka主题数据写入redis

[英]Write kafka topic data to redis using docker

I am using this repository kafka connect to redis.我正在使用这个存储库kafka 连接到 redis。

Explain: What I want to do is to write kafka topics data into redis using docker.说明:我要做的是使用docker将kafka主题数据写入redis。 They have created a readme file to instruct how to set the configuration of the kafka:他们创建了一个自述文件来指导如何设置 kafka 的配置:

curl -s -X POST -H 'Content-Type: application/json' --data @connector.json http://localhost:8083/connectors

connector.json file contains: connector.json文件包含:

{
  "config" : {
    "name" : "RedisSinkConnector1",
    "connector.class" : "com.github.jcustenborder.kafka.connect.redis.RedisSinkConnector",
    "tasks.max" : "1",
    "topics" : "mostafa"
  }
}

Problem: I know how to create a new topic in kafka, but the problem is where I don't know how to change the docker-compose or test the connection.问题:我知道如何在 kafka 中创建新主题,但问题是我不知道如何更改 docker-compose 或测试连接。 While I have created a new topic in kafka, nothing shown in redis database!虽然我在 kafka 中创建了一个新主题,但 redis 数据库中没有显示任何内容!

I would be thankful if anyone could help me.如果有人可以帮助我,我将不胜感激。

For starters, there is no Kafka Connect container in the compose file there, so you'll need to add one, or start Kafka Connect outside of Docker on your host machine.对于初学者来说,那里的 compose 文件中没有 Kafka Connect 容器,因此您需要添加一个,或者在主机上的 Docker 之外启动 Kafka Connect。

Then, it's not clear if you've gotten the Redis connector properly loaded, so open up http://localhost:8083/connector-plugins to see if it is (this will also verify you've started the Connect Server}然后,不清楚你是否正确加载了 Redis 连接器,所以打开http://localhost:8083/connector-plugins看看它是否正确(这也将验证你已经启动了连接服务器}

Once that's done, you can post your config (you will need to remove the -s that hides the curl output).完成后,您可以发布配置(您需要删除隐藏 curl 输出的-s )。 Once posted, you will want to be checking the logs of the running Connect process, or you can also go to http://localhost:8083/connectors/RedisSinkConnector1/status发布后,您将需要检查正在运行的 Connect 进程的日志,或者您也可以访问http://localhost:8083/connectors/RedisSinkConnector1/status

Given what you've shown, and you got this far, both of the above probably say something about a Connection Exception to localhost:6379, since that's the default connection.鉴于您所展示的内容,并且您已经走到了这一步,以上两者都可能说明 localhost:6379 的连接异常,因为这是默认连接。 You'll have to provide "redis.hosts": "redis:6379" as a property.您必须提供"redis.hosts": "redis:6379"作为属性。

Then, also mentioned in the documentation然后,文档中也提到了

This connector expects records from Kafka to have a key and value that are stored as bytes or a string此连接器期望来自 Kafka 的记录具有存储为字节或字符串的键和值

So, it wouldn't hurt to also add key & value converter to your properties as well to specify the data types.因此,将键和值转换器也添加到您的属性以及指定数据类型不会有什么坏处。 If you are directly using the Connect container from Confluent, it's probably set to use the Avro converter, not string or bytes one如果您直接使用 Confluent 中的 Connect 容器,它可能设置为使用 Avro 转换器,而不是字符串或字节一

Here's an example of a valid configuration that you can POST这是您可以发布的有效配置的示例

{
  "name" : "RedisSinkConnector1",
  "config" : {
    "connector.class" : "com.github.jcustenborder.kafka.connect.redis.RedisSinkConnector",
    "tasks.max" : "1",
    "topics" : "mostafa",
    "redis.hosts": "redis:6379",
    "key.converter": "org.apache.kafka.connect.storage.StringConverter",
    "value.converter": "org.apache.kafka.connect.storage.StringConverter"
  }
}

With those adjustments, I would think sending any simple key-value message would work and then use redis-cli to run some scan/get key query通过这些调整,我认为发送任何简单的键值消息都可以,然后使用 redis-cli 运行一些扫描/获取键查询

The following configuration would resolve the problem.以下配置将解决该问题。

{
  "name" : "RedisSinkConnector1",
  "config" : {
    "connector.class" : "com.github.jcustenborder.kafka.connect.redis.RedisSinkConnector",
    "tasks.max" : "1",
    "topics" : "mostafa",
    "redis.hosts": "redis:6379",
    "key.converter": "org.apache.kafka.connect.storage.StringConverter",
    "value.converter": "org.apache.kafka.connect.storage.StringConverter"
  }
}

add kafka-connect in docker-compose file在 docker-compose 文件中添加 kafka-connect

kafka-connect:
    hostname: kafka-connect
    image: confluentinc/cp-kafka-connect:latest
    container_name: kafka-connect
    ports:
      - 8083:8083
    depends_on:
      - schema-registry
      **- redis**
    environment:
      CONNECT_BOOTSTRAP_SERVERS: kafka:9092
      CONNECT_REST_PORT: 8083
      CONNECT_GROUP_ID: "quickstart-avro"
      CONNECT_CONFIG_STORAGE_TOPIC: "quickstart-avro-config"
      CONNECT_OFFSET_STORAGE_TOPIC: "quickstart-avro-offsets"
      CONNECT_STATUS_STORAGE_TOPIC: "quickstart-avro-status"
      CONNECT_CONFIG_STORAGE_REPLICATION_FACTOR: 1
      CONNECT_OFFSET_STORAGE_REPLICATION_FACTOR: 1
      CONNECT_STATUS_STORAGE_REPLICATION_FACTOR: 1
      CONNECT_KEY_CONVERTER: "org.apache.kafka.connect.json.JsonConverter"
      CONNECT_VALUE_CONVERTER: "org.apache.kafka.connect.json.JsonConverter"
      CONNECT_INTERNAL_KEY_CONVERTER: "org.apache.kafka.connect.json.JsonConverter"
      CONNECT_INTERNAL_VALUE_CONVERTER: "org.apache.kafka.connect.json.JsonConverter"
      **CONNECT_REST_ADVERTISED_HOST_NAME: "kafka-connect"**
      CONNECT_LOG4J_ROOT_LOGLEVEL: DEBUG
      CONNECT_PLUGIN_PATH: "/usr/share/java,/etc/kafka-connect/jars"
    volumes:
      - $PWD/jars:/etc/kafka-connect/jars

depends_on redis and CONNECT_REST_ADVERTISED_HOST_NAME variable very important in resolving this issue depends_on redis 和 CONNECT_REST_ADVERTISED_HOST_NAME 变量对于解决这个问题非常重要

Can you please send the soure connector details for RedisSinkConnector1您能否发送 RedisSinkConnector1 的源连接器详细信息

{ "name" : "RedisSinkConnector1", "config" : { "connector.class" : "com.github.jcustenborder.kafka.connect.redis.RedisSinkConnector", "tasks.max" : "1", "topics" : "mostafa", "redis.hosts": "redis:6379", "key.converter": "org.apache.kafka.connect.storage.StringConverter", "value.converter": "org.apache.kafka.connect.storage.StringConverter" } } {“名称”:“RedisSinkConnector1”,“配置”:{“connector.class”:“com.github.jcustenborder.kafka.connect.redis.RedisSinkConnector”,“tasks.max”:“1”,“主题”: “mostafa”,“redis.hosts”:“redis:6379”,“key.converter”:“org.apache.kafka.connect.storage.StringConverter”,“value.converter”:“org.apache.kafka.connect .storage.StringConverter" } }

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM