简体   繁体   English

连接Mongodb sink连接器和kafka

[英]Connecting Mongodb sink connector and kafka

I am trying connect the mongodb kafka by using sink connector because i would like write the data from kafka to mongodb.I added mongodb connector jar file in libs folder and I have edited connect-standalone-demo.properties file like below, I am trying connect the mongodb kafka by using sink connector because i would like write the data from kafka to mongodb.I added mongodb connector jar file in libs folder and I have edited connect-standalone-demo.properties file like below,

# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements.  See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License.  You may obtain a copy of the License at
#
#    http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

# These are defaults. This file just demonstrates how to override some settings.
bootstrap.servers=localhost:9092

# The converters specify the format of data in Kafka and how to translate it into Connect data. Every Connect user will
# need to configure these based on the format they want their data in when loaded from or stored into Kafka
key.converter=org.apache.kafka.connect.storage.StringConverter
value.converter=org.apache.kafka.connect.storage.StringConverter
#value.converter=org.apache.kafka.connect.json.JsonConverter
#key.converter=org.apache.kafka.connect.json.JsonConverter
# Converter-specific settings can be passed in by prefixing the Converter's setting with the converter we want to apply
# it to
key.converter.schemas.enable=true
value.converter.schemas.enable=true

offset.storage.file.filename=/tmp/connect.offsets
# Flush much faster than normal, which is useful for testing/debugging
offset.flush.interval.ms=10000

# Set to a list of filesystem paths separated by commas (,) to enable class loading isolation for plugins
# (connectors, converters, transformations). The list should consist of top level directories that include 
# any combination of: 
# a) directories immediately containing jars with plugins and their dependencies
# b) uber-jars with plugins and their dependencies
# c) directories immediately containing the package directory structure of classes of plugins and their dependencies
# Note: symlinks will be followed to discover dependencies or plugins.
# Examples: 
# plugin.path=/usr/local/share/java,/usr/local/share/kafka/plugins,/opt/connectors,
# plugin.path=/home/adminacl/Kafka/kafka_2.13-3.1.0/libs

I have created file [file-sink-standalone.properties]which has configuration of db details.我创建了文件 [file-sink-standalone.properties],其中包含数据库详细信息的配置。

curl -X POST -H "Content-Type: application/json" -d ' {
      "connector.class":"com.mongodb.kafka.connect.MongoSinkConnector",
      "tasks.max":"1",
      "topics":"departments",
      "connection.uri":"mongodb://localhost:27017",
      "database":"hrmdb",
      "collection":"departments",
      "key.converter":"org.apache.kafka.connect.json.JsonConverter",
      "key.converter.schemas.enable":false,
      "value.converter":"org.apache.kafka.connect.json.JsonConverter",
      "value.converter.schemas.enable":false
    
}

I am running the connector by using below cli,我正在使用以下 cli 运行连接器,

bin/connect-standalone.sh config/connect-standalone-demo.properties config/file-sink-standalone.properties 

I am getting following error,我收到以下错误,

ERROR Failed to create job for config/file-sink-standalone.properties (org.apache.kafka.connect.cli.ConnectStandalone:107)
[2022-07-27 17:04:20,424] ERROR Stopping after connector error (org.apache.kafka.connect.cli.ConnectStandalone:117)

You have to uncomment plugin.path property in your connect-standalone-demo.properties [Worker]您必须在 connect-standalone-demo.properties [Worker] 中取消注释plugin.path属性

In my environment all I have is this plugin.path=/usr/share/java for the most of the Sink Connectors and it works perfectly based on connector.class value from the Connector properties.在我的环境中,对于大多数接收器连接器,我所拥有的只是这个 plugin.path=/usr/share/java,它基于连接器属性中的connector.class值完美地工作。

Have a read at https://docs.confluent.io/home/connect/self-managed/userguide.html#installing-kconnect-plugins阅读https://docs.confluent.io/home/connect/self-managed/userguide.html#installing-kconnect-plugins

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM