简体   繁体   English

Kafka connect 找不到我的自定义写入策略

[英]Kafka connect can't find my custom write strategy

I am trying to implement a custom write strategy for a sink connector that writes to mongodb as per the documentation here:我正在尝试根据此处的文档为写入 mongodb 的接收器连接器实施自定义写入策略:

https://www.mongodb.com/docs/kafka-connector/current/sink-connector/fundamentals/write-strategies/ https://www.mongodb.com/docs/kafka-connector/current/sink-connector/fundamentals/write-strategies/

I am trying to get my connector to recognize the following (dummy) custom connector as a proof-of-concept:我试图让我的连接器将以下(虚拟)自定义连接器识别为概念验证:

package com.fu.connect.sink;

import org.bson.*;

import com.mongodb.client.model.UpdateOneModel;
import com.mongodb.client.model.UpdateOptions;
import com.mongodb.client.model.WriteModel;
import com.mongodb.kafka.connect.sink.converter.SinkDocument;
import com.mongodb.kafka.connect.sink.writemodel.strategy.WriteModelStrategy;

import org.apache.kafka.connect.errors.DataException;

public class CustomWriteModelStrategy implements WriteModelStrategy {

    private static final UpdateOptions UPDATE_OPTIONS = new UpdateOptions().upsert(true);

    //incomming json should have one message key e.g. { "message": "Hello World"}
    @Override
    public WriteModel<BsonDocument> createWriteModel(SinkDocument document) {
        
        // Retrieve the value part of the SinkDocument
        BsonDocument vd = document.getValueDoc().orElseThrow(
                () -> new DataException("Error: cannot build the WriteModel since the value document was missing unexpectedly"));

        // extract message from incoming document
        BsonString message = new BsonString("");
        if (vd.containsKey("message")) {
            message = vd.get("message").asString();
        }

        // Define the filter part of the update statement
        BsonDocument filters = new BsonDocument("counter", new BsonDocument("$lt", new BsonInt32(10)));

        // Define the update part of the update statement
        BsonDocument updateStatement = new BsonDocument();
        updateStatement.append("$inc", new BsonDocument("counter", new BsonInt32(1)));
        updateStatement.append("$push", new BsonDocument("messages", new BsonDocument("message", message)));

        // Return the full update å
        return new UpdateOneModel<BsonDocument>(
                filters,
                updateStatement,
                UPDATE_OPTIONS
        );
    }
}

(borrowed from https://github.com/felixreichenbach/kafkaWriteStrategy/blob/master/src/main/java/de/demo/kafka/CustomWriteModelStrategy.java ) I am compiling this class and some other custom transforms to a.jar and adding it to my plugin path using the following Dockerfile: (从https://github.com/felixreichenbach/kafkaWriteStrategy/blob/master/src/main/java/de/demo/kafka/CustomWriteModelStrategy.java借来的)我正在编译这个类和一些其他自定义转换到 a.jar 和使用以下 Dockerfile 将它添加到我的插件路径中:

FROM maven:3.6.0-jdk-11-slim AS build
COPY resources/custom_plugins /app/resources/custom_plugins
COPY resources/whitelist.csv /app/config/

WORKDIR /app/resources/custom_plugins
RUN mvn -e clean package

FROM confluentinc/cp-kafka-connect:7.2.2

ARG version
ENV VERSION=$version

USER appuser
RUN mkdir -p app/bin && \
    mkdir -p app/config
COPY --chown=appuser resources/truststore.jks app/config/
COPY --chown=appuser resources/whitelist.csv /app/config/

USER root

RUN confluent-hub install --no-prompt confluentinc/kafka-connect-avro-converter:5.5.3 && \
    confluent-hub install --no-prompt mongodb/kafka-connect-mongodb:1.8.0 && \
    mkdir /usr/share/confluent-hub-components/plugins && \
    mkdir /usr/share/confluent-hub-components/mongo_plugins && \
    cp /usr/share/confluent-hub-components/mongodb-kafka-connect-mongodb/lib/*.jar /usr/share/confluent-hub-components/mongo_plugins && \
    cp /usr/share/confluent-hub-components/confluentinc-kafka-connect-avro-converter/lib/*.jar /usr/share/confluent-hub-components/plugins && \
    cp /usr/share/filestream-connectors/*.jar /usr/share/confluent-hub-components/plugins

USER appuser

ENV ARTIFACT_ID=CustomPlugins-1.0-SNAPSHOT.jar
COPY --from=build /app/resources/custom_plugins/target/$ARTIFACT_ID /usr/share/confluent-hub-components/mongo_plugins/$ARTIFACT_ID

COPY --chown=appuser scripts/*.sh app/bin/
COPY --chown=appuser config/* app/config/

my current plugin path:我当前的插件路径:

plugin.path=/usr/share/confluent-hub-components/plugins,/usr/share/confluent-hub-components/mongo_plugins/CustomPlugins-1.0-SNAPSHOT.jar,/usr/share/confluent-hub-components/mongo_plugins/mongo-kafka-connect-1.8.0-confluent.jar,

in my sink properties I set在我设置的水槽属性中

writemodel.strategy=com.fu.connect.sink.CustomWriteModelStrategy 

I have tried a variety of different path configurations, including but not limited to adding all.jars to the same directory and only specifying one plugin path, and creating separate.jars for the SMTs and the custom write strategy.我尝试了多种不同的路径配置,包括但不限于将all.jars添加到同一目录并只指定一个插件路径,为SMT创建separate.jars和自定义写入策略。 When I try to run my connector I always get the same error:当我尝试运行我的连接器时,我总是会遇到同样的错误:

java.util.concurrent.ExecutionException: org.apache.kafka.connect.runtime.rest.errors.BadRequestException: Connector configuration is invalid and contains the following 1 error(s):
Invalid value com.fu.connect.sink.CustomWriteModelStrategy for configuration writemodel.strategy: Class not found: com.fu.connect.sink.CustomWriteModelStrategy

my custom transformations work fine, but as far as I can tell the module that loads these transforms is different from the one that loads custom write strategies.我的自定义转换工作正常,但据我所知,加载这些转换的模块与加载自定义写入策略的模块不同。

I have tried restructuring the Java code and the plugin path in all the permutations I can come up with, and I always get the same error我已经尝试在我能想到的所有排列中重构 Java 代码和插件路径,但我总是遇到同样的错误

figured it out: the problem was due to all the.jar file copying that happened here:想通了:问题是由于这里发生的所有.jar文件复制造成的:

mkdir /usr/share/confluent-hub-components/plugins && \
mkdir /usr/share/confluent-hub-components/mongo_plugins && \
cp /usr/share/confluent-hub-components/mongodb-kafka-connect-mongodb/lib/*.jar /usr/share/confluent-hub-components/mongo_plugins && \
cp /usr/share/confluent-hub-components/confluentinc-kafka-connect-avro-converter/lib/*.jar /usr/share/confluent-hub-components/plugins && \
cp /usr/share/filestream-connectors/*.jar /usr/share/confluent-hub-components/plugins

the plugin manager is very finicky about where the.jars are located when it comes to custom write strategies.当涉及到自定义写入策略时,插件管理器对 .jars 的位置非常挑剔。 simply removing all these lines and appropriately updating the plugin path fixed the problem只需删除所有这些行并适当更新插件路径即可解决问题

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM