简体   繁体   English

kafka jdbc sink 连接器独立错误

[英]kafka jdbc sink connector standalone error

I am trying to insert data into a postgres database from a topic in kafka.我正在尝试将数据从 kafka 中的主题插入到 postgres 数据库中。 I am using the following command to load我正在使用以下命令加载

./bin/connect-standalone etc/schema-registry/connect-avro-standalone.properties etc/kafka-connect-jdbc/sink-quickstart-mysql.properties

The sink-quickstart-mysql.properties is as follows sink-quickstart-mysql.properties如下

name=test-sink-mysql-jdbc-autoincrement
connector.class=io.confluent.connect.jdbc.JdbcSinkConnector
tasks.max=1
topics=third_topic
connection.url=jdbc:postgres://localhost:5432/postgres
connection.user=postgres
connection.password=postgres
auto.create=true

The error I am getting is我得到的错误是

[2019-01-29 13:16:48,859] ERROR Failed to create job for /home/ashley/confluent-5.1.0/etc/kafka-connect-jdbc/sink-quickstart-mysql.properties (org.apache.kafka.connect.cli.ConnectStandalone:102) [2019-01-29 13:16:48,862] ERROR Stopping after connector error (org.apache.kafka.connect.cli.ConnectStandalone:113) java.util.concurrent.ExecutionException: org.apache.kafka.connect.errors.ConnectException: Failed to find any class that implements Connector and which name matches io.confluent.connect.jdbc.JdbcSinkConnector, available connectors are: PluginDesc{klass=class org.apache.kafka.connect.file.FileStreamSinkConnector, name='org.apache.kafka.connect.file.FileStreamSinkConnector', version='2.1.0-cp1', encodedVersion=2.1.0-cp1, type=sink, typeName='sink', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.file.FileStreamSourceConnector, name='org.apache.kafka.connect.file.FileStreamSourceConnector', version='2.1.0-cp1', encodedVersion=2.1.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockConnector, name='org.apache.kafka.connect.tools.MockConnector', version='2.1.0-cp1', encodedVersion=2.1.0-cp1, type=connector, typeName='connector', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockSinkConnector, name='org.apache.kafka.connect.tools.MockSinkConnector', version='2.1.0-cp1', encodedVersion=2.1.0-cp1, type=sink, typeName='sink', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockSourceConnector, name='org.apache.kafka.connect.tools.MockSourceConnector', version='2.1.0-cp1', encodedVersion=2.1.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.SchemaSourceConnector, name='org.apache.kafka.connect.tools.SchemaSourceConnector', version='2.1.0-cp1', encodedVersion=2.1.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.VerifiableSinkConnector, name='org.apache.kafka.connect.tools.VerifiableSinkConnector', version='2.1.0-cp1', encodedVersion=2.1.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.VerifiableSourceConnector, name='org.apache.kafka.connect.tools.VerifiableSourceConnector', version='2.1.0-cp1', encodedVersion=2.1.0-cp1, type=source, typeName='source', location='classpath'}     at org.apache.kafka.connect.util.ConvertingFutureCallback.result(ConvertingFutureCallback.java:79)  at org.apache.kafka.connect.util.ConvertingFutureCallback.get(ConvertingFutureCallback.java:66)     at org.apache.kafka.connect.cli.ConnectStandalone.main(ConnectStandalone.java:110) Caused by: org.apache.kafka.connect.errors.ConnectException: Failed to find any class that implements Connector and which name matches io.confluent.connect.jdbc.JdbcSinkConnector, available connectors are: PluginDesc{klass=class org.apache.kafka.connect.file.FileStreamSinkConnector, name='org.apache.kafka.connect.file.FileStreamSinkConnector', version='2.1.0-cp1', encodedVersion=2.1.0-cp1, type=sink, typeName='sink', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.file.FileStreamSourceConnector, name='org.apache.kafka.connect.file.FileStreamSourceConnector', version='2.1.0-cp1', encodedVersion=2.1.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockConnector, name='org.apache.kafka.connect.tools.MockConnector', version='2.1.0-cp1', encodedVersion=2.1.0-cp1, type=connector, typeName='connector', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockSinkConnector, name='org.apache.kafka.connect.tools.MockSinkConnector', version='2.1.0-cp1', encodedVersion=2.1.0-cp1, type=sink, typeName='sink', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockSourceConnector, name='org.apache.kafka.connect.tools.MockSourceConnector', version='2.1.0-cp1', encodedVersion=2.1.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.SchemaSourceConnector, name='org.apache.kafka.connect.tools.SchemaSourceConnector', version='2.1.0-cp1', encodedVersion=2.1.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.VerifiableSinkConnector, name='org.apache.kafka.connect.tools.VerifiableSinkConnector', version='2.1.0-cp1', encodedVersion=2.1.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.VerifiableSourceConnector, name='org.apache.kafka.connect.tools.VerifiableSourceConnector', version='2.1.0-cp1', encodedVersion=2.1.0-cp1, type=source, typeName='source', location='classpath'}   at org.apache.kafka.connect.runtime.isolation.Plugins.newConnector(Plugins.java:179)    at org.apache.kafka.connect.runtime.AbstractHerder.getConnector(AbstractHerder.java:382)    at org.apache.kafka.connect.runtime.AbstractHerder.validateConnectorConfig(AbstractHerder.java:261)     at org.apache.kafka.connect.runtime.standalone.StandaloneHerder.putConnectorConfig(StandaloneHerder.java:189)   at org.apache.kafka.connect.cli.ConnectStandalone.main(ConnectStandalone.java:107) [2019-01-29 13:16:48,886] INFO Kafka Connect stopping (org.apache.kafka.connect.runtime.Connect:65) [2019-01-29 13:16:48,886] INFO Stopping REST server (org.apache.kafka.connect.runtime.rest.RestServer:223) [2019-01-29 13:16:48,894] INFO Stopped http_8083@dc4fee1{HTTP/1.1,[http/1.1]}{0.0.0.0:8083} (org.eclipse.jetty.server.AbstractConnector:341) [2019-01-29 13:16:48,895] INFO node0 Stopped scavenging (org.eclipse.jetty.server.session:167) [2019-01-29 13:16:48,930] INFO Stopped o.e.j.s.ServletContextHandler@3c46dcbe{/,null,UNAVAILABLE} (org.eclipse.jetty.server.handler.ContextHandler:1040) [2019-01-29 13:16:48,943] INFO REST server stopped (org.apache.kafka.connect.runtime.rest.RestServer:241) [2019-01-29 13:16:48,943] INFO Herder stopping (org.apache.kafka.connect.runtime.standalone.StandaloneHerder:95) [2019-01-29 13:16:48,944] INFO Worker stopping (org.apache.kafka.connect.runtime.Worker:184) [2019-01-29 13:16:48,944] INFO Stopped FileOffsetBackingStore (org.apache.kafka.connect.storage.FileOffsetBackingStore:66) [2019-01-29 13:16:48,947] INFO Worker stopped (org.apache.kafka.connect.runtime.Worker:205) [2019-01-29 13:16:48,950] INFO Herder stopped (org.apache.kafka.connect.runtime.standalone.StandaloneHerder:112) [2019-01-29 13:16:48,951] INFO Kafka Connect stopped (org.apache.kafka.connect.runtime.Connect:70)

Postgres jar file is already there in the folder. Postgres jar 文件已经在文件夹中。 Can someone advise?有人可以建议吗?

This lines are the most important from your log:这几行是您日志中最重要的:

java.util.concurrent.ExecutionException: org.apache.kafka.connect.errors.ConnectException: Failed to find any class that implements Connector and which name matches io.confluent.connect.jdbc.JdbcSinkConnector, available connectors are:... java.util.concurrent.ExecutionException: org.apache.kafka.connect.errors.ConnectException: 无法找到任何实现连接器且名称与 io.confluent.connect.jdbc.JdbcSinkConnector 匹配的类,可用的连接器是:...

It seems, that you didn't install kafka-connect-jdbc connector看来,您没有安装kafka-connect-jdbc连接器

Check your plugin.path property in etc/schema-registry/connect-avro-standalone.properties and ensure that the line for plugin.path is uncommented. 检查etc/schema-registry/connect-avro-standalone.propertiesplugin.path属性,并确保plugin.path行未注释。

If not using Confluent Platform, you will need to create under that plugin.path directory, another directory for the jdbc plugin: ex.如果不使用 Confluent Platform,则需要在该plugin.path目录下创建 jdbc 插件的另一个目录:例如。 kafka-connect-jdbc and put all needed jars there ex. kafka-connect-jdbc并将所有需要的罐子放在那里。 kafka-connect-jdbc-5.1.0.jar , its dependencies, and your jdbc drivers. kafka-connect-jdbc-5.1.0.jar 、它的依赖项和你的 jdbc 驱动程序。

More details can be found: https://docs.confluent.io/current/connect/userguide.html#installing-plugins可以找到更多详细信息: https : //docs.confluent.io/current/connect/userguide.html#installing-plugins

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM