简体   繁体   中英

kafka connect elastic sink Could not connect to Elasticsearch. General SSLEngine problem

I'm trying to deploy confluent Kafka connect to elasticsearch. My elastic stack is deployed on kubernetes, has HTTP encryption, and authentication. I'm forwarding elastic from kubernetes to localhost.


  Caused by: org.apache.kafka.connect.runtime.rest.errors.BadRequestException: Connector configuration 
     is invalid and contains the following 3 error(s):
     Could not connect to Elasticsearch. Error message: General SSLEngine problem
     Could not authenticate the user. Check the 'connection.username' and 'connection.password'. Error 
     message: General SSLEngine problem
     Could not authenticate the user. Check the 'connection.username' and 'connection.password'. Error 
     message: General SSLEngine problem

I'm sure that the username and password are right. Elastic properties file looks like


    name=elasticsearch-sink
     connector.class=io.confluent.connect.elasticsearch.ElasticsearchSinkConnector
     tasks.max=1
     topics=pwp-alerts
     key.ignore=true
     connection.url=https://localhost:9200
     type.name=kafka-connect
     errors.tolerance = all
     behavior.on.malformed.documents=warn
     schema.ignore = true
     connection.username ="elastic"
     connection.password ="my_password"

Does anyone know what can cause the problem?

I guess the failure issued by unsuccessful connection to your elastic engine it may happens by many things for example wrong port or your listener type it may be advertised listener instead of simple consumer, I recommend to use Logstash and add the Kafka input configuration in your Logstash configuration, You can simply modify your Kafka consumer and bootstrap server and many properties in input and your elastic index, Port and authorization in output easily.

Your Logstash configuration file with Kafka input may look like as below

 input { kafka{ group_id => "Your group consumer group id" topics => ["Your topic name"] bootstrap_servers => "Your consumer port, Default port is 9092" codec => json } } filter { } output { file { path => "Some path" } elasticsearch { hosts => ["localhost:9200"] document_type => "_doc" index => "Your index name" user => username password => password } stdout { codec => rubydebug } }

You can remove the file in output if you don't want to store your data additionally beside your Logstash pipeline.

Find out more about Logstash Kafka input properties in Here

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM