简体   繁体   English

使用 Azure 数据块配置 Confluent kafka 时遇到问题

[英]Facing issue while configuring Confluent kafka with Azure databricks

I am newbie to Azure databrciks and this forum.我是 Azure databrciks 和这个论坛的新手。 I am actually carrying out exercise for steaming Confluent Kafka on Azure databricks.我实际上是在 Azure 数据块上进行 Steaming Confluent Kafka 的练习。 I am able to stream the content on spark.我能够 stream 火花上的内容。 However, there is slight problem as I am exposing username and password in the program.但是,由于我在程序中公开了用户名和密码,因此存在一些小问题。 I would rather passed this through the variable(or Azure key-vault).我宁愿通过变量(或 Azure 密钥库)传递它。 I have tried passing username and password using variable but this approach is not working.我试过使用变量传递用户名和密码,但这种方法不起作用。 I am getting error saying that 'unable to create Kafka consumer'.我收到错误提示“无法创建 Kafka 消费者”。 Can you please let me know how I can proceed?你能告诉我如何继续吗? I am using scala for this.我为此使用 scala。

val streamingInputDF = spark
.readStream
.format("kafka")
.option("kafka.bootstrap.servers", host)
.option("kafka.security.protocol", "SASL_SSL")
.option("kafka.sasl.jaas.config", "kafkashaded.org.apache.kafka.common.security.plain.PlainLoginModule required username=\"AAAAAA\" password=\BBBBB\";")
.option("kafka.ssl.endpoint.identification.algorithm", "https")
.option("kafka.sasl.mechanism", "PLAIN")
.option("startingOffsets", "earliest")
.option("failOnDataLoss", "false")
.option("subscribe", "Test")
.load()

This is how I want to pass but not working:-这就是我想通过但不工作的方式:-

val streamingInputDF = spark
.readStream
.format("kafka")
.option("kafka.bootstrap.servers", host)
.option("kafka.security.protocol", "SASL_SSL")
.option("kafka.sasl.jaas.config", f"kafkashaded.org.apache.kafka.common.security.plain.PlainLoginModule required username=$username password=$password";")
.option("kafka.ssl.endpoint.identification.algorithm", "https")
.option("kafka.sasl.mechanism", "PLAIN")
.option("startingOffsets", "earliest")
.option("failOnDataLoss", "false")
.option("subscribe", "streaming_test_6")
.load()

You need to check below points:您需要检查以下几点:

  1. Make sure you have kafka-clients jar in your class path.确保您的 class 路径中有 kafka-clients jar。

  2. Make sure you have compatible version of kafka and spark installed确保安装了兼容版本的 kafka 和 spark

You can refer article by ANGELA CHU, GIANLUCA NATALI AND CAIO MORENO, where detailed description is given.可以参考ANGELA CHU, GIANLUCA NATALI AND CAIO MORENO 的文章,里面有详细的说明。

Scala Notebook link Scala 笔记本链接

Thank you all who tried to provide suggestion.感谢所有试图提供建议的人。 I am able to resolve issue.我能够解决问题。 Below is the line where changes are required下面是需要更改的行

.option("kafka.sasl.jaas.config",
   "kafkashaded.org.apache.kafka.common.security.plain.PlainLoginModule required username=\"" + confluentApiKey + "\"password=\"" + confluentSecret + "\";")

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 在 Confluent Kafka 上进行身份验证时出现 kerberos 错误 - kerberos error while authenticating on Confluent Kafka 在 Azure EventHub 消费者中面临延迟问题 - Facing latency issue in Azure EventHub consumer Azure Databricks 自动化 databricks-cli 身份验证问题 aad 令牌 - Azure Databricks automation databricks-cli authentication issue aad token 使用 GoogleAuthProvider() 方法时遇到问题 - Facing issue while using GoogleAuthProvider() method 面临 Azure cosmos 多区域 AKS 更改提要处理器的问题 - Facing issue with Azure cosmos change feed processor for Multi region AKS 将 xcom 值与气流 2.3.4 中的静态变量进行比较时遇到问题 - Facing Issue while comparing xcom value to a static variable in airflow 2.3.4 使用 firebase 从 React Native 发送 OTP 时遇到问题 - Facing issue while sending OTP from react native using firebase 使用 PHP 安装 AWS ClusterClient 时遇到问题 8.1 Amazon Linux 2 - Facing issue while installing AWS ClusterClient with PHP 8.1 Amazon Linux 2 使用签名 URL 上传到谷歌云存储时遇到问题 - Facing issue while uploading to Google cloud storage using signed URL 使用新的刷新令牌订阅 fcm 时遇到问题 - Facing an issue while subscribing to fcm with new refresh token
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM