简体   繁体   English

事件中心与Azure Databricks的连接

[英]Connection of Event hubs to Azure Databricks

I want to add libraries in Azure Databricks for connecting to Event Hubs. 我想在Azure Databricks中添加库以连接到事件中心。 I will be writing notebooks in python. 我将用python编写笔记本。 So which library should I add for connecting to Event Hubs? 那么我应该添加哪个库来连接到事件中心?

As per my search till now I got a spark connecting library in Maven coordinates. 根据我的搜索,到目前为止,我在Maven坐标中有了一个火花连接库。 But I don't think I will be able to import it in python. 但是我认为我将无法在python中导入它。

Structured streaming integration for Azure Event Hubs is ultimately run on the JVM, so you'll need to import the libraries from the Maven coordinate below: Azure事件中心的结构化流集成最终将在JVM上运行,因此您需要从下面的Maven坐标导入库:

 groupId = com.microsoft.azure
  artifactId = azure-eventhubs-spark_2.11
  version = 2.3.10

Note: For Python applications, you need to add this above library and its dependencies when deploying your application. 注意:对于Python应用程序,在部署应用程序时需要添加上述库及其依赖项。

For more details, refer " Structured streaming + Event Hubs Integration Guide for PySpark " and " Attach libraries to Spark Cluster ". 有关更多详细信息,请参阅“ PySpark的结构化流+事件中心集成指南 ”和“ 将库附加到Spark群集 ”。

And also, you may refer SO thread, which addresses a similar issue. 而且,您可以引用SO线程,它解决了类似的问题。

Hope this helps. 希望这可以帮助。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM