简体   繁体   English

使用Python Qpid / Proton / Messenger(),如何过滤来自Azure事件中心的消息?

[英]Using Python Qpid/Proton/Messenger(), how do I filter messages from Azure Event Hubs?

This gist shows how to use Messenger() to receive messages from an Azure Event Hub. 本要点介绍如何使用Messenger()从Azure事件中心接收消息。 It works great. 它很棒。

https://gist.github.com/tomconte/e2a4667185a9bf674f59 https://gist.github.com/tomconte/e2a4667185a9bf674f59

However, using this technique yields all messages in the Event Hub. 但是,使用此技术会生成事件中心中的所有消息。 I would like to read messages since an offset or timestamp (don't care which). 我想读取消息,因为偏移量或时间戳(不关心哪个)。 I can see in the Qpid docs how to set these requirements, but not when using Messenger(). 我可以在Qpid文档中看到如何设置这些要求,但在使用Messenger()时却看不到。

Here's the relevant section in the Qpid docs: https://qpid.apache.org/releases/qpid-proton-0.16.0/proton/python/api/index.html 以下是Qpid文档中的相关部分: https ://qpid.apache.org/releases/qpid-proton-0.16.0/proton/python/api/index.html

And a sample that shows how to use it: qpid.apache.org/releases/qpid-proton-0.16.0/proton/python/examples/selected_recv.py.html 以及展示如何使用它的示例:qpid.apache.org/releases/qpid-proton-0.16.0/proton/python/examples/selected_recv.py.html

Question: is it possible and if so how? 问题:有可能,如果可能,怎么样?

Based on my understanding, I think you want to read the event data which start with an offset or timestamp on EventHub. 根据我的理解,我认为您想要读取事件数据,该事件数据以EventHub上的偏移量或时间戳开头。 I reviewed all classes & methods of EventHub SDK for C#/Java, then there are not any way support this usage for consuming event data from partations on EventHub. 我查看了用于C#/ Java的EventHub SDK的所有类和方法,然后没有任何方法支持这种用法来消耗EventHub上的分区的事件数据。 Apache Qpid is a library support AMQP protocol for Java/C/Python, and EventHub support AMQP, but it not means EventHub support all methods of Qpid. Apache Qpid是一个支持Java / C / Python的AMQP协议的库,而EventHub支持AMQP,但它并不意味着EventHub支持Qpid的所有方法。

There are two solutions as workaround way for you. 有两种解决方案可供您解决。

  1. Receiving all messages on EventHub, and filter these undesired for you. 接收EventHub上的所有消息,并为您过滤这些不需要的消息。
  2. Using Azure Stream Analytics to create an output pipeline for outputing messages to other storages, such as Table Storage, DocumentDB, then you can retrieve these data from the other storages with the offset/timestamp of your needs. 使用Azure Stream Analytics创建输出管道以将消息输出到其他存储(例如Table Storage,DocumentDB),然后您可以使用您需要的偏移/时间戳从其他存储中检索这些数据。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM