简体   繁体   English

微服务顺序数据处理

[英]Microservices sequential data processing

Suppose I am receiving a stream of unordered sequential data in time. 假设我及时收到了无序的顺序数据流。

For example, input could be: 例如,输入可以是:

[
    {id:1, timestamp:1},
    {id:2, timestamp:1},
    {id:2, timestamp:2},
    {id:1, timestamp:2},
    {id:3, timestamp:1}
] 

Each entity is identified by 'id' field. 每个实体由“ id”字段标识。 There could be a large amount of entities and processing for each input could take some time. 可能会有大量的实体,每个输入的处理可能需要一些时间。 The problem is that I need to process each of those events in order it was received for each entity. 问题是我需要处理每个事件,以便为每个实体接收到这些事件。

I was considering some solutions as to put messages to Kafka topic with partitions and receive parallelism? 我正在考虑一些解决方案,以将消息放入具有分区的Kafka主题并接收并行性? Or create local storage of received messages and marking each processed message for each entity after successful processing (on other machine or on the same in Thread pool)? 还是在成功处理之后(在其他计算机上或在线程池中的同一台计算机上)创建接收到的消息的本地存储并为每个实体标记每个已处理的消息?

Questions: Is it is a good solution? 问题:这是一个好的解决方案吗? How can I reach this functionality while scaling data consumers (having fixed number of services/ creating new instances)? 在扩展数据使用者(具有固定数量的服务/创建新实例)时,如何实现此功能? Maybe there is a better way to solve such kind of problem? 也许有更好的方法来解决此类问题?

"IF" the sequential data you mention just divided by id, 1 2 and 3, Then Would be the best you make 3 background services as an consumer, just need 1 partition for the case (you can decided this on your own) “ IF”您提到的顺序数据仅由id,1 2和3划分,那么作为消费者,您将最好地制作3个后台服务,只需要1个分区(您可以自行决定)

Then make 3 topic based on the data ex : TOPIC 1 TOPIC 2 TOPIC 3 然后根据数据ex进行3个主题的讨论:主题1主题2主题3

which mean you need to make 3 kind of consumer, each of consumer would be listen to only 1 topic 这意味着您需要创建3个消费者,每个消费者只能听1个主题

Then you would be spawn new process / Thread for every new stream data, It would work in parallel 然后,您将为每个新的流数据生成新的进程/线程,它将并行工作

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM