简体   繁体   English

使用.NET 4.0任务并行库强制执行任务顺序

[英]Enforcing Task Order using the .NET 4.0 Task Parallel Libraries

I have a program that has a ton of sensors producing data at a fairly high rate, and consumers that need to consume it. 我有一个程序,有大量传感器以相当高的速率生成数据,而消费者则需要使用它。 The consumers consume at very different rates. 消费者的消费率非常不同。

Since I am using IObserver/IObservable, the trivial solution was to simply create a Task for each event and wrap the OnNext() call and the data in a lamda. 由于我使用的是IObserver / IObservable,所以简单的解决方案就是为每个事件创建一个Task,并将onNext()调用和数据包装在lamda中。 This worked very well, I was surprised how little overhead there was over raw calls. 这非常有效,我很惊讶原始电话的开销很小。

The problem is that some of these consumers need the order of events strictly enforced, and cannot miss any events. 问题是这些消费者中的一些需要严格执行的事件顺序,并且不能错过任何事件。 "PerferFairness" isn't good enough. “PerferFairness”还不够好。

The best solution I have come up with is instead of wrapping the event/OnNext() pair, to wrap an Insert into a ParallelQueue, one queue per consumer, and have a thread on the other end of the Queue to make the OnNext() calls. 我提出的最佳解决方案是包装事件/ OnNext()对,而不是将Insert包装到ParallelQueue中,每个使用者一个队列,并在队列的另一端有一个线程来创建OnNext()调用。

Three immediate problems with this approach. 这种方法存在三个直接问题。 It's much, much slower than the Task/OnNext() wrapping solution. 它比Task / OnNext()包装解决方案慢得多。 There is no Blocking Dequeue (or is there?) for ParallelQueue so the implementation is a bit tricky. ParallelQueue没有Blocking Dequeue(或者有?)所以实现有点棘手。 The third is that this seems such a common problem that I can't imagine there isn't some way to enforce order that I missed, maybe something like multiple task factories sharing an underlying pool, each Factory with some setting that makes them strictly enforce order. 第三是这似乎是一个常见的问题,我无法想象没有一些方法可以强制执行我错过的订单,也许就像多个任务工厂共享一个底层池,每个工厂都有一些设置使得它们严格执行订购。

Anyone know the proper way to achieve what I'm trying to do? 任何人都知道实现我想要做的事情的正确方法吗?

EDIT : Any solution that involves a thread per consumer or producer doesn't work. 编辑:任何涉及每个消费者或生产者的线程的解决方案都不起作用。 Producers/Consumers form long chains, there are hundreds of each. 生产者/消费者形成长链,每个有数百个。

The TPL DataFlow library might be a good fit for your application. TPL DataFlow库可能非常适合您的应用程序。 It extends TPL with a dataflow paradigm that allows you to configure a processing graph and run in a high-performance execution environment. 它通过数据流范例扩展了TPL,允许您配置处理图并在高性能执行环境中运行。

TPL Dataflow is available as a library on top of .NET 4 and should ship as part of .NET 4.5. TPL Dataflow作为.NET 4上的库提供,应作为.NET 4.5的一部分提供。

No comment on the best abstraction for enforcing partial ordering, but if you use a BlockingCollection<T> wrapper around a ConcurrentQueue<T> , that will give you a blocking Take operation to dequeue elements. 没有评论强制部分排序的最佳抽象,但是如果你在ConcurrentQueue<T>周围使用BlockingCollection<T>包装器,那么将为你提供一个阻塞的Take操作来使元素出列。 eg: 例如:

// the default is ConcurrentQueue, so you don't have to specify, but if you
// wanted different behavior you could use e.g. ConcurrentStack

var coll = new BlockingCollection<int>(new ConcurrentQueue<int>());

coll.Add(5); // blocks if the collection is at max capacity

int five = coll.Take(); // blocks if the collection is empty

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM