简体   繁体   中英

What is the best way of serializing the tasks using threads in C# .Net Compact Framework?

In my program I am receiving some data and I have to save that to a file (one or many ) order of the data is very important so it should be like first come first saved.

At the end i get a signal that no data is available anymore at this point i have to close all my open files, how can I handle it , I mean how can I make sure that all the threads are done with their work , so that , I can close files.

I am using ManualResetEvent to control the order of data so the next thread is waiting for the previous to finish it's work.

Following is my code sample, I need guideline to do the same kind of work in a very efficient way and how can I know that all threads are done with their work.

class Program
    {
        static StreamWriter _fileStream;
        static void Main(string[] args)
        {
            _fileStream = File.CreateText(@"D:\HelloThread.txt");               
            ManualResetEvent currentEvent = new ManualResetEvent(true);
            ManualResetEvent nextEvent = new ManualResetEvent(false);              
            int length = 60;
            Data data = null;
            Console.WriteLine("Writing started...");
            for (int i = 0; i < length; i++)
            {
                data = new Data { CurrentEvent = currentEvent, Number = i, NextEvent = nextEvent };
                ThreadPool.QueueUserWorkItem(PrintMsg, data);                   
                currentEvent = nextEvent;
                nextEvent = new ManualResetEvent(false);
            }


            Console.ReadLine();
        }

        private static void CloseAll()
        {
            Console.WriteLine("Requested to close all...");


            Console.WriteLine("Done with the writing...");
        }

        private static object _lockObj = new object();

        private static void PrintMsg(object state)
        {
            Data data = state as Data;

            data.CurrentEvent.WaitOne();

            string msg = "Hello times...";
            for (int j = 0; j < 5; j++)
            {
                _fileStream.WriteLine(msg + data.Number);
               // Console.WriteLine(msg + data.Number);
            }

            data.NextEvent.Set();
        }
    }

    public class Data
    {
        public ManualResetEvent CurrentEvent { get; set; }
        public ManualResetEvent NextEvent { get; set; }
        public int Number { get; set; }
    } 

It sounds like you are describing an application pipeline, where there are multiple threads, each one working on a separate piece of a work item. For example, one thread might be doing input, one thread doing process, and one thread doing output.

Typically, you handle this by creating multiple queues. Say you have those three threads. The input thread reads a record and places it into the input queue. The processing thread reads the input queue, processes an item, and places the result in the output queue. The output thread then reads the output queue and writes the data where it needs to go.

This ensures that work items are processed and written in the proper order, but allows all threads to be working concurrently.

Using BlockingCollection , you can have your threads do non-busy waits on the queues. Also, when the input thread is finished reading it can call CompleteAdding on the queue to signal that there are no more work items. When the processing thread reads the queue, it can check the IsCompleted property to determine if all items are done, and thus exit. Same thing for the output thread when reading the output queue.

See http://www.informit.com/guides/content.aspx?g=dotnet&seqNum=821 for some simple examples of using BlockingCollection .

If you can't use BlockingCollection , then you'll have to wrap a concurrency layer around Queue<T> .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM