简体   繁体   中英

If I ensure two threads never run in parallel do I still have to make my list variable volatile?

Imagine I have this code where inside Windows forms timer I can spawn some threads - but I ensure that ONLY one thread is running using following approach (as indicated by one of the answers from here - by Matt Johnson):

nb: let's assume for now this _executing approach works and I don't use backgroundworker , etc.

private volatile bool _executing;

private void TimerElapsed(object state)
{
    if (_executing)
        return;

    _executing = true;

    if(smth)
    {
    Thread myThread = new Thread(MainThread1);
    myThread.IsBackground = true;
    myThread.Start();

    }else
    {
    Thread myThread = new Thread(MainThread2);
    myThread.IsBackground = true;
    myThread.Start();
    }
}


 public void MainThread1()
 {
   try
   {
     methodWhichAddelementTomyList(); // e.g., inside list.add();
   }
   finally
   {_executing = false;}
 }
 public void MainThread2()
 {
   try
   {
     methodWhichAddelementTomyList(); // e.g., inside list.add();
   }
   finally
   {_executing = false;}
 }

Now I also have List instance variable, which you can see I access from MainThread1 and MainThread2 - but since my logic above I ensure that MainThread1 and MainThread2 never run in parallel, do I still have to make the list volatile ? Can I encounter issues related to caching the list variable?

EDIT : And also does this approach protect me from running those threads in parallel? (The answer in the linked question is a bit different -it runs the work inside timer - so I want to double check).

EDIT2 : Honestly there is no common opinion below whether I should apply volatile keyword on my list object or not. This state of affairs confuses me. So documented answer is still welcome; otherwise this is not fully answered

I'll restate your question:

If I ensure two threads never run in parallel do I still have to make my list variable volatile ?

You don't have two threads, you have three: one thread that launches the two others. That one is always running in parallel of either other threads, and it uses a shared flag to communicate with them. Given that and the code you posted, it is not required to mark the list as volatile .


But in the case of two threads and two threads only , that would somehow execute one after each other without interference from a third (ie, reading from a shared variable), making the list volatile would be enough to guarantee that the two threads always see the same data.

For two threads that do not run concurrently to see the list in a consistent state (in other words, up-to-date ), they always have to work on the latest version of what resides in memory. This means that when a thread starts using the list, it has to read from the list after the previous writes have settled.

This implies memory barriers. A thread needs an acquire barrier before using the list, and a release barrier after being done with it. Using Thread.MemoryBarrier , you can't control the semantics of barriers that finely, you always get full barriers (release and acquire, which is stronger than what we need), but the end result is the same.

So, if you can guarantee that the threads never run in parallel, the C# memory model can guarantee that the following works as expected:

private List<int> _list;

public void Process() {
    try {
        Thread.MemoryBarrier(); // Release + acquire. We only need the acquire.
        _list.Add(42);
    } finally {
        Thread.MemoryBarrier(); // Release + acquire. We only need the release.
    }
}

Notice how list is not volatile . Because it's not needed: what is needed is the barriers.

Now the thing is, the ECMA C# Language Specification says (emphasis mine):

17.4.3 Volatile fields

  • A read of a volatile field is called a volatile read. A volatile read has " acquire semantics "; that is, it is guaranteed to occur prior to any references to memory that occur after it in the instruction sequence.

  • A write of a volatile field is called a volatile write. A volatile write has " release semantics "; that is, it is guaranteed to happen after any memory references prior to the write instruction in the instruction sequence.

(Thanks to R. Martinho Fernandes for finding the relevant paragraph in the standard!)

In other words, reading from a volatile field has the same semantics as an acquire barrier, and writing to a volatile field has the same semantics as a release barrier. Which means that given your premise, the following code stanza behaves identically 1 to the previous one:

private volatile List<int> _list;

public void Process() {
    try {
         // This is an acquire, because we're *reading* from a volatile field.
        _list.Add(42);
    } finally {
        // This is a release, because we're *writing* to a volatile field.
        _list = _list;
    }
}

And that's enough to guarantee that as long as both threads do not run in parallel, they will always see the list in a consistent state.

(1) : Both examples are not strictly identical, the first one offers stronger guarantees, but these strong guarantees are not required in this specific case.

Making the object reference to the list volatile does nothing to the list itself. It affects the guarantees you get when reading and assigning to that variable.

You can't apply volatile somewhere and expect it to magically make a non-thread-safe data structure thread-safe. If it was that easy threading would be easy. Just mark everything volatile. Doesn't work.

It appears from the code and description given that you are accessing the list on one thread only. That does not require synchronization. Note, that even if you read a list on a second thread that is unsafe. If there is at least one writer there cannot be any other concurrent access. Not even reads.

Here's a simpler approach:

Task.Run(() => Process(smth));

 ...

 public void Process(bool smth)
 {
   try
   {
     if (smth) methodWhichAddelementTomyList();
     else otherThing();
   }
   finally
   {_executing = false;}
 }

No more "two threads". That's a confusing concept.

There seem to be 2 issues here:

  1. If you are using two threads, but they never run asynchronously, then why have two threads? Just serialize your methods appropriately, ie stick to one thread.

  2. However, if two threads is some sort of requirement (eg allow one thread to continue processing/remain unblocked whilst another is performing some other task): even though you have coded this to ensure no two threads can access the list at the same time, to be on the safe side, I would add a locking construct since a list is not thread safe. To me, it's the most straightforward.

You can use a thread safe collection for this instead, such as one of the collections in System.Collections.Concurrent . Otherwise, you'll need to synchronize all access to the List (ie: put every Add call within a lock),

I personally avoid using volatile. Albahari has a good explanation for it: "the volatile keyword ensures that the most up-to-date value is present in the field at all times. This is incorrect, since as we've seen, a write followed by a read can be reordered."

Volatile just ensures the two threads see the same data at the same time. It doesn't stop them at all from interleaving their reads and write operations.

Eg: Declare a synchronisation object, eg:

private static Object _objectLock = new Object();

and use like so in your methodWhichAddelementTomyList method (and anywhere else your list is modified) to ensure serial access to the resource from different threads:

lock(_objectLock)
{
    list.Add(object);
}

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM