Tag Archives: Concurrency

SpotTheDefect[0]

Let’s play a little game of “Spot the Defect”. For the following code, see if you can:

  • Figure out what it would output
  • Find the bug(s)
  • Correct the bugs

(And I’ll do a post next week with the answer)

using System;
using System.Threading;

namespace SpotTheDefect1
{
    class Program
    {
        private static int _x = 0;
        private static Foo _foo = new Foo();
        private static bool _disposed = false;

        static void Main(string[] args)
        {
            Thread thread1 = new Thread(Thread1);
            Thread thread2 = new Thread(Thread2);

            thread1.Start();
            thread2.Start();

            thread1.Join();
            thread2.Join();
        }

        private static void Thread1()
        {
            if (!_disposed)
            {
                _foo.Bar();
            }
        }

        private static void Thread2()
        {
            Console.WriteLine("Disposing");
            _disposed = true;
            _foo = null;
        }

        private class Foo
        {
            public void Bar()
            {
                Console.WriteLine("Hello, World!");
            }
        }
    }
}
Tagged , , ,

Collectively Concurrent

“The concept of a stack in programming is very similar to a stack of plates in real life, except that you can cheat a little – for instance, if you’re willing to accept that plates can float, then you can ignore gravity.”

One of the fantastic things about using a rich framework like .Net is that many of the basic data structures that programmers require already exists in a efficient and easy to use form. We also make sure to update these pre-built data structures with the new features that we introduce in the language, for instance in .Net 2.0 we introduced generics, and so the System.Collections.Generic namespace was created. With.Net 4.5 we are introducing new async APIs and the async\await keyword pair, meaning that programmers will now need to deal with concurrency and multithreading more often especially if their application has any shared data structures. Luckily enough, we already have the appropriate APIs that were introduced in 4.0: the System.Collections.Concurrent namespace.

Stack it. Queue it. Bag it.

The first couple of APIs I’d like to introduce you to is the ConcurrentStack and ConcurrentQueue. These are exactly as they sound: A stack and a queue that permit concurrent operations. One common trap when writing a multithreaded application is the pattern of checking the Count of the collection to see if an item is available, before attempting to take an item – which is fine with single threading, but in a multithreaded environment it is possible to have another thread jump in between your check and getting the item which then takes the last item before you can. Instead, the concurrent collections have the the TryPop and TryDequeue methods, which will atomically check the size of the structure and return an item to you if there is one available.

The other data structure I hinted at is the ConcurrentBag. Unlike the Stack or Queue, the ConcurrentBag has no guarantees about the order of output versus input – it’s an “Any In, Any Out” collection. This allows ConcurrentBag can have a much more efficient implementation when there is contention, since the collection can return any object that it currently holds, rather than having to coordinate with any of the threads accessing it in order to return objects in the correct order. One of the best uses of a ConcurrentBag is for a non-time sensitive resource cache, like a buffer pool – where you want to have the best performance even with contention, but it is ok to not return the most recently used object (as you would want to do with a connection pool).

Performance

One of the issues with writing multithreaded applications is attempting to measure performance, especially when you have a shared resource. If you are running a single threaded test with the above data structures, then you may notice that simply putting a standard Stack or Queue inside of a lock gives better performance than the Concurrent equivalent. However, introduce some contention (i.e. have multiple threads attempting to access the same object), and the Concurrent structures begin to shine. Additionally, you need to be careful when doing multithreaded micro-benchmarks as you may introduce too much contention (since a “real” application is likely to do some work with the object is just obtained, rather than handing it back to the collection).that would then skew your results.

However, unless you have a high-performance single threaded application or a multithreaded application with no shared resources, then a concurrent collection will be your best bet. It may be slower in an application with little load, but it will be much easier to scale it to a larger application if needed.

Tagged , , , , , , ,

Thread safety

One of our focuses for .Net 4.5 was on async and improving support for doing ADO.NET asynchronously. A side effect of this is that we did a lot of work improving our thread-safety story. With .Net 4.0 and prior, we simply had a blanket statement that multithreaded access to any ADO.NET object was not supported, except for cancellation (i.e. SqlCommand.Cancel). Even then, there were some unusual corner cases with cancellation that resulted in unexpected circumstances. With 4.5 we could have maintained this stance, but we realized that the use of async makes mistakes with accessing objects on multiple threads much more likely (e.g. when manually calling ContinueWith or if you forget the ‘await’ keyword).

Pending Operations

With the .Net 4.5 Developer Preview, if you try to call any operation on an ADO.NET object while it has an asynchronous operation pending (including the old Begin\End methods), then we throw an InvalidOperationException. While this isn’t the nicest of behaviors, it is much better than the 4.0 and below behavior (which was undefined, although typically resulted in NullReferenceExceptions and data corruption). The reason that we opted for an exception instead of doing something ‘smarter’ (like waiting for the operation to complete), is that a secondary call to an object is typically a coding mistake (e.g. forgetting the ‘await’ keyword) and that anyone who needs to do multiple operations should schedule the second operation via await or ContinueWith.

However, if there is a synchronous operation in progress, we still do not support starting another operation on that object. And, by ‘not supported’, I mean that we have no checks in place and no guarantees on the behavior. Unfortunately, there is no easy way to detect multi-threaded access to an object (even from within a debugger), so you need to make sure that your code is correct. The simplest way to do this is by never sharing an ADO,NET object between multiple threads. This means that if you have a shared ‘Data Access Layer’ in your application, you should be opening a new connection per call (and closing it afterwards) or, if you have something like a singleton logger, you may want to consider a Consumer\Producer pattern such that there is only one thread performing the logging.

Cancellation

As I mentioned previously, cancellation is the only operation that we have always supported from another thread. In .Net 4.5 we have done a lot of work to ensure that cancellation is still supported, and we have also dealt with quite a few of the corner cases. For instance, any time there is a fatal error on a connection (e.g. the network has gone down) then we close the current connection. While this may seem reasonable, it means that cancelling an operation could result in the connection being closed while another operation (i.e. the one being cancelled) was running. While we haven’t changed this behavior in .Net 4.5, we have made sure that any other operation can handle the connection be closed due to an error, even if it means throwing an InvalidOperationException.

Multi-threaded MARS

In SQL Server 2005, we introduced a feature called "Multiple Active Result Sets", or MARS, which allowed multiple commands to be executed on a single connection. In .Net 4.0 and prior this had the caveat that you could not execute multiple commands or use multiple readers simultaneously, which greatly limits the usefulness of MARS. In .Net 4.5 we have done a lot of work to try to enable this scenario for SqlClient and, although we are not yet officially supporting it, it is something that we would like for people to try out as a part of their testing of the Developer Preview and async. As a side note, there is a performance overhead for enabling MARS, so it may be worth also investigating if you can disable the feature instead.

Tagged , , , , , , ,