简体   繁体   中英

How to troubleshoot 100% CPU usage issue with a TCP server?

I've made a TCP server in C# (as a Windows service) that is based largely on the sample Asynchronous Server Socket code in MSDN. However after about a couple of days in operation, it starts using 100% of the CPU on the dev machine and stays there until I stop and restart the service. The weird thing is it still works correctly, even when the service hits max CPU usage, so functionality is never an issue.

I asked about it on StackOverflow here - http://goo.gl/XB2C5 but I guess there were no obvious issues with the code that I had pasted. I've monitored the number of threads that the program uses and it always is between 14-17, so I don't think that's an issue.

Now I'm a bit stuck and don't know how to troubleshoot this problem. Are there any tools I could be using or more diagnostic code that I could add to find out what's causing the CPU usage to spike? I just need some guidance on how to further investigate the problem.

Any help would be greatly appreciated. Thanks!

If you add a Thread.Sleep() to the end of the following loop, does that stop the 100% cpu?

 while (true)
 {
    // Set the event to nonsignaled state.
    allDone.Reset();                    

    // Start an asynchronous socket to listen for connections.
    listener.BeginAccept(new AsyncCallback(AcceptCallback), listener);

    // Wait until a connection is made before continuing.
    allDone.WaitOne();

    Thread.Sleep() //This makes the thread wait for its next timeslice
    //Thread.Yield()  //This makes the thread wait on another change to run from the OS wich might be much faster then Thread.Sleep(). 
 }

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM