简体   繁体   中英

C# TCP Unit Tests SocketException can't run all tests

I have a rather bizarre problem. I have some unit tests to are designed to test exceptions thrown in some TCP communication I am working on. Individually, all of the tests work. However, if I try and run them in sequence, the first one passed and all of the remaining tests fail. I am not sure why this is happening. I am creating a new socket in my [TestInitialize] each time. I thought maybe the listener was just hanging, so I set a wait at the start of test 2 of 10 seconds and it still fails. BUT, when I run test one and then test two back to back (about 1 second apart) individually, everything works.

Any idea what might be causing this? My hunch is that the listen port is cleared when the overall test finishes, so I can't rebind a new socket to that port when I run them in sequence.

"Only one usage of each socket address (protocol/network address/port) is normally permitted"

I figured out the mistake. I had been making the socket and binding it under the TestInitialize attribute. What I needed to do was make the socket under the ClassInitialize attribute so it is always there. Then I just open my reading threads in my TestInitialize area.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM