简体   繁体   中英

Why does deploying to one app pool affect CPU usage of other?

This is not an issue but I am asking out of curiosity as I didn't expect this behavior.

On the server, I have two copies of the web app. One copy at location dir1 is used mainly for testing and debugging purposes. The other at dir2 is the production site.

Earlier, both were using the same app pool. I changed it so that now the test one uses an app pool called testAppPool while the other uses DefaultAppPool.

Now, whenever I deploy a new build to test, the cpu usage of the w3wp.exe process of the DefaultAppPool also goes up. Why does this happen? Aren't they different processes and not be affected by changes in other?

In fact, a Virtual Dir in the prod also pointed to the test folder. This I found out today and that seems to be a reason.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM