[英]Why does deploying to one app pool affect CPU usage of other?
This is not an issue but I am asking out of curiosity as I didn't expect this behavior. 这不是问题,但出于好奇,我没有这样的要求。
On the server, I have two copies of the web app. 在服务器上,我有两个Web应用程序副本。 One copy at location dir1 is used mainly for testing and debugging purposes.
dir1位置的一个副本主要用于测试和调试目的。 The other at dir2 is the production site.
dir2的另一个是生产站点。
Earlier, both were using the same app pool. 此前,两者都使用相同的应用程序池。 I changed it so that now the test one uses an app pool called testAppPool while the other uses DefaultAppPool.
我对其进行了更改,以便现在测试一个使用名为testAppPool的应用程序池,而另一个使用DefaultAppPool。
Now, whenever I deploy a new build to test, the cpu usage of the w3wp.exe process of the DefaultAppPool also goes up. 现在,每当我部署一个新版本进行测试时,DefaultAppPool的w3wp.exe进程的CPU使用率也会上升。 Why does this happen?
为什么会这样? Aren't they different processes and not be affected by changes in other?
它们不是不同的过程,不会受到其他变化的影响吗?
In fact, a Virtual Dir in the prod also pointed to the test folder. 实际上,产品中的虚拟目录也指向测试文件夹。 This I found out today and that seems to be a reason.
我今天发现了这一点,这似乎是一个原因。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.