简体   繁体   中英

Monitoring of a process performance implications?

I have 3 servers with processes that require all the CPU they can get. I let these processes write their standard output to a file

./run.sh > run.log

Would this writing slow down the process? (the ./run.sh script starts eg. a Java program and sometimes a Ruby program)

Now I want to create a web-interface that would display the output from the script while it is running. I can imagine writing a php script which refreshes every 5 seconds, creates a SSH connection to the server does and gets the run.log file.

But wouldn't that interfere with the process or slow it down? It is really crutial that the server is able to use as much of it's power as possible. Or are there better ways to handle this? Instead of creating an SSH connection every 5 seconds, maby a persistent connection and update with Ajax? (security is not a requirement)

Would this writing slow down the process? (the ./run.sh script starts eg. a Java program and sometimes a Ruby program)

Maybe; if the process writes a lot of data, it can easily slowdown the process, because likely the process will be writing synchronously to the disk. Otherwise, you don't have to worry.

An alternative would be having a setup where the script sends the output to the machine with the web application via some kind of message service. This would avoid polling the server; whether it would be more efficient depends on the details.

A simple an efficient mechanism would be forwarding stdout to a UDP socket and have the web application listen and store temporarily those messages in a circular buffer.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM