简体   繁体   中英

How to wait until a process terminates before continuing to execute - C#

So I launch a bunch of process to convert some audio files and i want my main program to wait until all of those process complete before executing.

        System.Diagnostics.Process process = new System.Diagnostics.Process();
        System.Diagnostics.ProcessStartInfo stratInfo = new System.Diagnostics.ProcessStartInfo();
        stratInfo.WindowStyle = System.Diagnostics.ProcessWindowStyle.Hidden;
        DirectoryInfo di = new DirectoryInfo(@dir);
        foreach (FileInfo fi in di.GetFiles())
            {
                stratInfo.FileName = "C:\\AudioExtract.exe";
                stratInfo.Arguments = "-a \"" + dir + "\\" + fi.Name + "\"";
                process.StartInfo = stratInfo;
                process.Start();
            }
                foreach (Process clsProcess in Process.GetProcesses())
                {
                    if (clsProcess.ProcessName.Contains("AudioExtract.exe"))
                    {
                        StatusLbl.Text = "Found!";
                    }
                }

Thats what i have to see if it is running, but i need it to continue updating the getprocesses and check if it is still running and im not quite sure how.

My app launches Many of the same process for different audio files, almost simultaneously. I looked at the link in the comment and that will set up an event handler, how would i handle many of the same event everytime one of the processes exit?

inside the if, try

pprocess.WaitForExit();
break;

http://msdn.microsoft.com/en-us/library/fb4aw7b8.aspx

That code will wait until the process exits, then break from your foreach allowing your main program to continue running.

There are some issues that will make your code practically unusable:

  1. Firing up as many processes as you have files - big no-no. You will congest your CPU and won'2 get any benefits of your super multicore machine afterall. Rule of thumb: up to 2 processes per core. That will warm um the processor just fine.
  2. Disk fragmentation. Writing to 100 files at once will leave your hard drive so fragmented, you'l have it choke in no time.
  3. Reusing Process object: again, bad thing. If you want it like that: create one Process instance in a loop, and store it in some kind of List. If you really stick with idea of 'run all at once' - run them, store them in a list, then iterate the list and wait each one to complete!
  4. Creating processes then asking process list from the system and searching them by name - why when you created them in the first place?

EDIT:

How you could do it:

  • investigate how many CPU cores you have
  • create Array twice as big
  • in foreach loop, do this:
    • determine if you have place in your array (any of the processes is null)
    • if so - create new process, put it into the array
    • if no, loop: check (nonblocking) if any of your processes is completed; first one that is, set it to null (inside the array) - if none are done, Sleep a little (my magic number is 350 - you choose your own)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM