简体   繁体   中英

How can I use Perl to start a remote process and not wait for it to finish?

I am running multiple batch files in remote machine using a Perl script residing in the local machine and I want to run this batch files for a long duration.

The problem is, the Perl program which is running in the local machine is halting and executing the later commands only after the batch files ends.

I want to run batch files in remote machine and I want to execute the rest of the commands in my Perl script without halting.

Please help me out.

How are you running the remote processes? The best answer will probably depend on the specific implementation. But assuming you're using something like Net::SSH or Expect or some sort of RPC mechanism, the easiest thing is probably to fork a new process to run the remote job and then continue on with your script.

my $pid = fork;
if ( ( defined $pid ) and $pid == 0 ) {       
    # child process
    do_remote_batch_jobs();
} elsif ( defined $pid ) { 
    # parent process
    do_other_stuff();
} else { 
    # fork error
    die "Unable to fork: $!";
}

Yes you could use fork but I think a better solution would be to have a script at the remote machine which accepts a batch job and returns its id.
Also,the current status of a submitted job can be retrieved by using the same script. This way the client(ie your machine) would be independent of managing jobs.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM