简体   繁体   中英

How to prevent a Perl script from running more than once in parallel

I have a script that potentially runs for a long time. On Linux. While it is running, when invoked a second time by the same or a different user, it should detect that and refuse to run. I'm trying to figure out how to create a suitable semaphore that gets cleaned up even if the process dies for some reason.

I came across How to prevent PHP script running more than once? which of course can be applied, but I wondering whether this can be done better in Perl.

For example, Perl has the "clean up created temp file at process exit" flag (File::Temp::CLEANUP) that I believe triggers regardless how the process ended. But this API can only be used for creating temp files that have randomized names, so it won't work as a file name for a semaphore. I don't understand the underlying mechanism for how the file gets removed, but is sounds like the mechanism exists. How would I do this for a named file, eg /var/run/foobar?

Or are there better ways of doing this?

use strict;
use warnings;
use Fcntl ':flock';

flock(DATA, LOCK_EX|LOCK_NB) or die "There can be only one! [$0]";


# mandatory line, flocking depends on DATA file handle
__DATA__

One method is to put the pid in the file, then you can have the script check the running process.

open(my $fh, '>', '/var/run/foobar');
print $fh "$$\n";
close $fh;

Then you can read it with:

if (open(my $fh, '<', '/var/run/foobar')) {
    while (my $PID = <$fh>) {
        chomp $PID;
        $proc = `ps hp $PID -o %c`;
        if ($proc eq "foobar"){
            exit();
        }
        break;
    }
}

Probably want to do those in the other order

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM