简体   繁体   中英

Perl mailqueueing with threads 'out of memory'

I created a mailqueue script that checks a MySQL table for mail that has got a timestamp in the past to send. Sometimes it can be that there is more mail in the 'mailqueue' than i would like to send in one time so i used threads to send the mail in batches. Sometimes when there is not much mail to send the system is running steady, but other times this mailqueue process which is always running (through a bash script which calls it every 30 seconds ) is getting killed because the system is running out of memory. I would like to prevent the 'mailqueue' of running out of memory.

I would like to ask anyone of you to take a look to my code underneath maybe i doing something wrong.

Thanks in advance.

~$ htop
PID  USER       PRI  NI VIRT  RES   SHR  S  CPU% MEM%   TIME+   Command
5675 root       20   0  969M  528M  3744 S  0.0  14.1   2:33.05  perl /path/mailqueue.pl

Server specs: 1 vCPU; 3.75 Gib Memory;
sub start_threads {

    my @threads;

    # records found
    my $found = 0;

    my $sth = $dbh->prepare("SELECT mail_queue_id, project_id, type, name, email FROM mail_queue WHERE timestamp < NOW() AND active = 1 ORDER BY timestamp ASC LIMIT 10");
    $sth->execute();

    while (my $ref = $sth->fetchrow_hashref()) {
        # set if records are found
        $found = 1;

        # set email variables 
        my $id = $ref->{'mail_queue_id'};
        my $project_id = $ref->{'project_id'};
        my $type = $ref->{'type'};
        my $name = $ref->{'name'};
        my $email = $ref->{'email'};

        # create array with data
        my @select_arr = ($id, $project_id, $type, $name, $email);

        # start thread to send mail
        my $t = threads->new(\&sendmail, @select_arr);
        push(@threads,$t);
    }

    foreach (@threads) {
        # mail_queue_id
        my $id = $_->join;

        print "set email $id in queue inactive\n";  

        # set mail_queue record inactive -> MYSQL(event) mailqueue cleanup every 10 minutes
        my $sth = $dbh->prepare("UPDATE mail_queue SET active = 0 WHERE mail_queue_id = ? ");
        $sth->execute($id);
    }

    if($found eq 1) { # return rows in mail_queue < 1

        sleep(10);
        &start_threads;
    }
    else { # skip thread, wait 1 minut = sleep(1) to select new rows;
        sleep(30);
        &start_threads;
    }
}

# prepare send e-mail
sub sendmail {
    my @select_arr = @_; 

    # queue variables
    my $id = $select_arr[0];
    my $project_id = $select_arr[1];
    my $type = $select_arr[3];
    my $name = $select_arr[4];
    my $email = $select_arr[5];

    print "started sending email " . $id . " \n";

    # call function which sends the mail out 
    my $send = &email(@select_arr);

    # if mail is sent
    if($send eq 1) {
        print "done with sending email " . $id . "\n";

        sleep (1);

        # return unique id
        return $id;
    }
}

&start_threads;

What you're doing is potentially quite expensive - when perl threads it creates a copy of your process - including imported modules, data states, etc. If your table returns a lot of rows, you will eat up memory very quickly.

You can see this by doing ps -efT .

For what you're doing, the way you're doing it is a bad idea. I would suggest two alternatives:

  • stick with threads, start a fixed number (say, 10) and use Thread::Queue to serialise your data. This limits the number of process copies, and the thread startup overhead.

  • switch to using fork() . Parallel::ForkManager will do more or less what you want here. fork() is a more efficient way of process cloning - it'll only copy memory on demand, meaning your sub processes are much smaller.

I'll offer some examples I gave in a previous answer: Perl daemonize with child daemons

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM