简体   繁体   中英

PHP/MySQL - Multiple queries at the same time

I have 24 databases with a table labeled email_queue .

I have another database with a list of all the databases that have the email_queue table in it.

I loop through the list of databases and query my email_queue table to send mails for each database.

The problem with this is that the php script gets held up on, lets say, the 3rd database while sending 500 emails leaving the other databases after that one to wait for their turn.

I am trying to figure out how i can query all 24 databases at the same time and send the email queue at the same time.

Any suggestions?

I think having this many databases is probably a sign of bad design. If you can't change it and need to move forward now, I suggest one of two options:

  1. Run the same script with a parameter to select which database to use. You should be able to find resources on how to do this
  2. Use non-blocking queries ; the rest of this answer will be spent talking about this.

Here's a somewhat complete example using the mysqli extension (requires the mysqlnd driver):

$credentials = array(
    array(
        'host' => 'host1',
        'user' => 'user',
        'password' => 'password',
        'database' => 'database'
    ),
    array(
        'host' => 'host2',
        'user' => 'user',
        'password' => 'password',
        'database' => 'database'
    ),
    // credentials for other sites
);
$dbcs = array();
foreach ($credentials as $config) {
    $dbcs[] = array($db = new mysqli(
        $config['host'],
        $config['user'],
        $config['pass'],
        $config['database']
    ));
    $query = ""; // here is your query to do whatever it is with your table
    $db->query($query, MYSQLI_ASYNC);
}

$results = array();
$errors = array();
$rejected = array();
$secondsToWait = 1;

while (!empty($dbcs)) {
    foreach ($dbcs as $key => $c) {
        $db = $c[0];
        if (mysqli_poll($c, $errors, $rejected, $secondsToWait) == 1) {
            $r = $db->reap_async_query();

            // here you would do your fetches for each query, such as
            $results[] = $r->fetch_assoc();

            // do what you need to do with the result

            // then cleanup
            $r->free();
            $db->close();
            unset($dbcs[$key]);
        }
    }
}

Note that it does have drawbacks, such as a failed query may bring down the whole program.

On way to do this is with curl_multi_open

Split your script into two, you can make one php file (say email_out.php) take the db name (or some variable that's used to look up the db name, either the switch will be in the for loop or in email_out.php), and then do the mass email based of that one script.

the second part uses curl_multi_open to open the email_out.php script multiple times, effectively creating multiple separate connections to different db's, the scripts can all resolve at different times since they are all running in parallel. Essentially, your loop is now adding the script to curl_multi_open multiple times with different arguments and then executing all of them asynchronously.

class Fork
{
    private $_handles = array();
    private $_mh      = array();

    function __construct()
    {
        $this->_mh = curl_multi_init();
    }

    function add($url)
    {
        $ch = curl_init();
        curl_setopt($ch, CURLOPT_URL, $url);
        curl_setopt($ch, CURLOPT_HEADER, 0);
        curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
        curl_setopt($ch, CURLOPT_TIMEOUT, 30);
        curl_multi_add_handle($this->_mh, $ch);
        $this->_handles[] = $ch;
        return $this;
    }

    function run()
    {
        $running=null;
        do {
            curl_multi_exec($this->_mh, $running);
            usleep (250000);
        } while ($running > 0);
        for($i=0; $i < count($this->_handles); $i++) {
            $out = curl_multi_getcontent($this->_handles[$i]);
            $data[$i] = json_decode($out);
            curl_multi_remove_handle($this->_mh, $this->_handles[$i]);
        }
        curl_multi_close($this->_mh);
        return $data;
    }
}

(from http://gonzalo123.com/2010/10/11/speed-up-php-scripts-with-asynchronous-database-queries/ )

So your loop would look something like this:

$fork = new Fork;
for ($i = 0; $i < 24; $i++) {
    $fork->add("email_out.php?t=" . $i);
}
$fork->run();

In your script try this.

  1. Use set_time_limit(0); in order to override PHP's max_execution_time
  2. Use the getopt() function to get the database name when the script is ran from the command line (ie php script.php -d database1 ).
  3. From there do the logic.
  4. In your crontab make an entry for each database you would like to send emails from using the switch ( -d ) I specified in 2. If you have 20 databases then you must have 20 entries.

Using this way you will see a separate PHP process for each cron job and you can isolate a database if ever you will encounter an error with it.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM