简体   繁体   中英

Does using LIMIT in mysql queries will make my application faster and efficient?

Assuming i have more than 1 million records in a table. Which of the codes below will return results faster and efficient? I used php.

Should I use this?

$query = mysql_query("select count(*) as count from reversals where seen = 'false'");
while($q = mysql_fetch_array($query))
    {
    $limit = $q['count'];
    }

$query2 = mysql_query("select * from reversals where seen = 'false' limit $limit");
while($q = mysql_fetch_array($query))
    {
    echo $q['amount'];
    }

OR:

$query = mysql_query("select * from reversal where seen = 'false'");
while($q = mysql_fetch_array($query2))
    {
    echo $q['amount'];
    }

Your first code example counts number of rows then selects all of them (assuming there are no concurrent sessions that modify this table).

This practically means you select the whole table anyway thus LIMIT doesn't make sense there (and doesn't affect performance as well).

Wherever you've read it - it's a wrong assumption that adding LIMIT automagically makes your queries faster.

Mysql performance optimization IS a complicated topic and most often there are no many generic advices that work for everyone and in every case.

So if you have any real issues with mysql performance - explain what exact issues you have, provide the real database schema, some statistics about data, etc etc

Ok to answer your question here is it, I have a table called paymentlog and it has 709231 records in it. For the test purpose I made sure that it does not have any index and specially the column used in where condition.

I used EXPLAIN and I got something as

explain select * from paymentlog where transdate = '2012-12-01' limit 10 ; 

+----+-------------+------------+------+---------------+------+---------+------+--------+-------------+
| id | select_type | table      | type | possible_keys | key  | key_len | ref  | rows   | Extra       |
+----+-------------+------------+------+---------------+------+---------+------+--------+-------------+
|  1 | SIMPLE      | paymentlog | ALL  | NULL          | NULL | NULL    | NULL | 709231 | Using where |
+----+-------------+------------+------+---------------+------+---------+------+--------+-------------+

You can see its scanning all the rows in the table even through I have added the LIMIT. So LIMIT does not make your query faster it only reduced the data volume.

Now If I add an index in the above table for transdate and run the same explain , I get

+----+-------------+------------+------+--------------------+--------------------+---------+-------+------+-------+
| id | select_type | table      | type | possible_keys      | key                | key_len | ref   | rows | Extra |
+----+-------------+------------+------+--------------------+--------------------+---------+-------+------+-------+
|  1 | SIMPLE      | paymentlog | ref  | plog_transdate_idx | plog_transdate_idx | 3       | const | 1069 |       |
+----+-------------+------------+------+--------------------+--------------------+---------+-------+------+-------+

So the scanned rows now reported is 1069, even though it is not likely that it will scan 1069 rows it will stop when the limit value is found but if we are getting the values with limit say 1000,70 then it will need to do scan till 1069,so it reduced the scanning rows in the table due to index on the column in where condition, better than 709231.

So the conclusion is by using index you can reduced the number rows being scanned, but if you limit the same number of rows will be scanned with at-most index 1069 and without index 709231.

In most cases, like your example, YES , because mysql will stop scanning for results when the limit is reached.

BUT if the generation of the results it's based on something, like ORDER BY, LIMIT helps, but it may not have the expected speed.

SELECT * FROM table WHERE indexed_field = 2 LIMIT 10; //fast
SELECT * FROM table WHERE indexed_field = 2 ORDER BY noindexfield LIMIT 10; //less faster, uses filesort.

Use mysql's explain to optimize SELECTS. It gets more complicated when using GROUP BY.

yes,

fetching a bunch of rows from db is more faster and better than fetching the whole contents....

in PHP there by we can implement Pagination ....

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM