简体   繁体   中英

Handling large data with mysql

I have a large set of data that I need to pull from mysql. Proably would reach 100,000+. I have done optimization for processing the data using indexing and other things. The problem I am facing right now is the memory overflow. Whenever i tried to pull the data higher than 100,000 then it shows memory size error. I have put the memory limit to 512M but there may be chances of large data so I cannot increase the memoery everytime. Is there a better of way of handling this. I am using cakephp and I need all data at once for my system.

You can't escape the fact that the memory is limited. Period. The requirement is silly but fine. You'll have to process the data in chunks that fit into the available memory and send the chunks to the client or append them to the file you're writing.

This is basically AJAX pagination or "endless loading" just that you don't change the page but append the next page to the previous in the DOM tree until you reached the last page. Have fun enjoying a probably very slow responding site when all records are loaded and millions of elements exist in the DOM tree. ;)

Another way could be to create the view in a background job (shell, cron task) and then just send the pre-generated file to the client like a static page. The shell would have to paginate the data as well and append it to work around the memory limitation.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM