简体   繁体   中英

How to save millions of rows in MySQL in table with structure of two column?

I have around 600 millions of record in a format of Text and CSV File, I would like to keep these in MySQL for filtering. The file only contains information with two columns; one for SKU and another for Unique ID to identify.

So my question is, how can I design a table structure to get faster response on shared hosting environment?

600 million rows in a shared hosting environment! Any hosting company that sees you doing any slightly complicated query on a data set of that size is bound to red-flag your account. Depending on the size of the data, I'd suggest getting your own SSD VM, or a large dedicated machine. If it's a short-term requirement, I'd even suggest a high end memory intensive EC2 instance.

With regards to structure, there's very little you can do with such a small number of columns, apart from correctly indexing your data. Can you provide a few example rows of data?

With a data set of that size, you may even want to look at using a distributed solution, like MongoDB, so that queries can be off-loaded onto multiple high performance servers (again, SSD VM?).

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM