简体   繁体   中英

load data infile without file in mysql with php

I receive files in a streamed manner once every 30 seconds. The files may have up to 40 columns and 50,000 rows. The files are txt files and tab seperated. Right now, I'm saving the file temporally, save the contents with load data infile to a temporary table in the database and delete the file afterwards.

I would like to avoid the save and delete process and instead save the data directly to the database. The stream is the $output here:

protected function run(OutputInterface $output)
{
    $this->readInventoryReport($this->interaction($output));
}

I've been googling around all the time trying to find a "performance is a big issue" - proof answer to this, but I can't find a good way of doing this without saving the data to a file and using load data infile. I need to have the contents available quickly and work with thoses after they are saved to a temporary table. (Update other tables with the contents...)

Is there a good way of handling this, or will the file save and delete method together with load data infile be better than other solutions?

The server I'm running this on has SSDs and 32GB of RAM.

LOAD DATA INFILE is your fastest way to do low-latency ingestion of tonnage of data into MySQL.

You can write yourself a php program that will, using prepared statements and the like, do a pretty good job of inserting rows into your database. If you arrange to do a COMMIT every couple of hundred rows, and use prepared statements, and write your code carefully, it will be fairly fast, but not as fast as LOAD DATA INFILE . Why? individual row operations have to be serialized onto the network wire, then deserialized, and processed one (or two or ten) at a time. LOAD DATA just slurps up your data locally.

It sounds like you have a nice MySQL server machine. But the serialization is still a bottleneck.

50K records every 30 seconds, eh? That's a lot! Is any of that data redundant? That is, do any of the rows in a later batch of data overwrite rows in an earlier batch? If so, you might be able to write a program that would skip rows that have become obsolete.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM