简体   繁体   中英

How to read a large file from FTP location in server

I have to read large(say 20mb) pipe delimited file from a FTP location and store into DB. What would be the best way to do so.

Some approaches: 1. Copy file locally and then read it and save to DB. 2. Reading directly from stream and saving into DB.

Please suggest some efficient way to do so.

I think option-1 will be better ,

In case of option 2 -What if there is some exception - while reading the file using stream or exception while writing the file in DB , how will you come to know exactly where it have failed ,

with option 1 you can store file temp directory , then write the file in DB and then delete or archive from the temp directory.

Archiving file on file system will be preferable - it will be of use in the future references .

There are many examples around for both -

For file download from ftp you can follow - http://www.codejava.net/java-se/networking/ftp/java-ftp-file-download-tutorial-and-example

When its a sql or csv file you can try it via ssh.

mysql -u root -p

set global net_buffer_length=1000000; --Set network buffer length to a large byte number

set global max_allowed_packet=1000000000; --Set maximum allowed packet size to a large byte number

SET foreign_key_checks = 0; --Disable foreign key checking to avoid delays,errors and unwanted behaviour

source file.sql --Import your sql dump file

SET foreign_key_checks = 1; --Remember to enable foreign key checks when procedure is complete!

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM