简体   繁体   English

如何更快地插入记录

[英]How to insert records faster

I have to read records from CSV file and store them in Mysql database. 我必须从CSV文件中读取记录并将其存储在Mysql数据库中。

I know about "LOAD DATA INFILE" but in my case I have to get single record from file, check if it is in valid format/length etc and then store it in database. 我知道“ LOAD DATA INFILE”,但就我而言,我必须从文件中获取单个记录,检查它是否为有效格式/长度等,然后将其存储在数据库中。

// list to store records from CSV file
ArrayList<String> list = new ArrayList<String>();

//Read one line at a time
while ((nextLine = reader.readNext()) != null) 
{
   for (String number : nextLine) 
   {
      if (number.length() > 12 && number.startsWith("88"))
      {        
         list.add(number);
      } else if (number.length() > 9 && number.startsWith("54")) 
      {
         list.add(number);
      }
      else if (number.length() > 8 && number.startsWith("99"))
      {
         list.add(number);
      }
      else
      {
        // ....
      }

      // method to insert data in database
      insertInToDatabase(list);                     
   }
}

and method to insert record in db: taken from here 和在db中插入记录的方法: 取自此处

private void insertInToDatabase(ArrayList<String> list) 
{
   try
   {
      String query = "INSERT INTO mytable(numbers) VALUES(?)";

        prepStm = conn.prepareStatement(query);

        for (String test : list) 
        {
            prepStm.setString(1, test);

            prepStm.addBatch();// add to batch
            prepStm.clearParameters();
        }

        prepStm.executeBatch();
    }
}

This is working, but the rate at which the records are inserting is very slow. 这是可行的,但是记录的插入速度非常慢。 is there any way by which I can insert records faster. 有什么方法可以更快地插入记录。

You would need to use: " rewriteBatchedStatement " as that is a MYSQL optimization which attempts to reduce round trips to the server by consolidating the inserts or updates in as few packets as possible. 您将需要使用:“ rewriteBatchedStatement ”,因为这是MYSQL 优化 ,它试图通过将插入或更新合并到尽可能少的数据包中来减少到服务器的往返行程。

Please refer to: https://anonymousbi.wordpress.com/2014/02/11/increase-mysql-output-to-80k-rowssecond-in-pentaho-data-integration/ 请参考: https : //anonymousbi.wordpress.com/2014/02/11/increase-mysql-output-to-80k-rowssecond-in-pentaho-data-integration/

Also, there are other optimizations as well in that article. 此外,该文章中还有其他优化 Hope this speed up the batching. 希望这样可以加快批处理速度。

EDIT 1: There is a lucid explanation of this parameter on this site as well: refer to: MySQL and JDBC with rewriteBatchedStatements=true 编辑1:在此站点上也对此参数有一个清晰的解释:参见: MySQL和JDBC,带有rewriteBatchedStatements = true

i think the better approach is to process the csv file with the rules defined and then create another csv out it, and once the output csv is prepared. 我认为更好的方法是使用定义的规则处理csv文件,然后再创建另一个csv,一旦准备输出csv。 do load data infile. 加载文件中的数据。

it'll be pretty quick. 很快。

If you want to insert through your own application create batch query like this and execute to MySQL server. 如果要通过自己的应用程序进行插入,请创建像这样的批处理查询并执行到MySQL服务器。

String query = "INSERT INTO mytable(numbers) 
                VALUES (0),
                       (1),
                       (2),
                       (3)";

@Khanna111's answer is good. @ Khanna111的答案很好。

I don't know if it helps, but try checking the table engine type. 我不知道是否有帮助,但是请尝试检查表引擎类型。 I once encountered the problem in which records are inserting very slow. 我曾经遇到过记录插入速度非常慢的问题。 I changed the engine from InnoDB to MyISAM and insertion becomes very fast. 我将引擎从InnoDB更改为MyISAM,插入变得非常快。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM