繁体   English   中英

如何改进node.js中大型csv文件的处理并将数据插入远程数据库(MYSQL)?

[英]How to improve the processing of large csv file and insert data to remote database (MYSQL) in node.js?

这是一个工作示例,我正在寻找更好地改进它并在代码级别提供性能提升的方法,如果不寻找其他可扩展性选项。

我需要处理 csv 文件中的数据,该文件可以大到 100 万行。

处理完每一行后,必须检查数据库中的值以进行一些验证,然后将处理后的数据插入到不同的表中。

我从简单的节点 stream 和异步库开始,但是速度很低,3-5 分钟内有 2k 行,有时它会抛出 nodejs memory 错误

经过一些迭代和研究,在 20-25 秒内将速度提高到 5k 行,并避免了 memory 完全错误。

瓶颈是即使我改变batch size时间也不会改变,为数据库添加了pooling,甚至增加了pool

我可以增加池大小以提高速度,但是需要知道我们如何确定最大池大小。 如果每个连接或整体连接的池连接,鉴于我无法更改 mysql 的默认连接,因为没有管理员访问权限

增加子进程是否会产生任何影响,因为 nodejs 本身使用所有可用的池连接异步调用 dataProcessing。

有什么方法可以提高这个速度?

这是代码#

插入大.ts

import { Request, Response } from 'express';
import * as fs from 'fs';
import * as path from 'path';
import { getProductById, insertProduct } from '../repo';
import { performance } from 'perf_hooks';

import Papa from 'papaparse';

let processedNum = 0;

const insertBig = async (req: Request, res: Response) => {
  try {
    const filePath = path.normalize(`${__dirname}./../assets/test.csv`);
    importCSV(filePath).catch((error) => {
      return res.status(500).send('Some error here in isert biG 123');
    });
    return res.status(200).json({ data: 'Data send for processing 123' });
  } catch (error) {
    return res.status(500).send('Some error here in isert biG 123');
  }
};

async function importCSV(filePath: fs.PathLike) {
  let parsedNum = 0;
  const dataStream = fs.createReadStream(filePath);
  const parseStream = Papa.parse(Papa.NODE_STREAM_INPUT, {
    header: true
  });
  dataStream.pipe(parseStream);
  let buffer = [];
  let totalTime = 0;
  const startTime = performance.now();
  for await (const row of parseStream) {
    //console.log('PA#', parsedNum, ': parsed');
    buffer.push(row);
    parsedNum++;
    if (parsedNum % 400 == 0) {
      await dataForProcessing(buffer);
      buffer = [];
    }
  }
  totalTime = totalTime + (performance.now() - startTime);
  console.log(`Parsed ${parsedNum} rows and took ${totalTime} seconds`);
}

const wrapTask = async (promise: any) => {
  try {
    return await promise;
  } catch (e) {
    return e;
  }
};

const handle = async (promise: Promise<any>) => {
  try {
    const data = await promise;
    return [data, undefined];
  } catch (error) {
    return await Promise.resolve([undefined, error]);
  }
};

const dataForProcessing = async (arrayItems: any) => {
  const tasks = arrayItems.map(task);
  const startTime = performance.now();
  console.log(`Tasks starting...`);
  console.log('DW#', processedNum, ': dirty work START');
  try {
    await Promise.all(tasks.map(wrapTask));

    console.log(
      `Task finished in ${performance.now() - startTime} miliseconds with,`
    );
    processedNum++;
  } catch (e) {
    console.log('should not happen but we never know', e);
  }
};

const task = async (item: any) => {
  let table = 'Product';

  if (item.contactNumber == '9999999999') {
    table = 'random table'; // to create read error
  }
  if (item.contactNumber == '11111111111') {
    item.randomRow = 'random'; // to create insert error
  }

  // To add some read process
  const [data, readError] = await handle(getProductById(2, table));
  if (readError) {
    return 'Some error in read of table';
  }
  //console.log(JSON.parse(JSON.stringify(data))[0]['customerName']);
  data;
  // To add some write process
  const [insertId, insertErr] = await handle(insertProduct(item));
  if (insertErr) {
    return `Some error to log and continue process for ${item}`;
  }
  return `Done for ${insertId}`;
};

export { insertBig };

报告.ts

import pool from './dbConfig';

const getProducts = (table: any) => {
  return new Promise((resolve, reject) => {
    pool.query(`SELECT * FROM ${table}`, (error, results) => {
      if (error) {
        reject(error);
      }
      resolve(results);
    });
  });
};
const getProductById = (id: any, table: any) => {
  return new Promise((resolve, reject) => {
    pool.query(`SELECT * FROM ${table} WHERE id = ${id}`, (error, results) => {
      if (error) {
        reject(error);
      }
      resolve(results);
    });
  });
};

const insertProduct = (data: any) => {
  return new Promise((resolve, reject) => {
    pool.query(
      `INSERT INTO Product SET ?`,
      [
        {
          ...data
        }
      ],
      (error, results) => {
        if (error) {
          reject(error);
        }
        resolve(results);
      }
    );
  });
};

export { insertProduct, getProducts, getProductById };

dbConfig.ts

import mysql from 'mysql';
import * as dotenv from 'dotenv';
dotenv.config();

const dbConn = {
  connectionLimit: 80,
  host: process.env.DB_HOST,
  user: process.env.DB_USER,
  password: process.env.DB_PASSWORD,
  database: process.env.DB_NAME
};

const pool = mysql.createPool(dbConn);

pool.getConnection((err, connection) => {
  if (err) {
    if (err.code === 'PROTOCOL_CONNECTION_LOST') {
      console.error('Database connection was closed.');
    }
    if (err.code === 'ER_CON_COUNT_ERROR') {
      console.error('Database has to many connections');
    }
    if (err.code === 'ECONNREFUSED') {
      console.error('Database connection was refused');
    }
  }

  if (connection) {
    connection.release();
  }
  console.log('DB pool is Connected');
  return;
});

// pool.query = promisify(pool.query);

export default pool;

种子数据

import { OkPacket } from 'mysql';
import pool from './dbConfig';

const seed = () => {
  const queryString = `CREATE TABLE IF NOT EXISTS Product (
  id int(11) NOT NULL,
  customerName varchar(100) DEFAULT NULL,
  contactNumber varchar(100) DEFAULT NULL,
  modelName varchar(255) NOT NULL,
  retailerName varchar(100) NOT NULL,
  dateOfPurchase varchar(100) NOT NULL,
  voucherCode varchar(100) NOT NULL,
  voucherValue int(10) DEFAULT NULL,
  surveyUrl varchar(255) NOT NULL,
  surveyId varchar(255) NOT NULL,
  createdAt timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
  updatedAt timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4;`;

  pool.query(queryString, (err, result) => {
    if (err) {
      console.log(err);
    }

    const insertId = (<OkPacket>result).insertId;
    console.log(insertId);
  });
};

export default seed;

可以进行索引以增加 mysql 的读取

根据一条评论:我将文件分成 8 个部分,它实际上减少了处理,并且在性能提升方面似乎无关紧要。

child.ts

import * as fs from 'fs';
import * as path from 'path';
import { getProductById, insertProduct } from '../repo';
import { performance } from 'perf_hooks';

import Papa from 'papaparse';

let processedNum = 0;

process.on('message', async function (message: any) {
  console.log('[child] received message from server:', message);
  JSON.stringify(process.argv);
  const filePath = path.normalize(
    `${__dirname}./../output/output.csv.${message}`
  );

  let time = await importCSV(filePath, message);
  if (process.send) {
    process.send({
      child: process.pid,
      result: message + 1,
      time: time
    });
  }

  process.disconnect();
});

async function importCSV(filePath: fs.PathLike, message: any) {
  let parsedNum = 0;
  const dataStream = fs.createReadStream(filePath);
  const parseStream = Papa.parse(Papa.NODE_STREAM_INPUT, {
    header: true
  });
  dataStream.pipe(parseStream);
  let buffer = [];
  let totalTime = 0;
  const startTime = performance.now();
  for await (const row of parseStream) {
    // console.log('Child Server # :', message, 'PA#', parsedNum, ': parsed');
    buffer.push(row);
    parsedNum++;
    if (parsedNum % 400 == 0) {
      await dataForProcessing(buffer, message);
      buffer = [];
    }
  }

  totalTime = totalTime + (performance.now() - startTime);
  //   console.log(
  //     `Child Server ${message} : Parsed ${parsedNum} rows and took ${totalTime} seconds`
  //   );
  return totalTime;
}

const wrapTask = async (promise: any) => {
  try {
    return await promise;
  } catch (e) {
    return e;
  }
};

const handle = async (promise: Promise<any>) => {
  try {
    const data = await promise;
    return [data, undefined];
  } catch (error) {
    return await Promise.resolve([undefined, error]);
  }
};

const dataForProcessing = async (arrayItems: any, message: any) => {
  const tasks = arrayItems.map(task);
  const startTime = performance.now();
  console.log(`Tasks starting... from server ${message}`);
  console.log('CS#: ', message, 'DW#:', processedNum, ': dirty work START');
  try {
    await Promise.all(tasks.map(wrapTask));

    console.log(
      `Task finished in ${performance.now() - startTime} miliseconds with,`
    );
    processedNum++;
  } catch (e) {
    console.log('should not happen but we never know', e);
  }
};

const task = async (item: any) => {
  let table = 'Product';

  if (item.contactNumber == '8800210524') {
    table = 'random table'; // ro create read error
  }
  if (item.contactNumber == '9134743017') {
    item.randomRow = 'random'; // to create insert error
  }

  // To add some read process
  const [data, readError] = await handle(getProductById(2, table));
  if (readError) {
    return 'Some error in read of table';
  }
  //console.log(JSON.parse(JSON.stringify(data))[0]['customerName']);
  data;
  // To add some write process
  const [insertId, insertErr] = await handle(insertProduct(item));
  if (insertErr) {
    return `Some error to log and continue process for ${item}`;
  }
  return `Done for ${insertId}`;
};

父.ts

import { Request, Response } from 'express';
var child_process = require('child_process');

const insertBigChildProcess = async (req: Request, res: Response) => {
  try {
    var numchild = require('os').cpus().length;
    var done = 0;
    let totalProcessTime: any[] = [];
    for (var i = 1; i <= numchild; i++) {
      const child = child_process.fork(__dirname + '/child.ts');
      child.send(i);
      child.on('message', function (message: any) {
        console.log('[parent] received message from child:', message);
        totalProcessTime.push(message.time);
        const sum = totalProcessTime.reduce(
          (partial_sum, a) => partial_sum + a,
          0
        );
        console.log(sum); // 6
        console.log(totalProcessTime);
        done++;
        if (done === numchild) {
          console.log('[parent] received all results');
        }
      });
    }

  
    return res.status(200).json({ data: 'Data send for processing 123' });
  } catch (error) {
    return res.status(500).send('Some error here in isert biG 123');
  }
};

处理完每一行后,必须检查数据库中的值以进行一些验证,然后将处理后的数据插入到不同的表中。

是否可以从缓存中进行此验证? 它可以提高性能,因为当我们谈论数百万行时,每行验证都需要很长时间。

您可以将csv文件分成更小的csv文件,然后您可以并行处理这些文件。 您可以使用package 来实现它。

一旦你这样做了,使用子进程并行运行这些文件。

// parent.js
var child_process = require('child_process');

var numchild  = require('os').cpus().length;
var done      = 0;

for (var i = 0; i < numchild; i++){
  var child = child_process.fork('./child');
  child.send((i + 1) * 1000);
  child.on('message', function(message) {
    console.log('[parent] received message from child:', message);
    done++;
    if (done === numchild) {
      console.log('[parent] received all results');
      ...
    }
  });
}

// child.js
process.on('message', function(message) {
  console.log('[child] received message from server:', message);
  setTimeout(function() {
    process.send({
      child   : process.pid,
      result  : message + 1
    });
    process.disconnect();
  }, (0.5 + Math.random()) * 5000);
});

#从这个线程复制。 你可以试一试,看看现在需要多少时间。

我从“我可以用 SQL 完成这一切”来处理这样的任务。 这可能是性能最高的。 我会选择两种方式来进行“验证”和“处理”:

LOAD DATA期间

    LOAD DATA ...
        ( ... @a, ..., @b, ...)
        SET cola = ... @a ...,
            colb = ... @b ...

解释:

  • 随着行被读取,一些列被放入@变量中。
  • 然后在表达式/函数中使用这些变量来计算实际列的所需值。
  • 请注意,这是一种“忽略”列(通过不在SET中使用它)或组合列的方法。

LOADing

运行UPDATE语句以进行整体后处理。 这可能比一次修复一行要快得多

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM