简体   繁体   English

使用 Nodejs 将大数据保存到 SQL 服务器的最佳方法

[英]Best way to save large data to SQL server using Nodejs

I have a buffer data which is approximately 320MB.我有一个大约 320MB 的缓冲区数据。 I am trying to save it into SQL server table from node.js.我试图将它从 node.js 保存到 SQL 服务器表中。

What is the best way I can do this?我能做到这一点的最好方法是什么? When I try to insert the buffer data directly I am getting Cannot create a string longer than 0x1fffffe8 characters .当我尝试直接插入缓冲区数据时,我收到Cannot create a string longer than 0x1fffffe8 characters

What can I do to save it?我能做些什么来拯救它?

This is my buffer data这是我的缓冲区数据

<Buffer 53 6f 75 72 63 65 53 63 68 65 6d 61 2c 43 6f 75 6e 74 72 79 4e 61 6d 65 2c 49 44 2c 41 4d 54 5f 45 58 43 4c 5f 54 41 58 2c 41 4d 54 5f 49 4e 43 4c 5f ... 340629006 more bytes>

I am trying to save this data to database using sequelize as below我正在尝试使用 sequelize 将此数据保存到数据库,如下所示

const DBMODEL = require("../../../models/SaveBufferData")

module.exports = async (req, res) => {

try{
var bufdata = req.file.buffer;

 var fdata = fs.writeFile('files/uploaded_files/' + req.body.fileName, bufdata, "utf8", function (err) {
          console.log('errrrr',err)
          stream = fs.createReadStream('files/uploaded_files/' + req.body.fileName);
          
        // Read and display the file data on console
        stream.on('data', function (chunk) {
            console.log(chunk);
        });
        stream.on('end', function(){
          console.log('stream ended')
      })
      
      stream.on('open', function(){
          console.log('File opened')
      });
        })
var response = await DBMODEL.SaveBufferData(req.body.fileId, bufdata);

} catch (error) {
   console.log('errorerror',error)
   res.status(500).json(ResponseManager(false, error.message));
 }
 

I ended up with the below error我最终遇到了以下错误

node:buffer:669
slice: (buf, start, end) => buf.hexSlice(start, end),
                                ^

Error: Cannot create a string longer than 0x1fffffe8 characters
    at Object.slice (node:buffer:669:37)
    at Buffer.toString (node:buffer:811:14)
    at BLOB._stringify (C:\IWA-BACKEND\PRIMS-local\PRIMS - Iwa\BACKEND\node_modules\sequelize\dist\lib\data-types.js:423:23)
    at BLOB.stringify (C:\IWA-BACKEND\PRIMS-local\PRIMS - Iwa\BACKEND\node_modules\sequelize\dist\lib\data-types.js:22:19)
    at escape (C:\IWA-BACKEND\PRIMS-local\PRIMS - Iwa\BACKEND\node_modules\sequelize\dist\lib\sql-string.js:40:48)
    at C:\IWA-BACKEND\PRIMS-local\PRIMS - Iwa\BACKEND\node_modules\sequelize\dist\lib\sql-string.js:101:14
    at String.replace (<anonymous>)
    at Object.formatNamedParameters (C:\IWA-BACKEND\PRIMS-local\PRIMS - Iwa\BACKEND\node_modules\sequelize\dist\lib\sql-string.js:96:14)
    at Object.formatNamedParameters (C:\IWA-BACKEND\PRIMS-local\PRIMS - Iwa\BACKEND\node_modules\sequelize\dist\lib\utils.js:112:20)
    at Sequelize.query (C:\IWA-BACKEND\PRIMS-local\PRIMS - Iwa\BACKEND\node_modules\sequelize\dist\lib\sequelize.js:283:21) {
  code: 'ERR_STRING_TOO_LONG'
}

Looking at the call-stack, this is what I diagnose is happening:查看调用堆栈,这就是我诊断正在发生的事情:

  1. You are passing a file as a buffer (from multer I guess) to the sequelize node module您正在将文件作为缓冲区(我猜是来自 multer)传递给 sequelize 节点模块
  2. sequelize node module is calling Blob.stringify sequelize 节点模块正在调用 Blob.stringify

step 2) is where you are probably hitting node string length upper limits步骤 2) 是您可能达到节点字符串长度上限的地方

Try the following to test this hypothesis尝试以下方法来检验这个假设

-> try this endpoint with a file of a smaller size and keep incrementing the size of the said file, -> 使用较小的文件尝试此端点并不断增加所述文件的大小,

if this endpoint succeeds for smaller file sizes and fails after a certain file size is reached, we are can mention with high probability that the file to buffer to string conversion is the issue (as in we are hitting the ceiling of allowed size of string by either the node process or the system memory)如果此端点对于较小的文件大小成功并且在达到特定文件大小后失败,我们可以很可能提到要缓冲到字符串转换的文件是问题所在(因为我们正在达到允许的字符串大小的上限节点进程或系统内存)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM