简体   繁体   中英

Node get request returns zip file need to import JSON into MSSQL

I am using node to make a get request for JSON. It is returning a zip file and all the data is just a bunch of numbers. I cant find anything online to show me how to extract the data from the zip file to actually read the JSON file. How do I open the zip at this point so I can put it into MSSQL Server?

I get the request but not sure how to read it from here, it just says its an attachment.

"content-disposition":["attachment; filename=FileName.zip"]

This is the get request I am calling in node:

require('dotenv').config();
const fetch = require('fetch-everywhere');
const base64 = require('base-64');
process.env["NODE_TLS_REJECT_UNAUTHORIZED"] = 0;

const user = "Enter Username Here";
const pass = "Enter Password Here";
const headers = new Headers({
  "Authorization": `Basic ${base64.encode(`${user}:${pass}`)}`
});

fetch('Link', {
  method: 'GET',
  headers: headers,
})
  .then(function(response) {
    return response;
  })
  .then(function(myJson) {
    console.log(JSON.stringify(myJson));
  }).catch(error => {throw error});

So the return is a zip file. Where do I go from here if I need to dump it into MSSQL Server?

Any help would be greatly appreciate!

You mentioned the file is very large (10GB, first, then edited to 100MB) in your edited comment. It would not be viable to store this file in MSSQL, as rdbms systems are not designed to store large files. You may want to consider storing this object to Azure storage or AWS S3 for example. Please read this if you need more information https://softwareengineering.stackexchange.com/questions/150669/is-it-a-bad-practice-to-store-large-files-10-mb-in-a-database

I suspect nodejs may not be a good technology for this download / extraction either, and you may want to consider another technology which can benefit from higher resource consumption and multi threading. However, if you wish to do this in nodejs, it boils down to three steps

  • Download large file
  • Unzip large file
  • Store large file to somewhere (like azure or amazon storage)

In order to download your file without running out of memory, you need to stream/pipe it, I don't know how fetch-everywhere will handle this, so I have provided an example of request streaming using request instead. I also picked node-stream-zip as it has a streaming api which will not load the entire set into memory.

I have left the "storage" part of the code empty, as I could not possibly know where you want to store it and MSSQL is not an option.

const fs = require('fs');
const request = require('request');
const progress = require('request-progress');
const StreamZip = require('node-stream-zip');

// download large file...
const url = 'example.com/path/to/large/file';
const zipPath = 'generateATmpFileName.zip';
const jsonPath = 'generateATmpFileName.json';
progress(request(url))
    .on('end', function () {
        // file has been completely downloaded, lets unzip it
        var zip = new StreamZip({  
            file: zipPath,  
            storeEntries: true    
        });
        zip.on('ready', function() {
            zip.stream('path-to-json-file-in-zip.json', function(error, zstream) {
                zstream.pipe(fs.createWriteStream(jsonPath));
                zstream.on('end', function() { 
                    zip.close();
                    // file has been completely extracted, do what you want with it
                });
            });
        });
    })
    .pipe(fs.createWriteStream(zipPath));

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM