简体   繁体   中英

Node.js + Socket.IO on Heroku - File Downloading

As my first Node.js project, I've been building a reporting app for my work where people can search and then download the results to their computer as a CSV.

To accomplish this, I've been using Socket.IO to pass the JSON data back to my application.js file on a button click event. From there I use the json2csv module to format the data.

Here's where I run into issues...

  1. I know Heroku uses Ephemeral File Storage (which should be fine since I only need the file to be on the server for the session anyway and the added cleanup is nice), however my file exists check comes back positive even though I can't see the file when I run

     heroku run bash ls 
  2. Since I'm using Socket.IO (as far as I can tell, anyhow) the normal request and response callback function parameters aren't available. Can I set the headers for the CSV using data.setHeader() which is the socket function callback instead of response.setHeader() ? Do I need to break out that event listener from the sockets and run it directly from the app.get ?

    Here's the code I have that takes the JSON data from the event and formats it based on my searches:

     socket.on('export', function (data) { jsoncsv({data: data, fields: ['foo', 'bar'], fieldNames: ['Foo', 'Bar']}, function(err, csv) { if (err) console.log(err); fs.writeFile('file.csv', csv, function(err) { if (err) console.log(err); console.log('File Created'); }); fs.exists('file.csv', function (err) { if (err) console.log(err); console.log('File Exists, Starting Download...'); var file = fs.createReadStream('file.csv'); file.pipe(data); console.log('File Downloaded'); }); }); }); 

UPDATE

Here's the actual client-side code I'm using to build and send the JSON as an event. The exact event is the $('#export').on('click', function () {}); .

server.on('listTags', function (data) {
    var from = new Date($('#from').val()), to = new Date($('#to').val()), csvData = [];
    var table = $('<table></table>');
    $('#data').empty().append(table);
    table.append('<tr>'+
                    '<th>Id</th>' +
                    '<th>First Name</th>' +
                    '<th>Last Name</th>' +
                    '<th>Email</th>' +
                    '<th>Date Tag Applied</th>' +
                  '</tr>');
    $.each(data, function(i, val) {
        var dateCreated = new Date(data[i]['DateCreated']);
        if (dateCreated >= from && dateCreated <= to) {
            data[i]['DateCreated'] = dateCreated.toLocaleString();
            var tableRow = 
            '<tr>' +
                '<td>' + data[i]['ContactId'] + '</td>' +
                '<td>' + data[i]['Contact.FirstName'] + '</td>' +
                '<td>' + data[i]['Contact.LastName'] + '</td>' +
                '<td>' + data[i]['Contact.Email'] + '</td>' +
                '<td>' + data[i]['DateCreated'] + '</td>' +
            '</tr>';
            table.append(tableRow);
            csvData.push(data[i]);
        }
    });
    $('.controls').html('<p><button id="export">Export '+ csvData.length +' Records</button></p>');
    $('#export').on('click', function () {
        server.emit('export', csvData);
    });
});

As you point out yourself Heroku's file system can be a bit tricky. I can help with your question (1), and that is that you are not connected to the same virtual machine (dyno) that your application is running from. When you run heroku run bash you come to a clean filesystem with the files that are needed for your app to run, and the run command is running (as opposed to the web process you have specified in your Procfile).

This makes sense when you consider that one of the advantages using Heroku is that you can easily scale up from 1 node to several when needed. But you would still expect heroku run bash to work the same way when you have 10 web nodes running with your code. Which one should you then be connected to? :)

See https://devcenter.heroku.com/articles/one-off-dynos#an-example-one-off-dyno for some more details.

Hope this is helpful. Best of luck!

/Wille

So instead of using socket.io, we are just going to use an http server. I have a lot of code for you, as it is partially stripped of my own http server, and should of course also serve files (eg. your html, css and js files).

var http = require('http'),
    url = require('url'),
    fs = require('fs'),
    path = require('path');
var server = http.createServer(function (req, res) {
    var location = path.join(__dirname, url.parse(req.url).pathname),
        ext = location.split('.').slice(-1)[0];
    if (req.headers['request-action']&&req.headers['request-action'] == 'export'&&req.headers['request-data']) { //this is your export event
        jsoncsv({data: req.headers['request-data'], fields: ['foo', 'bar'], fieldNames: ['Foo', 'Bar']}, function(err, csv) {
            if (err){
                console.log(err);
                res.writeHead(404, {'content-type':'text/plain'});
                res.end('Error at jsoncsv function: ', err);
                return;
            }
            res.setHeader('content-type', 'text/csv');
            var stream = new stream.Writeable();
            compressSend(req, res, stream); //this is the equivalent of stream.pipe(res), but with an encoding system inbetween to compress the data
            stream.write(csv, 'utf8', function(){
                console.log('done writing csv to stream');
            });
        });
    } else {//here we handle normal files
        fs.lstat(location, function(err, info){
            if(err||info.isDirectory()){
                res.writeHead(404, {'content-type':'text/plain'});
                res.end('404 file not found');
                console.log('File '+location+' not found');
                return;
            }
            //yay, the file exists
            var reader = fs.createReadStream(location); // this creates a read stream from a normal file
            reader.on('error', function(err){
                console.log('Something strange happened while reading: ', err);
                res.writeHead(404, {'content-type':'text/plain'});
                res.end('Something strange happened while reading');
            });
            reader.on('open', function(){
                res.setHeader('Content-Type', getHeader(ext)); //of course we should send the correct header for normal files too
                res.setHeader('Content-Length', info.size); //this sends the size of the file in bytes
                //the reader is now streaming the data
                compressSend(req, res, reader); //this function checks whether the receiver (the browser) supports encoding and then en200s it to that. Saves bandwidth
            });
            res.on('close', function(){
                if(reader.fd) //we shall end the reading of the file if the connection is interrupted before streaming the whole file
                    reader.close();
            });
        });
    }
}).listen(80);
function compressSend(req, res, input){
    var acceptEncoding = req.headers['Accept-Encoding'];
    if (!acceptEncoding){
        res.writeHead(200, {});
        input.pipe(res);
    } else if (acceptEncoding.match(/\bgzip\b/)) {
        res.writeHead(200, { 'Content-Encoding': 'gzip' });
        input.pipe(zlib.createGzip()).pipe(res);
    } else if (acceptEncoding.match(/\bdeflate\b/)) {
        res.writeHead(200, { 'Content-Encoding': 'deflate' });
        input.pipe(zlib.createDeflate()).pipe(res);
    } else {
        res.writeHead(200, {});
        input.pipe(res);
    }
}
function getHeader(ext){
    ext = ext.toLowerCase();
    switch(ext) {
        case 'js': header = 'text/javascript'; break;
        case 'html': header = 'text/html'; break;
        case 'css': header = 'text/css'; break;
        case 'xml': header = 'text/xml'; break;
        default: header = 'text/plain'; break;
    }
    return header;
}

The top part is interesting for you, especially inside the first if. There it checks if the header request-action is present. This header will contain your event name (like the name export ). The header request-data contains the data you would otherwise send over the socket. Now you might also want to know how to manage this client side:

$('#export').on('click', function () {
    var xhr = new XMLHttpRequest();
    xhr.open('GET', 'localhost');
    xhr.setRequestHeader('request-action', 'export'); //here we set that 'event' header, so the server knows what it should do
    xhr.setRequestHeader('request-data', 'csvData); //this contains the data that has to be sent to the server
    xhr.send();
    xhr.onloadend = function(){//we got all the data back from the server
        var file = new Blob([xhr.response], {type: 'text/csv'}); //xhr.response contains the data. A blob wants the data in array format so that is why we put it in the brackets
        //now the download part is tricky. We first create an object URL that refers to the blob:
        var url = URL.createObjectURL(file);
        //if we just set the window.location to this url, it downloads the file with the url as name. we do not want that, so we use a nice trick:
        var a = document.createElement('a');
        a.href = url;
        a.download = 'generatedCSVFile.csv' //this does the naming trick
        a.click(); //simulate a click to download the file
    }
});

I have tried to add comments on the crucial parts. Because I do not know your current knowledge level, I have not added a comment on every part of the system, but if anything is unclear feel free to ask.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM