简体   繁体   English

Node.js Express 向客户端 vanilla JS 发送大量数据

[英]Node.js Express send huge data to client vanilla JS

In my application I read huge data of images, and send the whole data to the client:在我的应用程序中,我读取了大量图像数据,并将整个数据发送到客户端:

const imagesPaths = await getFolderImagesRecursive(req.body.rootPath);
const dataToReturn = await Promise.all(imagesPaths.map((imagePath) => new Promise(async (resolve, reject) => {
    try {
        const imageB64 = await fs.readFile(imagePath, 'base64');

        return resolve({
            filename: imagePath,
            imageData: imageB64,
        });
    } catch {
        return reject();
    }
})));

return res.status(200).send({
    success: true,
    message: 'Successfully retreived folder images data',
    data: dataToReturn,
});

Here is the client side:这是客户端:

const getFolderImages = (rootPath) => {
    return fetch('api/getFolderImages', {
        method: 'POST',
        headers: { 'Content-type': 'application/json' },
        body: JSON.stringify({ rootPath }),
    });
};

const getFolderImagesServerResponse = await getFolderImages(rootPath);
const getFolderImagesServerData = await getFolderImagesServerResponse.json();

When I do send the data I get failure due to the huge data.当我发送数据时,由于数据量大而失败。 Sending the data just with res.send(<data>) is impossible.仅使用res.send(<data>)发送数据是不可能的。 So, then, how can I bypass this limitation - and how should I accept the data in the client side with the new process?那么,我该如何绕过这个限制——我应该如何在新进程中接受客户端的数据?

The answer to your problem requires some read:您的问题的答案需要阅读:

Link to the solution链接到解决方案

One thing you probably haven't taken full advantage of before is that webserver's http response is a stream by default.您之前可能没有充分利用的一件事是网络服务器的 http 响应默认为 stream。

They just make it easier for you to pass in synchron data, which is parsed to chunks under the hood and sent as HTTP packages.它们只是让您更容易传递同步数据,这些数据在后台被解析为块并作为 HTTP 包发送。

We are talking about huge files here;我们在这里谈论的是大文件; naturally, we don't want them to be stored in any memory, at least not the whole blob.自然,我们不希望它们存储在任何 memory 中,至少不是整个 blob。 The excellent solution for this dilemma is a stream.解决这个困境的最佳解决方案是 stream。

We create a readstream with the help of the built-in node package 'fs,' then pass it to the stream compatible response.send parameter.我们在内置节点 package 'fs' 的帮助下创建一个读取流,然后将其传递给 stream 兼容的 response.send 参数。

const readStream = fs.createReadStream('example.png');
return response.headers({
  'Content-Type': 'image/png',
  'Content-Disposition': 'attachment; filename="example.png"',
}).send(readStream);

I used Fastify webserver here, but it should work similarly with Koa or Express.我在这里使用了 Fastify 网络服务器,但它应该与 Koa 或 Express 类似地工作。

There are two more configurations here: naming the header 'Content-Type' and 'Content-Disposition.'这里还有两个配置:将 header 命名为“Content-Type”和“Content-Disposition”。

The first one indicates the type of blob we are sending chunk-by-chunk, so the frontend will automatically give the extension to it.第一个指示我们逐块发送的 blob 类型,因此前端会自动为其提供扩展名。

The latter tells the browser that we are sending an attachment, not something renderable, like an HTML page or a script.后者告诉浏览器我们正在发送附件,而不是可渲染的东西,例如 HTML 页面或脚本。 This will trigger the browser's download functionality, which is widely supported.这将触发广泛支持的浏览器下载功能。 The filename parameter is the download name of the content. filename 参数是内容的下载名称。

Here we are;我们到了; we accomplished minimal memory stress, minimal coding, and minimal error opportunities.我们实现了最小的 memory 压力、最小的编码和最小的错误机会。

One thing we haven't mentioned yet is authentication.我们还没有提到的一件事是身份验证。

For the fact, that the frontend won't send an Ajax request, we can't expect auth JWT header to be present on the request.事实上,前端不会发送 Ajax 请求,我们不能指望 auth JWT header 出现在请求中。

Here we will take the good old cookie auth approach.在这里,我们将采用良好的旧 cookie 身份验证方法。 Cookies are set automatically on every request header that matches the criteria, based on the cookie options. Cookies 根据 cookie 选项在符合条件的每个请求 header 上自动设置。 More info about this in the frontend implementation part.前端实现部分中有关此的更多信息。

By default, cookies arrive as semicolon separated key-value pairs, in a single string.默认情况下,cookies 以分号分隔的键值对形式出现在单个字符串中。 In order to ease out the parsing part, we will use Fastify's Cookieparser plugin.为了简化解析部分,我们将使用 Fastify 的 Cookieparser 插件。

await fastifyServer.register(cookieParser);等待 fastifyServer.register(cookieParser); Later in the handler method, we simply get the cookie that we are interested in and compare it to the expected value.稍后在处理程序方法中,我们只需获取我们感兴趣的 cookie 并将其与预期值进行比较。 Here I used only strings as auth-tokens;这里我只使用字符串作为 auth-tokens; this should be replaced with some sort of hashing and comparing algorithm.这应该用某种散列和比较算法代替。

const cookies = request.cookies;
if (cookies['auth'] !== 'authenticated') {
   throw new APIError(400, 'Unauthorized');
}

That's it.而已。 We have authentication on top of the file streaming endpoint, and everything is ready to be connected by the frontend.我们在文件流端点之上进行了身份验证,一切都准备好由前端连接。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM