简体   繁体   中英

await fetch(file_url) returns sometimes doesn't return the full file contents

I have the following javascript code to fetch and process the contents of the.csv file

async function fetchCsv() {
    const response = await fetch("levels.csv");
    const reader = response.body.getReader();
    const result = await reader.read();
    const decoder = new TextDecoder("utf-8");
    const csv = await decoder.decode(result.value);
    return csv;
}

    useEffect(() => {
        fetchCsv().then((csv) => {
            // process csv
                (...)

When running this code 99% of the time the csv variable contains the correct contents of the file, but in rare cases the csv variable is only truncated part of the actual file.

What could be the reason and how to improve the code to handle that?

It's in a React App if that's relevant.

Extra info:

  • I have verified that when the problem occurs the.network response for the levels.csv file is a proper response (200 and full 38kb are returned)

What you get when calling response.body.getReader() is a ReadableStreamDefaultReader object.

Calling its .read() method will return a Promise that will resolve with either the full content of the response body, in case the request was honored fast enough and the body size isn't too big (apparently 256MB in Firefox), or with just one chunk of the response body.
This allows you to handle the response as a stream, before it's entirely fetched.

If you wish to process this stream as text, you could either use a TextDecoderStream , which finally got support in all major browsers:

const response = await fetch("levels.csv");
const textStream = response.body.pipeThrough(new TextDecoderStream());
// now you can handle each chunk as text from textStream.getReader();
// or pipe it in yet another TransformStream

or in more old-school style, you could use the { stream: true } option of the TextDecoder#decode() method and handle each chunk one by one in there:

const response = await fetch("levels.csv");
const decoder = new TextDecoder();
const reader = response.body.getReader();
while (true) {
  const {value, done} = await reader.read();
  if (value) {
    csv_chunks.push(decoder.decode(value, {stream: true}));
    // do something with all the chunks we have so far
  }
  if (done) {
    break;
  }
}

But maybe you don't want to handle this response as a stream at all, in which case it might very well be enough for you to ask the browser to first fetch the whole response body before it itself decodes it as text. For this, if you need to decode the text as UTF-8, you'd use the Response#text() method:

const response = await fetch("levels.csv");
if (!response.ok) { // don't forget to handle possible network errors
  throw new Error("NetworkError");
}
return response.text();

And if you need to handle an other encoding, then first consume the response as an ArrayBuffer then decode it to text:

const response = await fetch("levels.csv");
if (!response.ok) { // don't forget to handle possible network errors
  throw new Error("NetworkError");
}
const buf = await response.arrayBuffer();
const decoder = new TextDecoder(encoding);
return decoder.decode(buf);

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM