简体   繁体   中英

Concurrent large delete operations hang in NodeJS

I have a workspace cleanup function in my NodeJS application running on a Centos 7.2 server, that takes a list of directory paths and deletes them. When the function receives a list of paths, it uses Promise.all() to perform these deletes concurrently in the following manner:

/**
 * Deletes directories.
 *
 * @param   {Array} directories Array of directories to be deleted.
 *
 * @returns {Object} Response object with the status code and data.
 */
const cleanup = async (directories) => {
  if (!Array.isArray(directories)) {
    return await responseStatementHandler(
      400,
      `Provide a list of directories as an array - [dir1, dir2].`,
      null,
      console.error
    );
  }

  if (!directories.length > 0) {
    return await responseStatementHandler(
      400,
      `Directory list cannot be empty.`,
      null,
      console.error
    );
  }

  try {
    const promisesList = await Promise.all(
      directories.map((d) => deleteDirectory(d))
    );
    return await responseStatementHandler(
      207,
      await promisesList,
      null,
      console.log
    );
  } catch (err) {
    return await responseStatementHandler(
      500,
      `Directory cleanup failed.`,
      err,
      console.error
    );
  }
};

/**
 * Performs a force delete on a provided directory.
 *
 * @param   {String} directory  Path of the directory to be deleted.
 *
 * @returns {Object} Response object with the status code and data.
 */
const deleteDirectory = async (directory) => {
  console.log(`Deleting directory: ${directory}`);
  try {
    if ((await fs.stat(directory)).isDirectory()) {
      await fs.rm(directory, { recursive: true, force: true });
      return await generateIndividualResponseObj(
        directory,
        200,
        `Successfully deleted directory: ${directory}`,
        null,
        console.log
      );
    }
  } catch (err) {
    if (err.message.includes("no such file or directory")) {
      return await generateIndividualResponseObj(
        directory,
        404,
        `Could not find directory: ${directory}`,
        null,
        console.error
      );
    }
    return await generateIndividualResponseObj(
      directory,
      500,
      `Failed to delete directory: ${directory}`,
      err,
      console.error
    );
  }
};

The issue here is that the directories are large; ~1G in size. So when there are multiple such deletes (>10 so >10G), the operation just hangs.

The reason I know this is because as soon as I manually delete the directories, the application runs fine without issues.

Is this a limitation with the fs module or the way in which the logic is written? Would a timeout for the delete operations be helpful in this scenario? If so, how would I achieve such a timeout with the fs module?

The issue here was not with Node. The server had a slow HDD which was taking very long to perform large deletions. The application was only waiting for the delete process to complete and was not hung. Hooking up the server with a faster storage resolved the issue.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM