简体   繁体   中英

Resizing images with sharp before uploading to google cloud storage

I tried to resize or compress an image before uploading to the google cloud storage. The upload works fine but the resizing does not seem to work.

Here is my code:

const uploadImage = async (file) => new Promise((resolve, reject) => {
    let { originalname, buffer } = file
    sharp(buffer)
        .resize(1800, 948)
        .toFormat("jpeg")
        .jpeg({ quality: 80 })
        .toBuffer()


    const blob = bucket.file(originalname.replace(/ /g, "_"))
    const blobStream = blob.createWriteStream({
        resumable: false
    })
    blobStream.on('finish', () => {
        const publicUrl = format(
            `https://storage.googleapis.com/${bucket.name}/${blob.name}`
        )
        resolve(publicUrl)
    }).on('error', () => {
            reject(`Unable to upload image, something went wrong`)
        })
        .end(buffer)
}) 

I ran into the same issue with a project I was working on. After lots of trial and error I found the following solution. It might not be the most elegant, but it worked for me.

In my upload route function I created a new thumbnail image object with the original file values and passed it as the file parameter to the uploadFile function for google cloud storage.

Inside my upload image route function:

const file = req.file;

const thumbnail = {
  fieldname: file.fieldname,
  originalname: file.originalname,
  encoding: file.encoding,
  mimetype: file.mimetype,
  buffer: await sharp(file.buffer).resize({ width: 150 }).toBuffer()
}

const uploadThumbnail = await uploadFile(thumbnail);

My google cloud storage upload file function:

const uploadFile = async (file) => new Promise((resolve, reject) => {

  const gcsname = `$/thumbnail_${file.originalname}`;
  const bucketFile = bucket.file(gcsname);

  const stream = bucketFile.createWriteStream({
    resumable: false,
    metadata: {
      contentType: file.mimetype
    }
  });

  stream.on('error', (err) => {
    reject(err);
  });

  stream.on('finish', (res) => {
    resolve({ 
      name: gcsname
    });
  });

  stream.end(file.buffer);
});

I think the problem is with toFormat() . That function does not exist in the Docs. Can you try to remove it and check if it would work?

sharp(buffer)
  .resize(1800, 948)
  .jpeg({ quality: 80 })
  .toBuffer()

Modify the metadata once you have finished uploading the image.

import * as admin from "firebase-admin";
import * as functions from "firebase-functions";
import { log } from "firebase-functions/logger";
import * as sharp from "sharp";

export const uploadFile = functions.https.onCall(async (data, context) => {
  const bytes = data.imageData;

  const bucket = admin.storage().bucket();

  const buffer = Buffer.from(bytes, "base64");

  const bufferSharp = await sharp(buffer)
    .png()
    .resize({ width: 500 })
    .toBuffer();

  const nombre = "IMAGE_NAME.png";

  const fileName = `img/${nombre}.png`;
  const fileUpload = bucket.file(fileName);

  const uploadStream = fileUpload.createWriteStream();

  uploadStream.on("error", async (err) => {
    log("Error uploading image", err);

    throw new functions.https.HttpsError("unknown", "Error uploading image");
  });

  uploadStream.on("finish", async () => {
    await fileUpload.setMetadata({ contentType: "image/png" });

    log("Upload success");
  });

  uploadStream.end(bufferSharp);
});

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM