Firebase is awesome. As written in the previous JSMonday article, it allows us to write serverless applications, authenticate users and many other things.

Firebase also offers a service called Firebase Storage, which is somewhat similar to AWS S3. You can create a bucket and then insert any kind of file inside of it. It is commonly used for storing images, but here comes the problem: what if an user is uploading an incredible heavy image? You may want to resize it!
Let’s see how you can do it using Firebase Functions.

$ firebase init functions

First of all, let’s initialize a new Firebase Functions project.
Then, let’s install a couple of npm packages:

$ cd functions && npm install --save sharp fs-extra @google-cloud/storage

As you can see, we’re installing these three packages:

  • @google-cloud/storage allows you to get images from Firebase Storage.
  • fs-extra, which wraps the Node.js default fs module and exposes its functions with promises.
  • sharp, an amazing and high performances image manipulation library for Node.js

We’re now ready to write our image resizer!
Let’s navigate to our functions/index.js file and begin to import our needed dependencies:

const functions         = require("firebase-functions");
const { tmpdir }        = require("os");
const { Storage }       = require("@google-cloud/storage");
const { dirname, join } = require("path");
const sharp             = require("sharp");
const fs                = require("fs-extra");
const gcs               = new Storage();

Great! Now we can start to write our function handler:

const functions         = require("firebase-functions");
const { tmpdir }        = require("os");
const { Storage }       = require("@google-cloud/storage");
const { dirname, join } = require("path");
const sharp             = require("sharp");
const fs                = require("fs-extra");
const gcs               = new Storage();

exports.resizeImg = functions
                    .runWith({ memory: "2GB", timeoutSeconds: 120 })
                    .storage
                    .object()
                    .onFinalize(handler)

As you can see, we’re telling our Firebase Function that we need at least 2GB of memory and a maximum of 120 seconds of time for our function to be executed. We also declare that we need to trigger this function once an image has been uploaded to our bucket (storage.object().onFinalize).

Let’s write the handler function:

async function handler(object) {
  const bucket      = gcs.bucket(object.bucket);
  const filePath    = object.name;
  const fileName    = filePath.split("/").pop();
  const bucketDir   = dirname(filePath);

  const workingDir  = join(tmpdir(), "resize");
  const tmpFilePath = join(workingDir, "source.png");
}

We need to get a bunch of informations from our object argument (which is actually an object describing where our image has been uploaded):

  • Bucket: the bucket where we uploaded the image to.
  • File Path: the file path inside our Bucket.
  • File Name: the uploaded file name.
  • Bucket Directory: the name of the directory where we uploaded our image to.

Now we can create a temporary directory where we’ll execute our resizing code. We’ll also create a temporary file which we’ll run our manipulations on.

Here comes a problem now: our function gets triggered everytime a new file is created inside our Bucket... but we’re actually creating a new resized image, so how can we avoid an infinite resizing-loop?

async function handler(object) {
  const bucket      = gcs.bucket(object.bucket);
  const filePath    = object.name;
  const fileName    = filePath.split("/").pop();
  const bucketDir   = dirname(filePath);

  const workingDir  = join(tmpdir(), "resize");
  const tmpFilePath = join(workingDir, "source.png");

  if (fileName.includes("@s_") || !object.contentType.includes("image")) {
    return false;
  }
}

We’ll set a name like myImage@s_1920.jpg for our resized image (where s_ stands for “size”), so we’ll be able to check if the newly created image is the product of a resizing or not.
If the image is itself a resized image, we’ll quit our function.

Now we just need to download the newly created image. We’ll download it in the previously created temporary file path:

async function handler(object) {
  const bucket      = gcs.bucket(object.bucket);
  const filePath    = object.name;
  const fileName    = filePath.split("/").pop();
  const bucketDir   = dirname(filePath);

  const workingDir  = join(tmpdir(), "resize");
  const tmpFilePath = join(workingDir, "source.png");

  if (fileName.includes("@s_") || !object.contentType.includes("image")) {
    return false;
  }

  await bucket.file(filePath).download({ destination: tmpFilePath });
}

Now we’re ready to start the resizing! Let’s say we need to create three different sizes: 1920px, 720px, 100px. Let’s wrap these values into an array:

async function handler(object) {
  const bucket      = gcs.bucket(object.bucket);
  const filePath    = object.name;
  const fileName    = filePath.split("/").pop();
  const bucketDir   = dirname(filePath);

  const workingDir  = join(tmpdir(), "resize");
  const tmpFilePath = join(workingDir, "source.png");

  if (fileName.includes("@s_") || !object.contentType.includes("image")) {
    return false;
  }

  await bucket.file(filePath).download({ destination: tmpFilePath });

  const sizes = [ 1920, 720, 100 ];
}

Now we need to run the resizer, so we’ll create a promise for each size inside our array:

async function handler(object) {
  const bucket      = gcs.bucket(object.bucket);
  const filePath    = object.name;
  const fileName    = filePath.split("/").pop();
  const bucketDir   = dirname(filePath);

  const workingDir  = join(tmpdir(), "resize");
  const tmpFilePath = join(workingDir, "source.png");

  if (fileName.includes("@s_") || !object.contentType.includes("image")) {
    return false;
  }

  await bucket.file(filePath).download({ destination: tmpFilePath });

  const sizes = [ 1920, 720, 100 ];

  const uploadPromises = sizes.map(async (size) => {

    const ext        = fileName.split('.').pop();
    const imgName    = fileName.replace(`.${ext}`, "");
    const newImgName = `${imgName}@s_${size}.${ext}`;
    const imgPath    = join(workingDir, newImgName);
    await sharp(tmpFilePath).resize({ with: size }).toFile(imgPath);

    return bucket.upload(imgPath, {
      destination: join(bucketDir, newImgName)
    });

  });
}

As you can see, the process is pretty easy:

  • Get the image extension.
  • Get the original image name.
  • Create a final image name.
  • Get its path.
  • Now let’s run sharp and resize the image, then save it to a file.
  • Last but not least, upload the image to the original bucket with a new file name.

Now we just need to run these three promises… but we don’t want to run them sequentially, it could take too long! So we’ll use Promise.all in order to run them concurrently:

async function handler(object) {
  const bucket      = gcs.bucket(object.bucket);
  const filePath    = object.name;
  const fileName    = filePath.split("/").pop();
  const bucketDir   = dirname(filePath);

  const workingDir  = join(tmpdir(), "resize");
  const tmpFilePath = join(workingDir, "source.png");

  if (fileName.includes("@s_") || !object.contentType.includes("image")) {
    return false;
  }

  await bucket.file(filePath).download({ destination: tmpFilePath });

  const sizes = [ 1920, 720, 100 ];

  const uploadPromises = sizes.map(async (size) => {

    const ext        = fileName.split('.').pop();
    const imgName    = fileName.replace(`.${ext}`, "");
    const newImgName = `${imgName}@s_${size}.${ext}`;
    const imgPath    = join(workingDir, newImgName);
    await sharp(tmpFilePath).resize({ with: size }).toFile(imgPath);

    return bucket.upload(imgPath, {
      destination: join(bucketDir, newImgName)
    });

  });

  await Promise.all(uploadPromises);
  return fs.remove(workingDir);
}

And we’re done!
Let’s deploy the function and test it uploading a file to Firebase Storage:

$ firebase deploy

We’re now ready to run our resizer!
At JSMonday, we’re actually using the code above for resizing our images:
jsmonday/Daguerre

Feel free to reuse the code above for your image resizer!

Did you like this article? Consider becoming a Patron!

This article is CC0 1.0 (Public Domain) licensed.