Jay Gould

Resizing and uploading images to object storage with Sharp and Backblaze

February 27, 2022

Image of a bucket for object storage

I am working on a small side project which I’ll write about next month, but in the meantime I wanted to share my solution to a requirement of my project. Specifically, I was grabbing images from an external source but the images were not sized correctly, so my solution involves resizing each image using Sharp, and uploading to reasonably priced storage solution, Backblaze.

Creating a script to pull images from an external source

I decided to create a script to handle the process of getting images, processing them, and uploading them to object storage. A script fits this situation because it only needs to be executed manually every now and then - certainly not something that will need to be ran remotely.

The script is ran in node using node my-script.js:

// my-script.js

const fetch = require("node-fetch")
const Promise = require("bluebird")
const fs = require("fs")

const { uploadToB2 } = require("./images.js")

;(async function () {
  const response = await fetch(process.env.API_URL)
  const images = await response.json()

  const finalArray = await Promise.map(
    images.results,
    async function (image) {
      let newImageUrls = []

      try {
        newImageUrls = await uploadToB2({
          imageName: image.name,
          imageUrl: image.url,
        })
      } catch (e) {}

      return newImageUrls
    },
    { concurrency: 5 }
  )
})()

The script file contains a self executing async function which starts off by hitting an external API to request the initial image URLs. Then, each image is mapped over with Bluebird promise map function. I have done this because Bluebird allows promises to be mapped with a concurrency option, meaning only a certain number of items are processed at one time. This is great as my uploads to Backblaze were failing with too many concurrent requests during the initial attempt.

Processing images from external URL with Sharp in Node.js

The main part of this solution is handled in the uploadToB2 function referenced above. This function contains two aspects - resizing the images, and uploading to object storage. First, I’ll cover how the images are resized:

const sharp = require("sharp")
const axios = require("axios")

module.exports = {
  uploadToB2: async function ({ imageName, imageUrl }) {
    // ...

    const input = (await axios({ imageUrl, responseType: "arraybuffer" })).data

    const image = await sharp(input)
      .resize(400, 400, {
        kernel: sharp.kernel.nearest,
      })
      .toBuffer()

    // ...
  },
}

As you can see, image manipulation is easy with Sharp. It’s slightly more complicated than it could be with my situation as I’m providing the images to Sharp via an external URL, whereas most of the official docs show how to process images that are stored locally. When providing from a URL, the data must be first pulled down in a buffer to be fed in as an input to Sharp - which is where Axios comes in.

Once Axios has processed the image buffer, Sharp takes it and resizes the image to 400px square (from a pervious 100px square), and outputs the new image to a second buffer.

It’s worth noting that Sharp can do many other great images processing techniques, so it’s worth checking out if you haven’t already.

Uploading images to Backblaze object storage

The final part of the script takes the buffer output from Sharp, and uploads it to Backblaze:

const dotenv = require("dotenv")
dotenv.config({ path: "../.env.local" })

const B2 = require("backblaze-b2")
const sharp = require("sharp")
const axios = require("axios")

module.exports = {
  uploadToB2: async function ({ imageName, imageUrl }) {
    const input = (await axios({ imageUrl, responseType: "arraybuffer" })).data

    const image = await sharp(input)
      .resize(400, 400, {
        kernel: sharp.kernel.nearest,
      })
      .toBuffer()

    const b2 = new B2({
      applicationKeyId: process.env.BB_APP_ID,
      applicationKey: process.env.BB_APP_KEY,
    })
    await b2.authorize()

    const bucketData = await b2.getUploadUrl({
      bucketId: process.env.BB_BUCKET_ID,
    })
    const uploadUrl = bucketData.data.uploadUrl
    const authToken = bucketData.data.authorizationToken

    const uploaded = await b2.uploadFile({
      uploadUrl: uploadUrl,
      uploadAuthToken: authToken,
      fileName: imageName,
      data: image,
    })

    return `${process.env.BB_BUCKET_URL}${uploaded.data.fileName}`
  },
}

The remaining code has been filled in above, firstly authorizing the Backblaze object we’re using (const b2). The await b2.authorize() handles this, and needs to be ran separately for each image that is being uploaded.

Once authorized using env credentials, we’re required to get an upload URL from Backblaze using await b2.getUploadUrl. Each bucket URL we upload to is different, which is another way of Backblaze providing a security layer, alongside their authorization requirement mentioned above. The b2.getUploadUrl returns the URL, and a token, which together with the filename and image buffer from the Sharp output allows us to (finally) upload the image to storage using b2.uploadFile.

The final upload function returns the new storage URL which we return back to the script for usage with the rest of the process!

A side note - using environment variables within a separately running script file

There are a lot of environment variables used here, but .env files are not usually accessible unless running within a Node framework which supports environment variables within their setup. This is why the massively popular dotenv package comes in useful. This package is required so we can read the .env file from a simple, one-off Node script.


Senior Engineer at Haven

© Jay Gould 2023, Built with love and tequila.