← Home

Using CloudFlare workers to add CORS support to a backend API

This week I needed to expose a backend API written in a system that doesn’t support CORS configuration to a JavaScript client running on a Web page. I didn’t have many options, because the system was unable to change and the team didn’t have any existing infrastructure in place that would support deploying custom code with the required level of availability and with TLS support.

Fortunately, they were using CloudFlare as a Content Distribution Network (CDN). That gave me the idea to try out the (relatively new) CloudFlare Workers. CloudFlare Workers let you run JavaScript (or Web Assembly) within CloudFlare’s CDN - very similar to Lambda@Edge that I’ve used in AWS.

I’ve never used them before, but I’m very familiar with Lambda and I was pleased to find that it was a great developer experience for a few reasons.

Firstly, there’s a great CLI tool (@cloudflare/wrangler) that’s easy to install with yarn (yarn global add wrangler) or npm. Normally I prefer tools to be installed via static binaries to avoid dependency problems (think Terraform or ECS-CLI v2), but since you end up writing JavaScript here, it’s totally reasonable.

Secondly, there’s a free tier. I signed up without putting a credit card in, and I was able to use the wrangler preview to test my code locally and even use wrangler publish to publish my test code to a domain under my user name. All for free. I think this saved me a couple of days. By the time my client’s IT team got me access to the CloudFlare account, I’d already built and tested the system on my own account.

The wrangler tool has Webpack support built-in, so I could write modern JavaScript including async/await without extra hassle. It also has a --watch option to update the test harness automatically.

I was concerned about the 50ms of CPU limit [0], but in practice, it doesn’t include waiting for downstream requests due to the use of async / await. I couldn’t see any easy way to find out how much CPU I’d used up, but my code passed load testing without issues.

I picked up the example CORS proxy from [1] and modified it for my use case. I then added some document structure rewriting because I was unhappy with the return value structure of the 3rd party API. This way I was able to take a GET request in CloudFlare and issue a POST request to the downstream server, to tidy up the API and make it easier to cache. I was able to add 3rd party dependencies using yarn, while the wrangler tool looked after packaging and deployment.

The wrangler tool uses a wrangler.toml file to store configuration, and supports multiple environments. That made it really easy to set up dev, uat and prod environments, just a case of having an [env.uat] section to specificy the differences between environments in the wrangler.toml file.

[env.uat]
webpack_config = "webpack.config.uat.js"

That also allowed me to use different Webpack configurations so that I could push different values into the code depending on the environment:

const webpack = require("webpack");

module.exports = () => ({
  entry: "./index.js",
  target: "webworker",
  mode: "production",
  plugins: [
    new webpack.DefinePlugin({
      "process.env.ENVIRONMENT_VARIABLE": JSON.stringify("value"),
    }),
  ],
});

With the Webpack configuration in place, I could just use process.env.ENVIRONMENT_VARIABLE to get the environment-specific variables into my code.

Once I’d been in production for a few days, I decided to try and get rid of the downstream service completely to improve the performance of the service. CloudFlare has Workers KV [2], a lightweight NoSQL database that can be used to store data. The pricing was good, and the API looked simple enough and I was able to replace the backend service and lose the latency of that service from the response time. It was at this point that I found out that the Workers KV database isn’t included in the free tier, so I paid the $5 per month fee to enable it on my personal account rather than test on my customer’s production account.

It’s a shame there’s not even a tiny free tier, but it’s only a bit more than I pay for coffee.

I was also pleased to see that the Workers KV database supports one of my favourite DymamoDB features - giving data an expiry date. The data for this service doesn’t need to last more than a few months, and the API is straightforward enough [3].

All-in-all, really happy with the experience. If they added a zero-conf CI pipeline, then I’d be even happier. I’m not updating this code very often, so I’m happy to do it manually for now.