Skip to main content

Feature Flags with a Seat for Everyone

Say goodbye to per-user pricing and hello to feature flags in your IDE.

Feature Flags for Netlify Functions

· 4 min read
Jeff Dwyer


How should we integrate feature flags into Netlify functions? We'll explore why it's a bit tricky with lambdas, and I'll guide you through the best approaches to make it work efficiently.

The Lambda Challenge

Lambdas, like those in Netlify functions, are transient and don't run indefinitely. They're frozen after execution. This behavior poses a unique challenge for feature flags, which need to be swift and efficient and typically achieve this by using a background process to update the flag definitions.

Understanding Feature Flag Paradigms

Feature flags generally operate in two ways:

  1. Server-Side Flags: Here, your server connects to the flag server, downloads the necessary data, and performs local flag evaluations. This setup ensures no network calls during flag evaluations. Plus, we can manage telemetry asynchronously to avoid slowing down requests.

  2. Client-Side Flags: Common in web browsers, this approach involves making a network call to fetch flag values. For example, sending user data to an evaluation endpoint on page load, which returns the flag states. These endpoints need to be optimized for low latency, because they get called on every request.

Netlify Functions: A Middle Ground

Netlify functions are neither purely server-side nor client-side. They can't run background processes traditionally, but they are more persistent than a web browser so it would be nice to avoid network calls on every request. So, what's the best approach?

Feature Flags in Netlify: The Browser-Like Approach

A practical solution is to treat Netlify functions similar to a browser. Prefab's Javascript client, for instance, caches flag evaluations per user in a CDN. Here's a sample code snippet for this approach:

import { prefab, Context } from "@prefab-cloud/prefab-cloud-js";

export default async (req, context) => {
const clientOptions = {
apiKey: process.env.PREFAB_API_KEY,
context: new Context({user: {key: 1234}}),

await prefab.init(clientOptions);
if (prefab.get("my-flag")) {
// Your code here
return new Response("ok");

In my testing from a Netlify function I see results around a 50ms latency initially and around then 10ms for each subsequent request for the same context. That may be too slow for some applications, but it's a good starting point and very easy to set up.

The nice thing about this solution is that you're going to get instant updates when you change a flag. The next request will have up to date data.

The Server-Side Alternative

Alternatively, you can implement a server-side strategy using the Prefab NodeJS client. The key will be configuring our client to disable background updates and background telemetry, then performing an update on our own timeline.

Here's a sample code snippet for this approach:

import { Prefab } from "@prefab-cloud/prefab-cloud-node";

var prefab = new Prefab({
apiKey: process.env.PREFAB_API_KEY,
enableSSE: false, // we don't want any background process in our function
enablePolling: false, // we'll handle updates ourselves
collectLoggerCounts: false, // turn off background telemetry
contextUploadMode: "none", // turn off background telemetry
collectEvaluationSummaries: false, // turn off background telemetry

// initialize once on cold start
await prefab.init();

export default async (req, context) => {
const { userId } = context.params;
const prefabContext = { user: { key: context.userId } };

return prefab.inContext(prefabContext, (prefab) => {
if (prefab.get("my-flag")) {
// Your code here will

// ever 60 seconds, check for updates in-process
updateIfStalerThan(60 * 1000);
return new Response("ok");

export const config = { path: "/users/:userId" };

With this approach, most of our requests will be fast, but we'll have a periodic update that will take a bit longer. This is about 50ms in my testing from a Netlify function. We're entirely in control of the frequency here, so it's a judgment call on how real-time you want your feature flag updates. You could even disable the updates altogether if tail latency is of utmost concern and you didn't mind redeploying to update your flags.

Is there a better way?

The best way to solve this problem would be to use a Lambda Extension which could run a sidecar process to update the flags, then serve the flag data over localhost to your function. Unfortunately, Netlify doesn't support Lambda Extensions yet, but this is an exciting avenue to explore for other serverless platforms.


Deciding between a browser-like or server-side approach depends on your specific use case in Netlify functions. Both methods have their merits. The browser-like method offers simplicity and instant updates to feature flags, whereas the server-side approach gives a much better average response time at the cost of some tail latency and a configurable delay seeing flag changes. Choose what fits best for your application's architecture and performance requirements. Happy coding!

Like what you read? You might want to check out what we're building at Prefab. Feature flags, dynamic config, and dynamic log levels. Free trials and great pricing for all of it.
See our Feature Flags