Edge vs Serverless vs Containers: The Decision Matrix
There are three computing paradigms in 2026, each with distinct trade-offs. Containers (Docker/K8s): full control, any language, any dependency, long-running processes. You manage scaling and infrastructure. Best for: complex backends, databases, ML inference.
Serverless (AWS Lambda, Google Cloud Functions): auto-scaling, pay-per-invocation, cold starts (100-500ms). Supports most languages and npm packages. Best for: event-driven tasks, API endpoints with variable traffic.
Edge (Cloudflare Workers, Vercel Edge, Deno Deploy): runs on 300+ global servers, sub-50ms responses worldwide, V8 isolates (not containers). Limited runtime (no Node.js APIs like fs). Best for: auth, personalization, A/B testing, API routing, geolocation.
Key Takeaways
Cloudflare Workers: The Edge Platform
Cloudflare Workers run JavaScript/TypeScript on Cloudflare's global network (300+ cities). They use V8 isolates (the Chrome engine) instead of containers — starting in under 1ms compared to 100-500ms cold starts for Lambda. This makes them the fastest serverless platform.
Workers support the Web Standard APIs (fetch, Request, Response, crypto, URL) but NOT Node.js APIs (fs, path, child_process). They have 128MB memory limit and 30-second CPU time limit.
The Cloudflare ecosystem: Workers KV (key-value storage at the edge), D1 (SQLite database at the edge), R2 (S3-compatible object storage), and Durable Objects (stateful edge computing).
// Cloudflare Worker: API with edge caching and geolocation export default { async fetch(request: Request, env: Env): Promise<Response> { const url = new URL(request.url); // Geolocation-based routing const country = request.cf?.country || 'US'; // Cache API responses at the edge const cacheKey = new Request(url.toString(), request); const cache = caches.default; let response = await cache.match(cacheKey); if (response) { return response; // Cache HIT — respond from edge, no origin call } // Cache MISS — fetch from origin if (url.pathname.startsWith('/api/blog')) { const posts = await env.DB.prepare( 'SELECT id, title, excerpt FROM posts WHERE region = ? ORDER BY created_at DESC LIMIT 20' ).bind(country === 'IN' ? 'asia' : 'global').all(); response = new Response(JSON.stringify(posts.results), { headers: { 'Content-Type': 'application/json', 'Cache-Control': 'public, max-age=300', // Cache 5 min }, }); // Store in edge cache await cache.put(cacheKey, response.clone()); return response; } return new Response('Not found', { status: 404 }); }, }; // wrangler.toml — Configuration // name = "ink-api" // main = "src/index.ts" // compatibility_date = "2026-03-01" // // [[d1_databases]] // binding = "DB" // database_id = "abc-123"
Key Takeaways
Vercel Edge Functions & Next.js Middleware
Vercel Edge Functions run at the edge (Cloudflare Workers under the hood) and integrate natively with Next.js. They are used for: middleware (auth, redirects, A/B testing before page rendering), API routes (fast responses from the nearest edge location), and streaming (SSR streaming from the edge).
In Next.js, Edge Functions are activated by exporting a runtime configuration. Middleware (middleware.ts in the app root) automatically runs on the edge for every request.
// Next.js Edge API Route // app/api/hello/route.ts export const runtime = 'edge'; // Run on the edge network export async function GET(request: Request) { const { searchParams } = new URL(request.url); const name = searchParams.get('name') || 'World'; return new Response( JSON.stringify({ message: `Hello, ${name}! Served from the edge.` }), { headers: { 'Content-Type': 'application/json' } } ); } // Next.js Middleware (always runs on edge) // middleware.ts import { NextResponse } from 'next/server'; import type { NextRequest } from 'next/server'; export function middleware(request: NextRequest) { // A/B Testing at the edge const bucket = request.cookies.get('ab-bucket')?.value; if (!bucket) { const newBucket = Math.random() < 0.5 ? 'control' : 'variant'; const response = NextResponse.next(); response.cookies.set('ab-bucket', newBucket, { maxAge: 60 * 60 * 24 * 30 }); if (newBucket === 'variant') { return NextResponse.rewrite(new URL('/variant-home', request.url)); } return response; } if (bucket === 'variant') { return NextResponse.rewrite(new URL('/variant-home', request.url)); } return NextResponse.next(); }
Deno Deploy: The Full-Stack Edge Platform
Deno Deploy is Deno's edge platform — it runs Deno (not Node.js) on a global network. It supports TypeScript natively, has built-in KV storage (Deno KV), and uses web standard APIs. The framework Deno Fresh is built specifically for this platform.
Deno Deploy's unique advantage: Deno KV is a globally distributed, strongly consistent key-value store with zero configuration. It supports atomic transactions and secondary indexes — making it more capable than Workers KV.
// Deno Deploy: API with Deno KV import { Hono } from 'https://deno.land/x/hono/mod.ts'; const app = new Hono(); const kv = await Deno.openKv(); // Globally distributed KV store app.get('/api/posts', async (c) => { const entries = kv.list({ prefix: ['posts'] }); const posts = []; for await (const entry of entries) { posts.push(entry.value); } return c.json(posts); }); app.post('/api/posts', async (c) => { const body = await c.req.json(); const id = crypto.randomUUID(); // Atomic transaction await kv.atomic() .set(['posts', id], { ...body, id, createdAt: new Date().toISOString() }) .sum(['stats', 'postCount'], 1n) // Atomic counter .commit(); return c.json({ id }, 201); }); Deno.serve(app.fetch);
Key Takeaways
When to Choose Each Platform
Choose by asking: "What is my primary constraint?" If latency → Edge (Cloudflare Workers, Vercel Edge). If cost-efficiency → Serverless (Lambda). If flexibility → Containers (Docker/K8s).
Edge is perfect for: authentication checks, personalization, A/B testing, API routing, geolocation-based content, rate limiting, and bot protection. These are lightweight operations that benefit from being close to the user.
Serverless is perfect for: webhooks, scheduled jobs, file processing, API backends with variable traffic, and any workload that is idle most of the time.
Containers are perfect for: complex backends with many dependencies, long-running processes, WebSocket servers, ML inference, and workloads that need full Node.js/Python/Java runtime.
Key Takeaways
Key Takeaways
Edge computing is the hottest deployment target in 2026. Cloudflare Workers, Vercel Edge Functions, and Deno Deploy offer sub-50ms response times globally. The trade-off: limited runtime (Web APIs only, no Node.js modules like fs).
For interviews: explain the difference between edge, serverless, and containers. Discuss cold start mitigation, edge caching strategies, and when each platform is appropriate. Know that Cloudflare Workers use V8 isolates (not containers) for instant starts.
The future is hybrid: edge for latency-sensitive logic, serverless for event-driven tasks, containers for complex backends. Most production apps use all three.