Ink&Horizon
HomeBlogTutorialsLanguages
Ink&Horizon— where knowledge meets the horizon —Learn to build exceptional software. Tutorials, guides, and references for developers — from first brushstroke to masterwork.

Learn

  • Blog
  • Tutorials
  • Languages

Company

  • About Us
  • Contact Us
  • Privacy Policy

Account

  • Sign In
  • Register
  • Profile
Ink & Horizon

© 2026 InkAndHorizon. All rights reserved.

Privacy PolicyTerms of Service
Back to Blog
DevOps

Edge Computing & Serverless: Vercel, Cloudflare Workers & Deno Deploy

Deploy globally in milliseconds — edge functions, serverless architecture, cold starts, and when to use each platform

2026-01-15 22 min read
ContentsEdge vs Serverless vs Containers: The Decision MatrixCloudflare Workers: The Edge PlatformVercel Edge Functions & Next.js MiddlewareDeno Deploy: The Full-Stack Edge PlatformWhen to Choose Each PlatformKey Takeaways

Edge vs Serverless vs Containers: The Decision Matrix

There are three computing paradigms in 2026, each with distinct trade-offs. Containers (Docker/K8s): full control, any language, any dependency, long-running processes. You manage scaling and infrastructure. Best for: complex backends, databases, ML inference.

Serverless (AWS Lambda, Google Cloud Functions): auto-scaling, pay-per-invocation, cold starts (100-500ms). Supports most languages and npm packages. Best for: event-driven tasks, API endpoints with variable traffic.

Edge (Cloudflare Workers, Vercel Edge, Deno Deploy): runs on 300+ global servers, sub-50ms responses worldwide, V8 isolates (not containers). Limited runtime (no Node.js APIs like fs). Best for: auth, personalization, A/B testing, API routing, geolocation.

Key Takeaways

Edge: 300+ locations, sub-50ms globally, limited runtime (Web APIs only).
Serverless: auto-scale, cold starts (100-500ms), full Node.js runtime.
Containers: full control, any language, you manage scaling.
Edge for latency-sensitive, serverless for variable-traffic, containers for complex.

Cloudflare Workers: The Edge Platform

Cloudflare Workers run JavaScript/TypeScript on Cloudflare's global network (300+ cities). They use V8 isolates (the Chrome engine) instead of containers — starting in under 1ms compared to 100-500ms cold starts for Lambda. This makes them the fastest serverless platform.

Workers support the Web Standard APIs (fetch, Request, Response, crypto, URL) but NOT Node.js APIs (fs, path, child_process). They have 128MB memory limit and 30-second CPU time limit.

The Cloudflare ecosystem: Workers KV (key-value storage at the edge), D1 (SQLite database at the edge), R2 (S3-compatible object storage), and Durable Objects (stateful edge computing).

Snippet
// Cloudflare Worker: API with edge caching and geolocation
export default {
  async fetch(request: Request, env: Env): Promise<Response> {
    const url = new URL(request.url);
    
    // Geolocation-based routing
    const country = request.cf?.country || 'US';
    
    // Cache API responses at the edge
    const cacheKey = new Request(url.toString(), request);
    const cache = caches.default;
    let response = await cache.match(cacheKey);
    
    if (response) {
      return response; // Cache HIT — respond from edge, no origin call
    }
    
    // Cache MISS — fetch from origin
    if (url.pathname.startsWith('/api/blog')) {
      const posts = await env.DB.prepare(
        'SELECT id, title, excerpt FROM posts WHERE region = ? ORDER BY created_at DESC LIMIT 20'
      ).bind(country === 'IN' ? 'asia' : 'global').all();
      
      response = new Response(JSON.stringify(posts.results), {
        headers: { 
          'Content-Type': 'application/json',
          'Cache-Control': 'public, max-age=300', // Cache 5 min
        },
      });
      
      // Store in edge cache
      await cache.put(cacheKey, response.clone());
      return response;
    }
    
    return new Response('Not found', { status: 404 });
  },
};

// wrangler.toml — Configuration
// name = "ink-api"
// main = "src/index.ts"
// compatibility_date = "2026-03-01"
// 
// [[d1_databases]]
// binding = "DB"
// database_id = "abc-123"

Key Takeaways

V8 isolates: < 1ms cold start (vs 100-500ms for Lambda containers).
request.cf: built-in geolocation data (country, city, timezone).
Cache API: edge-level caching without CDN configuration.
D1: SQLite at the edge — full SQL, globally distributed.
wrangler: Cloudflare's CLI for developing and deploying Workers.

Vercel Edge Functions & Next.js Middleware

Vercel Edge Functions run at the edge (Cloudflare Workers under the hood) and integrate natively with Next.js. They are used for: middleware (auth, redirects, A/B testing before page rendering), API routes (fast responses from the nearest edge location), and streaming (SSR streaming from the edge).

In Next.js, Edge Functions are activated by exporting a runtime configuration. Middleware (middleware.ts in the app root) automatically runs on the edge for every request.

Snippet
// Next.js Edge API Route
// app/api/hello/route.ts
export const runtime = 'edge'; // Run on the edge network

export async function GET(request: Request) {
  const { searchParams } = new URL(request.url);
  const name = searchParams.get('name') || 'World';
  
  return new Response(
    JSON.stringify({ message: `Hello, ${name}! Served from the edge.` }),
    { headers: { 'Content-Type': 'application/json' } }
  );
}

// Next.js Middleware (always runs on edge)
// middleware.ts
import { NextResponse } from 'next/server';
import type { NextRequest } from 'next/server';

export function middleware(request: NextRequest) {
  // A/B Testing at the edge
  const bucket = request.cookies.get('ab-bucket')?.value;
  
  if (!bucket) {
    const newBucket = Math.random() < 0.5 ? 'control' : 'variant';
    const response = NextResponse.next();
    response.cookies.set('ab-bucket', newBucket, { maxAge: 60 * 60 * 24 * 30 });
    
    if (newBucket === 'variant') {
      return NextResponse.rewrite(new URL('/variant-home', request.url));
    }
    return response;
  }
  
  if (bucket === 'variant') {
    return NextResponse.rewrite(new URL('/variant-home', request.url));
  }
  
  return NextResponse.next();
}

Deno Deploy: The Full-Stack Edge Platform

Deno Deploy is Deno's edge platform — it runs Deno (not Node.js) on a global network. It supports TypeScript natively, has built-in KV storage (Deno KV), and uses web standard APIs. The framework Deno Fresh is built specifically for this platform.

Deno Deploy's unique advantage: Deno KV is a globally distributed, strongly consistent key-value store with zero configuration. It supports atomic transactions and secondary indexes — making it more capable than Workers KV.

Snippet
// Deno Deploy: API with Deno KV
import { Hono } from 'https://deno.land/x/hono/mod.ts';

const app = new Hono();
const kv = await Deno.openKv(); // Globally distributed KV store

app.get('/api/posts', async (c) => {
  const entries = kv.list({ prefix: ['posts'] });
  const posts = [];
  for await (const entry of entries) {
    posts.push(entry.value);
  }
  return c.json(posts);
});

app.post('/api/posts', async (c) => {
  const body = await c.req.json();
  const id = crypto.randomUUID();
  
  // Atomic transaction
  await kv.atomic()
    .set(['posts', id], { ...body, id, createdAt: new Date().toISOString() })
    .sum(['stats', 'postCount'], 1n) // Atomic counter
    .commit();
  
  return c.json({ id }, 201);
});

Deno.serve(app.fetch);

Key Takeaways

Deno Deploy: TypeScript native, web standard APIs, global edge network.
Deno KV: globally distributed, strongly consistent, atomic transactions.
No package.json — imports from URLs (or import maps).
Hono: ultra-light web framework that works on every edge platform.
Deno.serve: built-in HTTP server, no Express/Fastify needed.

When to Choose Each Platform

Choose by asking: "What is my primary constraint?" If latency → Edge (Cloudflare Workers, Vercel Edge). If cost-efficiency → Serverless (Lambda). If flexibility → Containers (Docker/K8s).

Edge is perfect for: authentication checks, personalization, A/B testing, API routing, geolocation-based content, rate limiting, and bot protection. These are lightweight operations that benefit from being close to the user.

Serverless is perfect for: webhooks, scheduled jobs, file processing, API backends with variable traffic, and any workload that is idle most of the time.

Containers are perfect for: complex backends with many dependencies, long-running processes, WebSocket servers, ML inference, and workloads that need full Node.js/Python/Java runtime.

Key Takeaways

Latency-sensitive? → Edge (sub-50ms globally).
Variable traffic? → Serverless (pay-per-invocation, auto-scale).
Complex runtime? → Containers (full language support, any deps).
Edge limitations: no Node.js APIs, limited CPU time, small memory.
Serverless limitations: cold starts (100-500ms), 15min timeout.
Containers limitations: you manage scaling, always running (costs).

Key Takeaways

Edge computing is the hottest deployment target in 2026. Cloudflare Workers, Vercel Edge Functions, and Deno Deploy offer sub-50ms response times globally. The trade-off: limited runtime (Web APIs only, no Node.js modules like fs).

For interviews: explain the difference between edge, serverless, and containers. Discuss cold start mitigation, edge caching strategies, and when each platform is appropriate. Know that Cloudflare Workers use V8 isolates (not containers) for instant starts.

The future is hybrid: edge for latency-sensitive logic, serverless for event-driven tasks, containers for complex backends. Most production apps use all three.

Key Takeaways

Edge: V8 isolates, < 1ms cold start, 300+ global locations.
Cloudflare Workers: D1 (SQL), KV (key-value), R2 (object storage).
Vercel Edge: native Next.js integration, middleware runs on edge.
Deno Deploy: TypeScript native, Deno KV with atomic transactions.
Hybrid architecture: edge + serverless + containers = optimal.
Key limitation: edge runtime = Web Standard APIs only, no Node.js modules.
AS
Article Author
Ashutosh
Lead Developer

Related Knowledge

Tutorial

Docker & Kubernetes for Production

5m read
Article

Understanding Closures in JavaScript: The Complete 2026 Guide

22 min read
Article

React 19 Server Components: The Definitive 2026 Guide

28 min read
Article

Next.js 15 App Router Masterclass: Everything You Need to Know

25 min read