Physics is the constraint that edge computing exists to address. Light travels roughly 200km per millisecond in fiber. A user in Sydney making a request to a server in Virginia experiences at minimum 80ms of one-way latency — 160ms round trip — before a single byte of computation has happened.
Edge computing moves compute closer to where users are. Instead of routing requests to a single origin data center, edge platforms run your code at hundreds of Points of Presence (PoPs) globally. A Sydney user's request might be handled 5ms away instead of 160ms.
But edge computing isn't always the answer. The constraints are real, and understanding them determines whether edge is a significant win or unnecessary complexity for your use case.
The Edge Computing Landscape
Major Edge Platforms
Cloudflare Workers → V8 isolates, 300+ PoPs, no cold starts
AWS Lambda@Edge → Lambda at CloudFront PoPs, heavier runtime
AWS CloudFront Functions → Lightweight JS at CloudFront, very fast
Vercel Edge Functions → Built on Cloudflare Workers
Fastly Compute@Edge → WebAssembly runtime, language flexible
Deno Deploy → V8 isolates, global Deno runtime
These platforms differ significantly in capability, runtime constraints, and pricing. Cloudflare Workers and Vercel Edge Functions are the most developer-friendly entry points.
What Edge Can and Cannot Do
Edge can:
- Serve responses directly from cache
- Transform requests and responses
- Make fast key-value lookups (KV stores)
- Run authentication and routing logic
- Personalize cached content
- A/B test at the edge
- Geolocation-based routing
- Rate limiting at scale
Edge cannot (easily):
- Connect to your relational database (latency defeats the purpose)
- Maintain WebSocket connections reliably
- Run long-duration computations (typically 50ms-30s limits)
- Access your internal network directly
- Use arbitrary Node.js packages (limited runtime)
Where Edge Genuinely Helps
Latency-Sensitive Read Operations
If your users hit an endpoint that does a database read and returns relatively static data, edge KV stores eliminate the origin round trip:
// Cloudflare Worker: serve user profile from edge KV
export default {
async fetch(request, env) {
const url = new URL(request.url);
const userId = url.pathname.split('/').pop();
// Check edge KV first (sub-millisecond)
const cached = await env.USER_PROFILES.get(userId, { type: 'json' });
if (cached) {
return Response.json(cached, {
headers: {
'Cache-Control': 'max-age=60',
'X-Cache': 'HIT'
}
});
}
// Fall back to origin for cache miss
const response = await fetch(`https://api.myapp.com/users/${userId}`, {
headers: request.headers
});
const profile = await response.json();
// Populate edge cache with TTL
await env.USER_PROFILES.put(userId, JSON.stringify(profile), {
expirationTtl: 300 // 5 minutes
});
return Response.json(profile, {
headers: { 'X-Cache': 'MISS' }
});
}
};
The key insight: cache population happens at the edge PoP. Subsequent requests from users near that PoP get sub-10ms responses instead of 200ms+ origin round trips.
Authentication at the Edge
Validating JWTs at the edge prevents unauthenticated requests from ever reaching your origin:
// Cloudflare Worker: JWT validation at edge
import { jwtVerify, importSPKI } from 'jose';
export default {
async fetch(request, env) {
// Allow public routes through
const url = new URL(request.url);
if (url.pathname.startsWith('/public/')) {
return fetch(request);
}
// Validate JWT
const authHeader = request.headers.get('Authorization');
if (!authHeader?.startsWith('Bearer ')) {
return new Response('Unauthorized', { status: 401 });
}
const token = authHeader.slice(7);
try {
const publicKey = await importSPKI(env.JWT_PUBLIC_KEY, 'RS256');
const { payload } = await jwtVerify(token, publicKey);
// Add user context to forwarded request
const modifiedRequest = new Request(request, {
headers: {
...Object.fromEntries(request.headers),
'X-User-Id': payload.sub,
'X-User-Role': payload.role
}
});
return fetch(modifiedRequest);
} catch (err) {
return new Response('Invalid token', { status: 401 });
}
}
};
This pattern is powerful: your origin doesn't need to do JWT validation at all. The edge handles authentication, and the origin trusts the X-User-Id header (which is only added by the edge Worker, not by clients).
Geolocation-Based Routing
Route users to the closest regional cluster without touching your origin:
export default {
async fetch(request) {
const country = request.cf.country;
const continent = request.cf.continent;
// Route to regional origin based on location
const origins = {
EU: 'https://eu.api.myapp.com',
AS: 'https://ap.api.myapp.com',
OC: 'https://ap.api.myapp.com',
default: 'https://us.api.myapp.com'
};
const origin = origins[continent] || origins.default;
// Add geo context for origin
const modifiedRequest = new Request(
request.url.replace('https://api.myapp.com', origin),
{
...request,
headers: {
...Object.fromEntries(request.headers),
'X-User-Country': country,
'X-User-Continent': continent
}
}
);
return fetch(modifiedRequest);
}
};
Rate Limiting at Edge Scale
Edge-based rate limiting can absorb massive traffic spikes before they hit your origin:
export default {
async fetch(request, env) {
const ip = request.headers.get('CF-Connecting-IP');
const key = `rate_limit:${ip}`;
// Use Cloudflare Durable Objects for consistent state
const id = env.RATE_LIMITER.idFromName(key);
const limiter = env.RATE_LIMITER.get(id);
const { allowed, remaining } = await limiter.check(100, 60); // 100 req/min
if (!allowed) {
return new Response('Too Many Requests', {
status: 429,
headers: {
'Retry-After': '60',
'X-RateLimit-Remaining': '0'
}
});
}
const response = await fetch(request);
return new Response(response.body, {
...response,
headers: {
...Object.fromEntries(response.headers),
'X-RateLimit-Remaining': remaining.toString()
}
});
}
};
A DDoS or credential stuffing attack can send millions of requests per minute. Blocking them at the edge means your origin never sees them. This is qualitatively different from rate limiting at the origin.
Content Personalization Without Origin Round Trips
A common pattern: serve a cached HTML page but personalize it at the edge based on cookies or headers:
export default {
async fetch(request, env) {
// Fetch the cached page
const response = await fetch(request);
if (!response.ok) return response;
const contentType = response.headers.get('Content-Type') || '';
if (!contentType.includes('text/html')) return response;
// Read user preferences from edge cookie
const cookie = request.headers.get('Cookie') || '';
const theme = parseCookie(cookie, 'theme') || 'light';
const locale = request.cf.country === 'DE' ? 'de' : 'en';
// Transform HTML at edge
return new HTMLRewriter()
.on('html', {
element(element) {
element.setAttribute('data-theme', theme);
element.setAttribute('lang', locale);
}
})
.on('[data-locale-key]', {
element(element) {
const key = element.getAttribute('data-locale-key');
const translation = getTranslation(locale, key);
if (translation) element.setInnerContent(translation);
}
})
.transform(response);
}
};
HTMLRewriter is Cloudflare's streaming HTML transformation API. It modifies HTML as it streams through, adding zero latency beyond the transformation itself.
Where Edge Computing Falls Short
Database Access
This is the most common mistake. An edge function that reads from a database in us-east-1 to serve a user in Tokyo:
User (Tokyo) → Edge PoP (Tokyo) → Database (us-east-1) → Edge PoP → User
The edge hop is now 5ms. The database round trip is still 150ms. Total latency is slightly worse than origin serving (extra hop added). You've paid for edge compute to make things slower.
Edge is only fast when it can serve from local storage (KV, caches, durable objects) or when it routes to a geographically appropriate origin. It does not make database queries faster.
Complex Business Logic
Edge runtimes are constrained:
Cloudflare Workers limits:
CPU: 10ms-30s depending on plan
Memory: 128MB
Request size: 100MB
Startup: no cold start (V8 isolate)
Node.js APIs: not all available
Native modules: not supported
Complex order processing, payment workflows, or anything requiring heavy computation belongs at the origin.
Stateful Operations
Edge functions are stateless by default. Cloudflare Durable Objects provide a form of state, but they're location-specific — which partially defeats the geo-distribution benefit.
For stateful operations (shopping carts, sessions with complex state, WebSocket connections), the origin is still the right answer.
Architecture Pattern: Edge + Origin Separation
The clean pattern is to separate what runs at edge vs origin based on data freshness requirements and latency sensitivity:
Edge layer:
├── Static assets (cached, max-age=31536000)
├── Authenticated routing (JWT validation)
├── Rate limiting
├── Bot detection
├── User profile data (cached, TTL=300s)
└── A/B testing assignment
Origin layer:
├── Write operations (all mutations)
├── Complex queries needing real-time data
├── Payment processing
├── Admin operations
└── Webhook receivers
Requests that match edge capabilities get handled there. Everything else is forwarded to origin. This is straightforward to implement and reason about.
Testing and Observability at Edge
Local Testing
# Cloudflare Workers local development
npx wrangler dev
# Runs local V8 isolate, simulates edge environment
# Supports KV bindings with --local flag
# Test specific country
curl http://localhost:8787/ \
-H "CF-IPCountry: DE"
Tracing Edge + Origin
Distributed tracing becomes critical when requests span edge and origin:
// Propagate trace context from edge to origin
export default {
async fetch(request, env) {
const traceId = request.headers.get('X-Trace-Id') ||
crypto.randomUUID();
const spanId = crypto.randomUUID().slice(0, 16);
const modifiedRequest = new Request(request, {
headers: {
...Object.fromEntries(request.headers),
'X-Trace-Id': traceId,
'X-Span-Id': spanId,
'X-Edge-PoP': request.cf?.colo || 'unknown',
'X-Edge-Country': request.cf?.country || 'unknown'
}
});
const start = Date.now();
const response = await fetch(modifiedRequest);
// Log edge performance
console.log(JSON.stringify({
traceId,
edgePop: request.cf?.colo,
country: request.cf?.country,
durationMs: Date.now() - start,
status: response.status,
cacheStatus: response.headers.get('CF-Cache-Status')
}));
return response;
}
};
Deciding When Edge Is Worth It
Strong case for edge:
- P50 latency improvement > 50ms for significant user population
- Read-heavy, cacheable data with acceptable staleness
- Authentication/routing logic running on every request
- Rate limiting needing to absorb high-volume attacks
- Geolocation-specific content delivery
Edge adds complexity without clear benefit:
- Single-region user base (geography advantage is minimal)
- Highly dynamic, non-cacheable content
- Complex business logic with many dependencies
- Teams without JavaScript/Wasm expertise
- Write-heavy workloads
Edge computing is a genuine performance tool, not just marketing. But its benefits are specific. Used where it fits — routing, caching, auth, rate limiting — it delivers measurable latency improvements. Used where it doesn't fit — as a general compute layer for complex application logic — it adds cost and complexity without the payoff.
Building something that needs to scale? We help teams architect systems that grow with their business. scopeforged.com