A South African edge hosting platform that started with static site deployment and grew into a full e-commerce engine — all running on Cloudflare's global network.
Wranglr started as an edge hosting platform. The original idea was simple: let anyone deploy a static website to Cloudflare R2 and have it served instantly from 300+ global locations. No server, no SSH, no complicated configs — just upload and your site is live worldwide.
The platform evolved from there. Static hosting led to a Dispatcher Worker that routes traffic based on custom domains. The Dispatcher led to A/B testing and dynamic theming at the edge. And the theming engine made it possible to build multi-tenant e-commerce — one shared storefront that looks different for every merchant.
Today, Wranglr handles two types of sites: static sites (HTML/CSS/JS deployed via CLI to R2) and e-commerce stores (a shared React SPA with products, checkout, and order management). Both run on the same edge infrastructure.
Deploy any static site to the edge. Files stored in R2, served by the Dispatcher from the nearest PoP. SPA fallback, custom 404s, and A/B testing built in.
A blueprint called "ecommerce-store" — one shared React app with products, cart, checkout, and order management. Theming injected by the Dispatcher.
South African payment processing built in. Yoco Checkout API integration with HMAC-verified webhooks. Money goes directly to the merchant.
For static sites, a visual editor injected via HTMLRewriter. For e-commerce, a drag-and-drop page builder with section types (hero, product grid, trust badges, FAQ).
Understanding this distinction is critical — the Dispatcher handles both types differently:
// User deploys via CLI or dashboard
// Files uploaded to R2 bucket:
BUCKET/{userId}/{siteId}/index.html
BUCKET/{userId}/{siteId}/style.css
BUCKET/{userId}/{siteId}/app.js
// Dispatcher serves directly from R2
// with SPA fallback + custom 404
// + A/B variant path prefix
// + visual builder injection
No per-store files. The shared React storefront is deployed once to Cloudflare Pages (wranglr-stores.pages.dev).
The Dispatcher proxies requests to the storefront, passing the hostname. The storefront reads the hostname, calls the API to resolve a site_id, and fetches that store's products/settings.
The blueprint_id column in D1 tells the Dispatcher which path to take.
Wranglr competes differently depending on the use case — as an edge hosting platform and as an e-commerce platform:
| Feature | Wranglr | Shopify | Netlify/Vercel |
|---|---|---|---|
| Static Site Hosting | ✓ R2 + Dispatcher | × Not supported | ✓ Core feature |
| E-Commerce | ✓ Built-in blueprint | ✓ Core feature | × Needs CMS/Snipcart |
| SA Payment (Yoco) | ✓ Native | Via third-party apps | × Build your own |
| Edge Network | 300+ PoPs (incl. JHB, CPT, DBN) | Canada/US primarily | Edge functions available |
| A/B Testing | ✓ At the Dispatcher level | × Needs apps | ✓ Edge middleware |
| Visual Builder | ✓ HTMLRewriter injection | ✓ Theme editor | × Code only |
| Pricing Currency | ZAR (Rands) | USD | USD |
| Multi-Tenant | ✓ Shared engine | Isolated per store | Per-project |
Unlike Shopify where each store is a separate application, Wranglr runs one storefront engine for all merchants. Think of it like Gmail — there's one Gmail app, but it shows different emails depending on who's logged in. Wranglr shows a different store depending on which domain you visit. For static sites, each site has its own files in R2, but they all share the same Dispatcher Worker that handles routing, theming, and A/B testing. This is called multi-tenancy.
Every app has characters — the main actors that each play a specific role. Here are Wranglr's.
The merchant control panel — products, orders, settings, page builder.
The brain — auth, data, payments, uploads.
The traffic controller — routes every request to the right place.
One shared React app that renders as any merchant's store.
D1 = SQL database. R2 = object/file storage.
Why running code at the edge is fundamentally different — and better — than running on a traditional server. This is the core innovation that makes Wranglr possible.
In a traditional web app, you rent a server — usually a virtual machine in a data center in the US or EU. Let's say you pick AWS us-east-1 (Virginia, USA). Here's what happens when a customer in Johannesburg visits your store:
Browser resolves your domain. DNS points to your server in Virginia. ~50ms
Browser establishes a connection across the Atlantic Ocean and back. The speed of light through undersea cables from JHB to Virginia takes ~130ms one way. A TCP handshake requires 1.5 round trips = ~390ms
Another 2 round trips to establish encryption. That's 2 × 260ms = ~520ms more.
Your Node.js/PHP app wakes up, queries the database, renders HTML. If it's a cold start (nobody visited recently), the runtime needs to initialize: 500-2000ms on Lambda, 100-500ms on a VM.
The HTML response crosses the Atlantic again. ~130ms
And that's just the HTML. The browser then needs to download CSS, JavaScript, images — each requiring its own round trip across the Atlantic. A typical e-commerce page loads in 3-8 seconds from South Africa on a traditional US-hosted server.
Now here's what happens on Wranglr when that same Johannesburg customer visits the same store:
DNS resolves to Cloudflare, which uses Anycast routing to send the request to the nearest data center. In JHB, that's Cloudflare's Johannesburg PoP. ~5ms
The TCP and TLS handshake happens with a server ~2ms away (Johannesburg data center), not 130ms away. Total: ~10ms instead of ~910ms.
The Dispatcher Worker is already running in Johannesburg. No cold start. It queries D1 (read replicas are global) and proxies to the storefront. ~5ms
Response travels ~2ms back to the customer.
That's 50-150x faster than a traditional US-hosted server. Not because the code is faster — because the distance is shorter. Physics wins.
Speed of light through fiber optic cable is ~200,000 km/s. These are the physical distances and round-trip times for a South African customer:
| Server Location | Distance from JHB | Light RTT | Real RTT | With Edge? |
|---|---|---|---|---|
| Johannesburg (CF Edge) | ~0 km | ~0ms | 2-5ms | ✓ Wranglr uses this |
| Cape Town (CF Edge) | ~1,400 km | ~14ms | 15-25ms | ✓ Falls back to this |
| London (AWS eu-west-1) | ~9,000 km | ~90ms | 150-180ms | ✗ Traditional hosting |
| Frankfurt (AWS eu-central-1) | ~8,500 km | ~85ms | 140-170ms | ✗ Traditional hosting |
| Virginia (AWS us-east-1) | ~13,000 km | ~130ms | 200-280ms | ✗ Shopify uses this |
| Oregon (AWS us-west-2) | ~16,500 km | ~165ms | 250-350ms | ✗ Many SaaS platforms |
A 250ms round-trip time sounds small. But a typical page load involves 20-50 HTTP requests (HTML, CSS, JS, images, API calls). That's 250ms × 50 = 12.5 seconds of pure latency. At the edge, it's 5ms × 50 = 250ms total. Google research shows 53% of mobile users abandon sites that take longer than 3 seconds to load.
A cold start is when a serverless function hasn't been called recently, so the cloud provider needs to boot it up. Here's how different platforms compare:
| Platform | Cold Start Time | When It Happens | Impact |
|---|---|---|---|
| Cloudflare Workers | 0ms — no cold starts | Never | Every request is instant |
| AWS Lambda (Node.js) | 100-500ms | After ~15min idle | First visitor after quiet period waits |
| AWS Lambda (Java) | 500-5,000ms | After ~15min idle | Can add 5 seconds to load |
| Google Cloud Run | 300-2,000ms | After idle period | Scale-to-zero has a cost |
| Vercel Edge Functions | ~0ms (uses Workers) | Never | Also edge-based, minimal |
| Traditional VM (EC2) | 0ms (always running) | Never (always paying) | You pay 24/7 whether busy or not |
Why are Cloudflare Workers cold-start-free? Because they use V8 Isolates — tiny, sandboxed environments that spin up in under 1 millisecond. Compare this to Lambda, which boots an entire Node.js runtime in a container. It's like the difference between opening a new browser tab (instant) versus booting a new computer (slow).
The Dispatcher is the single most important piece of Wranglr. It runs on every one of Cloudflare's 300+ edge locations and handles every single request to any Wranglr-hosted domain. Here's what it does, step by step:
// 1. Read the incoming hostname
let hostname = request.headers
.get('host').toLowerCase();
// 2. Bypass for admin portal
if (hostname === 'admin.wranglr.co.za')
return fetch(request);
// 3. Query D1 for the site record
let site = await env.DB.prepare(`
SELECT s.id, s.blueprint_id,
u.subscription_expiry,
ss.theme_colors, ss.font_family
FROM sites s JOIN users u
LEFT JOIN site_settings ss
WHERE s.domain_name = ?`
).bind(hostname).first();
// 4. Check subscription
if (new Date(site.subscription_expiry)
< new Date())
return expiredPage;
// 5. Route based on blueprint
if (site.blueprint_id ===
'ecommerce-store') {
// Proxy to shared storefront
targetUrl.hostname =
'wranglr-stores.pages.dev';
return applyAesthetics(
await fetch(proxyReq), site
);
} else {
// Serve static from R2
const obj = await env.BUCKET
.get(`${site.user_id}/${site.id}/..`);
return serveObject(obj, site);
}
Step 1: Read which domain the customer is visiting (e.g., "ronins-gear.co.za").
Step 2: If it's the admin portal, don't intercept — let Cloudflare Pages handle it directly.
Step 3: Query the database: "Who owns this domain? What type of site is it? What does it look like? Is their subscription still valid?" This single query joins 3 tables to get everything needed in one shot.
Step 4: If the merchant's subscription has expired, show a "subscription expired" page. The store is gated behind a paywall — not accessible for free.
Step 5a: If it's an ecommerce store, proxy the request to the shared storefront React app, then use HTMLRewriter to paint the merchant's brand on top.
Step 5b: If it's a static site (uploaded via CLI), fetch the file directly from R2 object storage and serve it.
This is the most clever part of Wranglr's edge architecture. Every store on Wranglr shares the exact same React application. But each store looks completely different — different colors, fonts, logos, and layouts. How?
The answer is HTMLRewriter — a streaming HTML transformer that runs at the edge. Think of it like a painter standing between the factory and the delivery truck, painting each product a different color as it comes off the assembly line.
function applyAesthetics(response, site) {
let cssVars = ':root {\n';
cssVars += ` --primary: ${site.primary};\n`;
cssVars += ` --bg: ${site.background};\n`;
cssVars += ` --surface: ${site.surface};\n`;
cssVars += '}\n';
return new HTMLRewriter()
.on('head', {
element(el) {
el.append(
`<style>${cssVars}</style>`,
{ html: true }
);
}
})
.on('title', {
element(el) {
el.setInnerContent(
site.store_name
);
}
})
.transform(response);
}
Build the CSS: Create a set of CSS variables from the merchant's stored settings — primary color, background color, surface color. These are the "paint buckets."
Intercept the <head>: As the HTML streams by, when the <head> tag appears, inject a <style> block containing the merchant's CSS variables. The storefront React app reads these variables to color its components.
Intercept the <title>: When the <title> tag streams by, replace "Wranglr Store" with "Ronin's Gear" so the browser tab shows the merchant's brand.
Key insight: The React app code is IDENTICAL for every store. The only thing that changes is the CSS variables injected by HTMLRewriter. One codebase → 10,000 unique-looking stores.
The edge isn't just faster — it changes the economics and reliability of running a platform:
Traditional: Buy faster servers. Edge: Move closer to users. You can't buy a faster speed of light, but you CAN reduce the distance to zero. A JHB customer to a JHB edge node = 2ms. To Virginia = 260ms. That's 130x faster — for free.
Traditional server: R1,500-15,000/month whether you have 0 or 100,000 visitors. Workers: R0 for the first 100,000 requests/day, then R5 per million. For small merchants, Wranglr's infrastructure cost per store is nearly zero.
Traditional: Server hits 100% CPU → crash. Need to pre-provision capacity for Black Friday. Workers: Each request gets its own isolate. 1 request or 10 million — the platform handles it identically. No capacity planning needed.
Traditional: Server goes down → site is dead. Edge: If Johannesburg's data center burns to the ground, Cape Town's picks up automatically. Then London. Then Frankfurt. There's no single machine to fail.
Cloudflare absorbs DDoS attacks at the edge before they reach your code. Traditional servers can be overwhelmed. Workers are protected by the same network that guards 20% of all websites on Earth.
Traditional: Your app is in Virginia. Full stop. Want to go global? Buy servers in 5+ regions, set up load balancers, manage databases in each. Workers: Deploy once → live in 300+ cities instantly.
Edge computing isn't magic. There are real constraints that shaped Wranglr's architecture:
Workers get 30ms of CPU time per request (free tier) or 30 seconds (paid). You can't run heavy computation like image processing or AI inference. Wranglr solves this by keeping Workers lightweight — they just route, query, and proxy.
D1 reads are globally replicated and fast. But writes go to one primary database. If a merchant in JHB writes, it might hit a primary in the US. This is fine for admin operations (saving products) but wouldn't work for real-time gaming.
Workers, D1, R2, Pages — this is all Cloudflare. If Cloudflare doubled their prices tomorrow, migrating to AWS would require significant rewrites. This is the biggest strategic trade-off.
Wranglr's workload is read-heavy — customers browse, view products, read pages. Writes are rare (placing orders, saving settings). This is PERFECT for edge computing. A different app (like a real-time chat or a spreadsheet) might not benefit as much from the edge.
Trace what happens from the moment a customer clicks "Add to Cart" to when the order is confirmed.
Imagine a customer on a Wranglr store adds a T-shirt to their cart and clicks checkout. The data travels through multiple actors — let's trace it.
Every storefront API call includes a siteId — the unique identifier for that merchant's store. This is how one API serves thousands of stores without mixing up data. Think of it like a hotel concierge who asks for your room number before doing anything.
const segments = path.split('/');
const siteId = segments[0];
const resource = segments[1];
// Verify site exists
const site = await env.DB.prepare(`
SELECT s.id FROM sites s
WHERE s.id = ? AND s.status != 'deleted'
`).bind(siteId).first();
Split the URL path to extract the store ID and what resource they want (products, orders, settings...)
The first segment is always the store ID — like the "room number" at a hotel.
Before doing anything, verify this store actually exists and isn't deleted. This prevents any store from accessing another store's data.
Wranglr uses two types of storage — each optimized for a different job:
D1 stores structured data: users, products, orders, site settings. Think of it as organized spreadsheets. Reads are replicated globally — the Dispatcher in Durban can query it instantly.
R2 stores files: product images, uploaded static sites, KYC documents. Think of it as a massive filing cabinet. The key feature? Zero egress fees — downloads never cost extra.
You wouldn't store a family photo in a spreadsheet, and you wouldn't store an order history as a loose file. D1 handles relationships (which products belong to which store), while R2 handles blobs (images, files). When you tell AI to add a feature, knowing whether it needs structured data (D1) or file storage (R2) saves design mistakes.
How money flows from customer to merchant — and why Wranglr never touches the money itself.
Here's the clever part: Wranglr doesn't process payments. Each merchant has their own Yoco account with their own API keys. When a customer pays, the money goes directly from customer → Yoco → merchant's bank. Wranglr is just the plumbing that connects them.
Think of it like a shopping mall: the mall provides the building and foot traffic, but each shop has its own cash register. Wranglr provides the storefront and checkout UI, but each merchant's own Yoco handles the money.
The storefront calls the API with the order total and merchant's site ID.
Each merchant's Yoco keys are stored securely in D1. The API grabs the right one.
A temporary payment page is generated with the merchant's branding.
Card details never touch Wranglr's servers. Yoco handles PCI compliance.
Yoco calls Wranglr's webhook endpoint saying "payment succeeded."
Wranglr checks the signature, updates the order to "paid," deducts inventory.
When Yoco says "payment succeeded," how does Wranglr know it's really Yoco and not someone faking the request? Through HMAC signature verification — like a secret handshake.
const signature = request.headers
.get('X-Yoco-Signature');
if (!await verifyYocoSignature(
body, signature, env.YOCO_WEBHOOK_SECRET
)) {
return new Response(
'Invalid signature', { status: 401 }
);
}
Grab the digital signature Yoco attached to this webhook request — like checking a wax seal on a letter.
Re-create the signature using our shared secret and compare it to Yoco's. If they don't match, reject the request immediately.
This prevents attackers from sending fake "payment succeeded" webhooks to steal products.
Wranglr uses PBKDF2 with 100,000 iterations for password hashing — plus constant-time comparison to prevent timing attacks.
If you ever ask AI to implement authentication, make sure it uses proper hashing (bcrypt, scrypt, or PBKDF2) — never plain text or simple MD5. And always use constant-time comparison for password verification. Wranglr gets this right.
An honest look at Wranglr's strengths, weaknesses, and how it stacks up against alternatives.
Workers are always warm. No waiting 2-5 seconds for a server to spin up — every request is instant.
Stores are served from the nearest of 300+ locations. A customer in Durban gets sub-20ms response times.
No USD subscription that fluctuates with exchange rates. Priced in Rands for South African businesses.
No third-party payment plugins to install and maintain. Yoco works out of the box with the merchant's own keys.
Cloudflare handles millions of requests automatically. No server sizing, no "Black Friday crash" fears.
Merchants can test different page variants with traffic splitting — no extra app needed.
Built entirely on Cloudflare's stack (Workers, D1, R2, Pages). Migrating to AWS or Vercel would require rewriting large portions.
D1 is newer than PostgreSQL/MySQL. Some advanced queries, full-text search, and write-heavy workloads may hit limits.
Yoco-only payments limit international expansion. Merchants wanting Stripe or PayPal need alternatives.
Shopify has 8,000+ apps and themes. Wranglr is early-stage with a smaller feature set and no plugin marketplace yet.
| Aspect | Wranglr | Shopify | WooCommerce | Ecwid |
|---|---|---|---|---|
| Architecture | Edge-first, serverless | Monolithic, centralized | PHP on hosting | SaaS widget |
| Speed (SA) | ~20ms (edge) | 200-400ms | Varies by host | 100-300ms |
| SA Payments | Yoco native | Via apps | Via plugins | Limited |
| Pricing | ZAR, competitive | $29-299 USD/mo | Free + hosting | Free-$82/mo |
| Customization | Page builder + CSS | Liquid themes | PHP themes | Limited |
| Scale Ceiling | Virtually unlimited | High (enterprise) | Host-dependent | Medium |
| Maintenance | Zero (serverless) | None (SaaS) | High (updates) | None (SaaS) |
| App Ecosystem | Growing | 8,000+ apps | 55,000+ plugins | 200+ apps |
A deep dive into the 14-route API Worker that handles authentication, data, payments, uploads, and everything else.
The API Worker (apps/workers/api) is one Cloudflare Worker with 14 route modules. Every request lands in index.ts, which reads the URL path and delegates to the right handler. Think of it like a switchboard operator — one phone line, 14 departments.
Here's how the main index.ts routes requests — it's a simple if/else chain that strips the prefix and passes the remaining path to each handler:
const path = url.pathname;
if (path.startsWith('/api/auth'))
response = await authRoutes(
request, env,
path.replace('/api/auth', '')
);
else if (path.startsWith('/api/sites'))
response = await sitesRoutes(...);
else if (path.startsWith('/api/payments'))
response = await paymentsRoutes(...);
// ... 11 more route handlers
else if (path === '/api/store-lookup')
// PUBLIC endpoint — no auth
// Resolves hostname → site_id
// Used by the storefront React app
Read the URL path (e.g., "/api/auth/login")
If it starts with "/api/auth", strip that prefix and pass the rest ("/login") to the auth handler.
Each handler receives: the request, the environment bindings (DB, R2, secrets), and the remaining sub-path.
The /api/store-lookup route is special — it's the ONLY fully public endpoint. It lets the storefront app resolve a hostname to a site ID, which is how multi-tenancy begins.
The API Worker has access to 12 environment bindings — secrets and services injected by Cloudflare:
| Binding | Type | Purpose |
|---|---|---|
DB | D1 Database | All structured data — users, sites, products, orders |
BUCKET | R2 Bucket | File storage — images, static site files, KYC documents |
JWT_SECRET | Secret | Signs/verifies JWT authentication tokens |
YOCO_SECRET_KEY | Secret | Server-side Yoco API key for checkout creation |
YOCO_WEBHOOK_SECRET | Secret | Verifies incoming Yoco webhook signatures |
RESEND_API_KEY | Secret | Email service (Resend) for notifications |
ENCRYPTION_KEY | Secret | Encrypts sensitive data (Yoco secret keys per merchant) |
CF_API_TOKEN | Secret | Cloudflare API — creates DNS records for custom domains |
CF_ACCOUNT_ID | Secret | Cloudflare account identifier |
CF_ZONE_ID | Secret | DNS zone for wranglr.co.za |
GEMINI_API_KEY | Secret | Google AI for smart features |
CORS_ORIGIN | Variable | Allowed origins for cross-origin requests |
The complete journey from merchant signup to a live, customer-facing store — every step, every API call, every database write.
When a merchant signs up at dashboard.wranglr.co.za/register, the following chain of events begins:
// POST /api/auth/register
const { email, password, name } = body;
// Generate salt + hash password
const salt = crypto.getRandomValues(
new Uint8Array(16)
);
const hash = await crypto.subtle
.deriveBits(
{ name: 'PBKDF2',
salt, iterations: 100000,
hash: 'SHA-256' },
key, 256
);
// Set 14-day free trial
const trialEnd = new Date();
trialEnd.setDate(
trialEnd.getDate() + 14
);
// Insert user into D1
await env.DB.prepare(`
INSERT INTO users
(id, email, password_hash,
salt, name, plan_tier,
subscription_expiry)
VALUES (?,?,?,?,?,?,?)`
).bind(id, email, hashHex,
saltHex, name, 'trial',
trialEnd.toISOString()
).run();
Step 1: Extract email, password, and name from the registration form.
Step 2: Generate a random 16-byte salt (like adding unique seasoning to each password before cooking). Hash the password using PBKDF2 with 100,000 iterations — this makes brute-force attacks incredibly slow (each guess takes ~100ms).
Step 3: Set the subscription expiry to 14 days from now. Every new user gets a free trial.
Step 4: Insert the user into the D1 database with their hashed password, salt, name, plan tier ("trial"), and expiry date. The password is never stored in plain text.
After login, the merchant creates a "site" — the container for their store, domain, and settings:
Sends the store name, selected blueprint (e.g., "ecommerce-store"), and desired subdomain.
INSERT into sites table with status="active", blueprint_id="ecommerce-store", and domain_name="storename.wranglr.co.za".
INSERT into site_settings with defaults: theme_colors (JSON), font_family, button_style, etc. This row drives HTMLRewriter theming.
Calls Cloudflare's DNS API to create: storename.wranglr.co.za CNAME proxy.wranglr.co.za. This tells the internet to route traffic for this subdomain through the Dispatcher Worker.
The Dispatcher Worker will find this site record in D1 and proxy traffic to the shared storefront. The store is accessible immediately — no deployment needed.
When a merchant brings their own domain (e.g., "ronins-gear.co.za"), Wranglr automates the DNS setup:
Dashboard PATCH → /api/sites/:id with the custom_domain field.
Updates site_settings.custom_domain = "ronins-gear.co.za"
They add a CNAME record: ronins-gear.co.za → proxy.wranglr.co.za. This routes ALL traffic for their domain through Wranglr's Dispatcher.
When someone visits ronins-gear.co.za, the Dispatcher queries D1: WHERE domain_name = ? OR custom_domain = ?. It finds the site and proxies to the storefront.
Unlike Shopify where a custom domain requires configuration and verification, Wranglr just needs one CNAME record. The Dispatcher's D1 query handles the rest — no server restarts, no DNS propagation waits on Wranglr's side. The moment the CNAME propagates (usually 1-5 minutes), the store is live on the custom domain.
How one React app serves thousands of unique-looking stores — the dynamic theming engine, page builder rendering, and multi-tenant architecture.
The storefront (templates/ecommerce-store) is a single React SPA deployed once to Cloudflare Pages at wranglr-stores.pages.dev. When a customer visits ANY Wranglr store, they're loading the same React app. Here's how it knows which store to render:
The storefront reads window.location.hostname or the X-Forwarded-Host header set by the Dispatcher.
This public endpoint resolves the domain to a site_id. The storefront now knows "I am store abc-123."
All subsequent API calls include /{siteId}/products, /{siteId}/settings, etc. The API ensures every query is WHERE site_id = ? — isolating each tenant.
The storefront's CSS uses var(--primary), var(--background), var(--surface). These were injected by the Dispatcher's HTMLRewriter. The React app never "knows" its colors — it just reads CSS variables.
If the merchant built a custom layout with the page builder, the storefront reads the JSON layout and renders matching React components (hero banners, product grids, trust badges, FAQs, etc.).
Merchants can customize their storefront using a drag-and-drop page builder in the dashboard. The builder produces a JSON array of "sections" stored in D1. The storefront's BuilderSections.tsx reads this JSON and renders matching React components:
[
{
"type": "hero",
"title": "Welcome to Ronin's",
"subtitle": "Premium Gear",
"bg_image": "https://..."
},
{
"type": "product-grid",
"columns": 3,
"limit": 6
},
{
"type": "trust-badges",
"badges": [
{ "icon": "truck",
"text": "Free Shipping" },
{ "icon": "shield",
"text": "Secure Checkout" }
]
}
]
Section 1 (Hero): A full-width banner with the title "Welcome to Ronin's", subtitle "Premium Gear", and a background image. Colors come from CSS variables injected by the Dispatcher.
Section 2 (Product Grid): A 3-column grid showing the first 6 products fetched from the API. Category filters appear in a sidebar.
Section 3 (Trust Badges): Small badges with SVG icons (Lucide) — "truck" renders a delivery truck icon, "shield" renders a shield icon. These replaced the old emoji badges.
PBKDF2 hashing, JWT tokens, constant-time comparison, CORS, and how each API request is validated — explained for humans.
When a merchant logs in, here's the complete chain — from typing their password to receiving a JWT that authenticates every subsequent request:
Dashboard sends { email, password } over HTTPS. Password is encrypted in transit.
API queries D1: SELECT id, password_hash, salt FROM users WHERE email = ?
Using PBKDF2 with the stored salt (16 bytes) and 100,000 iterations, the API re-hashes the submitted password. If the user entered "correct horse battery staple", PBKDF2 transforms it into a 256-bit hash.
Compare the computed hash with the stored hash byte-by-byte. The comparison ALWAYS takes the same time, whether 0 or all 32 bytes match (prevents timing attacks).
If hashes match, sign a JWT containing: user ID, email, plan tier, and a 7-day expiry. This token is returned to the dashboard.
Every subsequent API call includes Authorization: Bearer {token}. The API middleware verifies the JWT signature without querying D1 — making authenticated requests fast.
The auth middleware (middleware/auth.ts) sits between the request and every protected route. It verifies the JWT token before the route handler ever sees the request:
export async function
verifyAuth(request, env) {
const authHeader =
request.headers.get(
'Authorization'
);
if (!authHeader?.
startsWith('Bearer '))
return null;
const token =
authHeader.slice(7);
const key = await
crypto.subtle.importKey(
'raw',
encode(env.JWT_SECRET),
{ name: 'HMAC',
hash: 'SHA-256' },
false, ['verify']
);
const valid = await
crypto.subtle.verify(
'HMAC', key,
signature, payload
);
return valid ? user : null;
}
Step 1: Check if the request has an Authorization header with a "Bearer" token. No header = reject immediately.
Step 2: Extract the JWT token (everything after "Bearer "). The token has 3 parts: header.payload.signature — all base64-encoded.
Step 3: Import the server's JWT_SECRET as an HMAC signing key using the Web Crypto API. This is the server's private key — only it can verify signatures.
Step 4: Verify the signature. If someone modified the payload (e.g., changed their user ID or plan tier), the signature won't match. The token is rejected.
Why this is fast: The JWT contains the user's info (ID, email, plan). The server doesn't need to query D1 on every request — it just verifies the signature mathematically.
Only requests from dashboard.wranglr.co.za and whitelisted origins are allowed. Prevents malicious sites from calling Wranglr's API.
The Dispatcher strips .., null bytes, and encoded traversal sequences from file paths. Prevents attackers from accessing files outside a site's R2 directory.
Each merchant's Yoco secret key is encrypted with AES before storing in D1. Even if the database leaks, the keys are useless without the ENCRYPTION_KEY.
Every response includes HSTS (force HTTPS), X-Content-Type-Options (prevent MIME sniffing), Referrer-Policy, and Content-Security-Policy headers.
How Wranglr lets merchants test variants, preview changes before going live, and visually edit their static sites — all at the edge.
Wranglr's A/B testing runs in the Dispatcher — not in JavaScript, not in a third-party tool. The Dispatcher splits traffic before the page loads. Here's exactly how:
// Check for active A/B test
const abTest = await env.DB
.prepare(`SELECT * FROM ab_tests
WHERE site_id = ?
AND status = 'running'
LIMIT 1`)
.bind(site.id).first();
// Check existing cookie
const existing =
cookies[`wranglr_ab_${abTest.id}`];
if (existing === 'A' || existing === 'B')
variant = existing;
else
variant = Math.random() * 100
< abTest.split_ratio
? 'B' : 'A';
// Prepend variant path
filePath =
`${variantPath}${filePath}`;
// Track view (fire-and-forget)
ctx.waitUntil(
trackABView(
env.DB, abTest.id,
variant, today
)
);
Step 1: Query D1 for any running A/B test on this site. Tests define two variant paths (e.g., "/v1" and "/v2") and a split ratio (e.g., 50/50).
Step 2: Check if this visitor already has a variant cookie. Returning visitors ALWAYS see the same variant (consistency). Without this, they'd randomly bounce between versions.
Step 3: New visitors get randomly assigned. If split_ratio is 30, 30% get variant B. A cookie is set so they stay locked in.
Step 4: The variant path is prepended to the file path. So variant A loads from /v1/index.html and variant B from /v2/index.html in R2.
Step 5: ctx.waitUntil() records the view in a stats table WITHOUT blocking the response. The user gets their page instantly; stats are tracked asynchronously. This is a fire-and-forget pattern.
Before configuring a custom domain, merchants can preview their site at {siteId}-preview.wranglr.co.za. The Dispatcher handles these automatically:
const previewMatch =
hostname.match(
/^([a-f0-9-]+)-preview
\.wranglr\.co\.za$/
);
if (previewMatch) {
const siteId =
previewMatch[1];
return handlePreviewRequest(
request, env, siteId, url
);
}
The Dispatcher checks if the hostname matches the pattern: some-uuid-preview.wranglr.co.za
If it matches, extract the site ID from the subdomain and serve the site in preview mode. Subscription validation still applies — even previews require an active account.
For ecommerce stores, the preview also proxies to the shared storefront with full HTMLRewriter theming. For static sites, it serves directly from R2.
For static sites, the Dispatcher injects an entire visual editor into the page when viewed inside the dashboard's iframe. This 200-line JavaScript toolkit enables click-to-select, drag-to-reorder, and inline editing — all without any browser extension.
Clicking any element shows a floating toolbar with move up/down and delete buttons. The selected element gets a blue outline.
The dashboard sends UPDATE_ELEMENT messages via postMessage. The builder script applies text, color, padding, and URL changes in real time.
After editing, the builder clones the entire document, strips its own injected elements, and sends the clean HTML back to the dashboard via postMessage. The dashboard saves it to R2.
The builder script only activates inside an iframe (window !== window.parent). Live visitors never see the editing tools — it's invisible to customers.
Putting it all together — the full architecture in one view, and why it matters when working with AI assistants.
Here's every actor, every connection, every data flow in one mental model:
Merchant manages their store. Dashboard sends auth'd requests to the API. API writes to D1 (products, settings) and R2 (images).
Customer visits a domain. Dispatcher resolves it, checks subscription, proxies to the shared storefront, injects theme via HTMLRewriter.
Storefront fetches products, settings, and page layouts from the API — always scoped by siteId for multi-tenancy.
Customer pays via Yoco (off-site). Yoco webhook hits the API. API verifies signature, marks order paid, deducts inventory.
Static site users deploy via CLI to R2. Dispatcher serves files directly from R2 with SPA fallback and custom 404s.
When you ask an AI assistant (like me!) to modify Wranglr, knowing these actors and flows is the difference between a 5-minute fix and an hour of debugging.
"Fix the checkout" → API Worker (payments route). "Change the header" → Storefront or Dispatcher. "Add a product field" → API + Dashboard + Storefront.
Structured data (users, products) → D1. Files (images, sites) → R2. Never put structured data in R2 or binary files in D1.
Customer → Dispatcher → Storefront → API → D1. Don't bypass the pipeline. Don't put business logic in the Dispatcher.
You now understand how Wranglr works — from the edge network to payment webhooks. You can confidently direct AI assistants when building features for the platform.
Quiz modules completed