Your best salesperson can't be everywhere
The best shopping experiences happen in conversation. A knowledgeable person shows you exactly what you need, answers your questions, and helps you decide. That's how people buy suits, furniture, jewelry — anything where seeing it matters.
But in-store personal shopping doesn't scale. Your best salesperson can only be in one place at one time. Meanwhile, your website converts at 2-3% while your in-store close rate is 10x that.
A global B2B commerce platform — one that provides ecommerce infrastructure to some of the world's largest retailers, processing over $14 billion in annual transactions — needed to solve this. They brought us in to engineer the product. The platform later launched it at a major industry event with 12,000+ attendees and 60+ speakers, alongside enterprise retailers in fashion, electronics, and sporting goods. We built the engineering. They took it to market. See the complete live commerce case study for the visual breakdown.
What the product does
A personal shopper connects with a customer through live video — directly in the browser, no downloads. During the call, they browse the full catalog, push product cards to the customer's screen, add items to their cart, and close the sale. The customer checks out without ever leaving the video session.
Desktop and mobile. English, Spanish, Portuguese, and Catalan. 16 countries from Argentina to the UK. Designed for enterprise retailers who want to offer high-value clients and B2B customers a concierge-level shopping experience at digital scale.
But the product description only tells half the story. The other half is how we built it.
The architecture: full serverless, full event-driven
This was not a monolith. Every component of the system is serverless and event-driven — from the video sessions to the analytics pipeline to the notification engine. The entire system runs on AWS with minimal infrastructure cost because there are no servers to manage, no containers to orchestrate, no capacity to pre-provision.
Why serverless matters here
Personal shopping traffic is inherently spiky. A luxury retailer might have 50 concurrent sessions during a product launch and 2 sessions on a Tuesday afternoon. Traditional infrastructure means paying for peak capacity 24/7. Serverless means paying only for the sessions that actually happen.
But the real reason we went full serverless was multi-tenancy at scale. This product doesn't serve one retailer — it serves hundreds of retailers on the same platform. Every retailer has their own shoppers, their own branding, their own analytics, their own configuration. The architecture was designed from day one to scale to millions of sessions across thousands of tenants without any infrastructure changes.
Lambda functions don't care if they're handling one retailer or a thousand. DynamoDB tables partition naturally by tenant. EventBridge routes events by tenant context. SQS queues process work regardless of which retailer generated it. Adding a new enterprise retailer to the platform requires zero infrastructure provisioning — just a new configuration entry.
The cost implications are significant. "How much will this cost to run?" is always the second question after "What does it do?" Our answer: almost nothing when nobody's using it, scales automatically to millions, and the per-tenant cost approaches zero at scale.
The event bus: EventBridge + SQS
Every meaningful action in the system produces an event. We built the entire architecture around EventBridge as the central event bus, with SQS queues handling the workloads that need guaranteed delivery and ordered processing.
Here's what a single personal shopping session looks like from the event bus perspective:
- Customer requests a session →
session.requested - System matches available shopper →
session.matched - Shopper notified →
notification.sent(email, WhatsApp, or both) - Video call starts →
session.started - Shopper searches catalog →
product.searched - Shopper shares product →
product.shared - Customer adds to cart →
cart.updated - Customer completes checkout →
order.created - Video call ends →
session.ended - NPS survey sent →
survey.sent - Customer rates experience →
survey.completed - Analytics pipeline processes all events → dashboards update in real time
Every one of these events is published to EventBridge. Downstream consumers — analytics, notifications, the commerce integration layer, the recording system — subscribe to the events they care about. No service calls another service directly. Everything is decoupled.
Why EventBridge + SQS instead of just one or the other? EventBridge handles the routing — deciding which events go where based on rules. SQS handles the delivery — buffering events, retrying failures, and ensuring ordered processing for critical sequences like payment flows. Together they give us both flexibility and reliability.
SQS queues for critical paths
Not all events are equal. A product search event can be eventually consistent — if the analytics dashboard updates a few seconds late, nobody notices. But payment events, session recordings, and notification deliveries need guaranteed processing.
We use dedicated SQS queues for:
- Payment events — Every cart update and checkout trigger goes through a FIFO queue. Order matters. Duplicates can't happen
- Recording pipeline — Session recordings are large files that need reliable processing. SQS ensures no recording is lost even if the processing Lambda times out
- Notification delivery — Email and WhatsApp messages go through a queue with retry logic. If the messaging provider is temporarily down, the notification is retried automatically
- Analytics ingestion — High-volume event processing with batching. The queue absorbs spikes during peak hours and processes them at a steady rate
DynamoDB everywhere
DynamoDB is the data layer for virtually everything in the system. Every table is designed for the specific access pattern it serves:
Session table — Partition key: session ID. Stores all session metadata, participant info, status transitions, and timestamps. GSI on shopper ID for the "my sessions" view. GSI on status for the queue management dashboard.
Shopper table — Partition key: shopper ID. Stores profile, availability windows, assigned collections, contact info. GSI on email for authentication lookups.
Analytics table — Partition key: shopper ID, sort key: timestamp. Time-series data for all performance metrics. Query patterns optimized for "show me this shopper's last 30 days" and "show me all shoppers for this date range."
Notification table — Tracks every notification sent, delivery status, and retry attempts. Used for both operational monitoring and compliance audit trails.
Configuration table — Stores all retailer-specific settings: branding, notification templates, privacy links, cart behavior, CSS overrides.
Why DynamoDB for everything? Two reasons: well-defined access patterns and unpredictable multi-tenant scale. A retailer with 5 shoppers and a retailer with 500 shoppers use the same tables. Hundreds of retailers sharing the same infrastructure, each isolated by partition key design. DynamoDB scales to zero when idle, handles thousands of concurrent reads during a product launch across multiple tenants simultaneously — all without capacity planning.
The platform's own data layer handles the commerce data — products, inventory, customer profiles, order history. Our DynamoDB tables handle everything specific to the personal shopping product. Clean separation of concerns. The multi-tenant isolation is entirely logical — same tables, same infrastructure, zero cross-tenant data leakage through careful key design.
AWS Chime: managed video infrastructure
We didn't build video infrastructure from scratch. AWS Chime SDK provides the managed WebRTC layer — STUN/TURN servers, media processing, adaptive bitrate streaming, and recording capabilities. We built the product experience on top of it.
Why AWS Chime over raw WebRTC
Raw WebRTC works for 1:1 calls between two browsers on the same network. It falls apart when you need:
- NAT traversal at scale — Enterprise customers are behind corporate firewalls. Chime's TURN infrastructure handles this
- Adaptive bitrate — Automatic quality adjustment based on network conditions. Customer on hotel WiFi in Rome gets a usable experience, not a frozen screen
- Server-side recording — Sessions recorded for QA and training without relying on browser-based recording that breaks when tabs close
- Multi-participant support — Some sessions have multiple shoppers or a supervisor observing. Chime handles the media routing
What we built on top of Chime
Chime gives us the video pipe. We built everything around it:
- Session management — Create, schedule, join, end sessions with full lifecycle tracking
- Camera controls — Front/rear switching on mobile, video on/off, mute
- Screen sharing — Shopper shares their screen to walk through the catalog or show product details
- In-call chat — Text messaging alongside video for sharing links, sizes, and notes
- Recording pipeline — Automatic session recording with processing through SQS, stored and accessible from the admin
- Participant management — Invite additional participants, see who's in the call, manage permissions
The React frontend handles all video UI — call controls, participant tiles, the commerce sidebar, chat panel. Everything renders in the browser. Zero installs.
Adaptive quality in practice
Not every customer has perfect internet. The system handles this at multiple levels:
- Chime layer — Automatically adjusts video resolution and bitrate based on available bandwidth
- Our layer — Detects degraded conditions and adjusts the UI. On very poor connections, we switch to audio-only with product images replacing the video feed
- Shopper always gets priority — The shopper's video quality is maintained at the expense of the customer's outgoing video. The customer needs to SEE the products. The shopper needs to SEE the customer's reactions. When bandwidth is limited, we optimize for the product demonstration direction
The commerce integration layer
The hardest engineering challenge wasn't video. It was making the commerce platform's entire catalog, cart, and checkout available inside a video call in real time.
Deep platform integration
The commerce toolkit sidebar had to work with the platform's own APIs — not a simplified version, the real ones:
- Catalog API — Full product search with filters, collection scoping, and real-time pricing. The shopper sees the same products at the same prices the customer would see on the website
- Cart API — Add, remove, and modify items in the customer's actual cart. Not a shadow cart — the real one. If the customer opens the website in another tab, they see the same items
- Inventory API — Live stock levels queried before every recommendation. No suggesting products that are out of stock
- Checkout API — Trigger the platform's native checkout flow. All payment methods, all promotions, all tax calculations — exactly as if the customer checked out on the website
- Customer data API — Access to the customer's profile, purchase history, and preferences. The shopper walks into the call already knowing what the customer has bought before
Real-time product sharing
When a shopper finds a product to show, they click "Share" and the product card appears on the customer's screen instantly. This happens through WebSockets — not polling, not page refreshes.
The shared product card includes:
- Product images (multiple angles)
- Price (with any applicable promotions)
- Available sizes/variants
- Stock status
- "Add to cart" button that the customer can click directly
The customer can also add items to their cart from the shared product cards without the shopper's involvement. It's a collaborative shopping experience, not a one-way presentation.
Collection-based expertise
Each shopper is assigned to specific product collections — luxury watches, new arrivals, menswear, skincare, etc. When a shopper searches the catalog in-call, results are scoped to their assigned collections first, with the full catalog available as a secondary search.
This serves two purposes:
- Expertise routing — When a customer asks about watches, they're connected to the watch specialist, not a generalist
- Performance tracking — Analytics track "collection revenue" separately from total revenue. The retailer knows exactly how much each specialist generates from their assigned category
Multi-channel messaging: email, WhatsApp, and notifications
Travel advisory runs on relationships. So does personal shopping. Every touchpoint needs a notification — and in Latin America and Europe, WhatsApp is the primary communication channel, not email.
WhatsApp integration
We integrated WhatsApp as a first-class notification channel alongside email. Shoppers and customers receive session notifications, reminders, and links through WhatsApp — the channel they actually check.
The notification flow:
- Event fires (session requested, shopper ready, reminder due)
- EventBridge routes to notification Lambda
- Lambda checks customer's preferred channel
- Message sent via WhatsApp API or email provider
- Delivery tracked in DynamoDB
- Failed deliveries retried through SQS with exponential backoff
The notification lifecycle
Every session generates a sequence of notifications:
- Session requested → Shopper gets WhatsApp/email: "A customer is waiting"
- Session scheduled → Customer gets confirmation with date, time, shopper name
- Reminder → Both sides get a reminder before the scheduled session
- Shopper ready → Customer gets "Your personal shopper is ready — click to join"
- Session ended → Customer gets NPS survey link
- Cancellation → Both sides notified with clear messaging
All templates are customizable from the admin panel. Marketing teams write the copy, set the tone, localize for each market. Each message supports the retailer's branding — custom CSS for emails, formatted WhatsApp messages with the retailer's name and logo.
WhatsApp wasn't optional — it was essential. In Latin America, WhatsApp has 95%+ penetration. An email-only notification system would have meant most customers never see the notification. We built WhatsApp as a primary channel, not an afterthought.
Analytics pipeline
Every event in the system flows into the analytics pipeline. With the event-driven architecture already in place, analytics wasn't a separate system — it was a natural consumer of the event bus.
How events become dashboards
- Session events published to EventBridge
- EventBridge routes analytics-relevant events to an SQS queue
- Lambda processes events in batches, aggregates metrics
- Aggregated data written to DynamoDB analytics tables
- Admin dashboard queries DynamoDB with optimized access patterns
- Results rendered in real time
Three analytics views
General overview — Total calls, total revenue, collection revenue, total products sold, total call duration, average NPS. Filterable by date range and searchable by shopper. Shows a table ranking all shoppers by performance.
Per-shopper detail — Drill into any shopper's history. Every session listed with: date, product discussed, session status, customer name, carts created, products sold, revenue, collection revenue, call duration, recording link, NPS score, and device type (mobile/desktop).
Session history — Every session ever conducted. Searchable by product, customer, shopper, or any other criteria. Each entry shows: product thumbnail, product ID, product name, customer name, customer email, customer phone, creation date, shopper name, session status, chat transcript download, recording link, and NPS.
Everything exportable. Everything downloadable. Managers who never touch a video call use these dashboards daily to optimize their personal shopping operations.
Role-based permissions
Enterprise retailers need granular access control. A shopper shouldn't see revenue data. A regional manager shouldn't be able to change global settings. We built a full permissions system with three modules:
- Video Calls — Initiate, receive, create, and manage video calls
- Analytics — View performance metrics, session history, and NPS data
- Settings — Configure shoppers, notifications, branding, and privacy flows
Admins create custom roles combining any of these modules. A personal shopper gets Video Calls only. A floor manager gets Video Calls + Analytics. A regional director gets everything. Users without permission see a clear access-denied message directing them to their admin.
The permissions system integrates with the platform's existing role management — admins use the same interface they already know. No separate auth system, no additional credentials. One account, scoped access.
Shopper management
Managing a fleet of personal shoppers across multiple stores and regions requires its own infrastructure.
Onboarding a new shopper
- Admin creates shopper profile — name, email, phone
- Assigns product collections matching the shopper's expertise
- Sets weekly availability windows with recurring schedules
- Assigns role-based permissions
- Shopper receives access credentials and onboarding instructions
Day-to-day operations
- Shoppers toggle their availability status directly from the admin dashboard
- Queue view shows which customers are waiting, with context on what they're looking for
- Shoppers create private video call links for VIP clients — personalized URLs they can send via WhatsApp
- Calendar view of all scheduled sessions with customer details and product context
Management layer
- Search shoppers by name, email, phone, or product collection
- Edit profiles, reassign collections as inventory or seasons change
- Remove shoppers when they leave — permanent deletion of profile and associated data
- Performance visible per-shopper in the analytics dashboard
Operations teams — not developers — run everything. A regional manager can onboard a new shopper, assign them to seasonal collections, and track their performance from a single admin panel. After deployment, we're not needed for day-to-day operations. That was a deliberate design goal.
Configuration and branding
The entire customer-facing experience is customizable:
- Custom CSS — Full stylesheet override. The video call UI matches the retailer's brand
- Email templates — Custom invitation designs, HTML emails with retailer branding
- WhatsApp templates — Formatted messages with retailer name and context
- Notification copy — Every message customizable per touchpoint and per language
- Privacy flows — Configurable privacy policy and terms links. Consent captured before any session starts
- Cart behavior — Redirect to minicart or full order summary after checkout
- CMS integration — Script-based installation on any storefront template. One snippet, and the personal shopping experience is live on the retailer's website
Subscription plans and usage tracking
The product ships with two subscription tiers — Standard and Pro — each with different monthly minutes of live streaming. The admin dashboard shows real-time usage against plan limits so retailers always know where they stand.
Privacy and compliance
The product handles personal data across 16 countries — Argentina, Australia, Brazil, Canada, Chile, Colombia, France, Germany, Italy, Mexico, Netherlands, Peru, Portugal, Spain, United Kingdom, United States. Different privacy regulations in each.
We built compliance into the architecture:
- Consent flows — Configurable privacy policy and terms acceptance before any session
- Recording consent — Explicit opt-in before session recording begins
- Data processing agreements — Privacy links configurable per market
- Chat data — Transcripts stored securely, downloadable only by admins with Analytics permissions
- Audit trails — Every access to personal data logged through the event system
This wasn't bolted on after launch. The privacy layer was designed alongside the video and commerce layers from day one.
The full scope of what we shipped
Full serverless on AWS
Lambda, EventBridge, SQS, DynamoDB — zero servers, auto-scaling, near-zero idle cost
AWS Chime integration
Managed WebRTC, adaptive quality, recording, screen sharing, multi-participant support
Event-driven backbone
EventBridge as central bus, SQS for critical paths, every action is an immutable event
Deep platform integration
Catalog, cart, inventory, checkout, customer data — all real-time inside the video call
WhatsApp + email notifications
Multi-channel messaging with retry logic, delivery tracking, and customizable templates
Full analytics pipeline
3 views, 15+ metrics per shopper, session recordings, NPS, chat transcripts, full export
Self-service admin panel
Shopper management, permissions, branding, notifications, privacy, plan tracking
16 countries, 4 languages
Timezone-aware scheduling, localized notifications, market-specific compliance
Our role: the engineering
We didn't sell this product or take it to market — the platform did that. They're a global B2B commerce company that provides ecommerce infrastructure to some of the largest retailers in the world. Our job was the engineering.
We built the serverless infrastructure. The event-driven backbone. The video integration. The commerce layer. The analytics pipeline. The notification engine with WhatsApp. The permissions system. The admin panel. The multi-country deployment. The privacy and compliance layer.
The platform designed the product vision. We made it real — reliable enough for live video with in-call financial transactions, flexible enough for 16 countries, and cost-efficient enough that retailers pay near-zero when the system is idle and scale automatically during product launches.
What we learned building this
Multi-tenant serverless is the ultimate scaling story. This isn't one retailer's personal shopping tool — it's a product serving hundreds of retailers on the same platform. Serverless + DynamoDB partition design means adding a new tenant requires zero infrastructure changes. The same Lambda functions, the same tables, the same queues serve one retailer or a thousand. We designed for millions of sessions from day one, and the architecture has never required a scaling intervention.
Serverless is the right call for spiky workloads. Personal shopping traffic is unpredictable. A luxury retailer might have 200 sessions during a fashion week launch and 3 on a Tuesday. Serverless means the cost tracks usage perfectly. No over-provisioning, no capacity planning, no wasted spend.
EventBridge + SQS is a powerful combination. EventBridge for routing, SQS for guaranteed delivery. Every event in the system is decoupled, retryable, and auditable. When something goes wrong (and in a 16-country deployment, things go wrong), we can trace every event, replay failed processing, and fix issues without data loss.
Video is commoditized. Commerce integration is not. AWS Chime gives you video. What it doesn't give you is a product catalog inside the call, real-time cart management, inventory visibility, and native checkout. That integration layer — the bridge between video and commerce — was 60% of the engineering effort.
WhatsApp is infrastructure in Latin America. We initially built email notifications first. When we added WhatsApp, session attendance rates jumped dramatically. In markets like Brazil, Colombia, and Mexico, WhatsApp isn't a nice-to-have — it's the primary way people communicate. Building it as a first-class channel from the start would have saved us time.
DynamoDB with well-designed access patterns is unbeatable for this use case. Every table is designed for the queries it serves. No joins, no schema migrations, no capacity planning. The same tables serve a retailer with 5 shoppers and a retailer with 500. The cost scales linearly with usage and drops to near-zero when idle.
Analytics is the product for the buyer. The person who approves the purchase of a personal shopping tool is not the person who uses the video calls. It's the operations director who uses the analytics dashboards. We spent as much engineering time on analytics as on the video layer — and that was exactly the right allocation.
Live video commerce turns your best salespeople into a scalable channel. We engineered the full stack — serverless infrastructure, event-driven architecture, managed video, deep commerce integration, multi-channel messaging, and enterprise analytics. See more in our case studies, or if you're building a product for a commerce ecosystem, let's talk.