Appearance
Marketing Video System
Automated pipeline to generate real screen-recording demo videos from a running iOS simulator and display them in the website's phone mockup. Mirrors the existing screenshot generation system (scripts/gen-screenshots.sh + MarketingSnapshotTests.swift).
Architecture Overview
iOS Simulator (XCUITest drives app)
+ xcrun simctl io recordVideo
+ MarketingTapOverlayWindow (orange tap ripple)
+ MarketingMockLLMService (instant scripted responses)
+ MarketingDataSeeder (fake realistic data)
↓
gen-videos.sh (ffmpeg encode → WebM + MP4)
↓
apps/lucidpal-website/public/videos/
↓
lp-phone-video Angular component (A/B crossfade, tab nav)
↓
lp-hero (replaces static phone HTML)Scenes
| Scene | File | Duration | Launch arg |
|---|---|---|---|
| Onboarding | onboarding.webm/.mp4 | 8s | --scene onboarding |
| Agent | agent.webm/.mp4 | 8s | --scene agent |
| Chat | chat.webm/.mp4 | 8s | --scene chat |
| Notes | notes.webm/.mp4 | 8s | --scene notes |
| Habits | habits.webm/.mp4 | 8s | --scene habits |
| Live Notes | live-notes.webm/.mp4 | 10s | --scene live-notes |
Tab labels (conversion-optimized, not feature names):
| Scene | Tab label |
|---|---|
| onboarding | Setup in 30 seconds |
| agent | Ask your AI anything |
| chat | Fully private chat |
| notes | Capture every thought |
| habits | Build real habits |
Implementation Phases
Phase 1 — Add XCUITest Target to project.yml
No UITest target currently exists. Add LucidPalUITests:
yaml
LucidPalUITests:
type: bundle.ui-testing
platform: iOS
deploymentTarget: '18.0'
sources:
- path: UITests
settings:
base:
SWIFT_VERSION: '6.0'
PRODUCT_BUNDLE_IDENTIFIER: app.lucidpal.uitests
GENERATE_INFOPLIST_FILE: YES
CODE_SIGN_STYLE: Automatic
DEVELOPMENT_TEAM: 'KRPUAN3FFA'
TEST_TARGET_NAME: LucidPal
dependencies:
- target: LucidPalAdd to scheme test targets:
yaml
test:
targets:
- LucidPalTests
- LucidPalUITestsRun xcodegen generate after editing. Creates apps/lucidpal-ios/UITests/ directory.
Phase 2 — iOS Marketing Infrastructure (#if DEBUG only)
All files in Sources/Marketing/. Zero prod impact — entire directory excluded from Release builds.
MarketingEnvironment.swift
Central launch-arg reader. Everything else reads from here.
swift
#if DEBUG
enum MarketingEnvironment {
static var isActive: Bool {
ProcessInfo.processInfo.arguments.contains("--marketing-demo-mode")
}
static var scene: String? {
guard let i = ProcessInfo.processInfo.arguments.firstIndex(of: "--scene"),
ProcessInfo.processInfo.arguments.indices.contains(i + 1)
else { return nil }
return ProcessInfo.processInfo.arguments[i + 1]
}
}
#endifMarketingDataSeeder.swift
Seeds fake but realistic data into app stores at launch. Called once from LucidPalApp.init() when MarketingEnvironment.isActive.
Seeded content:
- Calendar: 6 events across next 3 days (Dentist, Team Standup, Lunch with Alex, Gym, Weekly Review, Pick up kids)
- Chat history: 4 pre-scripted conversation pairs demonstrating agent capability
- Notes: 3 notes (Meeting notes, Grocery list, Book ideas)
- Habits: 4 habits with streaks (Morning run 12-day, Read 20min 7-day, Sleep 8h 3-day, Cold shower 1-day)
No personal data. All content is generic and fictional.
MarketingMockLLMService.swift
LLMServiceProtocol conformance for demo mode:
isLoaded = trueimmediately (skips model download screen)generate(...)streams pre-scripted tokens at 25ms intervals (looks like real LLM typing)- Response bank keyed by scene name
Wiring in LucidPalApp.swift:
swift
#if DEBUG
private static func makeLLMService() -> any LLMServiceProtocol {
MarketingEnvironment.isActive ? MarketingMockLLMService() : LLMService()
}
#else
private static func makeLLMService() -> any LLMServiceProtocol { LLMService() }
#endif
private let llmService = LucidPalApp.makeLLMService()MarketingTapOverlayWindow.swift
UIWindow subclass that intercepts sendEvent(_:) and shows a branded tap ripple at every .began touch.
Visual: orange circle (#f97316, 22pt) → spring expands to 60pt → fades → gone. Total: 450ms.
swift
#if DEBUG
final class MarketingTapOverlayWindow: UIWindow {
override func sendEvent(_ event: UIEvent) {
super.sendEvent(event) // never consumes touches
guard let touches = event.allTouches else { return }
for touch in touches where touch.phase == .began {
showRipple(at: touch.location(in: self))
}
}
// ...
}
#endifWired in AppDelegate.application(_:didFinishLaunchingWithOptions:) inside existing #if DEBUG block. Window level .alert + 1, isUserInteractionEnabled = false.
Scene Navigation
LucidPalApp reads MarketingEnvironment.scene at launch and sets the initial navigation state:
| Value | Behaviour |
|---|---|
onboarding | Clear all state, show fresh onboarding |
agent | Skip onboarding, navigate to agent tab, pre-seed 2 past messages |
chat | Skip onboarding, navigate to chat tab, fresh session |
notes | Skip onboarding, navigate to notes tab |
habits | Skip onboarding, navigate to habits tab |
live-notes | Skip onboarding, navigate to live notes |
Phase 3 — MarketingVideoTests.swift (XCUITest)
Location: apps/lucidpal-ios/UITests/MarketingVideoTests.swift
One test method per scene. Not testing correctness — testing visual storytelling. continueAfterFailure = true so one flaky tap doesn't kill the recording.
Sleep pauses are deliberate storytelling beats, not just technical waits. Every pause gives the viewer time to register what happened.
Known element identifiers (from source reading)
| View | Element | XCUITest query |
|---|---|---|
OnboardingCarouselView | CTA button | app.buttons.matching(.button, identifier: "Continue").firstMatch or app.buttons["Get Started"] — label set via .accessibilityLabel(ctaLabel) |
OnboardingCarouselView | Skip | app.buttons["Skip"] |
ContentView tab bar | Agent tab | app.tabBars.buttons["Agent"] |
ContentView tab bar | Chat tab | app.tabBars.buttons["Chat"] |
ContentView tab bar | Notes tab | app.tabBars.buttons["Notes"] |
ContentView tab bar | Habits tab | app.tabBars.buttons["Habits"] |
AgentView | Text input | app.textFields["Ask the agent…"] |
ChatView+InputBar | Text input | app.textFields["Ask anything…"] |
NoteEditorView | Title field | app.textFields["Title"] |
NoteEditorView | Save button | app.buttons["Save"] |
NotesListView | New note (toolbar) | app.navigationBars.buttons.element(boundBy: 0) — square.and.pencil icon, no label |
HabitDashboardView | Add habit (toolbar) | app.navigationBars.buttons["Add"] — plus icon, no explicit label |
HabitDetailView | Log button | app.buttons["Log"] — from Label("Log", ...) |
Identifiers to add before XCUITests can be written reliably:
Several buttons have no accessibilityIdentifier. Add these during Phase 2 implementation:
swift
// NotesListView — new note toolbar button
Image(systemName: "square.and.pencil")
.accessibilityIdentifier("new-note-button")
// HabitDashboardView — add habit toolbar button
Image(systemName: "plus")
.accessibilityIdentifier("add-habit-button")
// AgentView — send button (currently no label)
Button { ... }
.accessibilityIdentifier("agent-send-button")
// ChatView+InputBar — send button
Button { ... }
.accessibilityIdentifier("chat-send-button")Mock LLM scripted responses
MarketingMockLLMService returns these token streams keyed by the text typed:
| Query | Streamed response |
|---|---|
"What's on my calendar tomorrow?" | "You have 3 events tomorrow: Team Standup at 9 AM, Lunch with Alex at 12:30 PM, and Gym at 6 PM." |
"Schedule dentist Friday at 3pm" | "Done! Added \"Dentist\" on Friday at 3:00 PM → 4:00 PM." |
"When am I free this afternoon?" | "You have a 2-hour gap from 2–4 PM today. Nothing scheduled." |
"Remind me to call mom at 6pm" | "Reminder set for 6:00 PM — \"Call mom\"." |
Token delay: 22ms between tokens. Gives natural typing feel without being slow.
Per-scene choreography
testOnboardingFlow — 8s
Goal: show how fast setup is. Viewer sees 3 info pages then lands on agent tab.
0.0s app.launchArguments += ["--scene", "onboarding"]
app.launch()
// Onboarding page 1 visible: "Your Pocket AI"
1.2s sleep(1.2) // viewer reads page 1
app.buttons["Continue"].tap()
1.8s sleep(0.6) // page 2 transition
// Page 2: "Knows Your Schedule"
2.8s sleep(1.0) // viewer reads page 2
app.buttons["Continue"].tap()
3.4s sleep(0.6)
// Page 3: "Type or Speak"
4.4s sleep(1.0) // viewer reads page 3
app.buttons["Continue"].tap()
5.0s sleep(0.6)
// Model selection page — mock LLM already loaded, "Get Started" visible
5.8s sleep(0.8) // viewer sees "no download needed"
app.buttons["Get Started"].tap()
6.5s sleep(1.5) // agent tab appears — REVEAL moment
// EndtestAgentFlow — 8s
Goal: show agent answering a calendar query instantly. Showcase on-device speed.
0.0s app.launchArguments += ["--scene", "agent"]
app.launch()
// Agent tab visible, orb idle, seeded calendar data in background
1.2s sleep(1.2) // viewer registers orb + abilities
app.textFields["Ask the agent…"].tap()
1.6s sleep(0.4) // keyboard slides up
app.typeText("What's on my calendar tomorrow?")
3.2s sleep(0.4) // viewer sees typed query
app.buttons["agent-send-button"].tap()
// Mock LLM streams response tokens at 22ms each (~1.8s for full response)
5.4s sleep(2.2) // response finishes streaming
// Calendar event cards rendered below response
6.8s sleep(1.4) // viewer reads the answer
// EndtestChatFlow — 8s
Goal: show private chat. Emphasise "no cloud" — tap send, answer appears, no spinner.
0.0s app.launchArguments += ["--scene", "chat"]
app.launch()
// Chat tab, empty session, clean slate
1.0s sleep(1.0)
app.textFields["Ask anything…"].tap()
1.4s sleep(0.4)
app.typeText("When am I free this afternoon?")
2.8s sleep(0.3)
app.buttons["chat-send-button"].tap()
// Typing indicator briefly visible, then response streams
4.6s sleep(1.8) // response finishes
// "You have a 2-hour gap from 2–4 PM today."
6.2s sleep(1.8) // viewer reads — no cloud icon, no spinner
// EndtestNotesFlow — 8s
Goal: capture an idea fast. Show note created in seconds.
0.0s app.launchArguments += ["--scene", "notes"]
app.launch()
// Notes list with 3 seeded notes visible
1.0s sleep(1.0) // viewer sees existing notes
app.buttons["new-note-button"].tap()
1.5s sleep(0.5) // NoteEditorView slides up
app.textFields["Title"].tap()
2.0s sleep(0.3)
app.typeText("Product ideas")
3.0s sleep(0.4) // cursor moves to body
// Type into body (tap below title)
app.textViews.firstMatch.tap()
app.typeText("AI-powered habit suggestions based on calendar patterns")
5.5s sleep(0.6)
app.buttons["Save"].tap()
6.2s sleep(0.5) // note list reappears
// New note "Product ideas" visible at top of list
7.5s sleep(1.3) // viewer sees note saved
// EndtestHabitsFlow — 8s
Goal: show streaks are alive and logging is one tap.
0.0s app.launchArguments += ["--scene", "habits"]
app.launch()
// Habits list: Morning run (12🔥), Read 20min (7🔥), Sleep 8h (3🔥)
1.2s sleep(1.2) // viewer registers streaks
// Tap "Morning run" habit card (first item in list)
app.collectionViews.cells.firstMatch.tap()
2.0s sleep(0.8) // HabitDetailView slides in, streak calendar visible
// Viewer sees streak calendar filled in
3.5s sleep(1.5)
app.buttons["Log"].tap()
4.2s sleep(0.4) // success animation plays
// Checkmark / celebration animation
5.5s sleep(1.3) // viewer sees it logged
// Streak counter increments to 13
6.5s sleep(1.0)
// EndtestLiveNotesFlow — 10s
Goal: show real-time transcription. Microphone active, text streams in live.
0.0s app.launchArguments += ["--scene", "live-notes"]
app.launch()
// Notes list visible
1.0s sleep(1.0)
// Tap "Start Live Note" button (onStart action in NotesListView)
app.buttons.matching(NSPredicate(format: "label CONTAINS 'Live'")).firstMatch.tap()
1.6s sleep(0.6) // live notes recording view opens
// Microphone active indicator, waveform animating
// Mock transcription streams in: "Team meeting notes — Q2 roadmap discussion..."
3.0s sleep(4.0) // viewer watches text appear in real time
// Full sentence visible: "Next sprint: focus on onboarding improvements and live note export."
7.5s sleep(1.5) // viewer reads the transcribed text
app.buttons["Cancel"].tap() // or let it run to clip end
// EndsetUp / boilerplate
swift
final class MarketingVideoTests: XCTestCase {
var app: XCUIApplication!
override func setUp() {
super.setUp()
continueAfterFailure = true
app = XCUIApplication()
app.launchArguments = ["--marketing-demo-mode"]
}
private func launch(scene: String) {
app.launchArguments += ["--scene", scene]
app.launch()
}
}Phase 4 — scripts/gen-videos.sh
Mirrors gen-screenshots.sh structure. Same flags:
- bare → generate + copy
--only copy→ copy last encoded files without re-running tests--deploy-dev→ generate + copy + build + deploy todev.lucidpal.pages.dev--scene <name>→ regenerate a single scene only
Key implementation details
Recording: use SIGINT not SIGTERM to stop xcrun simctl io — MP4 container needs a clean close. SIGTERM produces a malformed file.
bash
xcrun simctl io booted recordVideo --codec=h264 --force "/tmp/lp_${name}.mp4" &
REC_PID=$!
sleep 0.5 # let recorder initialize before xcodebuild starts
xcodebuild test -project "$PROJ" -scheme "$SCHEME" \
-destination "$DEST" \
-only-testing:"LucidPalUITests/MarketingVideoTests/${test}"
kill -INT $REC_PID
wait $REC_PID 2>/dev/null || trueEncoding: trim 1.5s off head (app launch flash), encode to both formats.
bash
# WebM VP9 — smaller, all modern browsers
ffmpeg -ss 1.5 -i "/tmp/lp_${name}.mp4" \
-vf "scale=390:844,fps=30" \
-c:v libvpx-vp9 -crf 35 -b:v 0 -an \
"$WEBSITE_DIR/${name}.webm" -y
# MP4 H.264 — Safari fallback, faststart for HTTP partial content
ffmpeg -ss 1.5 -i "/tmp/lp_${name}.mp4" \
-vf "scale=390:844,fps=30" \
-c:v libx264 -crf 28 -preset slow -an \
-movflags +faststart \
"$WEBSITE_DIR/${name}.mp4" -yPrerequisite check:
bash
command -v ffmpeg >/dev/null || { echo "✗ ffmpeg not found — brew install ffmpeg"; exit 1; }
command -v xcodegen >/dev/null || { echo "✗ xcodegen not found — brew install xcodegen"; exit 1; }Target file sizes
| Clip | Duration | WebM | MP4 |
|---|---|---|---|
| onboarding | 8s | ~380KB | ~580KB |
| agent | 8s | ~380KB | ~580KB |
| chat | 8s | ~380KB | ~580KB |
| notes | 8s | ~380KB | ~580KB |
| habits | 8s | ~380KB | ~580KB |
| live-notes | 10s | ~480KB | ~720KB |
| Total | ~2.4MB | ~3.6MB |
Total committed: ~6MB. Within Cloudflare Pages limits, no LFS needed.
Phase 5 — Website: lp-phone-video Component
New standalone Angular component.
src/app/shared/components/lp-phone-video/
lp-phone-video.component.ts
lp-phone-video.component.html
lp-phone-video.component.scssState
typescript
const SCENES = [
{ id: 'onboarding', label: 'Setup in 30 seconds', src: '/videos/onboarding' },
{ id: 'agent', label: 'Ask your AI anything', src: '/videos/agent' },
{ id: 'chat', label: 'Fully private chat', src: '/videos/chat' },
{ id: 'notes', label: 'Capture every thought', src: '/videos/notes' },
{ id: 'habits', label: 'Build real habits', src: '/videos/habits' },
];
activeIndex = signal(0);
transitioning = signal(false);
progress = signal(0); // 0–1, drives thin progress bar
showFallback = signal(false); // prefers-reduced-motionA/B Crossfade
Two stacked <video> elements. On scene change:
- Set next video
src, call.load() - Start playing (opacity 0)
transitioning = true→ CSStransition: opacity 300ms easeswaps them- After transition: swap roles,
transitioning = false
No library. Pure CSS transition.
Preload Strategy
When current video reaches 50% playback, inject <link rel="preload" as="video"> for next scene. Next video is in browser cache before crossfade begins.
Browser Compatibility
html
<video autoplay muted playsinline>
<source [src]="scene.src + '.webm'" type="video/webm" />
<source [src]="scene.src + '.mp4'" type="video/mp4" />
</video>WebM VP9 first (Chrome, Firefox, Edge). MP4 H.264 fallback (Safari).
Guards
typescript
// SSR: skip entirely, render static screenshot fallback
if (!isPlatformBrowser(this.platformId)) return;
// Reduced motion: show static phone mockup (existing screenshots)
if (window.matchMedia('(prefers-reduced-motion: reduce)').matches) {
this.showFallback.set(true);
return;
}
// Battery: pause when tab hidden
document.addEventListener('visibilitychange', () =>
document.hidden ? this.pauseAll() : this.resumeCurrent(),
);Phase 6 — Website: Hero Integration
Swap static .phone-screen HTML in lp-hero.component.html for <lp-phone-video />.
Remove: .chat-bubble divs, .phone-cal-card, .phone-input, .phone-status, .phone-header — all replaced by video content.
Keep: .phone shell div (hardware frame), .orbit-ring-1/2 (CSS, now ambient/subtle), .hero-float-cards.
Reduce orbit ring animation speed and opacity once video is the focal point.
Phase 7 — Three.js Enhancements (separate PR)
Independent of Phase 6. Can ship any time after the video player is confirmed working.
Hero neural net
Replace #particle-canvas (custom 2D canvas) with Three.js Points + LineSegments. Adds true z-axis depth and GPU-accelerated rendering. Reuses existing mouse/scroll signal wiring.
Phone orbital rings
Replace CSS .orbit-ring-1/2 with Three.js 3D orbital planes at 40° and 70° inclination. Nodes travel ellipses with z-depth. Rendered on a position: absolute canvas behind the phone div — nodes naturally appear "behind" the phone frame.
Both use dynamic import('three') — single shared chunk, loads only in browser, zero SSR impact.
Dependency Graph
Phase 1 (UITest target in project.yml)
└── Phase 3 (MarketingVideoTests) needs Phase 1 + 2
Phase 2 (iOS infra) ──────────────────────── independent, start now
Phase 5 (website component) ──────────────── independent, start now
Phase 3 (XCUITests) ── needs Phase 1 + 2
Phase 4 (gen-videos.sh) ── needs Phase 3 working + ffmpeg
Phase 6 (hero integration) ── needs Phase 5 + video files from Phase 4
Phase 7 (Three.js) ── fully independent, any timeFile Inventory
New iOS files
| File | Notes |
|---|---|
Sources/Marketing/MarketingEnvironment.swift | #if DEBUG |
Sources/Marketing/MarketingDataSeeder.swift | #if DEBUG |
Sources/Marketing/MarketingMockLLMService.swift | #if DEBUG |
Sources/Marketing/MarketingTapOverlayWindow.swift | #if DEBUG |
UITests/MarketingVideoTests.swift | XCUITest target |
Modified iOS files
| File | Change |
|---|---|
project.yml | Add LucidPalUITests target + scheme entry |
Sources/App/LucidPalApp.swift | makeLLMService() factory + scene nav |
Sources/App/AppDelegate.swift | Wire tap overlay window in #if DEBUG block |
New website files
| File | Notes |
|---|---|
src/app/shared/components/lp-phone-video/lp-phone-video.component.ts | |
src/app/shared/components/lp-phone-video/lp-phone-video.component.html | |
src/app/shared/components/lp-phone-video/lp-phone-video.component.scss | |
public/videos/onboarding.webm + .mp4 | Binary, ~960KB |
public/videos/agent.webm + .mp4 | Binary, ~960KB |
public/videos/chat.webm + .mp4 | Binary, ~960KB |
public/videos/notes.webm + .mp4 | Binary, ~960KB |
public/videos/habits.webm + .mp4 | Binary, ~960KB |
public/videos/live-notes.webm + .mp4 | Binary, ~1.2MB |
Modified website files
| File | Change |
|---|---|
src/app/sections/lp-hero/lp-hero.component.html | Swap phone content for <lp-phone-video /> |
src/app/sections/lp-hero/lp-hero.component.ts | Import component, lighten orbit ring styles |
src/app/sections/lp-hero/lp-hero.component.scss | Reduce orbit ring opacity/speed |
New scripts
| File | Notes |
|---|---|
scripts/gen-videos.sh | Mirrors gen-screenshots.sh |
Current Status
| Phase | Status |
|---|---|
| Phase 1 — UITest target in project.yml | ✅ Done |
| Phase 2 — iOS marketing infrastructure | ✅ Done |
| Phase 3 — MarketingVideoTests.swift | ✅ Done |
| Phase 4 — gen-videos.sh | ✅ Done |
| Phase 5 — lp-phone-video Angular component | ✅ Done |
| Phase 6 — Hero integration | ⬜ Not started |
| Phase 7 — Three.js enhancements | ⬜ Not started |
Branch: feat/llm-control-plan-gemini (active branch when plan was written).
Risks
| Risk | Mitigation |
|---|---|
| XCUITest timing flaky | continueAfterFailure = true; mock LLM makes responses deterministic |
| MP4 malformed on stop | Use SIGINT not SIGTERM to stop recorder |
| Personal data in videos | MarketingDataSeeder wipes real stores, seeds only fake content |
#if DEBUG leaks to prod | All marketing code in #if DEBUG blocks; App Store = Release config |
| Video autoplay blocked | autoplay muted playsinline — muted autoplay is universally allowed |
| SSR plays video on server | isPlatformBrowser guard in afterViewInit |
| Cloudflare Pages file limit | ~6MB total well within 25MB per-file limit |