CareCamAI-assisted therapy intelligenceProduct Design Lead (0→1 UX)

CareCam:
turning therapy session video into therapist-validated evidence.

CareCam turns therapy session video into structured, therapist-validated reports. I led the 0 to 1 product strategy and UX architecture for a live pilot across five therapy centers, designing the review loop where AI detects patterns but clinicians stay accountable.

CareCam research synthesis and AI-assisted therapy workflow framing
Research synthesis to AI-assisted therapy workflow framing.
Role

Founding Product Design and Strategy Lead

Ownership

Product vision, UX architecture, research synthesis, clinical workflow design, AI review model, and roadmap framing.

Evidence

5 pilot centers, 50+ therapy sessions analyzed, therapist annotation used as ground truth.

Core Product Decision

Validate session intelligence before moving into goal optimization.

01Discover · The Problem

The invisible gap in therapy delivery
observation was rich, documentation was fragile.

Therapists observed rich behavioral evidence live, but by the time notes were written, timing was lost, context was compressed, and detail faded. Two gaps became clear: a Documentation Gap and a Goal Optimization Gap.

Field research included interviews with 5 therapy center owners and clinical leads, workflow observation across ABA and Speech Therapy, and analysis of 50+ live therapy sessions to identify where observation, documentation, and decision quality broke down.

Documentation Gap

Manual notes were retrospective and therapist-dependent, producing inconsistent reporting quality.

Goal Optimization Gap

Goal updates were delayed and often memory-based, so session-level progress signals were missed.

Field visits across therapy centers
Field visits across 5 therapy centers: contextual inquiry in live therapy environments.
02Define · Strategy

A two-phase roadmap built around clinical trust
validation before optimization.

Phase 1 focused on structured session intelligence: converting session video into therapist-validated reports that clinicians could trust. The objective was not automating decisions, but making observation reproducible and defensible.

Phase 2 goal optimization was intentionally sequenced after Phase 1: cross-checking session behaviors with treatment plans, surfacing alignment gaps, and identifying plateau signals only once the behavioral baseline was reliable.

Behavior detection framework

Before model training, a clinically grounded taxonomy was defined: 7 behavior categories and 32 behaviors. ML scope was staged deliberately, prioritizing higher-confidence behaviors first to protect therapist trust during pilot validation.

Critical Product Decision 01

Reduce MVP to the core clinical loop for faster validation and lower delivery complexity risk.

Critical Product Decision 02

Reframe AI strategy around privacy, regional deployment constraints, and model availability reality.

What I Chose Not To Automate

I deliberately kept diagnosis, goal changes, and clinical interpretation outside the automated layer. The product's job was to make session evidence easier to review, not to replace therapist judgment.

03Architect · Workflow

Upload, analyze, review, report
AI prepares the evidence; clinicians make the call.

01

Upload Session Video

Therapist uploads a recorded therapy session to the secure platform.

02

AI Analysis

Platform detects structured behavioral events and generates initial insights.

03

Therapist Review

Clinician reviews, edits, and validates all insights and recommendations.

04

Export & Share

Generate a professional PDF report for parents, insurance, or clinical records.

Therapist Control Model

Accept / Reject / Relabel actions are core clinical controls, not secondary UI actions.

Temporal Evidence Logic

A behavior label without timestamp context is clinically weak; sequence defines interpretation.

CareCam UX architecture
UX architecture from intake to report outputs.
ML visual and audio detection scope
Detection scope: visual + audio channels aligned to clinical taxonomy.
04Design · Validation & Workflow Impact

From memory-based reporting to evidence-based review
documentation moved out of live therapy moments.

Before CareCam
  • • Manual and retrospective note workflows.
  • • In-session note-taking interrupted therapy attention.
  • • Quarterly reporting often reconstructed from memory.
After CareCam
  • • Session became reviewable timestamped evidence.
  • • Therapists stayed present in-session and reviewed post-session.
  • • Reports generated from validated session data.
Ground truth validation process

Therapist manual annotation was used as the benchmark for AI detections. Precision, recall, and F1 variation by behavior category directly influenced interface treatment and uncertainty communication.

Session review interface
Session review interface: clinician validation layer for AI-generated insights.
Manual annotation with AI metrics
Manual annotation vs AI counts with precision/recall/F1 by behavior category.
Independent Build · Skill-based End-to-End

In parallel, I independently scoped and shipped a skill-detection MVP end-to-end: framing, UX architecture, interaction design, and live ML integration. It validated the same core loop quickly: session signals in, clinician-reviewed outputs out.

View full build documentation
Skill-detection exploration
Parallel exploration: skill-detection review surface with evidence timeline.
Skill-detection UI reference
Confidence cues and defensible validation actions in skill-based workflow exploration.
05Deliver · Outcomes

Pilot outcomes and platform signal
usable now, expandable by design.

5

Live pilot centers

Actively uploading session data.

50+

Therapy sessions analyzed

Ground truth compared against therapist annotations.

5 domains

Platform vision

Therapy, nursing, elderly, neonatal, ICU/rehab.

The same loop scales beyond therapy: upload session, review AI evidence, validate clinically, generate report. Taxonomy changes by domain; accountability pattern remains stable.

carecam.in
Report front page
Structured report output for clinical and parent communication.
Report back page
Detailed behavior event records and notes.
NIEPID conference
NIEPID conference presentation and therapist adoption interest.
06Reflect

In clinical AI, interface decisions
are accountability decisions.

Model accuracy is not enough. Uncertainty must be visible where clinician decisions are made.

Therapist controls are not confirmations; they are responsibility boundaries inside high-stakes workflows.

If repeated, trust instrumentation should start earlier: overrides, relabel frequency, skips, and completion behavior.

Design principles
  • • Human-in-the-loop review at every critical decision node.
  • • Temporal clarity over isolated labels.
  • • Progressive disclosure for cognitive control.
  • • Visible system state for trust and accountability.
Core takeaway

CareCam proved that structured, therapist-validated evidence can replace retrospective memory workflows without sacrificing clinical agency.