2024 · Designer & Developer
GestureSketch
A webcam-driven mid-air drawing tool for expressive therapy
The Problem
Conventional drawing apps require styluses, menus, and technical confidence. For therapy clients seeking expressive creative outlets, this friction kills the flow before it starts. GestureSketch strips the interface down to a single camera and a set of natural hand gestures.
Role
Full-stack designer and developer — research, interaction design, prototyping, and implementation.
Timeline
6 weeks
Tools & Methods
- MediaPipe (hand landmark detection)
- p5.js (real-time stroke rendering)
- Figma
- SVG export
Interaction Model
I designed a three-gesture vocabulary that maps to natural hand movements: index finger + thumb pinch to draw, thumb + ring finger to cycle colors, thumb + pinky to undo. Button backups for undo and clear sit at screen edges as a confidence-building safety net for new users.
Key Design Decisions
Dead-zones around each gesture trigger to absorb jitter from natural hand movement
Immutable strokes — each stroke stores its own size and color, preventing accidental overrides
Minimal UI — no toolbars, no menus, nothing that interrupts creative flow
SVG export for persistent, shareable creations
LLM prompts deferred to v2 after tuning issues — shipped a clean, fast core first
Therapist Validation
“An art therapist reviewed the prototype and praised its ability to bridge the digital/human connection in online therapy sessions. The naturalness of mid-air gestures made the tool feel less like software and more like an expressive medium.”
Technical Architecture
MediaPipe hand-landmark model running in-browser at 60fps
p5.js canvas for real-time stroke rendering
Gesture buffer system with configurable dead-zones to prevent false triggers
Per-stroke metadata storage (color, size, timestamp)
Outcomes