Body Control - Gesture-based Control System

Full-body tracking and gesture recognition for real-time touchless interactions

Category: foundation
Year: 2024
Body Control - Gesture-based Control System

Overview

Body Control is our full-body tracking and gesture recognition technology that transforms human movement into digital input in real-time. We use computer vision and machine learning to track body poses, hand movements, and gestures, enabling natural and intuitive touchless interactions.

The technology is designed to be deployable on any platform: from public installations on LED walls to web applications accessible via browser, from Unity standalone apps to interactive kiosks. It works with standard webcams or depth cameras (RealSense, Kinect), making it extremely versatile for different contexts.

Why use it? Hygienic installations, immersive experiences, accessibility, gamification, interactive retail. No touch, just movement.

Demo & Proof of Concepts

🎮 Neon Bubble Pop

Try the live demo →

Web-based mini game to demonstrate the potential of full-body control. Pop the bubbles using your hands in front of the webcam and get the best score!

  • Technology: MediaPipe Pose + WebGL
  • Platform: Web (accessible from any browser)
  • Features: Hand tracking, collision detection, score system, real-time feedback
  • Deployment: Hosted on Vercel, scalable and performant

Perfect for showing how body tracking can create engaging gaming experiences without physical controllers.

🔥 Durex Hot Match — OOH Installation

See the full project →

Large-scale interactive installation for public spaces. Multi-user body tracking system with thermal camera effect, matchmaking mechanics based on proximity and "heat" visualization.

  • Technology: Kinect + custom computer vision + Unity
  • Platform: Standalone installation (outdoor LED wall)
  • Features: Multi-user tracking, real-time graphics, weatherproof hardware
  • Deployment: Physical installation in Milan, outdoor-rated monitors

Concrete example of how to deploy the technology on custom hardware for temporary installations in urban environments.

🍔 Feed the Face

Try the live demo →

Head-controlled mini game showcasing face tracking capabilities. Control the game with your head position and mouth opening—eat the food, avoid obstacles, and get the best score!

  • Technology: MediaPipe Face Mesh + WebGL
  • Platform: Web (accessible from any browser)
  • Features: Head position tracking, mouth opening detection, food/obstacle mechanics, score system
  • Deployment: Hosted on Vercel, scalable and performant

Demonstrates how face tracking can power innovative game mechanics based on natural facial movements and expressions.

Deployment Options

🎯 Unity Application

Standalone builds for installations, LED walls, interactive kiosks, and screens of any size.

  • Platforms: Windows, macOS, Linux
  • Output: Standalone executables, no internet connection required
  • Ideal for: Permanent installations, events, trade shows, museums, retail stores
  • Hardware: Any PC with dedicated GPU + webcam or depth camera
  • Scale: From small touchpoint screens to multi-meter LED walls

Use case: Install on a mini-PC connected to an LED wall, the system starts automatically and tracks passersby for public interactions.

🌐 Web Application

Web-based deployment accessible from any modern browser, no installation required.

  • Technologies: WebRTC, WebGL, MediaPipe, Three.js
  • Platforms: Desktop browsers (Chrome, Firefox, Safari), mobile with limitations
  • Ideal for: Quick demos, POCs, distributed experiences, universal accessibility
  • Deployment: Standard hosting (Vercel, Netlify, AWS), no complex backend
  • Latency: <100ms on stable connections

Use case: Shareable link to allow clients or end users to test the technology wherever they are.

📱 Cross-Platform Ready

Same codebase, multiple deployments.

  • Mobile: Android app (iOS with hardware limitations)
  • Embedded: Raspberry Pi 4+, Jetson Nano for compact installations
  • Cloud: Remote processing streaming for lightweight client-side hardware

🖥️ LED Wall & Screen Installations

Optimized for displays of any size:

  • Small screens: Kiosks, tablet stands (10"-32")
  • Medium displays: Retail walls, reception areas (40"-65")
  • Large LED walls: Public installations, events, facades (multi-meter)
  • Multi-display: Multi-screen setups for extended experiences

Hardware considerations:

  • Scalable output resolution (from 1080p to 4K+)
  • Adaptive frame rate (30-60fps depending on GPU)
  • Camera FOV optimized for user distance
  • Automatic calibration for different configurations

Technical Stack

Computer Vision Pipeline

  • Pose Estimation: MediaPipe Pose, OpenPose, BlazePose (33 keypoints full-body)
  • Hand Tracking: MediaPipe Hands (21 landmarks per hand)
  • Face Tracking: MediaPipe Face Mesh (optional, 468 landmarks)
  • Processing: Real-time on GPU (CUDA, Metal, WebGL)

Gesture Recognition

  • Pre-configured gesture library (wave, swipe, point, grab, jump)
  • Custom gesture trainer for application-specific gestures
  • State machine to prevent false triggers
  • Confidence thresholds and temporal smoothing

Hardware Support

  • RGB Cameras: Standard webcams (1080p+ recommended)
  • Depth Cameras: Intel RealSense D400 series, Azure Kinect, Kinect v2
  • Range: 1-5 meters (typical), up to 10m with appropriate lenses
  • Multi-user: Simultaneous tracking up to 6-10 people

Performance

  • Latency: 30-100ms end-to-end (camera → gesture → response)
  • Frame rate: 30-60fps depending on hardware
  • Accuracy: ±5cm position, ±5° orientation in optimal conditions
  • GPU: Minimum GTX 1060 / equivalent, recommended RTX series

Use Cases

Public Installations
Museums, exhibitions, trade shows. Touchless and hygienic interaction, accessible and engaging.

Retail & OOH
Virtual fitting rooms, product exploration, wayfinding, brand activations.

Gaming & Entertainment
Arcade games, party games, performance art, interactive concerts.

Accessibility
Interfaces for users with reduced mobility, alternatives to traditional touch and controllers.

Fitness & Wellness
Exercise tracking, rep counting, form checking without wearables.

Education & Training
Interactive simulations, hands-free training for sterile or hazardous contexts.

Integration & Development

Available SDKs

  • Unity Plugin: Ready-to-use prefabs, complete C# API
  • Unreal Engine: Blueprint support + C++ API
  • Web SDK: JavaScript library for browsers
  • Python SDK: For rapid prototyping and ML experiments

Custom Development

We can integrate Body Control into any existing stack or develop custom solutions from scratch. From POC to production, we manage the entire development cycle.


Body Control — Natural movement, digital control. Deployable anywhere, scalable for any context.