Signsight Interactive Hub

Signsight Interactive Hub is a gesture-based learning platform built on the Signsight system. Designed for families using both spoken and signed languages, it breaks sign language into visual, trackable components through real-time motion feedback. The platform helps users learn gestures, practice together, and understand communication in shared space.

Category:

Product Design
UIUX Design
Interaction Design

Tools:

SolidWorks, Rhino
Adobe Creative
Touch designer

Core Technology:

Built with MediaPipe, this system tracks hand gestures and body movements in real time, enabling seamless, non-verbal interaction through sign language.

HMW
How can we build shared understanding through sign language?
Especially—how can learners grasp its spatial grammar through interactive, visual, and immersive experiences?
Final Direction
Develop an interactive app within the Signsight system
We shifted the concept from an app to a projection-based interactive hubdesigned to run alongside Signsight Product System. This allowed us to create a shared, ambient learning experience—where families could see, move, and learn together in real-time.
Final Design
Signsight Interactive Hub
An app built on the Signsight smart projector, it functions as a visual detection system using MediaPipe. It captures sign language and facilitates interaction, serving as an interactive center for families using mixed languages, helping to bridge the communication gap between parents and children.

(01).

Control Through Sign or Mobile remote

With projection as the interface and gesture tracking built in, we added something new:Use signs to control the system.
Adjust settings with your phone—or your hands.

(02).

MediaPipe Breakdown

MediaPipe doesn’t just track gestures—it breaks them down into exactly the five components that form the foundation of sign language: Handshape, Orientation, Movement, Location, Facial Expression (non-manual signals)

Show case 01.

Use Case 1: Deconstruct Learning

You can activate the auto-tracking system. and you type in  “I love you,” it will show you an example. You can play, pause, and see which components form the sign. For deeper understanding, click the button to see detailed breakdowns of each gesture.this is how the frame learning works.

Show case 02.

Use Case 2: Reordering Language Syntax

This module addresses the difference between sign language and spoken language order, helping you structure your signs correctly.

For instance, if you type “I went to the library yesterday ,” the app will break it down into individual signs and reorder them according to sign language syntax. You can tap each sign to see how it’s performed and then replay them together. The app also provides hints, like “fingerspell A” or “move from mouth to ear.”

The most brutalist and efficient library
A Webflow library infused with the brutalist way
Just drag, drop and make your first MRR faster
Assets for Webflow builders.