PROJECT

Dance

Supervisor: Madhuka de Silva

PhD Candidate

Technology Used

C++

Javascript

AI/ML

Python

Dance, as an art form, is traditionally taught through visual cues, where dancers learn by watching and mirroring instructors to master posture, timing, and spatial awareness. The reliance on visual feedback—such as mirrors and body positions—provides essential real-time adjustments, helping dancers refine their technique and choreography. However, this visual approach creates significant barriers for individuals who are blind or have low vision, as they cannot fully engage with or benefit from these methods. 

Project Dance addresses this challenge by utilizing innovative technologies like body tracking and gesture recognition through applications like MediaPipe and TensorFlow to translate dance movements into non-visual modalities such as sonification and haptics. Our goal is to make dance accessible to everyone, allowing individuals of all abilities to experience the joy of movement through an inclusive, technology-driven platform

MEET OUR PROJECT MEMBERS