Robotics with Sensing is an advanced interactive environment designed to explore human-machine collaboration in hybrid physical-digital spaces. Built for a reconfigurable research lab equipped with an array of mobile robot arms carrying 4K displays, the project integrates software and hardware into a unified platform that enables natural and dynamic interaction across modalities.
The system allows developers to orchestrate complex spatial applications by coordinating screen movement, positioning, and content display through a distributed control layer. Interactions are made seamless with support for multi-modal inputs including voice commands, touch interfaces, hand gestures, and computer vision. This setup transforms the space into an intelligent interface—capable of responding to and anticipating user actions in real time.
The platform demonstrates the future of immersive collaboration spaces—merging robotics, AI, and human input into a single responsive system that supports fluid information analysis, remote collaboration, and experiential computing.
Key components:
- Robot Control Orchestration
- Computer Vision Pipelines (gesture recognition and spatial awareness)
- Voice and Speech Interfaces
- Modular Application Framework
- Sensor Fusion
- SDKs