Robotics Arms Orchestration
RoboticsAn advanced interactive environment designed to explore the frontier of human-machine collaboration in hybrid physical-digital spaces. The project integrates 15 robot arms, each carrying a 4K display, with a unified software platform that enables natural and dynamic interaction across modalities.
The orchestration system coordinates the precise real-time movement, positioning, and content synchronization of all 15 robotic displays simultaneously, transforming a physical space into a dynamic, responsive canvas. The software stack handles inverse kinematics computation, collision avoidance between arms, smooth trajectory planning with bezier curve interpolation, and frame-perfect display synchronization all with sub-10ms latency from intent to physical motion.
The architecture is built around a real-time event bus that connects perception systems (depth cameras, LiDAR, presence sensors) to the motion planning pipeline and content rendering engine. Users can interact with the installation through gestures, voice commands, touch, and spatial presence, and the system responds by reconfiguring the physical layout of displays in three-dimensional space. Use cases range from immersive data visualization and collaborative design review to artistic installations and accessibility-focused interaction research. The entire system is reconfigurable through a visual choreography editor, allowing researchers to design complex multi-robot behaviors without writing code.