Welcome to the Robotics and Biology Laboratory showcase at Automatica—where robots figure things out instead of being told what to do.
In this live demonstration, we present an integrated robot platform showcasing three core skills developed across multiple research projects
at the Robotics and Biology Lab, TU Berlin.
You'll see the robot spin a cube using its soft fingers, interact with everyday objects like drawers, and even tackle a lockbox challenge.
Each capability you see here is part of a broader effort to create robots that can leave the structured world of factories and engage with the complexity of real-world environments.
See how a robot hand can manipulate objects by leveraging the natural benefits of soft materials. Soft hands turns contact with the environment into an asset rather than a challenge. By embracing the natural compliance of soft hands, we enable robust and adaptive manipulation that is safe and effective, even in uncertain settings. This design philosophy treats the hand itself as a computational resource, where intelligence emerges from the interaction between morphology and control. Our work explores how to co-design the body and the software—so that the mechanics do part of the thinking, and the robot can focus on acting.
Experience how a robot opens a drawer without any planning. What makes this possible is our novel modeling framework called AICON, which is designed to generate robust and adaptive behavior by telling robots how the world behaves. Providing robots with a physically grounded model of the world enables them to adapt to dynamic environments and circumvent the need for explicit planning and edge-case handling.
Discover how our robot, inspired by nature's cleverest problem-solvers like mice and cockatoos, masters the Lockbox Challenge. This intricate puzzle involves interlocking joints, requiring the robot to strategically interact with it until the final joint is opened. Our innovative approach leverages perception, action, and planning into a tightly integrated system. A perception module locates handles, an action module controls the robot's hand and arm, and a planning module devises an optimal strategy. What sets our system apart are the active interconnections between these modules, dynamically routing information based on the current situation, enabling the robot to solve the lockbox reliably.