A bimanual robot controlled by a new artificial intelligence system responds to real-time tactile feedback so precisely that it can pick up individual Pringles chips without breaking them. Despite the delicacy required for such a feat, the AI program’s methodology allows it to learn specific tasks solely through simulated scenarios in just a couple of hours.
Researchers at University of Bristol’s Bristol Robotics Laboratory detailed their new “Bi-Touch” system in a new paper published on August 23 via IEEE Robotics and Automation Letters. In their review, the team highlights how their AI directs its pair of robotic limbs to “solve tasks even under unexpected perturbations and manipulate delicate objects in a gentle way,” lead author and engineering professor Yijiong Lin said in a statement on Thursday.
What makes the team’s advancements so promising is its leveraging of two robotic arms, versus a single limb as usually seen in most tactile robotic projects. Despite doubling the number of limbs, however, training only takes just a few hours. To accomplish this, researchers first train their AI in a simulation environment, then apply the finalized Bi-Touch system to their physical robot arms.
[Related: This agile robotic hand can handle objects just by touch.]
“With our Bi-Touch system, we can easily train AI agents in a virtual world within a couple of hours to achieve bimanual tasks that are tailored towards the touch,” Lin continued. “And more importantly, we can directly apply these agents from the virtual world to the real world without further training.”
Bi-Touch system’s success is owed to its reliance on Deep Reinforcement Learning (Deep-RL), in which robots attempt tasks through copious trial-and-error experimentation. When successful, researchers give AI a “reward” note, much like when training a pet. Over time, the AI learns the best steps to achieve its given goal—in this case, using the two limbs each capped with a single, soft pad to pick up and maneuver objects such as foam brain mold, a plastic apple, and an individual Pringles chip. With no visual inputs, the Bi-Touch system only relies on proprioceptive feedback such as force, physical positioning, and self-movement.
The team hopes that their new Bi-Touch system could one day deploy in industries such as fruit-picking, domestic services, and potentially even integrate into artificial limbs to recreate touch sensations. According to researchers, the Bi-Touch system’s utilization of “affordable software and hardware,” coupled with the impending open-source release of its coding, ensures additional teams around the world can experiment and adapt the program to their goals.