OpenAI wants to make a walking, talking humanoid robot smarter

Figure’s founder Brett Adcock says a new partnership with OpenAI could help its robots hold conversation and learn from its mistakes over time.
OpenAI is partnering with Figure to help it develop a general purpose humanoid robot capable of working alongside humans and holding conversations. Figure

Share

Just a few years ago, attempts at autonomous, human-shaped bipedal robots were laughable and far-fetched. Two-legged robots competing in high-profile Pentagon challenges famously stumbled and fell their way through obstacle courses like an inebriated pub-crawler while Tesla’s highly-hyped humanoid bot, years later, turned out to be nothing more than a man dancing in a skin-tight bodysuit.

But, despite those gaffs, robotics firms pressed on and now several believe their walking machines could work alongside human manufacturing workers in only a few short years. Figure, one of the more prominent companies in the humanoid robot space, this week told PopSci it raised $675 million in funding from some of the tech industry’s biggest players, including Microsoft, Nvidia, and Amazon founder Jeff Bezos. The company also announced it has struck a new agreement with generative AI giant, OpenAI to “develop next generation AI models for humanoid robots.” The partnership marks one of the most significant examples yet of an AI software company working to integrate its tools into physical robots. 

[ Related: BMW plans to put humanoid robots in a South Carolina factory to do… something ]

Figure Founder and CEO Brett Adcock described the partnership as a “huge milestone for robotics.” Eventually, Adcock hopes the partnership with OpenAI will lead to a robot that can work side-by-side with humans completing tasks and holding a conversation. By working with OpenAI, creators of the world’s most popular large language model, Adcock says Figure will be able to further improve the robot’s “semantic” understanding which should make it more useful in work scenarios. 

“I think it’s getting more clear that this [humanoid robotics] are becoming more and more an engineering problem than it is a research problem,” Adcock said. “Actually being able to build a humanoid [robot] and put it into the world of useful work is actually starting to be possible.” 

Why is OpenAI working with a humanoid robotics company? 

Founded in 2021, Figure is developing a 5 ‘6, 130-pound bipedal “general purpose” robot it claims can lift objects around 45 pounds and walk 2.7 miles per hour. Figure believes its robots could one day help address possible labor shortages in manufacturing jobs and generally “enable the automation of difficult, unsafe, or tedious tasks.” Though it’s unclear just how reliably current humanoid robots can actually execute those types of tasks, Figure recently released a video showing its Figure 01 model slowly walking towards a stack of create, grabbing one with its two hands and loading it into a conveyor belt. The company claims the robot performed the entire job autonomously. 

Figure demonstrates it robot autonomously picking up a crate and loading it onto a converter belt.

Supporters of humanoid-style robots say their bi-pedal form-factor makes them more adept at climbing stairs and navigating uneven or unpredictable ground compared to the more typical wheeled or tracked alternatives. The technology underpinning these types of robots has notably come a long way from the embarrassing stumbles of previous years. Speaking with Wired last year, Figure Chief Technology Officer Jerry Pratt said Figure’s robots could complete the Pentagon’s test course in a quarter of the time it took machines to finish it back in 2015, thanks in part to advances in computer vision technology. Other bipedal robots, like Boston Dynamics’ Atlas, can already perform backflips and chuck large objects.  

Figure says its new “collaboration agreement” with OpenAI will combine OpenAI’s research with it’s own experience in robotics hardware and software. If successful, Figure believes the partnership will enhance its robot’s ability to “process and reason from language.” That ability to understand language and act on it could, in theory, allow the robots to better work alongside a human warehouse worker or take verbal commands. 

“We see a tremendous advantage of having a large language model or multi models model on the robot so that we can interact with it and give what we call ‘semantic understanding,’” Adcock said. 

Over the long-term, Adcock said people interacting with the Figure should be able to speak with the robot in plain language. The robot can then create a list of tasks and complete them autonomously. The partnership with OpenAI could also help the Figure robot self-correct and learn from its past mistakes, which should lead to quicker improvements in tasks. The Figure robot already possesses the ability to speak, Adcock said, and can use its cameras to describe what it “sees” in front of it. It can also describe what may have happened in a given area over a period of time. 

“We’ve always planned to come back to robotics and we see a path with Figure to explore what humanoid robots can achieve when powered by highly capable multimodal models,” Open AI VP of Product and Partnerships Peter Welinder said in a statement sent to PopSci.  

OpenAI and Figure aren’t the only ones trying to integrate language models into human-looking robots. Last year, Elon Musk biography Walter Isaacson wrote an article for Time claiming the Tesla CEO was exploring ways to integrate his company’s improving Optimus humanoid robot and its “Dojo” supercomputer with the goal of creating so-called artificial general intelligence, a term some researchers use to describe a machine capable of performing above human level capability at many tasks. 

Tech giants are betting big on Figure to win out in a brewing humanoid robot race 

Figure hopes the support from OpenAI, in addition to its massive new wave of funding, could speed-up Figure’s timeline for making its product available commercially. The $675 million in funding Figure revealed this week was reportedly over $150 more than the amount it has initially sought, according to Bloomberg. The company says it’s planning to use that capital to scale up its AI training, robotic manufacturing, and add on new engineers. Figure currently has 80 employees. 

But Figure isn’t the only company looking to commercialize humanoid robots. 1X Technologies AS, another humanoid robotics company with significant investment from OpenAI, recently raised $100 million. Oregon-based Agility Robotics, which demonstrated how its robots could perform a variety of simple warehouse tasks autonomously, is reportedly already testing machines in Amazon warehouses. Figure, for its part, recently announced a partnership with BMW to bring the humanoid robot to the carmaker’s Spartanburg, South Carolina manufacturing facility. 

All of these companies are racing to cement their place as an early dominant force in an industry some supporters believe could be a real money-maker in the near-future. In 2022, Goldman Sachs predicted the global humanoid robot market could reach $154 billion by 2035. If that sounds like a lot, it’s a fraction of the $3 trillion financial services company Macquarie estimates the industry could be worth by 2050. That’s roughly the value of Apple today. 

But much still has to happen before any of those lofty visions resemble reality. These still-developing technologies are just now being trialed and tested within major manufacturing facilities. The most impressive of these robots, like the dancing giants produced by Boston Dynamics, remain extremely expensive to manufacture. It’s also still unclear whether or not these robots can, or ever will, be able to respond to complex tasks with the same degree of flexibility as a human worker. 

Generally, it’s still unclear what exact problems these are best suited to solve.  Both Elon Musk and Figure have said their machines could complete assignments too dangerous or unappealing to humans, though what those exact use cases are hasn’t been articulated clearly. BMW, for example, previously told PopSci it was still “investigating concepts,” when asked how it plans to deploy Figure’s robots. Adcock went a step further, suggesting the Figure robot could be used to move sheet metal or perform other body shop tasks. Adcock said Figure has five primary use cases for the robot in the facility in mind that they have not yet publicly announced. 

The issue of what to do with these robots when they are made isn’t unique to Figure. In an interview with PopSci, Carnegie Mellon Department of Mechanical Engineering Associate Professor Ding Zhao called that issue of use-cases the “billion-dollar question.” 

“Generally speaking, we are still exploring the capabilities of humanoid robots, how effectively we can collect data and train them, and how to ensure their safety when they interact with the physical world.” 

Zhao went on to say robots which are intended to work alongside humans will also have to invest heavily in safety, which he argued could even match or exceed development costs. 

The robots themselves need to improve as well, especially in real world work environments that are less predictable and more “messy” than typical robot training facilities. Adcock says the robot’s speed at tasks and ability to handle larger and more diverse types of payloads will also need to increase. But all of those challenges, he argued, can be improved through powerful AI models like the type OpenAI is building. 

“We think we can solve a lot of this with AI systems,” Adcock said. “We really believe here that the future of general purpose robots is through learning, through AI learning.”

 

Win the Holidays with PopSci's Gift Guides

Shopping for, well, anyone? The PopSci team’s holiday gift recommendations mean you’ll never need to buy another last-minute gift card.