The military wants their robots to be better listeners

Getting soliders and bots to work together more seamlessly involves giving the machines more autonomy, and eventually having them understand natural language.
Soldiers work on a robot
West Point cadets and a robotic ground vehicle in April. Eric Bartelt / DOD

Share

It is hard to shout at a robot and get it to obey the same way you might holler at Alexa, but the military is working on it.

Presently, humans pilot military robots into action. This is true for remotely guided drones and for ground vehicles, which are all steered by hand-held tablets. Autonomous features, from navigation to human-set GPS waypoints, help streamline this process, though issuing those commands still requires that a soldier enter them in a tablet or computer so the robot can understand.

To more seamlessly integrate robots into the routines of warfare, the Army is testing a range of robot-human communication tools. In late July, at a campus north of Baltimore, the Robotics Research Center (RRC) at West Point, together with another Army lab and command, tested methods that could make it easier for soldiers to communicate with robots—through tablets and beyond.

At present, remote piloting a robot, even by tablet, is labor-intensive. The soldier has to actively guide it in real time, avoiding pitfalls and paying full attention. Reducing that burden could come in two ways. First, the military could make the robots follow tablet-given instructions more autonomously, demanding less of the soldier’s time in battle. And eventually, with research done to improve robot ability to understand and act on human language, soldiers could do away with the tablets entirely, reducing the burden of commanding robots to the same as that of directing human troops.

This is not an easy task, in part because combat makes communication hard. Soldiers speak softly to each other when they need to conceal their position, and yell over the din of battle when they absolutely need to be heard. Orders shouted by officers are hopefully understood by those in earshot, who follow as best they can. 

[Related: These augmented-reality goggles let soldiers see through vehicle walls]

“The ultimate goal of the RRC’s research is for a squad or platoon to task teams of aerial, wheeled and legged robots the same way they would their Soldiers – no Linux command line required,” Daniel Gonzalez, postdoctoral fellow at RRC in the Department of Electrical Engineering and Computer Science, said in a release.

Using the tablet is a bridge to that future. In the exercise, the cadets trained with a quadcopter and a wheeled robot, connecting to phones, tablets, and a mobile server.

The robots moved autonomously and labelled objects in place using programming from the Army Research Laboratory. Then they shared the information they gathered with the soldiers on the tablets. This work builds on previous DARPA research to test swarming and mapping in urban settings. In 2019, the OFFSET program (for OFFensive Swarm-Enabled Tactics) tested swarms at a mock city training facility in Fort Benning, Georgia. The swarms would fly out to map an area by tracing a perimeter of a given building.

For this exercise, the humans and robots were tasked with locating an injured person in an urban setting. This is a task with applications far beyond the military, and one vital in the kinds of battlefields the military anticipates over the next several decades. 

“This is a critical capability as we seek to better understand how soldiers will interact with these systems and how they will aid missions,” said ARL researcher Stephen Nogar, in a release. “While the mission was a success, there is work to do to improve the reliability and coordination of the behaviors.”

Some of that work will likely be done by the Robotics Research Center, which will continue to refine the autonomous code. Other work might be done in conjunction with Army Combat Capabilities Development Command, or DEVCOM. 

[Related: How do you make AI trustworthy? Here’s the Pentagon’s plan.]

In fact, DEVCOM researcher Claire Bonial has been working on natural language understanding, or how machines understand the ways humans talk, for a decade. In a research paper published this year, Bonial and her co-authors worked on a way for robots to first understand human language, and then try to understand it in the context of human phrases.

This part of the processing is vital if robots have to understand commands issued as words instead of instructions given as direct mechanical inputs in a computer. The way a person might say “wait and then attack” puts the emphasis on paying attention to an environment before following through, but a robot given a wait command without the ability to reason through when that wait needs to change to attack becomes a liability instead of an asset in action.

“We are optimistic that the deeper semantic representation will provide the structure needed for superior grounding of the language in both the conversational and physical environment such that robots can communicate and act more as teammates to Soldiers, as opposed to tools,” Bonial said in a release.

This research, from the cadets training for a search and rescue to Bonial’s efforts at language understanding, will have its payoff in the future, with robots listening to and acting upon human commands expressed in language. In the meantime, the military will continue to adopt robots, incorporate their commands as tablet functions, and adjust to piloted machines on the battlefield.

The ultimate vision, that of human-robot cooperation, will have to wait until the machines can listen in on briefings, or at least listen in on orders. And then, when the order comes, as bluntly as it might under fire, the robots and humans together can seamlessly spring into action, working towards the same purpose and able to communicate through words, instead of just tablet commands.