Squirrels have a habit of storing acorns and other nuts in various spots, then patrolling those stashes. But what happens if another opportunistic squirrel shows up to steal the bounty? The stash-owning squirrel fakes out the would-be thief, “checking” fake cache sites to throw the invader off the trail. Now researchers at the Georgia Institute of Technology have endowed robots with the same ability.

GA Tech professor Ronald Arkin and his grad students programmed a similar strategy into some wheeled robots, and the tactic worked–the decepticon deceiving robot lured a “predator” to false locations.

This could have great practical value in military situations, the researchers say. Say a small wheeled robot is guarding an ammunition cache, for instance. It can perceive a threat from a visitor–robot or otherwise–and put on a little deceptive show.

“The robot could change its patrolling strategies to deceive humans or another intelligent machine, buying time until reinforcements are able to arrive,” Arkin said in a statement.

Now imagine a small robot finds itself in an ambush–what should it do? Puff up and show its plumage or quietly shrink away and hope it doesn’t get caught? Apparently deceit is also the key here, as long as it’s used carefully. You don’t want the attacker to call your bluff.

Again, Arkin and his team drew inspiration from the animal world, which offers plenty examples of feigning greater size or strength where it doesn’t exist. This might also be effective for robots, Arkin’s research shows. “Being honest about the robot’s abilities risks capture or destruction. Deception, if used at the right time in the right way, could possibly eliminate or minimize the threat,” Arkin said.

Arkin and his fellow roboticists at Georgia Tech previously taught machines how to deceive one another, teaching one robot how to leave a fake cookie-crumb trail of evidence that can fool a second robot. But this new research might even be more valuable because instead of rolling into hiding, the “prey” or vulnerable robot acts in a deceptive manner, using itself as a decoy. The new research is highlighted in the current issue of IEEE Intelligent Systems.

Georgia Tech