It’s not hard to notice when your co-worker is grouchy, your friend is exhausted, or your boss is overjoyed. Without recognizing it, we easily pick up on other people’s emotions by registering certain behavioral cues. In turn, we understand whether we need to back off, lend a helping hand, or, in the case of the boss, ask for a raise. Now comes the question: If we can do this, then why not computers? Why not robots? Indeed, by picking up on some of these same emotional traits, robots today are learning to act more naturally around their human counterparts.
A team of researchers at the University of California, Davis, is programming “follower robots,” to react more accurately in their surroundings by first registering their human leader’s behavior. The goal is to use actions occurring in the present to approximate future actions and prepare for them. In this case, the action is movement—where the leader is going next—a characteristic the researchers say is essential for the successful incorporation of robots into the workforce.
Previously, follower robots were equipped with cameras to help them track and follow the movements of the human (or another robot) leader. Like a high-speed car chase, staying with the leader became difficult when it came to turns and zigzags. With the addition of a behavior-cue controller, robots are no longer entirely dependent on visual signals. The new system, which combines cameras and the controller, continually predicts the leader’s next move and directs the robot to that position.
While it is no Freud, a robot can pick up on a slew of different human behaviors that correspond to movement. The cues can be overtly physical: If the leader points or gestures in a certain direction, for example, the robot will know the leader plans to move left, right or straight and move itself accordingly. Cues can also be subconscious. Follower robots can pick up on behavior that signals emotional states like tiredness or sadness, which indicate slower movement. Conversely, signals of stress or excitement can indicate quicker movement. The technology also capitalizes on a computerized robot’s ability to pick up on the subtlest of signs. As has been shown in studies on human walking, a person has the tendency to turn his head up to 25 degrees about 200 milliseconds before turning. A robot can detect such a change, saving itself from a fender bender against a wall or from losing its leader entirely.
Robots are increasingly working side-by-side with humans in a variety of different working environments, everywhere from offices to hospitals to airports. Just like any new employee, they are under pressure to be top performers. As researchers point out, organizations welcome new technology when it is helpful and user-friendly, but are quick to reject it when it is not. While, sadly, we can’t expect robots to automatically fetch us coffee when we are tired or dole out massages for stress anytime in the near future, scientists hope to add more behavioral cues to the robot repertoire as research progresses, so that they may improve upon other tasks beyond simple following. Considering rapid advances in technology, maybe the robot will be getting that raise before you do.