AI photo
SHARE

Read a robot a story, and you will be known as a weirdo. Teach a robot to read, and you might prevent a robot apocalypse.

At least that’s the quixotic idea behind Quixote, a technique developed by Mark Riedl, director of the Entertainment Intelligence Lab at the Georgia Institute of Technology. Building on previous research with Scheherazade, the Quixote technique shows robots stories which demonstrate normal or accepted behavior (helping others, being punctual, not wiping out the human race). The stories are crowdsourced from humans on the internet who chose the correct or socially accepted actions for a character, just like a “choose your own adventure” novel.

Then the technique assigns ‘rewards’ (basically robotic gold stars) to artificial intelligences that then make decisions that align with the good behavior in stories. They assign a negative reinforcement to behavior that doesn’t align with the stories’ morals.

“We believe that AI has to be enculturated to adopt the values of a particular society, and in doing so, it will strive to avoid unacceptable behavior,” Riedl said. “Giving robots the ability to read and understand our stories may be the most expedient means in the absence of a human user manual.”

Stories are also valuable to AI in other ways. Companies are exploring using children’s stories to teach AI how to parse language in order to make them better virtual assistants. So from ethics to office work, basic stories could help robots fit in to our world instead of destroying it.

And they all lived happily ever after.