It’s a quiet day in Duckietown. Duckie McDuckface is getting ready for work, but instead of waddling down the streets he hops into a local self-driving taxi and tells it where he needs to go. It gets him there quickly and safely, no steering required.
Duckietown is the culmination of a self-driving vehicles class at MIT’s Computer Science and Artificial Intelligence Lab (CSAIL). It was described to students in the syllabus like this:
Though it’s a funny and adorable idea, there’s some very serious science behind it. Each of the robot taxis is equipped with only a single camera, and makes its way around the roads without any preprogrammed maps. That’s different from many of the automated or semi-automated cars in development today, many of which rely on multiple cameras and/or preprogramed maps telling them what route to take.
The students in the class built the Duckiebots and programmed them to recognize traffic signs. A lot of the work was similar to what traffic and auto engineers face in the real world.
“We thought about key problems like integration and co-design,” says Andrea Censi, one of the leaders of the course and a scientist with MIT’s Laboratory for Information & Decision Systems (LIDS). “How do we make sure that systems that developed separately will work together? How do we design systems that maximize performance while sharing resources? It’s a delicate balancing act in weighing the relative importance of different infrastructure elements.”
The Duckietown teaching materials are open source, and the researchers hope that more Duckietowns will be established all over the world.