A new app can automatically tag your smartphone photos with a wide range of attributes, picking out not only the people but the context of the picture, including emotions, weather conditions and type of activity.
Dubbed TagSense, the new app was developed by students from Duke University and the University of South Carolina who combined smartphones’ many sensors into one all-encompassing tag suite. The technique goes way beyond GPS technology to recreate a photo’s location and context.
A phone’s built-in accelerometer could tell whether a person is standing still, dancing or engaged in some other activity, according to a Duke news release. Light sensors in the phone, normally used to dim or brighten a display screen, can be used to tell whether the picture is inside our outside; weather conditions can be checked against the phone’s location; and it can even use a microphone to tell whether the subject of the photo is laughing or quiet.
All these attributes are assigned to each picture, and a user can search according to various categories, the news release says.
“So, for example, if you’ve taken a bunch of photographs at a party, it would be easy at a later date to search for just photographs of happy people dancing,” said Chuan Qin, a visiting graduate student from the University of South Carolina.
The students tested the system using eight Google Nexus One mobile phones, snapping more than 200 photos at various spots on the Duke campus, and found it was more sophisticated than Google’s Picasa or Apple’s iPhoto tagging systems, according to Romit Roy Choudhury, assistant professor of electrical and computer engineering at Duke.
The team unveiled the app at the ninth Association for Computing Machinery’s International Conference on Mobile Systems, Applications and Services (MobiSys), being held in Washington, D.C.