IBM Predicts: Cognitive Computers That Feel And Smell, Within The Next Five Years
The computing giant's annual list of technology predictions for the next five years foresee computers that can taste, see, smell, hear, and touch.
At the end of each year, IBM releases its “5 in 5”–five technology predictions that IBM researchers foresee coming to fruition within the coming five years. These predictions are based on everything from emerging market trends to cultural and social behaviors to actual technologies IBM has incubating in its many labs. And if this year’s predictions are to be believed, many computational systems–from your tablet and laptop to your smartphone–are about to get a lot more sensory, learning to see, hear, touch, taste, and smell in their own digital ways.
Welcome to the era of cognitive systems, IBM’s researchers say. “Cognitive computing systems will help us see through complexity, keep up with the speed of information, make more informed decisions, improve our health and standard of living, enrich our lives and break down all kinds of barriers—including geographic distance, language, cost and inaccessibility,” the company says in a press release.
How? By mimicking the senses. IBM predicts that things like computer vision will revolutionize computing, particularly through health care where images like MRIs and CT scans won’t just be used by individual doctors to diagnose specific patients, but to find trends and meaning within huge volumes of medical image data. Where sound is concerned, IBM believes distributed sensor systems will begin to capture and analyze sound in new and meaningful ways (by assigning relevance to the inaudible characteristics of sounds waves, for instance) to do all kinds of things, from testing materials for weak spots to deciphering baby talk (no joke). Likewise, computers will have a sense of smell. Computers like your smartphone will be able to diagnose illnesses based on biomarkers on your breath, helping to aggregate epidemiological data and keep health authorities out in front of outbreaks.
Perhaps most interesting, though, are IBM’s visions of computers that can taste and feel. Where food is concerned, IBM more or less predicts the end of the chef who creates flavor pairings by intuition. IBM is already working on a system that “experiences” flavor compounds and uses that data to create flavor pairings and recipes at a very fundamental level, based on both food chemistry and human psychology. “In five years a computer system will know what I like to eat better than I do,” says Dr. Lav Varshney, research scientist in IBM’s Services Research branch.
And then there’s the sense of touch, which IBM thinks will become something we experience through our smartphone screens. Using specially tuned vibrations, it is already possible to create the sensation of textures that aren’t there. What we lack is a “dictionary of textures,” a kind of lexicon of vibrational patterns that allow us to generate virtually any texture sensation that we want. IBM predicts that we will create this–and is in fact working on doing so–and that it will create a whole new online experience. Think: the ability to feel the texture of a shirt you are shopping for online through the screen of your smartphone.
If any of this sounds far-fetched, it is. And IBM’s track record at predicting the future isn’t flawless. In its own accounting of its success, there are years (particularly 2011 and 2009) where its predictions have not yet borne fruit–though the five-year clock hasn’t run out on those predictions yet. Others, like real-time speech translation (2006), near-field communication payment technology for cell phones (2007), driver-assist technologies like self-parking and voice-activated commands (2007), and consumer market mind-reading devices (2011) have all proven true to some degree.
Judging purely from IBM’s previous record, at least some of the technologies described above should be on the five-year horizon. Siri suddenly seems quaint by comparison.