Walk the halls of the Consumer Electronics Show, or browse gadgets online, and you might hear that a gizmo has, or uses, AI. Artificial intelligence is a broad, catchall term, and so it can be hard to know what it actually means for a product to have AI. Does it mean you can talk to it, and it talks back? Can make decisions on its own? Is going to lead a robot army to harvest the organs of everyone you know?
The powerful technology is also becoming ubiquitous enough that’s it’s common to see it employed, and touted by, small companies you haven’t heard of, as opposed to just the big players, like Amazon or Google. Plus, companies that make gadgets that connect to a voice assistant, like Alexa, may use that as reason enough to call their product “smart.”
Here’s how to make sense of it all.
A key point to understand is that artificial intelligence isn’t just synonymous with a voice assistant. Those voices, like Alexa, make use of AI, to be sure—but there’s much more going on in the world of artificial intelligence.
Under the umbrella of AI is the large, dynamic field of machine learning. Frequently, when you encounter artificial intelligence in a product, it’s because it’s employing machine learning under the hood to do something, make a decision, or both. At its simplest, machine learning involves engineers feeding data into software, which then learns from it. The resulting algorithms can accomplish different tasks.
Here’s an example: Danish company Jabra announced their latest headphones at CES, the Elite 85h. They advertise the new $299 ‘phones as using “AI technology,” and they do, in the form of machine learning. They’re not “artificially intelligent” in the sense that they can read your mind and start talking to you, but the way they make use of AI is indeed smart.
Perhaps predictably, they call the feature in question SmartSound. “It listens to the environment the user is in,” says Fred Lilliehook, a senior product marketing manager at Jabra. “It automatically adapts the audio experience.”
If you’re on a bus, it can recognize that sound signature, and then put the headphones in their “Commute” mode, meaning that active noise canceling kicks in. In a public space, like a sidewalk, the headphones switch into a mode called “In Public,” which triggers a feature called “HearThrough” that uses the mics—it has eight in total—to amplify the sidewalk sounds.
But first, the headphones had to learn how to do this. For that, they relied on a company it partially owns, called Audeering. “They have developed 6000 different sound characteristics that they use to analyze sound scenes,” says Lilliehook. “That means they can identify: what does a restaurant sound like? What does a busy street sound like? What does a quiet office sound like?” To be able to do that, they had to first build up a “huge library” of sounds, he says.
It’s data like that that engineers need to train machine learning systems to do something.
In short, these headphones use, or “have,” artificial intelligence because machine learning has allowed them to be contextually aware and then switch the headphone mode to the correct setting dynamically. (Actually, it’s relying on the app you pair them with to make this happen.) It’s AI, but it’s limited by what Jabra and Audeering program it to do: it’s not going to learn Spanish on its own, or start adapting to an environment in a new way, unless it’s programmed to do so.
The headphones also let you summon Alexa, Amazon’s voice service, as well as the default digital assistant on your phone, like Siri or the Google Assistant. So it can also put you in touch with another AI-related tech that lives outside of the Jabra ecosystem. This kind of integration can be all a company needs to designate a product as “smart”—another tech buzzword that’s loosely connected to the idea of artificial intelligence.
Of course, Jabra is not the only company making a product with AI. “When I walk around, I see AI everywhere,” Lilliehook says, referring to the Consumer Electronics Show.
A video doorbell from Kasa, for example, has an artificial-intelligence feature. “The quad core processor and the AI engine provides face detection,” the company’s press release boasts. A company called Synamedia is using artificial intelligence and machine learning to look for patterns that indicate someone is sharing their login credentials with someone for a streaming service. Lululab offers an “AI skincare assistant.” The list goes on. Companies use it because the term is trendy, but under the hood, machine learning is powerful in its own right.
The next time you hear that a product has AI, think: it’s probably using machine learning to do something specific that it’s been trained to do. And of course, it happens outside of the hardware world: Another example is Yelp, which uses artificial intelligence to categorize the photos people upload and can tell the difference between food types like burritos, pizza, and hot dogs.
Finally, at CES, there’s actually a bike that has Alexa in it. So is it an artificially intelligent bike? Nope. Think of it instead as an ebike that can access a voice assistant, which in turn uses AI to do its job.