Amazon Echo Look uses a camera and AI to judge your outfit, sell you stuff

'Alexa, how do I look today?'
Amazon Echo Look

Just last month, Amazon introduced a Prime feature called Outfit Compare which let users upload a pair of selfies showing different outfits to be evaluated by style experts. Today, Amazon has doubled down on its push to guide the style of everyday users with its home assistant and fashion guru, the $200 Echo Look, which can currently only be ordered by invitation.

The oblong device is a version of the Echo smart home hub with Alexa, but with the addition of built-in camera meant to help users pick out the best possible outfit. While the introduction of Outfit Compare originally focused heavily on the idea of human “style experts” making fashion calls, the Alexa integration–called Style Check–relies on AI in some part. Amazon says its algorithm will continue to learn based on feedback from users and style experts to make the process more automated.

When you want feedback on an outfit, you can ask Alexa to take a picture or video. The camera is oriented so it can take a full-body image from a reasonably close distance. It then applies a “depth-effect,” which strongly resembles the fake background blur you’ll find in the iPhone 7 Plus Portrait Mode.

From there, users dive into the companion app, in which they can browse a Look Book full of past styles, share photos, and call up fashion advice from the Style Check app.

This style feedback certainly won’t appeal to everyone, especially people whose wardrobes have little variety (who can tell one of my black t-shirts from another?), but equipping Alexa with a robotic eye does open up a lot of potential future functionality in terms of usability, entertainment, and of course, shopping.

We’re talking about Amazon after all, and that puts commerce at the core of all of this technology. Amazon started building object recognition into its ill-fated Fire phone back in 2014 with its Firefly tech. Other virtual assistants, like Samsung’s Bixby, have already demonstrated the ability to recognize objects in the real world and allow users to shop for them. The AI shortens the distance between seeing an object in the world and buying it.

Amazon Echo Look app

The outfit evaluations consider fit, color, and other attributes which Amazon could easily use to recommend you items from its growing apparel shop. That’s not to say that it can’t notice other stuff about users, like changes in body shape, complexion, and any number of other physical traits.

The specific AI recipe driving the algorithm remains a secret, so one wonders how whether Amazon might use the Echo Look to undercut its e-commerce competition. If Alexa can recognize clothes from Target, for example, could it recommend instead that you swap it out for something from its own service?

From a security standpoint, Echo Look users will have to make peace with putting a camera and a microphone in their bedrooms, connecting it to the internet, and giving Amazon the ability to get some data from it. Of course, this isn’t an entirely new concept, either: Nest cameras and smart TVs have done similar things for a while, but Amazon has already said it will permanently store images and videos that can be managed through the Alexa Look app.

Adding cameras to virtual assistants is an obvious step forward and it seems very likely that the other AI butlers of the world will put more and more emphasis on being able to see us in addition to hearing us. Let’s just hope they have a good sense of style.