Style Is an Algorithm No one is original anymore, not even you.

Kyle Chayka:

The camera is a small, white, curvilinear monolith on a pedestal. Inside its smooth casing are a microphone, a speaker, and an eye-like lens. After I set it up on a shelf, it tells me to look straight at it and to be sure to smile! The light blinks and then the camera flashes. A head-to-toe picture appears on my phone of a view I’m only used to seeing in large mirrors: me, standing awkwardly in my apartment, wearing a very average weekday outfit. The background is blurred like evidence from a crime scene. It is not a flattering image.
 
 Amazon’s Echo Look, currently available by invitation only but also on eBay, allows you to take hands-free selfies and evaluate your fashion choices. “Now Alexa helps you look your best,” the product description promises. Stand in front of the camera, take photos of two different outfits with the Echo Look, and then select the best ones on your phone’s Echo Look app. Within about a minute, Alexa will tell you which set of clothes looks better, processed by style-analyzing algorithms and some assistance from humans. So I try to find my most stylish outfit, swapping out shirts and pants and then posing stiffly for the camera. I shout, “Alexa, judge me!” but apparently that’s unnecessary.
 
 What I discover from the Style Check™ function is as follows: All-black is better than all-gray. Rolled-up sleeves are better than buttoned at the wrist. Blue jeans are best. Popping your collar is actually good. Each outfit in the comparison receives a percentage out of 100: black clothes score 73 percent against gray clothes at 27 percent, for example. But the explanations given for the scores are indecipherable. “The way you styled those pieces looks better,” the app tells me. “Sizing is better.” How did I style them? Should they be bigger or smaller?