Amazon’s Alexa may be the doyenne of digital assistants, but she isn’t queen.
Alexa’s vast talents, evidenced by the myriad integrations she has with everything from televisions to thermostats to microwaves and more, is a big reason for her anointment by media and consumers alike as the best digital assistant. In this writer’s technological life, however, Alexa plays only a bit part. I use Alexa exclusively in the kitchen, where the aforementioned Amazon Basics microwave and the Echo Wall Clock are both paired with an old Echo Dot that was previously collecting dust in my office. The ability to ask Alexa to start the microwave and set timers on the clock with the big LED indicators are huge accessibility wins for me when cooking—these admittedly niche use cases fly in the face of those not older and wiser who have derided Amazon for using Alexa to make literally cheap and so-called “gimmicky” home products. After all, best not to sneer at the fact not everyone can easily use the keypad on a microwave.
Such gimmickry has found its way into Amazon’s wearables, including its $270 Echo Frames glasses. Amazon bills the glasses as helping you “save time so you can focus on what matters most” by leveraging Alexa to control phone and music, smart home devices, and more. The premise here is Amazon obviously and rightfully is pushing people to allow Alexa to be the center of their digital universe.
And therein lies the problem: Alexa has no sovereignty over my life.
Amazon sent me pair of Echo Frames (the sunglasses version) earlier this year, and I’ve spent the last several months using them. In terms of niceness and build quality, they’re on par with the microwave and wall clock. Unassuming and, in all honesty, unexciting—yet wholly serviceable for its intended functionality. In other words, Chanel or Gucci they are not, but they nonetheless get the job done.
At a practical level, the Echo Frames are far more interesting from a accessibility standpoint than for what they can actually do. Conceptually speaking, the hands-free nature is seemingly a boon for people with disabilities, particularly those for whom interacting with traditional screens is difficult or downright impossible. The ability to discreetly ask Alexa to play music or put a pound of butter on your grocery list goes beyond sheer convenience; using one’s voice to manage these tasks can prove—for people with certain needs and tolerances—to be eminently more accessible than even using conventional motor-based accessibility software.
The same can be said for Meta’s Ray-Ban Stories, which essentially does the same things as the Echo Frames but with the obvious emphasis on Meta’s family of services. Meta sent me a pair of Stories for testing more recently, a couple of months ago, and I’ve found them to be practically identical to the Echo Frames in nearly every way imaginable. Both look roughly the same, they more or less do the same jobs, and charge similarly. They’re so similar, in fact, that it’s been hard at times to tell which glasses I happened to pick up on the way out of the house. In Meta’s case, the big signifier is the Ray-Ban logo on the side of the temple.
The Stories have a camera with which you can take photos, but I have yet to test the feature. Like Amazon, Meta positions the glasses as a hands-free way to interact with technology that helps you stay in the moment with other human beings. Again, it’s fascinating to consider the accessibility implications of these products: the convenience and fashion factors play second fiddle to the questions of how people use technology. The hands-free nature of both devices, and particularly the fact they’re worn on the face, opens up a world of discussion about how the next frontier of wearable technologies will enable access for disabled people over the next decade. The possibilities in this space are tantalizing.
The biggest question is whether the Echo Frames and Ray-Ban Stories are accessible. As ever with accessibility, the answer lies in one’s needs and tolerances—and where your allegiances land. If you’re Blind or low vision, for instance, it’s fair to wonder if either will really work without robust screen reader-like support. Being so centered on voice-first interaction is inclusionary in many respects, but can be exclusionary if your speech patterns veer far from typicality. Likewise, neither are particularly pragmatic if you’re not married to an ecosystem. The Echo Frames’ value proposition rises exponentially if you’re all-in on Alexa; as someone who’s all-in on Apple products, especially HomeKit, that Alexa serves only a esoteric function for me limits the appealability of the Echo Frames.
As an ardent supporter of wearable tech by way of Apple Watch and AirPods, using Echo Frames and Stories interchangeably has led to an interesting realization. Perhaps one reason both devices lack true functional appeal in terms of capability is because I always forget to charge the things. To me, they’re much fancier versions of the inexpensive drugstore sunglasses I’ve used for years. I don’t think of Echo Frames in the same way I do my Apple Watch: a computer that needs to be be powered to have utility. The Echo Frames (and Stories) are mere dumb sunglasses—something to keep the sun out of my eyes when I’m outdoors. I clearly haven’t internalized wearing a computer on my face, which is what these devices effectively are. Prior to Apple Watch, I never wore a watch either.
All this is to say, although Echo Frames and Stories have limited appeal to me (and surely scores of other people) today, both feel like immense harbingers of tomorrow. It’s still very much early days for face-worn technology, so it’s hard to fault Amazon and Meta for dipping their toes in the water. Products like the Echo Frames and Stories do well now to keep the sun away, but it isn’t hard to imagine the day will soon come when they do a lot more—and accessibly at that.