The Smart Glasses Overload
I am currently wearing a pair of smart glasses called the Even Realities G2. Another two pairs, from Rokid, sit on my desk. A few feet away, I have the Meta Ray-Ban Display charging alongside their Neural Wristband. In my closet are six pairs of $50 smart sunnies sent by an overzealous Walmart representative. Those sit next to some Xreal, RayNeo, and Lucyd glasses, plus an old pair of Razer Anzu. Later, I am calling my optician because I hope to test a pair of the new Ray-Ban Meta Optics, which can supposedly handle my challenging prescription. I am drowning in smart eyewear, and even more is on the horizon.
Right now, it is difficult to tell these devices apart. Not only do they look alike, but most are similarly unsubtle in their attempts to stick AI on your face. They are loaded with promises about how wearable AI can change your life: It will make you healthier by tracking what you eat, make you smarter by capturing notes on every word you utter, and make you more creative by transforming your surroundings into playlists and date ideas. However, after a year of testing, I have yet to see anything that lives up to those promises. If the smart glasses category is going to succeed, it needs a better story for why they should stay on your face all day.
The Spy Fantasy vs. Reality
Regardless of what model I put on in the morning, modern smart glasses make me feel like James Bond. I can walk around the neighborhood wearing a pair of chunky Ray-Bans, listen to my audiobook, and see my texts without pulling out my phone. If I feel like getting a coffee, I can put in the name of a local cafe and get directions. No one looking at me would know. That is doubly true when the glasses come with cameras or gesture-based accessories, like the Even Realities G2 and Meta Ray-Ban Display. Secretly controlling an invisible display that only I can see is incredibly cool. Capturing my cat antics without him knowing makes me feel like a wildlife documentarian. I have never felt more hip than when I was walking down a Williamsburg street last summer, wearing a pair of Oakley Meta HSTN. The most stylish man in Brooklyn stopped me to ask about the glasses and my experience. However, I have also never felt less like a good citizen than when I unintentionally recorded a florist while testing the Meta Ray-Ban Display.
Good modern smart glasses are defined by how much you can get away with. It is good if no one clocks them, making them stylish and versatile enough for everyday wear. It is good if you have a fancier model that does not require you to speak AI voice commands aloud, making you less conspicuous while still getting benefits. Even Realities G2 glasses can be controlled by tapping on the side of an accompanying smart ring. I could be looking at a teleprompter on the G2 display, and someone standing in front of me would be none the wiser. When I was at my local LensCrafters getting fitted for the Nuance Audio, a pair of glasses that double as over-the-counter hearing aids, the optician asked if I was ready to be a superspy because I would be able to hear all the good gossip from across the room. In reality, good gossip comes straight to your DMs, and I mostly just hear tinny garbling.
There is a reason spies operate incognito. Recognizability is a threat when you are wearing one of these devices, for you and the people around you. In public bathrooms, I now worry about making others uncomfortable. I am not a creep, but strangers do not know that. When I occasionally wear camera glasses to a concert or show, I wonder how long I will be able to do so before venues start banning them. Cruises and courtrooms already have. On one hand, I got okay-ish Stray Kids concert footage last year. On the other hand, will Patti LuPone stop her next Broadway show to berate me if the glasses accidentally turn on, their LED indicator light flashing in a dark crowd? The angrier people get about this tech privacy invasions, the more nervous and deceitful I feel wearing them. People might know these devices exist, but most still do not expect to see them in their day-to-day lives. I have yet to have a negative in-person interaction, but would the internet be calling these pervert glasses if glassholes were not making a comeback?
The AI Letdown
The optimist in me says this is the most affordable, stylish, comfortable, and capable smart glasses have ever been. The skeptic in me asks whether that is a good thing. It is a big step forward that I do not feel ugly wearing these glasses. The harder thing is convincing myself to keep them on. Big Tech wants smart glasses to be AI wearables, but right now, the AI stinks for most people. Meta AI is not great, and the glasses that come with proprietary models layered over ChatGPT are not much better. These AI integrations are fine for basic tasks like controlling music playback or asking about the weather. However, the advanced AI features are often a battery drain, stupendously basic, or unusable in daily life, sometimes all of the above.
My spouse, who exclusively uses their Meta glasses to identify obscure car models, sometimes drags me along to local car shows. One time, I had to listen to Meta AI fail six times to identify a Ferrari. At the Vatican Museum, it correctly identified the Belvedere Torso, but the lack of a holy Wi-Fi signal rendered the AI otherwise useless. Rokid AI constantly tells me I have not adequately set up permissions for certain features, or that my Bluetooth connection is spotty. I quickly gave up on the Lucyd glasses because using ChatGPT through them was more trouble than it was worth. Even Realities recently built in a Conversate feature, which uses AI to define phrases or present useful factoids related to your conversation. I tried using it in a product briefing, and the feature peppered my vision with the definition of artificial intelligence and wearable technology.
When I go to tech companies shiny smart glasses demos, I always ask what scenarios I should try. I am usually given examples like identifying a book to read from a shelf of travel books, or getting recipe suggestions from a well-curated shelf of pasta, red wine, and sun-dried tomatoes. Ooh, maybe ask the AI to generate a playlist based on a piece of artwork hanging on the wall? These scenarios feel utterly inorganic. My to-be-read pile of books is a mishmash of genres. When I snapped a photo and asked for a recommendation, Meta AI told me it did not have preferences or opinions, and I should just pick what interests me. My fridge is a hodgepodge of veggies about to wilt, separate from my pantry. I tend to play music based on my mood, not a painting. The features that have felt purposeful are occasional. I like turn-by-turn navigation, except New York City has a handy grid system and every smart glasses maker recommends you do not use the devices for driving. AI translation requires quiet environments where you do not have cross talk, which do not materialize often. Same goes for live captioning. Teleprompters can be useful if you are the sort of person who often gives lectures, but I am just not.
Practical Challenges and Prescription Woes
I found smart glasses to be most useful when I am traveling. Outside of some accessibility communities, these glasses are best for business people or content creators always on the go, which is maybe why Silicon Valley is so gung-ho on them. For everyone else, they are a cool pair of open-ear headphones. Wearing these glasses, it has never been clearer that companies are inventing scenarios because they so badly want this to work. The better the tech gets, the question I am left asking is: But why are you insisting I need this on my face?
Sometimes I feel tech companies have forgotten that, first and foremost, people wear glasses to see. Only in the past few weeks has Meta, the front-runner, come out with a version of its glasses that supports all prescriptions. Of all the brands I have tested, only Even Realities confidently said they could absolutely handle my prescription with zero problem, accommodating up to ±12 diopters. Impressive, though you are still out of luck if you need bifocals. Most of these devices do not support my vision needs, which means every morning, one of the first choices I make is: Do I wear contacts and smart glasses, or my normal dumb glasses? Sometimes that is an easy choice, but most days it is not. As the tech improves, it will be easier to make these devices lighter and incorporate displays for more complex prescriptions. However, because of the countless permutations of face size and vision, this is an infrastructural and supply chain problem too, one that smart rings also share. Fixing this will take time.
Even if I intend to wear smart glasses all day, my eyes sometimes get so dry I have to swap them out for my regular glasses. Also, what happens if your glasses break? This has happened to me a few times in my life. The last time, I was lucky that a pair of pliers and a heat gun did the trick, but these kinds of DIY repairs are impossible with smart glasses, where the tech lives in the frames. New glasses in the US can be an exorbitant expense, and the whole idea that I would not be able to replace nose pads or screws on my own is concerning. I never thought I would have to ponder right to repair for my vision. A smartphone main benefit is useful for everyone regardless of their body or needs. There are multiple sizes, plentiful accessibility features, and accessories like cases, straps, and mounts for any situation. Until glasses can claim the same, they are doomed to be niche devices.
A Glimmer of Hope
Oddly, in a way, I am more optimistic about smart glasses than ever. The current crop of smart glasses still is not fully it, but for the first time it is not because the devices just plain suck. It is more that I do not think anyone has presented a clear idea of why you would want these on your face all day, every day. Finally, I can at least see glimmers of why I might want to use these glasses sometimes. Regardless of what Big Tech thinks, AI is not it. In real life, you look unhinged nattering away at your glasses. Companies have also seemingly forgotten gadgets are meant to be put away. A phone can go into your pocket. A laptop gets stashed in your bag. The only time I take my glasses off is to sleep. In an ideal world, I would like the smart part of glasses to be something I can easily remove, depending on the situation. I find certain features potentially useful for my job, but like with my phone, I would love a mode that turns them off when I am off the clock. Big Tech does not seem to agree. It wants the next big thing regardless of whether it makes sense for the device. To me, that is where all this cultural friction comes from.
I would wager most people would be okay with temporarily sacrificing some privacy in specific scenarios where the benefit outweighs the cost. Smart glasses in museum tours are awesome. As tools in factories to help multilingual employees better communicate makes sense. A camera on your face 24/7 that can surreptitiously capture images and then feed a faceless corporation AI your data to ultimately fuel its targeted ad revenue is instantly creepy and no thank you. I have tested about a dozen of these things. Several more are on the way, and I am sure I will hear companies tell me how the next generation will fix my issues with the current one, or come up with several more half-baked reasons why these should be 24/7 devices. However, so far, none of these fancy AI use cases are what I am enjoying. The smart glasses I enjoy most are the jabroni-chic Oakley Meta Vanguard. I use them exclusively for training and recording race moments, everybody else can clearly see why I look like a mall cop, and no one is likely to punch my brains out because who wants to go near a sweaty cyberpunk doofus while they are running as fast as they can? It is okay that these glasses are not all-purpose. They were never meant to be.



