My first reaction when I put on Meta’s $800 Ray-Ban Display was excitement. As frivolous as it may seem to have yet another screen in your life, there’s something that happens when you basically glue a display to your eyeball. You transform from a person with glasses to, like, a spy, or a cyborg—a cyborg spy! Yeah, that’s it. Ghost in the Shell fans will get it.
When I initially donned these smart glasses at Meta Connect, I smiled because this was what I felt had been missing from my previous Ray-Ban smart glasses experience. A big, bright, full-color screen—the one thing people always wanted to know about when I showed them my deflatingly screenless Ray-Ban Meta AI glasses.
That little dose of magic is even further heightened by Meta’s Neural Band, a small wristband that, when slipped over your hand, reads the electrical signals in your arm, allowing you to navigate the Meta Ray-Ban Display with a series of finger pinches and thumb swipes.
Meta Ray-Ban Display
Meta’s Ray-Ban Display is impressive hardware that’s limited by its lack of apps.
Pros
- Display is impressive
- Neural Band feels like magic
- Navigation and notifications can be useful
- Battery life holds up
Cons
- Not enough apps
- Camera isn’t upgraded
- Neural Band can be uncomfortable over long periods
- Probably a privacy nightmare
- Existentially exhausting
The only other experience I can liken this combo to is the first time I used Apple’s Vision Pro, which creates a similar kind of magic, sans wrist-worn wearable. In the Vision Pro and Meta’s Ray-Ban Display, you’re using technology the same way a wizard casts a spell, waving your hand to make the computer do the things computers do, which, if you’ve watched as much sci-fi and fantasy as I have, is pretty f*cking rad.
Weirdly, I’m reminded of my grandma (my nonna, actually; sorry for being Italian), when I first showed her how to use a computer mouse on my family’s PC when I was a kid. You move this little plastic thing on a desk, and it moves something on a screen! Groundbreaking! It seemed silly to me at the time, but now, as I get older… I get it. Inputs and screens are exciting, no matter how jaded we get with the experience of using them.
So, there it is. Excitement; that was my first reaction to the Meta Ray-Ban Display. My first reaction. It’s not, however, my last.
A see change

If you’re like most people, the first thing you’re probably dying to know about the Meta Ray-Ban Display is how they actually look when they’re on your face.
The titular display part of the Meta Ray-Ban Display is a 90Hz (30Hz minimum) 600 x 600-pixel full-color screen with a 20-degree field of view in the bottom-right corner of the right lens. The good news about having a screen in that area specifically is that it doesn’t obstruct your vision when you’re walking around and doing stuff. The bad part? Well, every time you look at it, you’re looking down and away as though you’re worried a snake might slither in and lunge at you. It’s not what I would call a natural resting face (let’s call it resting Meta face), but let’s be honest, there is nothing natural about walking around with a screen strapped to your eyeball.
The screen inside the Meta Ray-Ban Display is also very bright, with a max brightness of 5,000 nits. This might not seem like a stat you want to pay attention to, but believe me, in a pair of smart glasses, it’s critical. I’ve used less bright screens in other pairs, and they’re hard to see outside. And if you’re spending $800 (before tax) on a pair of smart glasses, you’d better be able to use them while you’re walking around in the real world.

In terms of style, you should know that all Meta Ray-Ban Display have transition lenses by default. That may seem like a bummer if transitions aren’t your thing, but it makes sense, since the screen needs to be effective indoors and out, and the only way to do that is by giving it contrast in direct sun. Conversely, it also provides see-through lenses indoors so you don’t go stepping on your cat or something. I find the screen to be very visible even in direct sun, probably because of the added contrast from the transition lenses. Also, you can buy these with prescription lenses, so that’s good news for those reading this from behind a pair of regular glasses.
But just because the Meta Ray-Ban Display are bright does not mean the screen is perfectly sharp. I find the screen to be sharp enough to satisfy the dream I had in my head of what a pair of display smart glasses from Meta would look like, but others might be less enthused. I also noticed that some people might see the screen differently than I do. One colleague in my office described the screen as “shaky,” though I wouldn’t describe it that way at all. Others said they struggled to see it or that it was disorienting.

One thing I definitely found disorienting is that the lenses in the Meta Ray-Ban Display are actually mirrored. This, I assume, is part of the construction of the “geometric waveguides,” which is what the display tech inside the smart glasses is called. Geometric waveguides are special because they use mirrors to cut down on visual artifacts by reflecting light instead of splitting it like diffractive waveguides in other smart glasses. It also makes it so the screen is hard to see from the other side, which, by the way, is true. People probably aren’t going to know your screen is on unless you’re in a dark area and the brightness is turned up.
The benefits of using a geometric waveguide are clear, but it can also be distracting at times, since you can see behind you if you look to the right, or even sometimes when you’re looking straight ahead. I do feel like my visibility actually decreases when I’m wearing the Meta Ray-Ban Display, probably more than when I wear other smart glasses with a screen in them.
That being said, I find the screen to be up to snuff, if not the highest resolution in the world, but I highly recommend you go see for yourself before buying a pair. Luckily, Meta is requiring people to get sized for wristbands in-store anyway, but hey, maybe you’re considering buying them aftermarket! And, if you are, I would suggest… not. The lesson of the screen, if there is one to be had, is that, though your experience may vary. It is surprisingly bright, if not always sharp or hi-res. Ultimately, the screen is just a part of the picture; it’s also about what you can do on said screen, and on that front, the possibilities are… not endless.
So… now what?
As cool as controlling a UI by waving your fingers in the air is, that thrill (for most at least) probably won’t last forever, and when it fades, you’re going to wonder to yourself, “Okay, so what now?” In Meta’s case, the “what now” part consists of a few things, and I really mean a few.
You’ve got bread-and-butter phone-type stuff like messaging, which encompasses Meta’s first-party messaging apps like WhatsApp or Messenger on Facebook and Instagram. It also, thank god, works with both iOS and Android, allowing you to both send and receive messages from your phone. In the smart glasses display, all of those notifications can be shown as they roll in, popping up as a bubble. You can also opt, via the Meta AI app (where you’ll have to connect Instagram, WhatsApp, and your phone) to have messages read out loud through the built-in speakers. Personally, I find that feature to be a little annoying. You have a screen now; you might as well use it.

As you might imagine, the Meta Ray-Ban Display are more integrated with Instagram and WhatsApp than they are with your phone. For example, in the Instagram app, you can even watch Reels that are sent to you via DMs, which is a nice touch if you want to catch up with that one friend who spams you with memes while you’re commuting on the subway. Sending pictures via your phone, however, is a little less clean. In iOS, pictures show up as a link that the person has to tap in order to see the picture. It’s a small hurdle, but one that creates just a little bit more friction than a first-party gadget would.
Mostly, though, the friction isn’t a dealbreaker. Messaging someone from the Meta Ray-Ban Display connected to your phone (in my case iOS) is pretty simple, though you still have to use the voice assistant on the smart glasses to do so. I tried texting my colleague, Ray Wong, for example, by saying “Text Ray,” and fortunately, Meta AI asked me, “Which Ray?” After that, I was able to use my thumb to select the correct one and then pinch “dictate” to say my message out loud, which in this case was, “I’m texting you from my stupid glasses.” I was even able to respond with a thumbs-up emoji after Ray texted back, “You look like a dork.”
There’s nothing revolutionary about being able to send and receive messages with smart glasses, but I will say being able to see notifications as they roll in is a novel experience, and the ease with which you can respond feels more refined than you’d expect from a category of device that feels like it’s only existed for five minutes.
Outside of messaging, there’s also video calling, which works about the same as it did on the Ray-Ban Meta Gen 1 and 2 AI smart glasses, though with a new video calling feature via WhatsApp, Messenger, and Instagram that shows your POV from the Meta Ray-Ban Display’s camera. This obviously is not an ideal way to video chat, but if you’re trying to show someone something, it could be useful. I’m not really sure how the Meta Ray-Ban Display would ever overcome the fact that there’s not a camera pointed at your face for a more natural video calling experience, either. A Meta-made version of Apple’s spatial Personas in the Vision Pro, maybe? I certainly hope they’d be more high-res than its Horizon Worlds avatars.
I tried video calling my partner through Instagram to test the feature out, and the results were… low-res. To make sure, I compared the quality to the same thing from my iPhone 17 just to make sure it wasn’t just Instagram, and it looks like the camera resolution on the Meta Ray-Ban Display was the issue, since the video quality from my iPhone looked much clearer. For lots of reasons, I don’t think this is a feature I’d be using much.
In addition to calling and messaging, there’s also navigation, which Meta says is still in beta. I used the Meta Ray-Ban Display’s navigation feature to do some walking in New York, and it was decent. I even used dictation to actually enter the address I was headed to, and it worked on a busy sidewalk in Times Square. Having turn-by-turn navigation glued to your eyeball like that isn’t always going to be useful, but in certain situations, it can be, and walking through Times Square definitely felt like one of them. Sure, I could have pulled out my phone just as easily, but there was something more freeing about being able to just glance down at the map on my face to make sure I was headed the right way. It also freed up my hand to use my phone to double-check the address was right, which felt a little dystopian in some ways to be screen maxing like that.

Meta does offer a zoomed-in and zoomed-out view inside the UI, which is good if you’re moving fast and need to see ahead, which could be useful on a bike. Right now, when I’m using a bike share and I want to make sure I’m going the right way, I usually pull over to the curb and check my phone, which is not ideal. Meta even went as far as to integrate some non-essential map features into navigation on the Meta Ray-Ban Display, too, including tabs that help you search for cafes, restaurants, parks, and “attractions.” Meta is pulling that information, according to a helpful information tab in the maps app, from sources like OpenStreetMap, which offers publicly accessible map data.
Another nice-to-have feature you might be interested in is live captions as well as live translations, which are exactly what they sound like. Live captions uses the microphones in the smart glasses to hear your surroundings and then captions them in real-time on the display, while translation does the same thing while converting one language to another. I tested both, and the live captions works fairly well, while the latter… well, you’ll find out.
Live captions kept up with a fairly fast-paced YouTube video, and while it didn’t nail all of the words 100 percent, the broad strokes were all in place. If you’re hard of hearing or have an impairment, I can see live captions being useful, granted you’re in an environment where you can pick up sound okay. One thing that’s impressive is the Meta Ray-Ban Display’s ability to know when the wearer is talking and then not captioning that speech. This is, however, also a downside sometimes in real conversations, because oftentimes two people may overlap in vocalizing, and this causes Meta AI to miss what your conversation partner said in an effort not to capture your voice in the caption.
Similarly, live translation works, but with some variability. I tested live translation in a conversation with my partner, who is bilingual (she speaks English and Spanish), and like live captions, it superimposes the translated text of your speaking partner onto the screen in real time. The only problem is, when I read back the translated text to confirm that it was correct, my partner frequently reported it being slightly off. It’s not that the meaning was wrong, per se (though sometimes it was), but the translated text, when ported over to English, was translated but not interpreted, if you catch my meaning. That’s to say the words were mostly correct, but it wasn’t rephrased to fit English grammar, making reading and understanding the conversation touch-and-go at points.

These hiccups are bound to happen in any translation app out there, and Meta AI is no different, but I wouldn’t say I was wowed at its acumen. That being said, I can see this feature coming in handy if you had to use it while traveling, if just because looking at the translation in your smart glasses screen is more natural than looking down at a phone. Google Translate is still more refined, but looking down and to the right in your smart glasses is marginally better than looking at your phone. If you do need to look at a phone, all of the translated text appears in the Meta AI app, so it can be easily referenced.
One other minor gripe with live translation is that each time you want to switch languages, you have to go into the Meta AI app and download that language onto the smart glasses, which takes a couple of minutes (and that’s on my home Wi-Fi, not LTE). It’s not a huge deal, and you probably won’t be switching a ton between languages, but if you’re visiting a country where multiple languages are frequently spoken, it could get kind of annoying.

There are other apps that I’ll get into later (camera and photos), but right now, those are your main features. And when I say “main features,” what I mean is they’re the only features you’re going to get. There is no app store here, which means if you were excited to be able to doomscroll through TikTok on the Meta Ray-Ban Display, you’re out of luck. There is no Gmail. There is no Slack. And it’s not just third-party apps; there’s no proper Instagram or Facebook app, either. This, to me, seems like an odd choice given some other smart glasses out there with fewer resources manage to do a lot more. Even Inmo’s Air 3, which are otherwise not a good pair of smart glasses on a hardware level, have the Google Play Store and let you download pretty much any app and use it in 2D.
It’s deflating to an extent to spend $800 on a future-forward device and then find that the future can’t run even a pared-down version of Instagram. I can only assume that will change at some point, but I don’t have a crystal ball, and can only review the Meta Ray-Ban Display as it is right now. And what it is right now is a pair of smart glasses with an impressive display that you can’t do a ton with.
Time for (neural) band practice
I think what gets lost in the hype of a bright and shiny screen inside Meta’s Ray-Ban Display is the screen’s companion, the Neural Band. One of the biggest hurdles to developing a functional pair of smart glasses with a screen in them is figuring out how exactly you should be controlling the UI that the screen displays. Meta’s solution is a pretty magical wearable called the Neural Band, which uses electromyography (EMG) sensors to detect electrical activity in your muscles and nerves and then translates that activity into inputs.
To control the Meta Ray-Ban Display, you use a series of tiny gestures. For scrolling, make a fist and then move your thumb over the top to navigate left, right, up, or down, to select apps, scroll, etc… To select, you pinch your thumb and index finger together once. Going back is a single pinch of your middle finger and thumb, while a double-tap of those two digits can wake the screen or put it to sleep, and a long middle finger pinch brings up a quick menu for going home and going to apps. There’s also a double-tap gesture with your thumb on your fist that activates Meta AI, if you don’t want to use the wake command, “Hey Meta.” Every time the Neural Band registers an input, it gives you a nice little haptic buzz to tell you it’s working.
The Neural Band is surprisingly quick at reading your inputs, though there are most likely times when you will have to send an input twice. The accuracy also has a lot to do with whether the band is strapped in tight enough, so if you’re having trouble, try making it a little tighter. Also, if you’re wearing the Neural Band all day, you can expect some accidental inputs, though I didn’t find this to be a huge issue. A couple of times here and there while I was typing, the Neural Band registered my finger motions as an input, but for the most part, it was reliable and steady.
The band itself doesn’t have much to it; it’s a piece of cloth with plastic sensors inside that you slip on and then strap to your wrist just like you would a fitness band from Whoop or Polar. While it doesn’t look like much on the surface, it’s made from a fairly high-tech material called Vectran, which is used on the crash pads of the Mars Rover and is soft but strong. The band itself has 18 hours of battery and a magnetic charging cable, and an IPX7 rating, which means it can be submerged in 1 meter of water for up to 30 minutes without ingress. (Full disclosure: I did not test the band’s water resistance for fear of damaging it, so dunk it at your own risk.)

To me, it’s all as cool as it sounds. Being able to control smart glasses this way is novel, and companies are still figuring out the best input method (smart rings are also hot right now), but this is my favorite so far in terms of user experience. With that said, it’s also a wristband. I previously wrote that making people put on a wristband to use their smart glasses is a big ask, and I still mostly stick by that statement. Whether the process is difficult or not, wearing the band for long periods can get a little irksome.
The Neural Band leaves a mark on your arm afterward from the sensors pressing into your flesh, and sometimes it has to be strapped on fairly tight to work properly, which isn’t ideal for those of us who can get the ick from wearing something on our wrist all day (this is why I don’t wear watches of any kind). In a world of fitness bands and smartwatches, the Neural Band looks banal enough, but I am unconvinced that this is the solution to navigating glasses UI. My guess? This band is going straight in the trash when smart glasses makers figure out how to fit hand and eye tracking into the frames themselves. Oh, and speaking of the trash, don’t accidentally throw your Neural Band out or lose it; that’ll cost you $199 for a replacement.

Despite those downsides, the Neural Band is impressive overall, but I should warn you that misusing it can have dire consequences. My first word of advice: make sure you put the band on correctly. While testing the Meta Ray-Ban Display, I accidentally put the band on backwards since knowing which way is which can be confusing at first (just make sure the button of the top sensor is facing you). The result, obviously, was incorrect inputs, which, no big deal, right? Wrong. Don’t forget, the Meta Ray-Ban Display is connected to your phone, and because of that, I accidentally, somehow, ended up taking a picture and sending it to a friend of mine by accident. Luckily, everything was PG. Crisis averted, but it also could have made for a very awkward situation.
If that idea sends a shiver down your spine, my word of advice would be to practice with your Neural Band for a few days before you connect your phone, so by the time you’re ready to start zooming around, you can do so with a reasonable degree of certainty that you won’t accidentally give your Aunt Debra an eyeful.
How does the future feel?
One thing you should be very aware of when you’re choosing a pair of smart glasses is how they look, but also how they feel on your face. Again, these are $800 smart glasses, and to get your money’s worth, you’re probably going to want to wear them for fairly long periods of time. And if you’re wearing them for long periods of time, they need to not destroy your nose. Mostly, the Meta Ray-Ban Display were comfortable during longer periods of use, though they are objectively heavier than non-display models. The Meta Ray-Ban Display are 69g (70g for the larger size) compared to the Ray-Ban Meta Gen 2, which are 52g.
The Meta Ray-Ban Display aren’t just heavier, they’re also thicker. You can see that the frames are a great deal thicker than Meta’s screenless smart glasses. Though, thanks to the acumen of EssilorLuxottica in designing the Meta Ray-Ban Display, I think they’re stylish on most faces. Don’t get me wrong, they make most people look like Nerds (capital N intentional) or Brooklyn hipsters from 2004, but compared to other smart glasses with screens, they’re stylish and comfortable for the most part.

The style choices are limited right now, which is a bit of a bummer since the Ray-Ban Meta AI glasses (the ones without a screen) come in lots of different styles. Right now, the Meta Ray-Ban Display only come in Black or Sand, and both of those finishes are shiny instead of matte, which is not really my first preference when it comes to Ray-Bans, both because I don’t love the look and they attract fingerprints a lot easier.
One thing that I love this time around is the case, which is black and has the same pleather material as the ones for the Ray-Ban Meta AI Gen 1 and 2, but can be collapsed to lie flat, which is great for when you want to slip it into your pocket. That’s a bigger perk than you might think, since these are very expensive smart glasses that you’re going to want to take care of when you remove them from your face by putting them back in the case. I wouldn’t plan on repairs for scratches or damage on these smart glasses being cheap or easy, if you can even get them repaired at all.
As long as we’re talking about wearing the Meta Ray-Ban Display for long periods, we should talk about battery life, too. There’s a new battery in all of Meta’s 2025 smart glasses (the same battery in the Oakley Meta Vanguard glasses and the Ray-Ban Meta Gen 2), and it pays off the same way it does in the non-display versions. After a full day’s worth of intermittent use, including about an hour of audio playback, messaging, navigation, and more, starting at about 10:30 a.m., the Meta Ray-Ban Display (which started with a full charge) were at about 18% by the time I got home at 8 p.m.
For me, that feels more than sufficient, though I guess that depends on how much of a smart glasses junkie you are. Charging the Meta Ray-Ban Display is basically the same as always. You just slide them in between the arms inside the case so that the bridge rests on top, and the charging case will do the rest. You’ll get up to 50% charge in about 20 minutes, and the case holds 30 hours of battery in total.
How you think the Meta Ray-Ban Display looks will be subjective, but you should be prepared for a little more heft than screenless versions, even if EssilorLuxottica does a good job of making that extra size work for the smart glasses.
Cameras, speakers, and Meta AI
There are some aspects of the Meta Ray-Ban Display that are somewhat unchanged from the screenless Ray-Ban Meta Gen 1 and 2. The audio is as solid as ever, both for calls and music playback, which is great since using the smart glasses for voice calls is still one of my favorite uses. Pictures are… fine. I was a bit disappointed to find that the Meta Ray-Ban Display doesn’t have the 3K 60 fps capability that the Ray-Ban Meta Gen 2 and Oakley Vanguard glasses have, and instead maxes out at 1440p at 30 fps.
There’s also the same 12-megapixel sensor as the Ray-Ban Meta Gen 1 and Gen 2, which gives it similar video and picture capture. While shooting with the Meta Ray-Ban Display, I was underwhelmed. For $800, it would have been nice to see an improvement here, but you’re probably buying these smart glasses for the screen, anyway. If you want a more in-depth camera analysis, read our original Ray-Ban Meta Gen 1 review; it’s the same camera hardware, and as a longtime owner of those smart glasses, I can tell you that the results in the Meta Ray-Ban Display are about the same.

What is new on the camera front is that you can get a real-time viewfinder of what you’re looking at in the display, which is nice. To take a picture, you just open the camera app via your preferred methods (voice or selecting it using the UI) and then pinch your index finger and thumb to get snapping. One cool twist here (no pun intended) is that you can zoom using the Neural Band by pinching your index finger and thumb and then twisting your wrist counterclockwise to zoom in. There’s a very simple photo app where you can look at all the pics and vids you snapped, too, and you can send them to people from this menu as well.
Meta AI works the same as it does on previous generations of glasses, though you get a pop-up circle in the Meta Ray-Ban Display that tells you when Meta AI is activated and thinking. It works fine for simple voice commands like “take a picture” and “launch Spotify,” but uses the same AI models as other generations, so more complex tasks like “what am I looking at?” or “what kind of flower is this?” can be hit or miss. AI is still one of the least compelling parts of Meta’s smart glasses, despite the company’s emphasis on that front. I would like to see Meta focus on making a smoother voice assistant over computer vision capabilities, but that’s also a very tough nut to crack. Just ask Google, Amazon, and Apple, which have been trying for like a decade now.
As always, there’s the Meta AI app, which shoves annoying AI content in your face that I could do without, but if you’re going to use these smart glasses, you’ll have to make peace with that. In fact, there are quite a few things you might have to make peace with if you’re going to use the Meta Ray-Ban Display.
Rose-tinted smart glasses?
There were some things I expected to feel while wearing the Meta Ray-Ban Display, and some things I didn’t. One thing I expected to feel was a little distracted. Turns out I was right. In theory, smart glasses with screens in them could be less all-consuming than phones, but in practice, I just don’t think that pans out. Sure, you don’t have access to apps and all of the things that keep your head glued to a screen, but notifications are also distracting, and even more so when they’re plastered on your eyeballs. There is something that happens when you bring your body and your eyes that close to a screen, and I’m not sure I like what that something is. Which brings me to the next thing I felt, though unexpectedly this time: worried.

Putting on Meta’s smartest glasses had me feeling surprisingly introspective and a little douchey. On one hand, zooming around a screen floating before your eyes while using just one hand is cool, but on the other, it’s a little depressing. Is the novelty worth the hit to your distraction? Is it worth the implications on your privacy or the privacy of people around you? Is it worth wondering, as you’re walking around like a screen-zombie, staring at the ground, if people think you’re a total tool? That’s a personal thing that you’ll have to decide for yourself, but they’re questions worth asking, and it’s better to ask them now, before it’s too late.
And maybe I’m blowing things out of proportion. Maybe we won’t have to reconcile any of those questions. Maybe this whole smart glasses thing will fall flat on its face, and that will be that; just glowing rectangles in our hands from here on out. That’s a possible future, but one that I sincerely doubt. With companies like Google, Apple (reportedly), and Samsung all waiting in the wings to launch their own versions of the Meta Ray-Ban Display, I’m willing to wager we haven’t seen the last of the smart glasses boom, which means we’re going to have to make some decisions.
So, what say you: are smart glasses the future? Or are they just a one-way ticket to glasshole 2.0?
Original Source: https://gizmodo.com/meta-ray-ban-display-smart-glasses-review-is-this-the-future-we-really-want-2000679520
Disclaimer: This article is a reblogged/syndicated piece from a third-party news source. Content is provided for informational purposes only. For the most up-to-date and complete information, please visit the original source. Digital Ground Media does not claim ownership of third-party content and is not responsible for its accuracy or completeness.








