Meta already makes the best AI hardware
An actually useful AI gadget has arrived, even if it's mostly just sunglasses
The thing about wearable gadgets is you have to want to wear them.
Even if the Humane AI Pin worked flawlessly, which by all accounts it very much does not, I still can’t see myself wanting to stick it on my chest. As I wrote last year, it’s the kind of thing you’d need to orient your wardrobe around, or at least the kind of thing that might make you consider your t-shirt’s thread count.
Times can change, of course. AirPods were widely panned for looking silly at launch, for example. But that was a take that was easy to predict would age badly. New form factors have a much higher barrier to adoption because it isn’t immediately obvious how — or whether — they can slot into your life.
That’s why Meta’s take on the concept of AI hardware works so well. Its second-generation smart glasses produced in tandem with Ray-Ban, now simply named the Ray-Ban Meta Smart Glasses, start with basic functionality that almost everyone has a use for. They’re sunglasses. Just as importantly, they’re Ray-Bans.
I’ve been using the Meta Ray-Bans for several months, but I held off on writing about them until I could test their recently released AI assistant. I’m glad I did, because I’m now convinced this is a form factor that is going to stick around.
The Meta Ray-Bans basically look like Ray-Bans. The arms are a little thicker, and there are small circles on the temples for the camera and the recording light, but you’d have to know what you were looking for to think these are anything other than regular sunglasses.
There are three styles available: Wayfarer, Headliner, and the newly added Skyler. I got the Headliner with teal blue lenses in a “shiny caramel transparent” frame that exposes a little of the tech inside. I think they look pretty good. I am not really a huge sunglasses person, but I’ve had the same pair of Wayfarers for about twelve years and I don’t find the Meta Headliners much different to wear.
The faux-leather charging case is very slick and comes in at roughly the same size as a standard Ray-Ban case. You power the case over USB-C, and it juices up the glasses with electrical contacts located under the nose bridge. Meta cites four hours of battery life on a single charge from the case, and the case itself should provide up to eight charges.
That battery life really depends on what you’re doing with the glasses; you’re probably not getting four hours of continuous video recording. But I will say I’ve never actually had them die on me or needed to top up the case on the move. Even if I did, they’d still work as sunglasses.
Beyond that useful functionality, the Meta Ray-Bans have three key features: audio playback, a camera, and an AI assistant that leverages the first two.
When paired to your phone, the glasses work just like Bluetooth headphones. There’s a touch-sensitive strip on the right arm that lets you swipe to adjust volume and tap to play and pause. Tiny speakers in the arms direct audio to your ear without blocking any external sound, which makes them handy in situations where you might still want to hear people or vehicles around you. I’ve never enjoyed the “transparency” modes on noise-cancelling headphones before, and this is a more natural way to achieve the same thing.
It does, of course, come at the expense of bass response, and people around you will be able to tell that you’re listening to something — if not what you’re listening to — when you crank the volume up past about halfway. These glasses aren’t practical for using on a subway or in any other loud environment. But the sound quality is surprisingly good for non-critical listening like podcasts, and I generally prefer using them to earbuds when walking or biking around my neighbourhood.
“Surprisingly good” is also how I’d describe the tiny camera embedded in the frame. There’s a physical shutter button located above the capacitive audio controls: you can press it once to take a photo, or hold it down for a second to start recording a video of up to a minute in length.
All photos and videos are captured in portrait orientation, which makes sense for Instagram Reels and Stories but limits their utility for YouTubers hoping to replace their GoPro. Still, the footage that comes out of this tiny sensor is generally crisp, colourful, and pleasingly exposed. It isn’t very good in low light, but well, these are sunglasses.
Composing your shots and footage takes a little practice. The camera is located above your left eye, which means there’s parallax error from what you’re looking at, and it’s hard to judge distances accurately. I’ve also found that the camera often gets obscured by my admittedly floppy hair, though Meta recently updated the glasses to let you know when it thinks that happened. (“Camera covered by hair”, it curtly states.) The experience kind of reminds me of using a half-frame film camera with a bad optical viewfinder, especially since you can’t check the results immediately.
But it’s fun to use in situations where your hands are full or where you wouldn’t normally think to pull out your phone. The Meta View app also offers automatic cropping and straightening edits for some photos when you import them, which are usually helpful.
![](https://substackcdn.com/image/fetch/w_474,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd47e972a-0321-4e21-8194-507158805c65.heic)
![](https://substackcdn.com/image/fetch/w_474,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F48fa1ebc-a3bf-4aee-b49c-fa38185339b6.heic)
![](https://substackcdn.com/image/fetch/w_474,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa049f4a3-f962-4ba2-9913-28f986771b7b.heic)
![](https://substackcdn.com/image/fetch/w_474,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F99b46169-c830-4c7c-9991-a38e4e0b1529.heic)
![](https://substackcdn.com/image/fetch/w_474,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffdfb7ab4-0e80-4261-a198-2ebc935cbb35.heic)
![](https://substackcdn.com/image/fetch/w_474,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F49f40e26-afe7-416e-9a87-8d73cfa4c039.heic)
![](https://substackcdn.com/image/fetch/w_474,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F49de4d01-b457-4659-9eaa-416fd6d45a1e.heic)
![](https://substackcdn.com/image/fetch/w_474,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc94da023-3b6c-43b6-921b-957ae02e3f97.heic)
![](https://substackcdn.com/image/fetch/w_474,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa9528fd9-a37f-4455-b870-f069330c8cb5.heic)
The Meta Ray-Bans are also serviceable as a video camera, though you’re limited to 60-second clips in portrait orientation. The camera sometimes has trouble adjusting between dark and light scenes, but overall the image quality is better than you might expect and the software stabilisation is solid. It’s a fun way to capture footage you otherwise might not get at all, like when I was mostly holding on for dear life when riding in this car on the Dubai sand dunes last month.
Meta’s multimodal AI assistant, simply called Meta AI, is a recent addition to the Ray-Bans. It’s officially only available in North America and didn’t do anything at all for me until I took a recent trip to Hawaii, which seems to have unlocked the functionality for me everywhere. I’ve since used it in Dubai and at home in Japan, although it’s clearly not yet set up to handle languages like Japanese and Arabic.
You activate the assistant by saying “Hey Meta”. The microphones on the Ray-Bans are really good — it’s easy to invoke the assistant in loud situations even if you’re speaking quietly. I’ve also heard from others that their sound is stand-out impressive when used for audio and video calls.
Meta AI is a relatively standard Siri-style voice assistant, but the biggest thing that sets it apart is its access to the glasses’ camera. If you start your request with “Look at…”, Meta AI will snap a photo and analyse the image to inform its response. I will just say straight-up that it isn’t very reliable right now, while noting that some of my testing has been outside of its intended market.
Hiking Diamond Head in Hawaii, I’d ask what kind of tree I was looking at and get solid answers. (“The trees appear to be Acacia Koa and eucalyptus.”) I’d ask what I was looking at from various points along the trail and get answers like “The image appears to be taken from a high vantage point, possibly the summit of Diamond Head.” Not bad.
If I held up my Xiaomi 14 Ultra and asked what it was, Meta told me “This appears to be a smartphone with an attached camera lens”. Not exactly true, because the Photography Kit is more of a camera grip, but close. My iPhone 15 Pro, meanwhile, was identified as an iPhone 11 Pro.
While Meta AI isn’t supposed to work in Japan yet, I still got some useful results walking around my neighbourhood. It could accurately identify the types of businesses I was walking past, for example, as well as giving me a good explanation for the carp-shaped koinobori streamers that are ubiquitous in May. It told me whether a vending machine sold coffee and which row to find it on. I would say its guesses as to what kind of car I was looking at were on point about half the time.
I would have liked to test the translation capability — “Hey Meta, look at this menu and translate it” — but I haven’t spent much time in countries where that would be supported. I did try getting the glasses to read me some French and Spanish newspaper articles from my iPad, and while I can’t speak to the nuances of translation, it seemed like Meta AI provided a reasonable summary. This would probably be a lot more useful if Meta supported languages written in non-Latin scripts.
The need to say “Look at” at the beginning of every command using the camera is pretty awkward. I can see why it’s a hard problem to solve, but it’d be much more natural to be able to say things like “Hey Meta, what kind of car is this?” rather than “Hey Meta, look at this car and tell me what it is.” I don’t want to have to think about my syntax when I’m talking to a computer.
It’s clearly early days for Meta AI. But as I kept on testing these Ray-Bans, I found myself thinking of more situations in which I might actually want to use the assistant. It often made me hope for better results than I got.
Honestly, I think that means Meta has nailed the hard part, or at least an equally hard part. The camera and audio features were genuinely useful long before the AI assistant was added. I want to wear these glasses and I want them to be even more useful. I think the path for Meta to improve its assistant is easier than the path for the Humane AI Pin or the Rabbit R1 to become truly desirable devices.
The second-gen Meta Ray-Bans are just an extremely well-executed product. I find myself using them every day when the weather is appropriate and wanting to do so when it isn’t. I don’t wear glasses or contacts, but I’m honestly tempted to get a second pair with clear or transition lenses.
I never regret wearing the Meta Ray-Bans, even when their more ambitious features don’t work as well as I’d like. If and when they do, they’ll be a truly killer product. Until then, they’re just the world’s most useful sunglasses.
Love the thoughts about the “right” form factors. Thinking about what Apple Vision would look like if they had started from the AirPods rather than the iPad Pro.