Welcome back to Multicore. This is Instruction Set for the week of Monday, April 1st.
Not the wildest news week ever, but let’s get into it.
Meta is bringing a major update to its Ray-Ban smart glasses sometime this month, adding a multimodal AI voice assistant that can describe what you’re looking at. It’ll tell you what landmark is in front of you, for example, or translate text that the glasses’ camera can see. The New York Times had an entertaining exclusive look at the software, while Mark Zuckerberg hasn’t been shy about showing it off of late.
This feature has been in beta for a few months, but Meta is yet to even roll out its basic voice assistant to countries outside the US. I have a pair of Ray-Ban smart glasses and really like them for audio playback and video recording. But while I can invoke the voice assistant by saying “hey Meta”, it’s been virtually useless during my testing in Japan and Europe.
I can say “hey Meta, take a picture” or “hey Meta, pause [music]” and that’s about it. The only voice assistanty thing I can get it to respond to is “how’s the weather”, which is not generally something I need to know while wearing sunglasses.
I keep meaning to write about these glasses because I do think they’re a fascinating product. I guess I will get around to it once this update is out and Meta says anything about the rest of the world.
Vivo announced the X Fold3 Pro, its latest folding flagship.
Keep reading with a 7-day free trial
Subscribe to Multicore to keep reading this post and get 7 days of free access to the full post archives.