Welcome back to Multicore. This is Instruction Set for the week of Friday, September 29th.
Heck of a week for news! Let's just get into it.
Meta announced the Quest 3 VR headset. I don’t have a ton to say about this, since its design, price, and basic capabilities were officially revealed months ago. From the hands-on impressions I’ve seen — always read Adi Robertson on VR — the mixed reality function seems to work better than the Quest Pro’s but worse than the Apple Vision Pro’s, which would be a pretty good outcome for a $500 device.
The Quest 3 uses Qualcomm’s new Snapdragon XR2 Gen 2 chip. Qualcomm hasn’t gone into specific architectural details, but it claims 2.5x GPU performance and 8x AI performance over the Gen 1 chip in the Quest 2. The Gen 2 also has dedicated hardware for tracking hands, objects, and head movements.
This is a legit improvement in silicon, so I was a little surprised that Meta didn’t announce any Quest 3-exclusive software beyond the obligatory mixed reality stuff. That kind of thing will probably come later; the smartly designed VR version of Resident Evil 4 was the first big Quest 2 exclusive, and that was announced more than six months after the headset came out.
I think I’ll be okay with my Quest 2 for now. It should be good enough to handle Meta’s vision for VR until proven otherwise. Plus it lives on a big Anker wireless charging dock on my shelf that I am really not all that motivated to get rid of.
The more interesting announcement was the second generation of Meta and Ray-Ban’s smart glasses, formerly called Ray-Ban Stories and now just referred to as “Ray-Ban Meta Smart Glasses”. The cameras and audio are apparently much improved, representing a big leap over the core functionality of the previous version and now supporting live streaming to Instagram and Facebook.
But the real step forward could be the onboard Meta AI assistant, based on Meta’s Llama 2 large language model. With an update next year, the glasses will be able to recognise what you’re looking at when you ask them questions, sort of like a hands-free Google Lens. Theoretically, you’d be able to look at an unfamiliar building or a foreign-language restaurant menu and simply ask your glasses to explain what’s in front of you.
I’m not ready to give Meta’s AI the benefit of the doubt, but this could be a really compelling product. They more or less look like regular Ray-Bans, they should be able to capture passable photo and video, and they could plausibly replace your headphones for casual listening — to say nothing of the potential AI features down the road. Oh, and they presumably work fine as sunglasses. I actually think I’d buy that for $300.
Speaking of hardware with nebulous AI capability!
The Financial Times:
OpenAI is in advanced talks with former Apple designer Sir Jony Ive and SoftBank’s Masayoshi Son to launch a venture to build the “iPhone of artificial intelligence”, fuelled by more than $1bn in funding from the Japanese conglomerate.
Sam Altman, OpenAI’s chief, has tapped Ive’s company LoveFrom, which the designer founded when he left Apple in 2019, to develop the ChatGPT creator’s first consumer device, according to three people familiar with the plan.
Altman and Ive have held brainstorming sessions at the designer’s San Francisco studio about what a new consumer product centred on OpenAI’s technology would look like, the people said.
They hope to create a more natural and intuitive user experience for interacting with AI, in the way that the iPhone’s innovations in touchscreen computing unleashed the mass-market potential of the mobile internet.
The process of identifying a design or device remains at an early stage with many different ideas on the table, they said.
The Information was first to report details of Ive and Altman’s discussions.
This sounds like it’s pretty far off if it’s going to happen at all, but it would obviously be a fascinating way for Ive to get back into consumer hardware. It’d also raise the same questions I’ve been thinking about ever since the AirPods were announced as essentially AR devices. Apple trades on its industrial design prowess, but how does that square with future technology trending towards being as unobtrusive as possible?
I wouldn’t want to bet against Ive on anything other than laptop keyboard design, but dedicated AI gadgets from the likes of Humane are going to have a tough time looking cool. I think Meta might be onto something by simply making it non-negotiable that its tech needs to fit into a pair of Wayfarers.
Supply chain oracle Ming-chi Kuo has had himself a week. Here he is on the prospects for a presumed non-Pro version of the Apple Vision Pro:
Keep reading with a 7-day free trial
Subscribe to Multicore to keep reading this post and get 7 days of free access to the full post archives.