Are smart glasses a vision of our AR future?
A review of the Ray Ban Meta smart glasses
Last week I was in Vienna to speaking at AWE, one of the main conferences for the AR industry. As has been the case for several years AR glasses of many different types, from contact lens to industry-friendly form factors, were all being exhibited as the industry makes its slow but steady march towards headworn computing.
At FOV Ventures, we’re excited about this progress but are still cautious about timelines for mainstream adoption. There are many challenges that need solving (perhaps a subject for another post). But we are watching with interest as the category develops and have made some bets in companies like Doublepoint that are looking to solve one of the key challenges around user input.
But as we look towards the future, it’s still useful to reflect on where things are at today. Last month Meta launched its latest Ray Ban smart glasses, starting at $299. Whilst a long way off from being full AR glasses, they arguably represent the most mainstream head-worn computing devices available today. So we thought we better get some and try them out!
And having used them now for the last few weeks, I have to admit I’m more impressed than I thought I would be. But we still have a long way to go! Here’s my review…
Design trumps features
For smart glasses to become mainstream, design will have to trump features. Anything we wear on our faces will have to look amazing. These do!
Design matters when you’re wearing a computer on your face
Meta’s multi-year deal in 2020 with EssilorLuxottica to leverage the Ray-Ban brand and design expertise was a genius move. It begs the question, will Apple design its own or do a similar deal with a third party? If the latter, then with whom? Amazon is working with Carrera on its Echo Frames for example.
I already wear Ray-Bans for regular sight and own a pair of Ray-Ban sunglasses. If I hadn’t told anyone I don’t think they would have known I was wearing ‘smart glasses’. And setup was frictionless, with no cables and a beautiful charging case.
Unboxing the Ray-Ban Metas
The biggest friction when it comes to design? It’s the sunglasses form factor. If you buy the sunglasses (and live in Northern Europe) you’re not going to wear these a lot. They’ll end up in your gadget graveyard draw. But if you’re not already a glasses wearer like myself then it’s going to feel weird buying clear lenses. You’re not used to wearing glasses for extended periods of time and, most importantly, you’ll feel self-conscious.
I initially bought the sunglasses and I’ve now opted to get the transition prescription lenses. But be warned, the cost quickly escalates to a point where this will end up being a tougher purchase for most people. Luckily I can just about justify the cost as a work expense.
Wearing sunglasses on a cloudy autumn day
Utility straight out of the case
Audio smart glasses have been around for a while now. It’s a use case I’ve somewhat dismissed in the past. But as someone who sticks AirPods in their ears for an increasingly long part of the day, just having speakers in my glasses is surprisingly good and gives me instant utility. There is some audio leakage, which would prevent me from using on the tube, but in other settings, this will be fine.
This utility around audio makes me wonder whether Meta should also have shipped a cheaper SKU with no cameras, to compete with Amazon’s Echo Frames and the Bose Frames. But arguably for Meta, adding cameras as default is core to their ambitions around AI and AR.
As an aside, it’s still to be seen whether audio will shine as a preferred UI for wearable devices. Again, sitting on the tube, or even with a group of friends or family, is a terrible place to start talking to your glasses. This could be solved by advances in voiceless speech recognition, something that French startup Wisear and others, are already working on. But this is likely still a way out.
Camera on your face
So let’s talk about the cameras. The new glasses have an improved 12MP ultrawide camera with 1080p video recording at 60fps. For reference, that’s comparable with my iPhone 12. And these glasses actually take a really good picture and video. You won’t catch me live-streaming to Instagram any time soon (although Insta influencers are likely going to be the early power users), but I am someone who likes to document daily memories of the kids and other interesting tidbits.
Where I hope the glasses will really add utility in the camera department is in capturing things I normally wouldn’t (or couldn’t) get my phone out for — like kayak trips, bike rides, fleeting moments whilst walking the dog, spontaneous shots of the kids. And later this year we have a big family trip planned where I hope I’ll be able to capture lots of great memories. That’s my intention at least, let’s see if the novelty has worn off by then.
One pleasant surprise is that the Meta View app makes syncing and sharing media more straightforward than I was expecting (something the Quest app has been terrible at). But it’s definitely optimised for Meta’s own properties and this is probably an area that will suit me better on an Apple-made device. For many reasons, Apple will likely be a much later entrant into this market though. I’d happily better that if they were to release a similar form factor then their sales could easily be 10x.
As Arian pointed out though, in our recent podcast on Meta Connect, Meta’s first glasses shipped with a lot of resistance internally. However, the company’s philosophy is to ship early and ship regularly. So even with this latest release, it seems that Meta’s main goal here is not to have a mainstream hardware hit for the masses. They’ll be happy to sell a couple hundred thousand of these and get both data and feedback from the users that will help them iterate for future devices.
One of the likely reasons that a player like Apple is not the first-mover in this smart glass category is undoubtedly privacy concerns. A discussion on this would be a whole separate blog post, but needless to say, whilst the possibilities that are unlocked once you’re recording everything you’re seeing are endless, there’s going to need to be a lot of attention paid to this area to limit the potential violation. The brighter privacy LED light on the front of the glasses is a simple but good start.
More prominent privacy light. The glasses are disabled if it’s covered.
But it’s not just video data being collected, it’s footage of people who potentially don’t know they’re being recorded as well as additional data including the wearer’s voice, biometrics etc. For me, wearing these smart glasses more regularly has been a useful way to start thinking about and calibrating on some of these topics.
Is XR the interface for AI?
AI was a big topic of conversation at both Meta Connect and this year’s AWE conference
One of the more exciting parts of Meta’s Connect event this year was the inclusion of AI. It’s perhaps not surprising that AI took center stage, if nothing else it’s definitely what Meta’s shareholders wanted to hear. But this means that the expectation is potentially still much higher than the actual reality. The promise is that the glasses will be able to translate a sign that I’m seeing or locate me based on a building that I’m looking at. But these are all things that won’t ship until early next year. So far any voice interactions I’ve had with my glasses have been limited to asking it to record a video or play Spotify. I’m looking forward to seeing how this area evolves, especially if it’s something they can open up in some form to third-party developers. But if Arian from Earthling is right, and the AI features are still lagging behind, then maybe I shouldn’t get my hopes up too high.
But these aren’t AR glasses (yet!)
It’s a nice novelty to be wearing a face camera. And as I say, I’ve been surprised by how great the audio utility has been. But I think for many this won’t be a compelling enough case to purchase smart glasses today. Especially given the considerable frictions around things like the perceived social stigma and reasonable privacy concerns. And in my short time with the glasses I’ve found myself really wanting some small AR features even if just limited, eg simple readout of what media I’m listening to, text messages, camera status etc.
So when will we get there and what’s next in the roadmap?
What we think we know so far is that there will be a third generation of Meta smart glasses due in 2025. These could have a viewfinder display for viewing incoming texts, scanning QR codes, and translating languages in real-time. They’re even rumored to potentially come with a neural interface band that allows the wearer to control the glasses through hand movements.
Full AR glasses (codenamed Orion) might not come until 2027. That’s if Meta can stay the course and nail the right waveguides, display tech and UI. But a slow ramp into this AR 1.0 generation probably isn’t a bad thing, with the previous form factors (and Meta’s mixed reality efforts) helping to build the AR developer ecosystem and address the biggest privacy concerns. But most importantly, I think each successive generation of new smart glasses will decrease the social stigma of people wearing a computer on their faces.
Still several moves to go until we’re at full and mainstream head-worn AR
I’m glad I bought the Ray-Ban Meta’s. Even with the flaws and caveats, it’s been a great learning exercise and I’m looking forward to my prescription lenses being delivered in a few week’s time to see if wearing smart glasses regularly is a habit that will stick. This category has a long way to go, but the current generation will certainly provide some clues as to what’s next and provide a glimpse at our AR future.
If you’re a startup building in this space then we’d love to hear from you.