- Viewpoints
- Posts
- Ray-Bans, Robots, and Real World AI
Ray-Bans, Robots, and Real World AI
Why smart glasses are the interface layer for the trillion-dollar spatial computing shift
Welcome to Viewpoints, news and investment activity covering the next generation of computing - brought to you by FOV Ventures.
This Month:
IXI featured on BBC News
Ray-Ban Meta Sale Numbers
FOV Portfolio Company Hits Half a Million Users
At FOV Ventures, we watch Meta's numbers because they provide one of the most reliable leading indicators for spatial computing adoption.
In his post-earnings interview with The Information, Zuckerberg doubled down on this vision, articulating how "personal super intelligence" differs from today's AI paradigm - emphasising that the future of AI understands and operates in our physical world through spatial computing interfaces.
As part of the earnings we saw Ray-Ban Meta glasses outsold VR headsets 2:1, validating Zuckerberg's thesis that glasses will be "the ideal form factor for AI" because they create persistent sensory bridges between digital / physical. But equally important - this demonstrates that consumer demand for AI is strong enough to drive physical hardware adoption in ways that virtual worlds alone couldn't. Where VR still marches towards mass market appeal, AI-enhanced glasses are accelerating as they augment rather than replace our physical reality.
Beyond Meta, this continues to signal the readiness for the broader physical AI revolution happening right now.
For example, Genesis AI just emerged from stealth with $105M to build robotics foundation models. and video AI companies like Luma and Runway are reportedly also moving toward robotics, recognising that understanding physical movement is the next trillion-dollar opportunity.
Smart glasses represent the consumer interface for this physical AI stack because they're the only device that can seamlessly bridge digital AI capabilities with real-world spatial context. While robotics and foundation models handle the heavy computation and physical manipulation, glasses provide the always-on sensory layer that lets AI see what you see, understand your spatial context, and respond appropriately in the physical world. They're the keyboard and mouse for a world where AI operates across both digital and physical domains.
The companies building the software layers that make this physical/digital bridge programmable - from simulation environments to AI-hardware interfaces - represent the next AWS-scale opportunity.
At FOV Ventures, we've positioned across this entire stack - from portfolio companies automating game testing with AI agents that transfer to real robots, to 3D content generation platforms, to gesture control interfaces. The recent growth of Ray-Ban Metas is the latest canary in the coal mine of the platform shift happening now.
If you're building or investing in this space, we'd love to hear from you.
FOV NEWS
In case you missed it… last month we published a new analysis on Defence Tech x Spatial Computing: A New Era of Dual-Use Innovation.
Gearing up for the final events of Q3 - come meet the FOV team at:
Petri will be at Tech BBQ (Copenhagen) - August 27-28
Dave in Zurich - September 9-10
Sointu at European VC Platform Summit (Berlin) - September 10-12
Petri at IPEM (Paris) - September 24-26
Dave at Bits & Pretzels (Munich) - September 29 - October 1
If you're attending any of these events and want to connect, reach out to the team!
FOV PORTFOLIO NEWS
IXI featured on BBC News with their upcoming autofocus glasses, talking about their adaptive tech that enhances everyday clarity without sacrificing style.v
Scenario launched "Generate in Parts" that transforms any image into a clean, modular 3D model in just one click.
nunu.ai launched Nexus - AI agents that can control computers, phones, and robots through natural language. Currently helping game studios automate QA testing
Ray Browser featured Tall Team co-founder John Halloran in an interview about their upcoming game Obby Roads and the future of web gaming.
Flow Computing published their explainer on the Parallel Processing Unit (PPU), a licensable IP block that delivers scalable, high-throughput parallel performance. (more)
GEEIQ's CEO joined a podcast to discuss brand success in virtual worlds —from Roblox's evolution to Fortnite's algorithm shifts. (more)
M-XR rolled out their new website and UI, with Marso now being tested with select users before broader access opens up.
ZENOS unveiled Game Twins - real-time, 1-to-1 recreations of live games inside Unreal Engine, solving esports' problem of outdated observer tools.
Starstuff hit half a million early beta users in just months with their viral playable social platform designed for Gen Z!!
Atopia Space won Munich's Innovation Prize and was chosen by the City of Munich to bring public art programs into virtual worlds, co-creating a virtual stage inside a digital twin of an iconic Munich site.
FOV JOBS
IXI - Lead Android Developer (apply)
Doublepoint - Principal ML Engineer in Helsinki, Finland to advance their gesture control technology for AR experiences. (apply)
M-XR has multiple openings in London, UK: ML/AI Infra, Eng & Data Ops, ML/AI Applied Research, and ML/AI Image Models roles available for their spatial computing platform. (apply)
GEEIQ has multiple positions open in London, UK: Client Services Executive, Head of Marketing, and Fullstack Engineer (£50k-65k/year) for their gaming analytics platform. (apply)
ai|coustics has multiple openings in Berlin, Germany: GTM Engineer, Machine Learning Engineer, and Founding Sales Development Representative roles available for their speech enhancement technology team. (apply)
AQL Robotics is expanding their Tampere, Finland team with three roles: Mechanical Design Engineer/Mechatronics Engineer, Laboratory Technician, and Machine Learning Engineer. (apply)
Iconic.AI - Principal Technical Designer in London, UK (apply)