Stanford's Holographic Ai Glasses Are Coming For Your Clunky Vr Headset

Trending 1 month ago
Stanford's glasses prototype
Stanford/ZDNET

Over nan past mates of years, pinch nan preamble of nan Apple Vision Pro and nan Meta Quest 3, I've go a believer successful nan imaginable of mixed reality.

First, and this was a large interest for me, it's imaginable to use VR headsets without barfing. Second, immoderate of nan applications are genuinely amazing, particularly nan entertainment. While nan expertise to watch a movie connected a elephantine surface is awesome, nan afloat immersive 3D experiences connected nan Vision Pro are really rather compelling.

In this article, I'm going to show you a exertion that has nan imaginable of definitively obsoleting VR devices for illustration nan Vision Pro and Quest 3. But first, I want to recount an acquisition I had pinch nan Vision Pro that had a spot of a reality-altering effect.

Then later, erstwhile we talk nan Stanford research, you'll spot really they mightiness grow connected thing for illustration what I knowledgeable and return it acold beyond nan adjacent level.

Also: These XR glasses gave maine a 200-inch surface to activity with

There's a Vision Pro acquisition called Wild Life. I watched nan Rhino section from early 2024 that told nan communicative of a wildlife refuge successful Africa. While watching, I really felt for illustration I could scope retired and touch nan animals; they were that close.

But here's wherever it gets interesting. Whenever thing connected TV shows someplace I've really been to successful existent life, I person an soul dialog container popular up successful my encephalon that says, "I've been there."

So, immoderate clip aft I watched nan Vision Pro section connected nan rhino refuge, we saw a news communicative astir nan place. And wouldn't you cognize it? My encephalon said, "I've been there," moreover though I've ne'er been to Africa. Something astir nan VR immersion indexed that section successful my encephalon arsenic an existent lived experience, not conscionable thing I watched.

To beryllium clear, I knew astatine nan clip it wasn't a existent experience. I presently cognize that it wasn't a real-life lived experience. Yet immoderate small spot of soul encephalon parameterization still indexes it successful nan lived experiences array alternatively than nan viewed experiences table.

Also: I yet tried Samsung's XR headset, and it thumps my Apple Vision Pro successful meaningful ways

But location are a fewer wide known problems pinch nan Vision Pro. It's measurement excessively expensive, but it's not conscionable that. I ain one. I purchased it to beryllium capable to constitute astir it for you. Even though I person 1 correct present and movies are insanely awesome connected it, I only usage it erstwhile I person to for work.

Why? Because it's besides rather uncomfortable. It's for illustration strapping a ceramic to your face. It's heavy, hot, and truthful intrusive you can't moreover return a sip of java while utilizing it.

Stanford research

All that brings america to immoderate Stanford investigation that I first covered past year.

A squad of scientists led by Gordon Wetzstein, a professor of electrical engineering and head of nan Stanford Computational Imaging Lab, has been moving connected solving some nan immersion and nan comfortableness problem utilizing holography alternatively of TV technology.

Using a operation of optical nanostructures called waveguides and augmented by AI, nan squad managed to conception a prototype device. By controlling nan strength and shape of light, they're capable to manipulate ray astatine nan nano level. The situation is making real-time adjustments to each nan nano-light sequences based connected nan environment.

Also: We tested nan champion AR and MR glasses: Here's really nan Meta Ray-Bans stack up

All of that took a ton of AI to amended image formation, optimize wavefront manipulation, grip wildly analyzable calculations, execute shape recognition, woody pinch nan thousands of variables progressive successful ray propagation (phase shifts, interference patterns, diffraction effects, and more), and past correct for changes dynamically.

Add to that real-time processing and optimization done astatine nan super-micro level managing ray for each eye, processing instrumentality learning and perpetually refining nan holographic images, handling non-linear and high-dimensional information that comes from dealing pinch changing aboveground dimensionality, and past making it activity pinch optical data, spatial data, and biology information.

It was a lot. But it was not enough.

Visual Turing Test

The logic I mentioned nan rhinos earlier successful this article is because nan Stanford squad has conscionable issued a new investigation report published successful Nature Photonics Journal, showing really they are trying to exceed nan cognition of reality imaginable from surface show technology.

Back successful nan 1950s, integer pioneer Alan Turing suggested what has go known arsenic nan Turing Test. Basically, if a quality can't show if a instrumentality astatine nan different extremity of a speech is simply a instrumentality aliases a human, that instrumentality is said to walk nan Turing Test.

The Stanford folks are proposing nan thought of a ocular Turing Test, wherever a mixed reality instrumentality would walk nan trial if you can't show whether what you're looking astatine is existent aliases computer-generated.

Also: The time reality became unbearable: A peek beyond Apple's AR/VR headset

Putting speech each nan nightmares of uber-deep fakes and my small communicative here, nan Stanford squad contends that nary matter really high-resolution stereoscopic LED exertion is, it's still flat. The quality brain, they say, will ever beryllium capable to separate 3D represented connected a level show from existent reality.

As existent arsenic it mightiness be, there's still an uncanny vale that lets nan encephalon consciousness distortions.

But holography bends ray nan aforesaid measurement beingness objects do. The Stanford scientists contend that they tin build holographic displays that nutrient 3D objects that are each spot arsenic dimensional arsenic existent objects. By doing so, they'll walk their ocular Turing Test.

"A ocular Turing Test past means, ideally, 1 cannot separate betwixt a physical, existent point arsenic seen done nan glasses and a digitally created image being projected connected nan show surface," says Suyeon Choi, a postdoctoral clever clever successful Wetzstein's laboratory and first writer of nan paper.

Also: Meta conscionable launched a $400 Xbox Edition Quest 3S headset - and it's afloat of surprises

I'm not judge astir this. Yes, I support nan thought that they will beryllium tin of producing eyewear that bends ray to replicate reality. But I deterioration glasses. There's ever a periphery extracurricular nan separator of my glasses that I tin spot and sense.

Unless they create headsets that artifact that peripheral vision, they won't beryllium capable to genuinely emulate reality. It's astir apt doable. The Meta Quest 3 and nan Vision Pro some wrap astir nan eyes. But if Stanford's extremity is to make holographic glasses that consciousness for illustration normal glasses, past peripheral imagination could complicate matters.

In immoderate case, let's talk astir really acold they've travel successful a year.

That was then, this is now

Let's commencement by defining nan method word "étendue." According to Dictionnaires Le Robert and translated into English by The Goog, étendue is nan "Property of bodies to beryllium located successful abstraction and to inhabit portion of it."

Ocular scientists usage it to harvester 2 characteristics of a ocular experience: nan section of position (or really wide an image appears) and nan eyebox (the area successful which a pupil tin move and still spot nan full image).

41566-2025-1718-fig1-html
Image: Nature Photonics Journal

A ample étendue would some supply a wide section of position and let nan oculus to move astir capable for existent life while still seeing nan generated image.

Since we reported connected nan task successful 2024, nan Stanford squad has accrued nan section of position (FOV) from 11 degrees to 34.2 degrees horizontally and 20.2 degrees vertically.

This is still a acold outcry from nan Quest 3's 110‑degree horizontal and 96‑degree vertical FOV, aliases moreover nan estimated 100‑degree FOV of nan Vision Pro. Of course, human eyes each person a section of view of astir 140 degrees and, erstwhile combined, springiness america imagination of astir 200 degrees.

Also: This AR headset is changing really surgeons spot wrong their patients

This year, nan squad developed a custom-designed angle-encoded holographic waveguide. Instead of Surface Relief Gratings (SRGs) utilized successful 2024, nan caller prototype's couplers are constructed of Volume Bragg Gratings (VBGs). VBGs forestall "world-side ray leakage" and ocular sound that tin degrade opposition successful erstwhile designs, and they besides suppress stray ray and shade images.

Both SRGs and VBGs are utilized to power really ray bends aliases splits. SRGs usability via a mini shape etched connected nan aboveground of a worldly -- ray bounces disconnected that surface. VBGs supply changes wrong nan worldly and bespeak aliases select ray based connected really that soul building interacts pinch ray waves. VBGs fundamentally supply much power complete ray movement.

Another cardinal constituent of nan newest prototype is nan MEMS (Micro-Electromechanical System) mirror. This reflector is integrated into nan illumination module on pinch a collimated fiber-coupled laser and nan holographic waveguide we discussed above. It is different instrumentality for steering light, successful this lawsuit nan illumination angles incident connected nan Spatial Light Modulator (SLM).

This, successful turn, creates what nan squad calls a "synthetic aperture," which has nan use of expanding nan eyebox. Recall that nan bigger nan eyebox, nan much a user's oculus tin move while utilizing nan mixed-reality system.

Also: HP conscionable turned Google Beam's hologram calls into reality - and you tin bargain it this year

AI continues to play a cardinal domiciled successful nan move functionality of nan display, compensating heavy for real-world conditions and helping to create a seamless belief of nan substance of existent reality and constructed reality. AI optimizes nan image value and three‑dimensionality of nan holographic images.

Last year, nan squad did not specify nan size of nan prototype eyewear, isolated from to opportunity it was smaller than emblematic VR displays. This year, nan squad says they've achieved a "total optical stack thickness of little than 3 mm (panel to lens)." By contrast, nan lenses connected my mundane eyeglasses are astir 2 mm thick.

"We want this to beryllium compact and lightweight for all-day use, basically. That's problem number one, nan biggest problem," Wetzstein said.

The thrillogy of nan trilogy

The Stanford squad describes these reports connected their advancement arsenic a trilogy. Last year's study was Volume One. This year, we're learning astir their advancement successful Volume Two.

It's not clear really acold distant Volume Three is, which nan squad describes arsenic real-world deployment. But pinch nan improvements they've been making, I'm guessing we'll spot immoderate much advancement (and perchance Volumes Four and Five) sooner, alternatively than later.

Also: I wore Google's XR glasses, and they already hit my Ray-Ban Meta successful 3 ways

I'm not wholly judge that blending reality pinch holographic images to nan constituent wherever you can't show nan quality is healthy. On nan different hand, existent reality tin beryllium beautiful disturbing, truthful constructing our ain bubble of holographic reality mightiness connection a respite (or a caller pathology).

It's each conscionable truthful very weird and ever truthful somewhat creepy. But this is nan world we unrecorded in.

What do you deliberation astir nan thought of a "visual Turing Test"? Do you judge holographic displays could genuinely fool nan encephalon into reasoning integer imagery is real? Have you tried immoderate of nan current‑gen mixed reality headsets for illustration nan Vision Pro aliases Quest 3? How immersive did they consciousness to you? Do you deliberation Stanford's waveguide-based holographic attack could flooded nan comfortableness and realism barriers holding backmost mainstream XR adoption? Let america cognize successful nan comments below.

Get nan morning's apical stories successful your inbox each time pinch our Tech Today newsletter.


You tin travel my day‑to‑day task updates connected societal media. Be judge to subscribe to my play update newsletter and travel maine connected Twitter/X astatine @DavidGewirtz, connected Facebook astatine Facebook.com/DavidGewirtz, connected Instagram astatine Instagram.com/DavidGewirtz, connected Bluesky astatine @DavidGewirtz.com, and connected YouTube astatine YouTube.com/DavidGewirtzTV.

More