Amalgamated Vision: The Second Coming of Virtual Retinal Displays?
In the world of headworn AR, there are several approaches. These include a varied array of combinations to channel light toward the eye and bring digital content into one’s field of view. This is first bisected by passthrough AR (Apple Vision Pro) and seethrough AR (Meta Orion).
Within seethrough AR are several configurations in optical systems such as waveguides and LCOS microdisplays. Things further branch into display illumination — OLED, Micro-LED, etc — which each have tradeoffs (e.g., OLED’s low brightness), with no silver bullet yet to emerge.
Elsewhere in the mix is a sometimes-debated optical system: laser beam scanning (LBS). This involves lasers beamed into the eye to render digital images. Lasers race back and forth in a raster pattern to create an image, sans pixels, sometimes known as a virtual retinal display.
The benefits of this approach include sidestepping issues seen in waveguide-based AR, such as light inefficiency, low resolution, field of view, contrast, and vergence accommodation conflict. Downsides include a technology that, fairly or unfairly, has a reputation for being impractical.
Part of that stems from North Focals’ application of LBS, which required individual hardware fittings to get the display systems calibrated just right. This had to be done at one of its two global locations. AR optics thought leader Karl Guttag is meanwhile a well-established LBS detractor.
Palmer Luckey has also publicly challenged the technology. As he asserts in this interview with Peter Diamandis, LBS’ promise of a large field of view requires a wide-angle light source. This, he argues, raises practical challenges in where you place the lens and how big it has to be.
Contrarian View
Synthesizing these LBS misgivings, Amalgamated Vision (AV) founder and CEO Adam Davis tells AR Insider that they’re sometimes wrong, sometimes reasoned… but often outdated. For the latter, he says that AV has addressed several issues in its latest patents and prototypes.
Before going into those updates, what is AV? Its AR light engine is built on the principles of laser beam scanning and achieving a virtual retinal display. It’s a component maker rather than an AR device company, but it has built reference designs to demonstrate what the technology can do.
AV is also focused on a contrarian view of what AR’s UX should be. Rather than graphics that fill one’s field of view, it focuses on non-obtrusive and quick-reference information for use cases ranging from navigation to surgery. Davis himself is a former neuro-radiology physician.
For that reason, AV’s main product is a MEMs-based light engine that’s about the size of a penny. It’s meant to attach to glasses at the nose bridge for quick information when you need it. It can also be positioned above or below the eyes for look-down/up reference, depending on the need.
In each case, the display sits in a blind spot so it doesn’t block vision, but it’s there when you need it ( ambient computing). The UI contains an iPhone-like aspect ratio that’s perceived as a line-of-sight overlay, but much sharper and higher contrast than AR glasses (see graphic below).
To do this, AV’s light engine contains several parts including a tiny mirror that oscillates rapidly to achieve the raster scanning that directs light through the lens and into the eye. Along the way, the light is diffused, collimated, and guided to the right place by MEMs components.
Directing & Diffusing
That brings us back to AV’s approach and how it addresses LBS misgivings. Davis says that AV has solved two overarching issues. The first is that LBS causes distortion when the field of view is enlarged. The second is Palmer Luckey’s point: the lens has to be impractically large.
Taking those one at a time, picture an oscillating mirror that directs laser light from a small point to a wider flat surface (lens) by scanning back and forth. Because the raster pattern originates from a central point, it has a harder time rendering images at the edges of the flat lens.
AV fixed this with a curved pancake lens. That way, the concavity of the lens accommodated the extremities of the laser’s oscillating raster pattern. The result was that the field of view could be enlarged, without the normally-resulting distortion and chromatic aberration.
As a side note, pancake lenses — seen in Meta Quest 2 and other hardware — cause light inefficiency (losing light), as do AR waveguide systems. But in AV’s case, that was a welcome side-effect because LBS emits plenty of light… in fact, it often requires reducing light intensity.
On to the second innovation, it needed to achieve a greater exit pupil — the diameter of light leaving the lens. So it added a diffuser to widen the beam received by the lens. The result is a larger exit pupil from the same small-diameter source, thus addressing Palmer Luckey’s concern.
The practical result is a forgiving 8mm eyebox. This is the range of movement the eye can shift and still see the intended image (think: binoculars). A smaller eyebox offers less leeway to move around without losing the image — the reason North Focals had to be finely tuned for each user.
Right Time
Beyond the technical challenges in innovating around the laws of physics, AV has met market challenges. One reason is that its approach — though promising — deviated from consensus. Customers and investors had their hearts set on line-of-sight waveguide-based AR.
One common sentiment from investors was “Why would Microsoft and Magic Leap and others spend so many billions on seethrough AR if it wasn’t the right approach?” This was before Apple launched Vision Pro, but there was still ample momentum around seethrough AR glasses.
But that consensus has eroded in recent history. The reason? It largely didn’t work. If you consider Magic Leap’s market challenges and Microsoft’s HoloLens abandonment (HoloLens 2 contains some LBS), waveguide-based seethrough AR is anything but market-validated.
Meta last week captured the world’s imagination with Orion, but keep in mind that it’s a $10,000 prototype that’s several years from being market-ready. The point is that now is the time when the consensus is being shaken and the industry is open to other paradigms.
That plus AV’s breakthroughs means that it could be the right time to reconsider LBS’ role in AR’s future. There will still be lingering concerns, not to mention vested interests and entrenched opinions in ongoing AR optical-system debates. But in the end, the market will decide.
AV meanwhile has been able to sell the vision. It was recently awarded $2 million+ in SBIR funds to design a custom optical engine for USAF and NASA. It should come out of that process with the velocity to continue developing toward the elusive promise of a virtual retinal display.
Originally published at https://arinsider.co on October 3, 2024.