Magic Leap today revealed a mixed reality headset that it believes reinvents the way people will interact with computers and reality. Unlike the opaque diver’s masks of virtual reality – which replace the real world with a virtual one – Magic Leap’s device, called Lightwear, resembles goggles, which you can see through as if you’re wearing a special pair of glasses. The goggles are tethered to a powerful pocket-sized computer, called the Lightpack, and can inject life-like moving and reactive people, robots, spaceships – anything – into a person’s view of the real world.
Where the 1830s technology uses two flat images, virtual reality essentially uses two screens. Abovitz thought there had to be a better way. He was uninterested in improving virtual reality; instead, he sought a better way to create images that can be placed into a person’s view of the real world. In short, he was interested in mixed reality.
The first was something called the analog light field signal. The light field is essentially all of the light bouncing off all of the objects in a world. When you take a picture, you’re capturing a very thin slice of that light field. The eye, however, sees much more of that light field, allowing a person to perceive depth, movement and a lot of other visual subtleties. The other thing that Abovitz wanted to figure out was how that light field signal makes its way into your brain through the eye and into the visual cortex.“The world you perceive is actually built in your visual cortex,” he says. “The idea is that your visual cortex and a good part of the brain is like a rendering engine and that the world you see outside is being rendered by roughly a 100 trillion neural-connections.”
technology didn’t need to capture the entirety of the light field and recreate it; it just needed to grab the right bits of that light field and feed it to the visual cortex through the eye. Abovitz calls it a system engineering view of the brain. “Our thought was, if we could figure out this signal and or approximate it, maybe it would be really cool to encode that into a wafer,” he says. “That we could make a small wafer that could emit the digital light field signal back through the front again. That was the key idea.”
Suddenly, Abovitz went from trying to solve the problem to needing to engineer the solution. He was sure if they could create a chip that would deliver the right parts of a light field to the brain, he could trick it into thinking it was seeing real things that weren’t there. The realization meant that they were trying to get rid of the display and just use what humans already have. “There were two core zen ideas: The no-display-is-the-best-display and what’s-outside-is-actually-inside. And they turned out to be, at least from what we’ve seen so far, completely true. Everything you think is outside of you is completely rendered internally by you, co-created by you plus the analog light field signal.
The light field photonics, which can line up a fake reality in your natural light real one, may be the most obvious of the innovations on display, but there’s much more. The visual perception system is actively tracking the world you’re moving through, noting things like flat surfaces, walls, objects. The result is a headset that sees what you do, and that can then have its creations behave appropriately, whether that means hanging a mixed reality monitor next to your real one, or making sure the floating fish in your living room don’t drift through a couch. That room mapping is also used to keep track of the things you place in your world so they’re there waiting for you when you come back. Line up six monitors above your desk and go to sleep, the next day they’ll be exactly where you left them.