DRAG ME
DRAG ME
Play Reel

SEE THE FUTURE FOR YOURSELF.

I find it hard to recall all this without a touch of nostalgia: I saw the first iPhone just a few days after Apple’s official launch—ironically, in the hands of a Microsoft executive.We happened to be on a joint road show in Bangkok. Corporate bonds didn’t allow the head of Microsoft Mobile to gush over a rival gadget. When I borrowed his iPhone, though, I couldn’t stop playing with it. Tapping. Pinching. Zooming…. Now the “rubber band” scrolling feels so natural—back then it amounted to a miracle.

 

My excitement over the multitouch technology was shared by millions of people. But even for us all, it was hard to anticipate all the consequences of this invention in 2007.

 

Today, it goes without saying that in a measly ten years, the iPhone has turned the world upside down. The question is whether it’s about to turn again.

 

Believers say humanity is closing in on augmented reality. We already have the HoloLens. Google has just revamped its Google Glass. And finally, Apple ARKit is aiming to be a game-changer, potentially involving thousands of developers in the brawl. More level heads argue that there is quite a lot of pretty tricky stuff that has to be figured out. Being among the believers, I want to point out one particular problem that is often completely overlooked: the capability of projecting high-quality images close enough to the user.

 

It may catch you unawares, but no modern mainstream AR gadget can do that. They all project an image that looks and feels like it’s at least two meters away. And this image is always “flat”—with a fixed focal point. If you focus your eyes on your hand, every real object two meters from you will be out of focus—except virtual projections. They will stay in focus, and it will look weird. Compared to other problems we still have to solve, this particular one may seem insignificant. However small the distance/focal issue looks, though, it ruins everything.

Imagine a world that is filled with objects that always stay at a distance. You can’t get close to them. You can’t scrutinize them. You can’t naturally interact with them, because people usually interact with physical objects by touching them with their hands, and it’s a bit hard to find a human being with two-meter extremities.

 

So, with such limitations, you basically can’t create a virtual reality that will feel “real.” What’s more, you can’t truly embed it into our world.

 

avegant1

A virtual screen can’t be projected on a wall at your apartment. Your grandpa can’t be projected into an armchair in your living room for you both to have a natural conversation despite being a thousand miles apart. A human heart can’t be projected right onto a real one for the surgeon performing the operation.

 

The first-generation Google Glass languished because people saw it basically as a device that could take photos and record videos—other possibilities were overlooked for fear of “glassholes” secretly taking pictures of every person they looked at, including coworkers, shop assistants, and even their own friends drinking beers with them. So, a stupid screen flying somewhere two meters in front of you doesn’t impress people. Augmented reality fulfills its potential only when it becomes mixed reality. Virtual and real worlds need to get so interconnected that the user will have to make a deliberate effort to discriminate between them. Only then will new, endless possibilities for communication, education, and work be open.

 

It will end the smartphone era like the multitouch technology ended the era of computers. We’ll get into the screenless future, where everything is done by natural interaction—with gestures or voice commands—while virtual objects are embedded into the only world we once considered “real.” The distance limitation won’t allow that to happen. The first iPhone was far from perfect, not like the flawless device we’re using now, but its multitouch technology, along with its well-thought user interface, was a ground-breaking invention that started a new era. The next augmented reality device can be far from perfect too, but it will begin to gain traction only if it puts virtuality right in the hands of the user.

 

Luckily, we might already have a solution. Avegant, a company from Belmont, California, has come up with a new technology that tackles this problem. It’s called Light Field, and it doesn’t have any distance limitations.

 

avegant2

On top of that, Light Field can project a high-resolution image that feels super natural. It uses what’s known as a “multi-focal plane approach” to send digital imagery directly to your eyes in a way that replicates the way our eyes naturally perceive things and shift focus. In our real world, if you look at an object that is close to you, everything far from you becomes blurry, right? As I mentioned earlier, this doesn’t happen with virtual objects. But Light Field’s objects obey the rules of our reality, which makes them so realistic that you have to remind yourself they aren’t real. I can’t be more thrilled seeing the arrival of a groundbreaking near-eye display technology that solves one of mixed reality’s greatest challenges: enabling virtual objects to appear real at distances both near and far.

 

High-resolution projections, no pixelation, no screendoor effect. Focus shifting. Isn’t it what we need to finally take augmented reality off the ground and deliver on the promise to change the world—the promise that this newborn industry has already been making for a few years?

 

avegant3

Don’t get me wrong: the last thing I want is to oversell it (like VR aficionados did many times). Light Field is still in the works, and it’s far from being a mass market product. Like Google now does with its Glass, Avegant’s crew is now aiming at b2b markets where consumers can tolerate a heavy price tag and bulky headsets—if the gadget makes a real difference in their work. So, it’s still a long ways to go for an iPhone-like augmented reality device to arrive. But with this tech, you may foresee that it can be made soon—in three to five years.

 

Jeff Han presented an early version of multitouch technology at TED in 2006. Having seen it, one could argue that the arrival of a new-generation smartphone was inevitable. In reality, it happened only when the iPhone creators combined this invention with a new type of user interface.

 

Something similar has to happen now with the Light Field technology. Again, it won’t take off if we don’t come up with a new type of interface. This time, the task is even more challenging, since this new interface has to be almost non-existent—or at least has to feel like it’s non-existent.

I believe we’re witnessing the dawn of ubiquitous computing—the time when we won’t be interacting with computers through gadgets, but rather, we just use them everywhere. As Walt Mossberg put it, “Tech was once always in your way. Soon, it will be almost invisible.” There is a thing with virtual reality experiences: you can’t read about it to understand it and believe in it—you need to experience it. Well, nothing is impossible. We at Colorfy are proud to be Avegant’s partner and a part of the team that works on Light Field trying to figure out the best possible UI that will help this technology fulfil its potential.

That’s why we’ve got a unique opportunity to build the only Light Field test site in Europe—in our office in Berlin.