In virtual and augmented realities, computers adapt to humans

Santo Antônio do Monte

‘The world itself becomes the interface’ as people exercise more intuitive controls

can you buy Lyrica over the counter See the original article from CBC.ca on October 17, 2016

Imagine experiencing flying through the sky like an eagle or purchasing clothes in a store window without having to walk inside.

Like a scene out of The Matrix or Minority Report, virtual reality and augmented reality are poised to revolutionize the way we engage with media, transporting us to new worlds and layering digital information over our real environments.

But it’s not just the content that is revolutionary — so is the way we connect with it.

Up to now, we have engaged with digital devices through screens, either computers, televisions, smartphones or tablets. Even though VR and AR are very different forms, the one thing they have in common is that they are drastic departures from the flat screens we’re accustomed to.

Putting on a headset immerses VR users in new surroundings, real or imaginary. AR, on the other hand, enhances the user’s physical environment by overlaying digital information on the immediate surroundings.

While one transports us to a new world and the other enhances the world we’re in, both fundamentally change the way we interact with technology.

By forgoing the traditional flat screen, “the world itself becomes the interface,” says Keram Malicki-Sanchez, executive director of VRTO Virtual & Augmented Reality World Conference & Expo.

Over the last decade, there has been an evolution of interaction as the chain of command between humans and machines has become shorter. In the early days of personal computers we relied on intermediary hardware. Desktop computing required that we use a mouse and a keyboard to interact with content on a screen.

With the rise of smart screens, the chain of command between user and device has become a step shorter, no longer requiring external hardware. Instead, through touch, taps and swipes, we can directly engage with the device.

More intuitive interaction

With the explosive growth of VR and AR, the next wave of human-computer interaction is set to be even more intuitive. Because we’re moving away from not only the extraneous hardware but also the flat screen, the way we interact in augmented and virtual reality environments is based on ambient cues and movement.

“With AR, the immediate surroundings will become the medium on which data and interactive elements are displayed,” says Craig Alguire, technical director and co-founder of Quantum Capture.

“This creates near-endless possibilities dynamic content and applications which will respond to a user’s unique physical location.”

AR does not rely on inputs that are easy for computers to understand. Instead, it is essential for our devices to respond to the way we naturally interact with each other and our environments through gestures, motion, eye movement and speech.

What’s been so impressive about VR is the way it transports us to new environments. We can be in the rubble of war-ravaged Aleppo, or near the North Pole, or in entirely manufactured fantasy worlds. How we interact with those worlds is the next challenge for developers.

Maxime Beland, the studio creative director for Ubisoft Toronto, says, “Right now, we are very close to perfecting sight and sound immersion in interactive experiences using VR. For me, the next big challenge will be to solve the sense of touch in a more natural way. Hand controllers are still a crutch to replace real feeling of grabbing, using and interacting with the virtual environment.”

Beland adds, “Often when we talk about immersion we think about graphics first, but for me the way the player expresses herself inside the game is even more critical than the way the game looks. The best games make you forget about the controls, about the way you interact with the game.”

Reacting to emotions

In addition to personalized experiences that respond to our gestures and our gaze, AR and VR open up possibilities for experiences that react to our emotions.

“It’s absolutely fascinating that we can create worlds that adapt to our moods,” says Loc Dao, the chief digital officer for the National Film Board, widely recognized as being an international leader in VR.

While it sounds like science fiction, products that facilitate this new era of intuitive interactions are making their way to market.

The HTC Vive headset is equipped with infrared emitters used to track movement that can read and interpret gestures. Oculus will begin shipping Touch hand controllers in December, to go along with its headset, which allow VR users to use their hands to manipulate simulated environments.

The new Playstation VR, which just launched, has basic motion supports including head tracking and gesture tracking in addition to controller input.

For anyone who’s ever wanted to fly, Ubisoft Montreal has created an experience called Eagle Flight, in which you embody an eagle, controlling your flight direction by turning your head.

The move toward these intuitive interactions is a big shift in the power dynamic between humans and machines. Up to now, we have had to adjust our behaviour to interact with computers in ways that they understand. With AR and VR experiences, the roles are reversed, as computers are required to respond to the way humans naturally engage with objects and each other.

On one hand, this shift puts humans in control, but as Dao notes, it will also make our digital experiences more seamlessly integrated into our environments and lives than ever before.

Technology that responds to our eye movements, moods, and motions “will change our definition of what is real and virtual to the point where our reliance and addiction may border on cybernetic.”