How can we make our virtual worlds more physical

There are two main challenges for tech developers seeking to drag immersive virtual reality environments out of the realms of science fiction.

The first is the ability for users to interact with the environment – to walk, run, jump and crouch, pick up, drop and throw objects, and handle them as though they had physical properties.

On the flipside, a user’s ability to feel feedback is almost as important if they are to believe they inhabit the worlds they are engaged with.

There have been huge strides in the development of core VR technologies in recent years, and now the race is on to create supplementary gear to enhance the experience.

Controllers are becoming more sophisticated, indeed Valve’s new prototype for Steam VR lets users pick up and throw objects, as well as manipulate them.

The main problem for developers trying to mimic locomotion within VR is space. Room scales the current aspirational standard, and the ideal would clearly be to allow complete freedom of movement, but this is far from practical.

Aside from the safety aspects, it would be practically impossible to make the real world simulate what users see through their headsets, most of which are currently designed with standing still or sitting in mind.

Roto VR has developed a motorised chair which it believes could tackle the problem. It allows users to walk, run and jump using touch pedals, tracks head movement and rotates the chair to match, and can accommodate a range of peripherals such as racing pedals, and a table for steering wheels or flight sticks.

This appears to solve a range of issues, including rooting physical movement to the spot. Virtuix has come up with an alternative solution, the Omni, a rig and harness allowing users to walk or run, forwards or backwards, and even strafe. It also decouples body and head, allowing viewing and movement in separate directions.

But the two options are rather costly which, for a diversion already potentially requiring heavy outlay, is far from ideal. Both pieces of kit are also rather bulky for most normal domestic settings.

Some indie developers are working on much lower-tech solutions to the problem. Ryan Sullivan (YouTube handle deprecatedcoder) has been working on a static version using HTC Vive controllers held at waist height coupled with running on the spot to move. Others are working on methods to solve the issue of not having a free hand for interaction.

Even if developers crack the issue of movement and interaction, VR is likely to continue feeling somewhat sterile unless users are able to gain some level of sensory feedback.

Microsoft Research is working on two devices called NormalTouch and TextureTouch which could offer the ability to feel shapes and the basic texture of objects. Current technology clearly lacks finesse and would need a great deal of work to become at all ergonomic.

An elegant solution to the problem of physical resistance has yet to be found. Several ideas have been put forward for VR gloves, allowing control and feedback, including the Dexta Robotics Dexmo exoskeleton glove.

The company’s demo video explains that the glove “physically pulls back your fingers to fit the shape of the virtual objects, and it dynamically changes the force applied to simulate their stiffness”.

This, it continues, means “you can not only feel the physical presence of the object, but also tell the difference between a virtual stone and a virtual rubber duck by just squeezing them”.

But how about the rest of the body? Perhaps the Synesthesia Suit or Teslasuit have the answer. Synesthesia uses full-body haptics which rely on vibration to simulate sensations and contact with objects. Teslasuit approaches the problem in a similar way using electro-muscular stimulation.

All of these solutions have one thing in common, they affect or are affected by the body. Perhaps there’s another way. Scientists from Italy and Japan recently collaborated on a study which allowed paralysed subjects to control a robot using an EEG cap which read electrical activity in their brains.

The subjects at the University of Rome could direct the robot in a lab almost 9,000 kilometres away in Tsukuba to pick up a drink, move across a room and put the drink on a table.

With this kind of cerebral manipulation under development, who knows what might be possible in the future of virtual reality?


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s