A team of European researchers has "virtually" teleported real objects through cyberspace, touched things in virtual reality and even felt the movements of a virtual dance partner.
It sounds like science fiction, but advances in haptic technology and a new approach to generating virtual reality (VR) content are helping to create virtual experiences that are far more realistic and immersive than anything achieved before.
Not only do users see and hear their virtual surroundings, objects and avatars, but they can touch them as well, paving the way for new applications in telepresence, telemedicine, industrial design, gaming and entertainment.
“The audiovisual aspects of VR have come a long way in recent years, so adding a sense of touch is the next step,” says Andreas Schweinberger, a researcher at Technische Universität Munchen in Germany. “We know that the more senses that can be used, the more interaction, the greater the sense of presence. And a stronger sense of presence means the experience is more immersive and realistic.”
Schweinberger led a team from nine universities and research institutes in developing technology to make VR objects and characters touchable. With funding from the EU in the Immersence project, they developed innovative haptic and multi-modal interfaces, new signal processing techniques and a pioneering method to generate VR objects from real-world objects in real time.
The latter technology, developed at the Computer Vision Laboratory of Swiss project partner ETH Zürich, uses a 3D scanner and advanced modelling system to create a virtual representation of a real object, such as a cup, box or, in one experiment, a green fluffy toy frog. The 3D digital representation of the object can then be transmitted to someone at a remote location, who, by wearing VR goggles and touching a haptic interface, can move, prod and poke it.
“Haptic technology is still in the early stages. For the haptic interface, we used a robotic arm called a PHANTOM that has one contact point. This gives the sense of touching an object, but you can’t pick it up or handle it. However, one of the other project partners, the Universidad Politécnica de Madrid, is developing a haptic device with two contact points that should make it possible to grasp an object with a virtual hand,” Schweinberger explains.
The researchers also worked on techniques that would allow a user to feel different textures and sense the stiffness of an object, enabling them to differentiate between a hard box, a soft fluffy frog or even a liquid.
Would you care to dance?
The Immersence researchers did not stop at human-object interaction, however. Technische Universität München also developed technology to enable human-human interaction in a virtual environment.
At the lab in Munich, they used a mobile robotic platform with two arms to serve as the dance partner for a real human dancer. By wearing VR goggles, the user would see a dancer of the opposite sex and could dance with them by holding the “hands” of the robot.
“To program the robot we first recorded the forces, balance and movement of a real human dancer and applied these to the robot. In a VR environment, the robot could be a computer-controlled agent or the avatar of another person,” the project manager says.
French partner Université d’Evry went one step further and studied how to give the sensation of two people handling an object, such as lifting a heavy box, all virtually.
“It is not as simple as one person taking the lead and the other following. In reality, it is a negotiation process and the robotic interface has to be programmed for that,” Schweinberger notes.
Gamers will obviously be delighted by the developments, which promise to bring a whole new dimension and realism to VR environments. Besides entertainment, however, there are many serious applications for haptic VR technology. Doctors, for example, could use it to treat patients remotely, physiotherapists could use it for training and rehabilitation and industrial designers could collaborate remotely by virtually “teleporting” touchable digital mock-ups of designs over the internet.
“The research will also help in the development of cognitive robots that are better able to interact with humans,” notes Schweinberger, whose team is continuing research on that aspect of the Immersence project, which received funding under the FET-proactive strand of the EU’s Sixth Framework Programme. ETH Zürich, meanwhile, is set to continue developing its virtual teleporter.
Several of the project partners are also continuing their work in the EU-funded BEAMING project where they plan to develop a virtual reality room in which several mobile robots will move around autonomously and simulated objects, such as a table, chair or door, will be experienced by the user immersed in a virtual world.
Star Trek’s Holodeck, it seems, may only be a few years away.
Explore further: Machine-learning breakthrough paves way for medical screening, prevention and treatment
More information: Immersence project - www.immersence.info/