Researchers pushing boundaries of virtual reality

Feb 05, 2013

UT Dallas researchers are extending the borders of virtual reality, going beyond virtual spaces in which people can see and hear each other to an environment that adds the sense of touch.

The technology would make it possible for , for example, to work with patients in other locations. When a patient pushes down on a device, a doctor's device in another location would also move down with the same force, as if the patient were physically pressing the doctor's hand.

Professors in the Erik Jonsson School of Engineering and Computer Science are creating a multimedia system that uses multiple 3-D cameras to create avatars of humans in two different places, and then puts them in the same where they can interact.

In traditional telemedicine, a doctor and patient both appear on the same screen and are able to talk, but they are not in the same physical space.

"With in-home rehabilitation, doctors ask a patient if he or she has done their exercises, but the patient may not be doing them correctly," said Dr. Balakrishnan "Prabha" Prabhakaran, professor of computer science at UT Dallas and a principal investigator of a $2.4 million project funded by the National Science Foundation to create the system.

"It is one thing for a patient to say he or she did their exercises, but it is another to watch them in action, feel the force exerted, be able to correct them on the spot and get immediate response."

With large amounts of data, such as tracking images or movement, there could be significant lag time or delays in transmission. The grant funds creation of the algorithms and software needed to transmit the data through the internet in real time. There are four major areas of this system under research by experts in the Jonsson School.

Haptic Devices

Haptic devices are pieces of equipment with resistance motors that apply force, vibration or motion to the user to provide feedback. For example, touching a virtual stone with a haptic device would feel hard, while touching a virtual sponge would provide less feedback and feel more pliable.

If both doctor and patient have haptic devices in his or her physical environment, the applied force can be sent to the other person. A doctor could feel the strength of a patient's muscle, for example.

"Each device sends lots of data and combining that information in real time is a big challenge," Prabhakaran said.

Prabhakaran has expertise in multimedia systems and using haptic devices in real time.

Teleoperation and Control

Anyone who has used a service such as Skype has likely experienced a delay in communication – suddenly words get lost or are slow to transmit. A similar effect could happen with haptic devices.

"We absolutely do not want instability," Prabhakaran said.

Dr. Mark W. Spong, dean of the Jonsson School and holder of the Lars Magnus Ericsson Chair in Electrical Engineering and the Excellence in Education Chair, is a leading researcher in control and teleoperation – operating of machines at a distance. He is developing techniques to eliminate instability in communicating the data from the haptic devices over the network.

3-D Data Compression

To minimize the amount of data that needs to be exchanged, sophisticated algorithms need to be created. That's where Dr. Xiaohu Guo, associate professor of at UT Dallas and a project co-principal investigator, comes in. He's an expert in computer graphics, animation and modeling.

Guo is refining techniques to not only allow the data between haptic devices to be transmitted over the network more efficiently, but also creating 3-D visual images of original movements in real time.

"We do not only want the person to be moving the device, we want them to have a visual feel of what the movement is causing," Prabhakaran said.

Guo has had success transforming large amounts of data using what is known as spectral transformation techniques. These techniques rely on manifold harmonics to first transform 3-D images into points that represent the surface of an object. The data is then compressed into a smaller form that can be sent faster over networks.

Body Sensors

People using this platform would use body sensors similar to those installed in smartphones that can tell whether the user is looking at the device in portrait or landscape views.

"If we put body sensors on the patients, then his or her movements can be tracked with high accuracy," Prabhakaran said. "The advantage of the sensor is the data that is generated is only a few bytes large, so it is easily transmitted over the network.

"You need a 3-D model to provide visual perspective, but if you are dealing with a lousy network and can not have consistent visual perspective, the body sensors could provide that information."

Dr. Roozbeh Jafari, assistant professor of electrical engineering at UT Dallas and a co-principal investigator of the project, is an expert in cyber-physical systems. He has built wearable computers for monitoring different aspects of human health, behavior and thought, and is developing sensors for this project.

Researchers at the University of California, Berkeley and the University of Illinois at Urbana-Champaign are working on other aspects of the system, such as refining the overall user experience and coordination of the cameras used to visually capture the movements and interactions. Rehabilitation specialists at the Dallas VA Medical Center will test the system on patients.

While the main goal of the research, which is about halfway complete, is , other applications include dance instruction or any type of education in which people need to be in the same space, Prabhakaran said.

Explore further: FAA, industry launch drone safety campaign

add to favorites email to friend print save as pdf

Related Stories

Complete feel of user interfaces with vibrotactile feedback

Oct 04, 2012

Touchscreen mobile devices, phones and tablet computers, have gained prominence in our everyday life in the past years. Their user interfaces, however, make quite crude use of touch and the somatosensory system. Teemu Ahmaniemi ...

Virtual reality you can touch (w/ Video)

Aug 19, 2010

Researchers at the Computer Vision Lab at ETH Zurich, Switzerland, have developed a method with which they can produce virtual copies of real objects. The copies can be touched and even sent via the Internet. ...

Model to Help Patients See How to Sound Out Words

Jun 10, 2010

(PhysOrg.com) -- Traditionally, speech-language pathologists have relied on a patient’s sense of hearing to improve speech sounds. A team of researchers from UT Dallas is hoping to change that by creating a new high-tech ...

Recommended for you

FAA, industry launch drone safety campaign

4 hours ago

Alarmed by increasing encounters between small drones and manned aircraft, drone industry officials said Monday they are teaming up with the government and model aircraft hobbyists to launch a safety campaign.

Off-world manufacturing is a go with space printer

Dec 20, 2014

On Friday, the BBC reported on a NASA email exchange with a space station which involved astronauts on the International Space Station using their 3-D printer to make a wrench from instructions sent up in ...

User comments : 1

Adjust slider to filter visible comments by rank

Display comments: newest first

FrankHerbertWhines
1 / 5 (3) Feb 05, 2013
I can't hardly bear the waiting......soon i will be able to pay an internet "woman" 5 bucks for a "massage" instead of the 50 i pay now.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.