Future surgeons may use robotic nurse, 'gesture recognition'

Feb 03, 2011 by Emil Venere
Purdue industrial engineering graduate student Mithun Jacob uses a prototype robotic scrub nurse with graduate student Yu-Ting Li. Researchers are developing a system that recognizes hand gestures to control the robot or tell a computer to display medical images of the patient during an operation. Credit: Purdue University photo/Mark Simons

Surgeons of the future might use a system that recognizes hand gestures as commands to control a robotic scrub nurse or tell a computer to display medical images of the patient during an operation.

Both the hand-gesture recognition and robotic nurse innovations might help to reduce the length of surgeries and the potential for infection, said Juan Pablo Wachs, an assistant professor of industrial engineering at Purdue University.

The "vision-based hand gesture recognition" technology could have other applications, including the coordination of emergency response activities during disasters.

"It's a concept Tom Cruise demonstrated vividly in the film 'Minority Report,'" Wachs said.

Surgeons routinely need to review medical images and records during surgery, but stepping away from the operating table and touching a keyboard and mouse can delay the surgery and increase the risk of spreading infection-causing bacteria.

The new approach is a system that uses a camera and specialized algorithms to recognize hand gestures as commands to instruct a computer or robot.

At the same time, a robotic scrub nurse represents a potential new tool that might improve operating-room efficiency, Wachs said.

Findings from the research will be detailed in a paper appearing in the February issue of Communications of the ACM, the flagship publication of the Association for Computing Machinery. The paper, featured on the journal's cover, was written by researchers at Purdue, the Naval Postgraduate School in Monterey, Calif., and Ben-Gurion University of the Negev, Israel.

Research into hand-gesture recognition began several years ago in work led by the Washington Hospital Center and Ben-Gurion University, where Wachs was a research fellow and doctoral student, respectively.

He is now working to extend the system's capabilities in research with Purdue's School of Veterinary Medicine and the Department of Speech, Language, and Hearing Sciences.

"One challenge will be to develop the proper shapes of hand poses and the proper hand trajectory movements to reflect and express certain medical functions," Wachs said. "You want to use intuitive and natural gestures for the surgeon, to express medical image navigation activities, but you also need to consider cultural and physical differences between surgeons. They may have different preferences regarding what gestures they may want to use."

Other challenges include providing computers with the ability to understand the context in which gestures are made and to discriminate between intended gestures versus unintended gestures.

"Say the surgeon starts talking to another person in the operating room and makes conversational gestures," Wachs said. "You don't want the robot handing the surgeon a hemostat."

A scrub nurse assists the surgeon and hands the proper surgical instruments to the doctor when needed.

"While it will be very difficult using a robot to achieve the same level of performance as an experienced nurse who has been working with the same surgeon for years, often scrub nurses have had very limited experience with a particular surgeon, maximizing the chances for misunderstandings, delays and sometimes mistakes in the operating room," Wachs said. "In that case, a robotic scrub nurse could be better."

The Purdue researcher has developed a prototype robotic scrub nurse, in work with faculty in the university's School of Veterinary Medicine.

Researchers at other institutions developing robotic scrub nurses have focused on voice recognition. However, little work has been done in the area of gesture recognition, Wachs said.

"Another big difference between our focus and the others is that we are also working on prediction, to anticipate what images the surgeon will need to see next and what instruments will be needed," he said.

Wachs is developing advanced algorithms that isolate the hands and apply "anthropometry," or predicting the position of the hands based on knowledge of where the surgeon's head is. The tracking is achieved through a camera mounted over the screen used for visualization of images.

"Another contribution is that by tracking a surgical instrument inside the patient's body, we can predict the most likely area that the surgeon may want to inspect using the electronic image medical record, and therefore saving browsing time between the images," Wachs said. "This is done using a different sensor mounted over the surgical lights."

The hand-gesture recognition system uses a new type of camera developed by Microsoft, called Kinect, which senses three-dimensional space. The camera is found in new consumer electronics games that can track a person's hands without the use of a wand.

"You just step into the operating room, and automatically your body is mapped in 3-D," he said.

Accuracy and gesture-recognition speed depend on advanced software algorithms.

"Even if you have the best camera, you have to know how to program the camera, how to use the images," Wachs said. "Otherwise, the system will work very slowly."

The research paper defines a set of requirements, including recommendations that the system should:

  • Use a small vocabulary of simple, easily recognizable gestures.
  • Not require the user to wear special virtual reality gloves or certain types of clothing.
  • Be as low-cost as possible.
  • Be responsive and able to keep up with the speed of a surgeon's hand gestures.
  • Let the user know whether it understands the hand gestures by providing feedback, perhaps just a simple "OK."
  • Use gestures that are easy for surgeons to learn, remember and carry out with little physical exertion.
  • Be highly accurate in recognizing .
  • Use intuitive gestures, such as two fingers held apart to mimic a pair of scissors.
  • Be able to disregard unintended gestures by the surgeon, perhaps made in conversation with colleagues in the operating room.
  • Be able to quickly configure itself to work properly in different operating rooms, under various lighting conditions and other criteria.
"Eventually we also want to integrate voice recognition, but the biggest challenges are in ," Wachs said. "Much is already known about voice recognition."

Explore further: Doctor behind 'free radical' aging theory dies

add to favorites email to friend print save as pdf

Related Stories

Gesture recognition

Dec 18, 2008

A system that can recognize human gestures could provide a new way for people with physical disabilities to interact with computers. A related system for the able bodied could also be used to make virtual worlds more realistic. ...

Operating a computer by gesture only

Mar 12, 2010

Operating computers without touching them, using only hand and arm gestures: it sounds futuristic, but it's already possible. Researcher Wim Fikkert of the Centre for Telematics and Information Technology of the University ...

Cell Phones Using Gesture Control (w/ Video)

Apr 29, 2010

(PhysOrg.com) -- The next generation of cell phone interfaces is currently under development at Ishikawa Komuro Laboratory at the University of Tokyo but instead of using a touchscreen the new interface is ...

Gestures provide a helping hand in problem solving

Feb 01, 2011

Talking with your hands can trigger mental images that help solve complex problems relating to spatial visualization, an important skill for both students and professionals, according to new research published by the American ...

Gesture-driven computers

Feb 29, 2008

It isn’t always easy to communicate with a computer. Two Fraunhofer Institutes will be presenting new possibilities of man-machine interaction at CeBIT in Hanover (Germany) on March 4 through 9. They will demonstrate how ...

Recommended for you

Doctor behind 'free radical' aging theory dies

Nov 25, 2014

Dr. Denham Harman, a renowned scientist who developed the most widely accepted theory on aging that's now used to study cancer, Alzheimer's disease and other illnesses, has died in Nebraska at age 98.

Mexican boy who had massive tumor recovering

Nov 25, 2014

An 11-year-old Mexican boy who had pieces of a massive tumor removed and who drew international attention after U.S. officials helped him get treatment in the southwestern U.S. state of New Mexico is still recovering after ...

New medical device to make the mines safer

Nov 21, 2014

Dehydration can be a serious health issue for Australia's mining industry, but a new product to be developed with input from Flinders University's Medical Device Partnering Program (MDPP) is set to more effectively ...

US family gets $6.75 million in Botox case

Nov 20, 2014

A New York couple who said Botox treatment of their son's cerebral palsy left him with life-threatening complications and sued its manufacturer won a $6.75 million verdict from a federal jury on Thursday.

User comments : 0

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.