(PhysOrg.com) -- Researchers at the University at Buffalo and Amrita University in India have developed the framework for a smart environment that can track people's whereabouts without the use of invasive technologies such as constant filming or radio frequency identification (RFID) tags. The new tracking method could improve safety and security in nursing homes, hospitals and other closed spaces while providing occupants with freedom from continuous surveillance.
"Our goal is to develop systems that could enhance quality of life at homes and hospitals; productivity at the workplace; and security of critical spaces," said Bharat Jayaraman, a professor of computer science and engineering at UB and a principal investigator of the project. "We want technology to be natural and unobtrusive. We don't want you to carry around an RFID tag, and we don't want cameras everywhere. We want technology to be assistive, and not become Big Brother."
A peer-reviewed paper, "Three R's of Cyber-Physical Spaces," describing the new tracking method appears online in Computer, the flagship magazine of the IEEE Computer Society and the print edition will carry the article in a future issue.
The research also will be presented next week at the "Indo-US Workshop on Developing a Research Agenda in Pervasive Communications and Computing Collaboration (PC3)," co-sponsored by the National Science Foundation.
Here's how the new system works: First, administrators place video cameras that capture a person's face, gait or height at entryways within a building, such as doors that separate one room from another. When a person passes through an access point, the camera registers his presence and feeds the information to a computer.
The computer then compares the individual's biometric characteristics against a database containing the biometrics of all building occupants. Because of variations in room lighting, camera angle, facial expression and other details, the computer can only make an initial guess about who an individual might be.
To increase the accuracy of the identification, the computer employs reasoning, making a judgment on whether it is possible for a person to be at a certain location based on his trajectory and the building's spacial layout.
For instance, because it would be impossible for a person identified in a hospital lobby to immediately move into a room in a distant wing of the building, the computer would deduce that no person in the lobby could also be moving around the far wing. This "spatio-temporal" reasoning helps to eliminate "false positives," as the system only identifies individuals with valid trajectories.
When researchers ran computer simulations of the tracking system, they were able to identify and trace the whereabouts of individuals with a high degree of accuracy, even when employing images from low-quality cameras as the means of identification.
The tracking solution that the three collaborators devised is elegant, combining recognition, reasoning and information retrieval--three areas of computer science that are studied heavily but usually separatelywithin a unified framework known as a state-transition system.
In computer science, a state-transition system is a way of modeling dynamic environments by monitoring how specific changes alter the state of a given environment. In the case of the new tracking system, the "state" of a building is defined by the location of its occupants. (Specifically, a "state" consists of the set of probabilities describing where every occupant in a building might be at a given time.) A change in state, called a "state transition," takes place each time a person moves from one room to another.
Besides Jayaraman, the team that developed the tracking system included Vivek Menon, an assistant professor of information systems at Amrita University in India, and Venu Govindaraju, a SUNY Distinguished Professor of computer science and engineering at UB. Menon was a visiting research scientist at UB's Center for Unified Biometrics and Sensors from 2007 to 2009. Govindaraju is director of that center, and Jayaraman is a member of the advisory board.
Explore further: Newest computer neural networks can identify visual objects as well as the primate brain