(Phys.org) -- AndyVision is a robot that walks around a store taking inventory and making sure that customers find what they are looking fortwo seemingly simple tasks that for retailers can add up to doing away with inefficiencies and lost sales. Think of it, say Carnegie Mellons robot design team that came up with AndyVision. You cannot find your favorite mustard (the store ran low and then out, and nobody thought to notice and re-order) or the shirt you want is misplaced among other racks (you give up looking and walk out).
Priya Narasimhan, a Carnegie Mellon professor who heads the Intel Science and Technology Center in Embedded Computing, recently demonstrated AndyVision at an Intel Research Labs event in San Francisco.
For Narasimhan, AndyVision reflects a type of computer-vision inventory system that might trump wireless RFID tags. Instead, the Carnegie Mellon team behind AndyVision notes that it is a combination of different types of algorithms running on a low-power system that is easier than RFID tagging to implement.
She and her team interviewed retailers first to best tailor a retail robot to their needs. They said that stores lose out when they run low on any item in high demand, and when a customer carries off a jar of something and drops it in another aisle of completely unrelated items. How unappetizing is a tin of salmon tossed next to roach-killer aerosols, or how disappointing is it when customers ask a clerk where an item is and the clerk does not know.
The CMU robot addresses those weaknesses. AndyVision has made video fame in its red hoodie, and has been spotted moving around the Carnegie Mellon university store since May. The robot does not crash into anything because of proximity sensors. It scans the shelves to generate a real-time interactive map of the store, which customers can browse via an in-store screen. The map generated by the robot is sent to the large touch-screen system in the store. Without even travelling to a desired aisle, the customer can check out a product on the screen, which shows the product location and also displays the product information. For employees, the robot checks out all the shelves, looking this way and that, and performs a detailed inventory check, identifying each item on the shelves, and alerting the workers if stock is low or if an item has been misplaced.
The robot uses image-processing and machine-learning algorithms; it looks for barcodes and text; and uses information about the shape, size, and color of an object to determine its identity. The robot can also infer items that belong next to each other. If an unidentified bright orange box is near Clorox bleach, it will infer that the box is Tide detergent. There is a database of 3-D and 2-D images showing the store's stock and a basic map of the store's layout.
After its initial test at the campus store, said Narasimhan, the robot system will be put to test in several stores next year.
Explore further: Robot researcher combines nature to nurture 'superhuman' navigation