(Phys.org) —Apple Inc. has been granted a patent for an application filed with the U.S. Patent Office in 2010 for "Synchronized, interactive augmented reality displays for multifunction devices." The patent filing describes Augmented Reality (AR) technology that is familiar to most tech watchers, along with new ideas that are not.
In their filing, Apple describes their approach to providing standard AR functionality, i.e. overlaying information onto a real world screen image as seen on an electronic hardware device. In this case, the device is any running Apple's iOS. Along with that Apple has also described two new innovations.
The first is interactivity. Up till now, AR apps have been largely one way—users look "through" their device at the surrounding environment and see what is out there, with pertinent data displayed over the top of it. Apple gives an example of using an iPhone to look at a circuit board, which is recognized by the system and displays pop-up labels identifying its various components. Apple's new idea is to allow the user to add to what is displayed—if a particular chip is missing from the circuit board, for example, an engineer could add a pop-up that describes it, right then and there. Taking the idea even further, the words, data, graphs, pictures or even web links a user adds, could be shared with other online users.
The second innovation Apple is looking to patent is a split screen approach to viewing AR environments. On one screen, users see a real-time view of what is on the other side of their device, just as happens currently with an iOS device. The second screen displays a computer generated view of what appears on the first screen. The purpose of such a second screen would be to allow for manipulating a virtual environment in three dimensions. In the example given in the patent application, Apple says a user could point their iPhone at a San Francisco skyline for example—the top screen would show what the user is already able to see without the device, while the second screen would be a computer generated version of what is there, complete with information from online mapping programs. That would allow the user to look at the same scene from different angles for example, or draw directions lines for sharing with another user online.
The patent application indicates that Apple, like most others big names in the tech sector, is taking a very hard look at AR applications and the ways they might be implemented into both new and existing devices.
Explore further: Practicing nursing care in a virtual world
More information: Synchronized, interactive augmented reality displays for multifunction devices, United States Patent, 8,400,548.
A device can receive live video of a real-world, physical environment on a touch sensitive surface. One or more objects can be identified in the live video. An information layer can be generated related to the objects. In some implementations, the information layer can include annotations made by a user through the touch sensitive surface. The information layer and live video can be combined in a display of the device. Data can be received from one or more onboard sensors indicating that the device is in motion. The sensor data can be used to synchronize the live video and the information layer as the perspective of video camera view changes due to the motion. The live video and information layer can be shared with other devices over a communication link.