According to a pair of patent applications published on Thursday, Apple is investigating augmented reality systems for iOS capable of providing users with enhanced virtual overlays of their surroundings, including an “X-ray vision” mode that peels away walls.

Source: appleinsider.com

Apple filed two applications with the U.S. Patent and Trademark Office, titled "Federated mobile device positioning" and "Registration between actual mobile device position and environmental model," both describing an advanced augmented reality solution that harnesses an iPhone’s camera, onboard sensors and communications suite to offer a real-time world view overlaid with rich location data. 

The system first uses GPS, Wi-Fi signal strength, sensor data or other information to determine a user’s location. From there, the app downloads a three-dimensional model of the surrounding area, complete with wireframes and image data for nearby buildings and points of interest. Corresponding that digital representation with the real world is a difficult task with sensors alone, however.

To accurately place the model, Apple proposes the virtual frame be overlaid atop live video fed by an iPhone’s camera. Users can align the 3D asset with the live feed by manipulating it onscreen through pinch-to-zoom, tap-and-drag and other gestures, providing a level of accuracy not possible through machine reckoning alone. 

Alternatively, users can issue audible commands like "move left" and "move right" to match up the images. Wireframe can be "locked in" when a point or points are correctly aligned, thus calibrating the augmented view. 

In yet another embodiment, the user can interact directly with the wire model by placing their hand into the live view area and "grabbing" parts of the virtual image, repositioning them with a special set of gestures. This third method requires object recognition technology to determine when and how a user’s hand is interacting with the environment directly in front of the camera.

See on Scoop.itaugmented world