Apple recently announced their upcoming ARKit framework, due to be released on iOS 11, with the native, built-in ability to track position and rotation in the real world. That’s a game changer.
Traditionally, anchoring something in Augmented Reality (AR), required you to have a marker, or a known image that would be used to hold a reference point to a 3D world. 3D elements are then rendered in that world, and overlaid on top of the camera image.
With ARKit, you don’t need the marker anymore. It can recognize planes and surfaces, and generate points in 3D space that you can attach things to.
They released the beta version of iOS 11, so I’ve been able to do some pretty cool experiments, as seen below: