AR Tracking Technology
AR tracking takes an object or a medium in the real world as the reference, fixes the AR content to the relative position of the reference, and changes the relative position along with the movement of the reference. The common types of AR tracking include: AR plane tracking, AR image tracking, and AR physical tracking.
AR Plane Tracking: Tracking effect with any plane of the real world as a reference.
AR Image Tracking: Tracking effect with realistic image as reference.
AR Physical Tracking: Tracking effect with a three-dimensional object as a reference.
AR Tracking Technology Core Algorithm:
- Feature detection: image local feature detection can extract scale invariant and rotation invariant features.
- Feature point matching: A data structure algorithm using a high-dimensional feature space, rather than simple linear matching.
- Object Tracking and 3D pose estimation: Each tracking relies on the aforementioned feature point extraction, matching and computation to obtain the homograph transform (the set of 3D realistic coordinate points of the subject and the set of 2D screen imaging coordinates).
With the resulting set of matched points and in-camera parameters obtained by calibration, the algorithm for solving the perspective N-point problem can be used to accurately find the out-camera parameters - i.e., the exact camera poses in real time, allowing efficient estimation of the subject's 3D pose.