| CPC G06F 3/1454 (2013.01) [G06T 11/00 (2013.01); G11B 27/036 (2013.01)] | 8 Claims |

|
1. A method for a service provider to synchronize augmented reality (AR) objects across a network with a device, in which essential meta-data and nonessential data facilitate first and second rendering of an AR presentation, comprising:
receiving, over the network from the device, a video stream associated with the device, the video stream including embedded in-band therein the essential meta-data corresponding to at least motion of the device while capturing the video stream to facilitate the first rendering, and an associated separate non-embedded out-of-band nonessential data to facilitate the second rendering, wherein the device determines selection of information that is embedded in-band and information that is non-embedded out-of-band nonessential data dynamically based at least in part on analysis of communication session conditions, the essential meta-data embedded in the video stream such that the essential meta-data is non-visible;
extracting the essential meta-data information from the video stream;
determining, based at least in part on the essential meta-data, initial AR information including placement of one or more AR objects in the video stream;
determining the first rendering based at least in part on the initial AR information;
determining an updated AR information based at least in part on updating the initial AR information based at least in part on the separate nonessential data; and
determining the second rendering based at least in part on refining the initial rendering based at least in part on the updated AR information.
|