CPC H04L 67/131 (2022.05) [G06F 3/0482 (2013.01); G06F 16/213 (2019.01); G06F 16/245 (2019.01); G06T 19/003 (2013.01); G06T 19/006 (2013.01); H04L 63/1416 (2013.01); H04L 67/02 (2013.01)] | 1 Claim |
1. A method comprising:
generating, based on first sensor data captured by a depth sensor on a mobile device, three-dimensional data representing a physical space that includes a real-world asset;
generating, based on second sensor data captured by an image sensor on the mobile device, two-dimensional data representing the physical space;
generating an extended reality (XR) stream representing at least a portion of a remote collaboration session between a host device and a set of one or more remote devices, wherein the XR stream includes:
a combination of the three-dimensional data and the two-dimensional data that includes a digital representation of the real-world asset,
a set of augmented reality (AR) elements that are associated with the real-world asset, and
a set of performed actions, each action in the set of performed actions associated with (i) at least a portion of the digital representation, or (ii) at least one AR element in the set of AR elements;
serializing the XR stream into a set of serialized chunks, wherein a first combined storage size of the set of serialized chunks is smaller than a storage size of XR stream;
transmitting the set of serialized chunks to the set of one or more remote devices, wherein the set of one or more remote devices at least partially recreate the XR stream in a set of one or more remote XR environments; and
transmitting the set of serialized chunks to a remote storage device, wherein a device subsequently retrieves the set of serialized chunks to replay the remote collaboration session.
|