| CPC H04N 13/117 (2018.05) [H04N 7/15 (2013.01)] | 11 Claims |

|
1. A one-way telepresence method, comprising:
capturing three-dimensional video from a plurality of stationary cameras at a source location, the video including a subject within a respective point of view;
at each of a respective at least one remote location, determining a viewpoint of a respective Virtual Reality or Augmented Reality headset worn by a remote user, the viewpoint corresponding to a determination of the head position and head orientation of the remote user based on an orientation tracking system associated with the headset;
transmitting the viewpoint of the respective remote user from the at least one remote location to the source location;
at the source location, creating at least one synthetic three-dimensional video from the captured three-dimensional video, the synthetic three-dimensional video including the subject and corresponding to the determined viewpoint of the respective remote user; the creating including:
a) applying at least one of i) realtime view interpolation techniques to at least a portion of the captured video three-dimensional video and ii) a light field rendering technique to at least a portion of the captured three-dimensional video from the plurality of stationary cameras and;
b) integrating at least one of i) chromakey and ii) depth filtering technologies;
transmitting the synthetic three-dimensional video of the subject to the at least one remote location; and
displaying the synthetic three-dimensional video of the subject to the respective headset at the at least one remote location; wherein the one-way telepresence system including a plurality of remote locations with respective headsets worn by different users; and the synthetic video for each respective headset is different to correspond with the determined viewpoint for each user.
|