CPC G06F 21/6245 (2013.01) [G01C 21/00 (2013.01); G02B 27/0172 (2013.01); G06F 3/011 (2013.01); G06F 9/451 (2018.02); G06Q 30/0261 (2013.01); G06T 3/40 (2013.01); G06T 7/20 (2013.01); G06T 11/00 (2013.01); G06T 13/40 (2013.01); G06T 17/00 (2013.01); G06T 19/006 (2013.01); G06V 10/25 (2022.01); G06V 10/60 (2022.01); G06V 10/764 (2022.01); G06V 20/20 (2022.01); G06V 20/50 (2022.01); H04N 7/147 (2013.01); H04N 7/152 (2013.01); H04N 7/157 (2013.01); H04W 4/02 (2013.01); G02B 2027/014 (2013.01); G02B 2027/0141 (2013.01); G02B 2027/0178 (2013.01); G06T 2200/24 (2013.01); G06V 2201/07 (2022.01)] | 18 Claims |
1. A non-transitory computer readable medium containing instructions that when executed by at least one processor cause the at least one processor to perform operations for managing extended reality video conferences, the operations comprising:
receiving a request to initiate a video conference between a plurality of participants;
receiving image data captured by at least one image sensor associated with a wearable extended reality appliance, the image data reflecting a layout of a physical environment in which the wearable extended reality appliance is located;
analyzing the image data to identify at least one interference region in the physical environment;
receiving visual representations of the plurality of participants; and
causing the wearable extended reality appliance to display the visual representations of the plurality of participants at multiple distinct locations other than in the at least one interference region, such that the at least one interference region is devoid of any of the visual representations of the plurality of participants;
selecting a designated area based on a vision problem of a wearer of the wearable extended reality appliance; and
after causing the wearable extended reality appliance to display the visual representations, identifying a speaking participant and moving a particular visual representation associated with the speaking participant to the designated area.
|