CPC G06F 3/04815 (2013.01) [G06F 3/011 (2013.01); G06F 3/017 (2013.01); G06F 3/04817 (2013.01); G06F 3/167 (2013.01); G06T 13/40 (2013.01); G06T 19/006 (2013.01); G06T 19/20 (2013.01); H04L 67/131 (2022.05); H04N 7/157 (2013.01); G06F 3/0485 (2013.01); G06F 3/04842 (2013.01); G06F 2203/04803 (2013.01); G06T 2219/2004 (2013.01); G06T 2219/2016 (2013.01)] | 20 Claims |
1. A computer-implemented method, comprising:
building an augmented reality (AR) meeting space comprising structured data received from a plurality of apps operating on a mobile device of a user, wherein the plurality of apps are displayed on a home screen of the mobile device;
translating the structured data into a three-dimensional representation of the structured data corresponding to each of the plurality of apps;
rendering a three-dimensional representation of the home screen of the mobile device of the user in the AR meeting space wherein the three-dimensional representation of the structured data corresponding to the content of each app is separately grouped and displayed within the three-dimensional representation of the home screen, wherein the plurality of apps include one active app and at least one background app, wherein the active app is rendered with greater visual prominence relative to the background app;
generating an invisible mesh in front of the mobile device;
detecting that a new image has been captured by the mobile device;
automatically receiving, without user interaction and responsive to the detecting, an image file corresponding to the new image from the mobile device; and
displaying the new image corresponding to the received image file on the invisible mesh in conjunction with a corresponding visual effect simulating the image moving from the mobile device to the invisible mesh.
|