| CPC G06F 3/011 (2013.01) [G06F 3/0346 (2013.01); G06F 3/04815 (2013.01); G06F 3/0482 (2013.01); G06F 3/04847 (2013.01)] | 20 Claims |

|
1. A mobile device comprising:
one or more processors; and
one or more memories storing instructions that, when executed by the one or more processors, configure the mobile device to perform operations comprising:
determining a first position of a surface of a physical object based on an image from a first camera, an image from a second camera of the mobile device, and based on (R*X0)/(2*tan((Θ1/2)+C)*(x2−x1)), where R is an optical distance between an optical axis of the first camera and an optical axis of the second camera, X0 is a width of a point of the surface in pixels of an image sensor of the first camera, x2 is based on a number of pixels from the optical axis of the image sensor of the second camera, x1 is based on a number of pixels from the optical axis of the first camera, ⊙1 is a view angle of the first camera and the second camera, and C is a compensation for distortions of the mobile device;
generating, based on the first position, an augmented reality (AR) interactive control, wherein the AR interactive control is generated to appear to a user of the mobile device to be on the surface;
displaying on a display, of the mobile device, the AR interactive control;
determining a first plurality of positions of a control indicator controlled by a user;
activating the AR interactive control in response to detecting a position of the first plurality of positions of the control indicator controlled by the user transgresses a first threshold distance from the AR interactive control; and
displaying on the display an indication that the AR interactive control is activated in response to activating the AR interactive control.
|