CPC G06F 3/011 (2013.01) [G06F 3/017 (2013.01); G06F 3/0488 (2013.01); G06F 3/04815 (2013.01); G06F 3/04845 (2013.01)] | 21 Claims |
1. A computer-implemented method comprising:
receiving an input corresponding to interaction with a touch-based control of a device, the input defining an operation related to augmented reality (AR) content that is overlaid onto a view of a real-world space and displayed by the device;
detecting a real-world object within the real-world space visible in the view of the real-world space displayed by the device;
responsive to detecting the real-world object within the real-world space visible in the view of the real-world space displayed by the device and prior to detecting a gesture performed using the real-world object, revealing more of the real-world space in the view of the real-world space displayed by the device by altering what is output for display by the device by at least one of reducing, removing, or altering a graphical user interface element that was overlaid on the view of the real-world space displayed by the device before the real-world object was detected within the real-world space visible in the view of the real-world space;
subsequent to receiving the input corresponding to the interaction with the touch-based control of the device and subsequent to altering what is output for display by the device responsive to detecting the real-world object, detecting the gesture performed using the real-world object within the real-world space visible in the view;
responsive to detecting the gesture, manipulating the AR content; and
responsive to no longer detecting the real-world object within the real-world space visible in the view of the real-world space displayed by the device, further altering what is output for display by the device to restore at least some of the graphical user interface element that was reduced, removed, or altered.
|