US 12,189,861 B2
Augmented reality experiences with object manipulation
Ilteris Canberk, Marina Del Rey, CA (US)
Assigned to Snap Inc., Santa Monica, CA (US)
Filed by Ilteris Canberk, Marina Del Rey, CA (US)
Filed on Mar. 4, 2021, as Appl. No. 17/192,144.
Claims priority of provisional application 63/045,452, filed on Jun. 29, 2020.
Prior Publication US 2021/0405761 A1, Dec. 30, 2021
Int. Cl. A63F 13/55 (2014.01); G02B 27/01 (2006.01); G06F 3/01 (2006.01); G06F 3/04815 (2022.01); G06F 3/16 (2006.01); G06T 7/20 (2017.01); G10L 15/22 (2006.01)
CPC G06F 3/017 (2013.01) [A63F 13/55 (2014.09); G02B 27/0172 (2013.01); G06F 3/04815 (2013.01); G06F 3/165 (2013.01); G06T 7/20 (2013.01); G02B 2027/0138 (2013.01); G02B 2027/014 (2013.01); G02B 2027/0141 (2013.01); G02B 2027/0178 (2013.01); G06T 2207/10016 (2013.01); G06T 2207/10028 (2013.01); G06T 2207/30196 (2013.01); G10L 15/22 (2013.01)] 7 Claims
OG exemplary drawing
 
1. An interactive augmented reality system comprising:
a position detection system;
a display system;
an image capture system;
an eyewear device comprising the position detection system, the display system, an inertial measurement unit, a processor, and a memory, wherein the interactive augmented reality system renders one or more virtual objects in the display system of the eyewear device giving the impression to a user that the virtual objects are authentically present in a physical environment, wherein the memory includes a virtual object rendering utility; and
programming in the memory, wherein execution of the programming by the processor configures the eyewear device to perform functions, including functions to:
operate the position detection system to construct a map of the physical environment and determine a location and pose of the eyewear device in the map using images of the physical environment captured using the image capture system and input from the inertial measurement unit in the eyewear device;
display a visual graphic representation of a music player as a first virtual object in the physical environment using the display system of the eyewear device, wherein the first virtual object is stored in the memory of the eyewear device and rendered on a display using the virtual object rendering utility, the visual graphic representation of the music player includes a plurality of buttons and a progress bar, and displaying on a separate screen from the first virtual object a menu of a plurality of hand gestures and a plurality of control aspects of the music player corresponding to each of the hand gestures;
detect at least one of a hand gesture or movement of the user in the physical environment using images captured from the image capture system, wherein the detected hand gesture or movement corresponds to one or more of the plurality of buttons or the plurality of hand gestures and thereby controls a display of a current song, whether the current song is paused, movement back or forth within a queue of songs, a volume, or combinations thereof, wherein the visual graphic representation in the interactive augmented reality system gives the impression to the user that the music player is authentically present as a physical object that is fixed in the physical environment and that the user is interacting with an interface of the music player;
associate the at least one hand gesture or movement with a control setting of the music player corresponding to the first virtual object stored within the memory of the eyewear device, the control setting of the music player being stored in the memory of the eyewear device, wherein the control setting of the music player is changed responsive to the at least one detected hand gesture or movement;
provide audio output that reflects the changed control setting of the music player responsive to the at least one hand gesture or movement;
display a visual graphic representation of a virtual game piece as a second virtual object in the physical environment using the display system of the eyewear device; and
detect at least one of a second hand gesture or movement of the user in the physical environment using images captured from the image capture system and input from the inertial measurement unit, wherein the detected at least one second hand gesture or movement controls at least one of a position or an orientation of the virtual game piece.