| CPC A63F 13/53 (2014.09) [A63F 13/335 (2014.09); A63F 13/65 (2014.09); A63F 13/90 (2014.09); G02B 27/0172 (2013.01); G06F 3/011 (2013.01); G06F 3/012 (2013.01); G06F 3/04815 (2013.01); G06T 19/006 (2013.01); G06T 19/20 (2013.01); A63F 13/213 (2014.09); A63F 2300/1043 (2013.01); A63F 2300/105 (2013.01); A63F 2300/1093 (2013.01); A63F 2300/205 (2013.01); A63F 2300/303 (2013.01); A63F 2300/69 (2013.01); A63F 2300/8082 (2013.01); G02B 2027/0138 (2013.01); G02B 2027/014 (2013.01); G02B 2027/0178 (2013.01); G06T 2219/2004 (2013.01); G06T 2219/2016 (2013.01); G09G 3/001 (2013.01); G09G 5/003 (2013.01)] | 19 Claims |

|
1. A method performed by a wearable device, the method comprising:
determining, by at least one processor of the wearable device, information related to a wearable device and a plurality of physical entities located in an environment proximate to and viewable through the wearable device, determination of the related information comprising determining a state of a game;
obtaining, by the at least one processor, image information representative of a plurality of virtual entities of the game, the image information representative of the plurality of virtual entities of the game being based at least on the determined related information comprising the determined state of the game;
generating, by the at least one processor and using the image information representative of the plurality of virtual entities of the game, a plurality of images of the plurality of virtual entities of the game as virtual entity display regions of a lens of the wearable device enabling the plurality of virtual entities of the game to appear to be present in the environment using the wearable device;
tracking, by the at least one processor, a position of a movable physical entity in the environment, the movable physical entity is visible through a transparent region of the lens of the wearable device;
optically aligning, by the at least one processor, a first image of a first virtual entity from among the plurality of virtual entities on the lens of the wearable device with the movable physical entity by overlaying at least a portion of the movable physical entity with the first image of the first virtual entity so that the first image of the first virtual entity conceals the at least portion of the movable physical entity as the movable physical entity is being tracked; and
optically aligning, by the at least one processor, a second image of a second virtual entity from among the plurality of virtual entities on the lens of the wearable device with a non-movable physical entity by overlaying at least a portion of the non-movable physical entity with the second image of the second virtual entity so that the second image of the second virtual entity conceals the at least portion of the non-movable physical entity.
|