| CPC A63F 13/52 (2014.09) [A63F 3/00643 (2013.01); A63F 3/00697 (2013.01); A63F 13/23 (2014.09); A63F 13/235 (2014.09); A63F 13/24 (2014.09); A63F 13/25 (2014.09); A63F 13/40 (2014.09); A63F 13/44 (2014.09); A63F 13/53 (2014.09); A63F 13/537 (2014.09); A63F 13/5375 (2014.09); A63F 13/5378 (2014.09); A63F 13/57 (2014.09); A63F 13/65 (2014.09); A63F 13/655 (2014.09); A63F 13/95 (2014.09); G06F 3/011 (2013.01); G06F 3/038 (2013.01); G06F 3/048 (2013.01); G06F 3/0482 (2013.01); G06F 3/0488 (2013.01); G06T 19/006 (2013.01); A63F 2009/2482 (2013.01); A63F 2009/2486 (2013.01); A63F 2009/2488 (2013.01); A63F 2300/308 (2013.01); A63F 2300/8082 (2013.01); G06F 2203/0383 (2013.01)] | 19 Claims |

|
1. A method comprising:
displaying, by a computing device, a game element within a user interface of the computing device, the game element comprising a representation of a physical game piece situated on a playing surface;
detecting, by the computing device, a user interaction with the game element using computer vision-based image recognition to identify the physical game piece and its position on the playing surface;
displaying, by the computing device, a list of available interactions in the user interface, the list of available interactions comprising user interface elements overlaid on an image of the game element and selectable by the user;
receiving, by the computing device, a selection of a selected interaction in the list of available interactions by detecting a user input with the user interface elements;
determining, by the computing device, positions of the game element and at least one other game element on the playing surface via an image recognition algorithm;
executing, by the computing device, the selected interaction by generating a visualization of a field of play based on the determined positions of the game element and the at least one other game element, coordinating displays across multiple computing devices to show different perspective views of a physical game space, effectuating both virtual changes in the visualization and positional changes to the physical game piece in response to the selected interaction using a communication protocol to transmit control signals to the physical game piece, and transforming real-world objects on the playing surface into augmented reality terrain elements having virtual appearances that differ from their real-world appearances;
transmitting, by the computing device, position data and interaction data to a second computing device to enable synchronized rendering of the visualization and positional changes across the multiple computing devices; and
updating the visualization on each computing device based on its respective perspective view of the physical game space, wherein the updating includes dynamically adjusting the visualization in response to detected physical movements of each computing device relative to the physical game space.
|