US 11,656,690 B2
User input and virtual touch pad in augmented reality for use in surgical settings
Long Qian, Brooklyn, NY (US); Wenbo Lan, Brooklyn, NY (US); Christopher Morley, New York, NY (US); and Osamah Choudhry, New York, NY (US)
Assigned to Medivis, Inc., New York, NY (US)
Filed by Medivis, Inc., New York, NY (US)
Filed on Apr. 18, 2022, as Appl. No. 17/723,437.
Application 17/723,437 is a continuation in part of application No. 17/194,191, filed on Mar. 5, 2021, granted, now 11,307,653.
Prior Publication US 2022/0283647 A1, Sep. 8, 2022
Int. Cl. G06F 3/01 (2006.01); G06T 19/00 (2011.01); G06F 3/04845 (2022.01); G06F 3/04815 (2022.01)
CPC G06F 3/017 (2013.01) [G06F 3/04815 (2013.01); G06F 3/04845 (2013.01); G06T 19/006 (2013.01); G06T 2210/41 (2013.01)] 18 Claims
OG exemplary drawing
 
1. A computer-implemented method, comprising:
generating, within a unified three-dimensional (3D) coordinate space:
(i) a virtual 3D medical model positioned according to a current model pose, the current model pose representing a position and orientation of the virtual 3D medical model in the unified 3D coordinate space; and
(ii) at least one a virtual 3D hand representation;
rendering, via an Augmented Reality (AR) headset device, an AR display that includes display of the virtual 3D medical model positioned according to the current model pose and the virtual 3D hand representation;
detecting a first physical gesture;
based on the first physical gesture, identifying selection of a type of slate virtual interaction from a plurality of types of slate virtual interactions, each respective type of slate virtual interaction corresponds to a different type of modification applied to the display of the virtual 3D medical model responsive to detection of one or more physical gestures with respect to a virtual slate;
in response to selection of the type of slate virtual interaction, rendering in the AR display an instance of the virtual slate (“rendered virtual slate”) that corresponds with functionality of the selected type of slate virtual interaction, the rendered virtual slate comprising an AR touchpad concurrently displayed, at a first display position, with the virtual 3D medical model;
detecting a second physical gesture with respect to the virtual slate; and
modifying the AR display by:
(i) displaying a handle overlay at a first handle display position within the virtual slate, the first handle display position based on a projection of a first position and orientation of the virtual 3D hand representation (“first hand position”);
(ii) displaying the handle overlay at a subsequent second handle display position within the virtual slate, the second handle display position based on a projection of a result of a movement from the first hand position to a second position and orientation of the virtual 3D hand representation (“second hand position”); and
(iii) during the movement from the first hand position to the second hand position, concurrently adjusting the display of at least a portion of model data of the virtual 3D medical model, according to the selected type of slate virtual interaction, based at least on an extent of the movement from the first hand position to the second hand position.