| CPC A61B 34/20 (2016.02) [G06T 17/20 (2013.01); G06T 19/003 (2013.01); G06T 19/006 (2013.01); G16H 40/60 (2018.01); A61B 2034/2046 (2016.02)] | 14 Claims |

|
1. A computer-implemented method, comprising:
generating within a unified three-dimensional (3D) coordinate space:
(i) a 3D virtual medical model positioned according to a model pose; and
(ii) at least one virtual object associated with a physical instrument, the physical instrument having a current instrument pose based at least on current coordinates of one or more fiducial markers disposed on the physical instrument, in the unified 3D coordinate space;
rendering an Augmented Reality (AR) display that includes concurrent display of the 3D virtual medical model and the virtual object;
detecting one or more physical gestures associated with the physical instrument;
identifying a change related to the physical instrument pose, in the unified 3D coordinate space, due to at least one of the detected physical gestures associated with the physical instrument; and
modifying the AR display according to at least one virtual interaction related to the virtual object that incorporates the change of the physical instrument pose, wherein modifying the AR display comprises:
modifying the AR display by:
(i) generating a virtual trajectory of the physical instrument based on a target point at an internal anatomical location represented by the 3D virtual medical model and an entry point of the 3D virtual medical model;
(ii) displaying the virtual trajectory of the physical instrument in relation to the target point; and
(iii) displaying a visual indicator providing a first visual cue indicating the virtual trajectory of the physical instrument in currently misaligned with a planned virtual trajectory and a second visual cue upon determining the virtual trajectory of the physical instrument has moved into an alignment with the planned virtual trajectory.
|