CPC A61B 34/20 (2016.02) [A61B 34/10 (2016.02); A61B 90/361 (2016.02); A61B 90/37 (2016.02); G06F 3/011 (2013.01); G06T 7/20 (2013.01); G06T 7/70 (2017.01); G06T 17/00 (2013.01); G06T 19/006 (2013.01); G06T 19/20 (2013.01); A61B 2034/102 (2016.02); A61B 2034/105 (2016.02); A61B 2034/2055 (2016.02); A61B 2090/365 (2016.02); A61B 2090/367 (2016.02); A61B 2090/372 (2016.02); G06T 2207/30012 (2013.01); G06T 2207/30052 (2013.01); G06T 2210/41 (2013.01); G06T 2219/008 (2013.01); G06T 2219/2004 (2013.01); G06T 2219/2012 (2013.01); G06T 2219/2016 (2013.01)] | 12 Claims |
1. A method by a navigated surgery system, the method comprising:
obtaining a first two-dimensional (2D) medical image slice of a portion of an internal anatomical structure of a patient;
obtaining a three-dimensional (3D) graphical model of the internal anatomical structure;
determining a first pose of a first virtual cross-sectional plane extending through the 3D graphical model of the internal anatomical structure, wherein the first virtual cross-sectional plane corresponds to the portion of the internal anatomical structure imaged by the first 2D medical image slice; and
controlling an extended reality (XR) headset to simultaneously display on a display device the first 2D medical image slice of the internal anatomical structure of the patient, the 3D graphical model of the internal anatomical structure, and a first graphical object oriented with the first pose relative to the 3D graphical model of the internal anatomical structure,
wherein the XR headset is configured to operate to track hand poses and gestures to enable gesture based interactions with virtual buttons and interfaces displayed through the XR headset and configured to interpret hand gesturing as defined commands.
|