US 12,220,176 B2
Extended reality instrument interaction zone for navigated robotic
Thomas Calloway, Pelham, NH (US); Isaac Dulin, Somerville, MA (US); Keiichi Matsuda, London (GB); Christine Russ, Stoneham, MA (US); Keerthighaan Kanagasegar, Norristown, PA (US); and Amelia Raphaelson, Upperville, VA (US)
Filed by GLOBUS MEDICAL, INC., Audubon, PA (US)
Filed on Feb. 4, 2020, as Appl. No. 16/781,200.
Application 16/781,200 is a continuation in part of application No. 16/709,185, filed on Dec. 10, 2019.
Prior Publication US 2021/0169581 A1, Jun. 10, 2021
Int. Cl. A61B 34/20 (2016.01); A61B 34/00 (2016.01); A61B 34/30 (2016.01); G02B 27/01 (2006.01); G06F 3/01 (2006.01)
CPC A61B 34/20 (2016.02) [A61B 34/25 (2016.02); A61B 34/30 (2016.02); G02B 27/017 (2013.01); G06F 3/011 (2013.01); A61B 2034/2055 (2016.02)] 13 Claims
OG exemplary drawing
 
1. A surgical system comprising:
an extended reality (“XR”) headset configured to be worn by a user during a surgical procedure and including a see-through display screen configured to display a world-registered XR image including a 3D image of an anatomical structure which is derived from a medical imaging system and to allow at least a portion of a real-world scene including an actual patient to pass therethrough for viewing by the user; and
a tracking system configured to determine a real-world pose of the XR headset and a real-world pose of a real-world element, the real-world pose of the XR headset and the real-world pose of the real-world element being determined relative to a real-world coordinate system; and
an XR headset controller configured to generate the world-registered XR image based on the real-world pose of the XR headset;
wherein the XR headset controller is configured to generate and display on the see-through display screen the world-registered XR image such that the displayed XR image remains in place as a head of the user moves so as to allow the user to simultaneously see the actual patient and the world-registered XR image floating near the patient;
wherein a portion of the real-world coordinate system is interpreted by the XR headset controller as defining an interaction zone;
wherein the XR headset controller is further configured to determine whether the real-world pose of the real-world element is within the interaction zone;
wherein the XR headset controller is configured to respond to a determination that the real-world pose of the real-world element is within the interaction zone by generating the virtual element within the world-registered XR image;
wherein the XR headset further includes a gesture sensor configured to output to the XR headset controller an indication of a hand gesture formed by the user, and
wherein the XR headset controller is further configured to define and/or modify a dimension, a pose, and/or a type of the interaction zone based on the indication of the hand gesture formed by the user.