US 12,201,375 B2
Extended reality systems for visualizing and controlling operating room equipment
Michael Robinson, Concord, NH (US); Thomas Calloway, Pelham, NH (US); Isaac Dulin, Somerville, MA (US); and Mir Hussain, Downingtown, PA (US)
Assigned to Globus Medical Inc., Audubon, PA (US)
Filed by GLOBUS MEDICAL, INC., Audubon, PA (US)
Filed on Sep. 16, 2021, as Appl. No. 17/476,689.
Prior Publication US 2023/0078919 A1, Mar. 16, 2023
This patent is subject to a terminal disclaimer.
Int. Cl. G06F 3/01 (2006.01); A61B 34/00 (2016.01); A61B 34/20 (2016.01); A61B 34/30 (2016.01); A61B 90/00 (2016.01); A61B 90/50 (2016.01); G06T 19/00 (2011.01); G06V 20/40 (2022.01); G06V 40/10 (2022.01); G06V 40/20 (2022.01); A61B 34/10 (2016.01)
CPC A61B 34/20 (2016.02) [A61B 34/25 (2016.02); A61B 34/30 (2016.02); A61B 90/361 (2016.02); A61B 90/37 (2016.02); A61B 90/50 (2016.02); G06F 3/017 (2013.01); G06T 19/00 (2013.01); G06V 20/46 (2022.01); G06V 40/107 (2022.01); G06V 40/28 (2022.01); A61B 2034/105 (2016.02); A61B 2034/107 (2016.02); A61B 2034/2057 (2016.02); A61B 2034/2065 (2016.02); A61B 2034/258 (2016.02); A61B 2090/365 (2016.02); A61B 2090/502 (2016.02)] 5 Claims
OG exemplary drawing
 
1. A camera tracking system comprising at least one processor operative to:
receive equipment reference tracking information indicating poses of medical equipments and a patient reference array tracked by a tracking camera relative to a reference frame;
determine an extended reality (XR) headset view pose transform between an XR headset reference frame of an XR headset and the reference frame using the equipment reference tracking information;
obtain operator-gesture tracking information from the tracking camera indicating movement of an object relative to the XR headset reference frame by an operator wearing the XR headset;
select an operational command from among a set of operational commands based on the operator-gesture tracking information; and
provide instructions to one of the medical equipments based on the operational command that is selected wherein the at least one processor is further operative to:
determine a gesture path relative to the XR headset reference frame based on processing the operator-gesture tracking information through the XR headset view pose transform,
select the operational command from among the set of operational commands based on identifying that the gesture path corresponds to a defined gesture associated with the operational command, wherein the operational commands in the set are associated with different shaped gesture paths,
wherein the at least one processor is further operative to:
select the operational command for relocating an end effector connected to a surgical robot arm that is movable under control of a surgical robot system, from among the set of operational commands based on the gesture path corresponding to the defined gesture associated with the operational command for relocating the end effector;
determine a present pose of the end effector based on end effector tracking information indicating pose of the end effector tracked by the tracking camera relative to the reference frame; and,
control movement of the end effector by the surgical robot system from the present pose to a target pose relative to the reference frame based on the operational command for relocating the end effector;
determine a planned end effector trajectory path from the present pose to the target pose based on at least a segment of the gesture path; and
control movement of the end effector by the surgical robot system to conform to the planned end effector trajectory path from the present pose to the target pose.