US 12,318,150 B2
Camera tracking system for computer assisted surgery navigation
Thomas Calloway, Pelham, NH (US); Sanjay Joshi, Andover, MA (US); Tushar Sawant, Newton, MA (US); Rand Kmiec, Nashua, NH (US); and Norbert Johnson, North Andover, MA (US)
Assigned to Globus Medical Inc., Audubon, PA (US)
Filed by GLOBUS MEDICAL, INC., Audubon, PA (US)
Filed on Oct. 11, 2022, as Appl. No. 18/045,474.
Prior Publication US 2024/0115325 A1, Apr. 11, 2024
Int. Cl. A61B 34/20 (2016.01); A61B 90/00 (2016.01); G06T 7/80 (2017.01)
CPC A61B 34/20 (2016.02) [A61B 90/361 (2016.02); G06T 7/85 (2017.01); A61B 2034/2055 (2016.02); A61B 2090/3937 (2016.02); A61B 2090/3983 (2016.02); G06T 2207/10028 (2013.01); G06T 2207/20081 (2013.01); G06T 2207/30004 (2013.01); G06T 2207/30204 (2013.01); G06T 2207/30244 (2013.01)] 6 Claims
OG exemplary drawing
 
1. A camera tracking system for computer assisted navigation during surgery, comprising at least one processor operative to:
identify locations of markers of a reference array in a set of the images obtained from tracking cameras imaging a real device with at least partially overlapping field-of-views;
determine measured coordinate locations of a feature of a real device in the set of the images based on the identified locations of the markers and based on a relative location relationship between the markers and the feature;
process a region of interest in the set of the images identified based on the measured coordinate locations through a neural network configured to output a prediction of coordinate locations of the feature in the set of the images, wherein the neural network has been trained based on training images containing the feature of a computer model rendered at known coordinate locations;
track pose of the feature of the real device in three-dimensional (3D) space based on the prediction of coordinate locations of the feature of the real device in the set of the images;
determine a predicted 3D pose of the feature of the real device in a tracked space based on triangulation of the prediction of two-dimensional (2D) coordinate locations of the feature of the real device in a pair of the set of the images from a pair of the tracking cameras;
determine a measured 3D pose of the feature of the real device in the tracked spaced based on triangulation of the locations of the markers of the reference array in the pair of the set of the images; and
calibrate a feature offset based on comparison of the predicted 3D pose of the feature and the measured 3D pose of the feature.