| CPC A61B 34/20 (2016.02) [A61B 34/30 (2016.02); A61B 90/39 (2016.02); B25J 13/089 (2013.01); G06T 7/248 (2017.01); G06T 7/80 (2017.01); G06T 11/003 (2013.01); G06T 19/006 (2013.01); A61B 2034/2051 (2016.02); A61B 2034/2055 (2016.02); A61B 2034/2065 (2016.02); A61B 2090/374 (2016.02); A61B 2090/3762 (2016.02); A61B 2090/378 (2016.02); G06T 2207/10081 (2013.01); G06T 2207/10088 (2013.01); G06T 2207/10116 (2013.01); G06T 2207/10132 (2013.01); G06T 2207/30004 (2013.01); G06T 2207/30204 (2013.01); G06T 2211/424 (2013.01)] | 8 Claims |

|
1. A method of providing surgical guidance and targeting in robotic surgery systems comprising an imager comprising a source, a detector having a marker and a navigation system having a tracker, comprising the steps of:
capturing by an imaging system pre-operative, image data of a patient at predetermined positions and orientations;
reconstructing a 3D pre-operative image data from the captured pre-operative image data using a processing system;
capturing 2D intra-operative image data of the patient at the predetermined positions and orientations of the imager during a surgical procedure;
registering the 2D intra-operative image data and data from the navigation system in real time during the surgical procedure to track the position of a tool;
registering the 2D intra-operative image data with the 3D pre-operative image data, wherein the registering comprises:
calibrating one or more intrinsic parameters of the imager;
performing extrinsic calibration of the imager to localize the detector image plane in 3D with respect to a surgical site, the calibration comprising the steps of:
capturing two or more images of a space calibration object placed on a surface using the imager, the detector positioned at two or more predetermined locations and orientations, wherein the object comprises a spiral arrangement of reference indices embedded around a radio transparent cylindrical structure and arranged around a camera axis and wherein the space calibration object (307) is a cylindrical object and is placed with a first reference index kept proximal to the tracker and wherein the cylindrical object has a reference marker placed proximal to the first reference index:
recording the position and displacement of the detector by a tracker attached to the detector, for each image capture;
identifying the location of the reference indices in each capture, wherein identifying the location of the reference indices is based on image processing techniques selected from thresholding or Hough transform;
computing a projection iteratively from a spiral canonical 3d coordinate system to each of the captured images and obtaining an Euler rotation for each image, wherein computing a projection comprises iterative optimization techniques selected from steepest descent, least-squares minimization, or Frobenius-norm minimization;
obtaining a transform that links the computed projections and the tracker recorded positions and displacements of the detector; and
positioning the pre-operative image data in the spiral canonical 3D coordinate system and applying one or more digital radiographic re-projections (DRR) to obtain a 2D projection of the preoperative volume that is aligned with the 2D intra-operative images, wherein the positioning of the pre-operative volume in the spiral canonical 3D coordinate system and applying one or more digital radiographic re-projections (DRR) does not require placement of markers on the patient; and
augmenting the intra-operative image data with a rendering of the 3D pre-operative image data that is in registration with the real time intra—operative 2D image data.
|