US 12,456,218 B2
Generation of three-dimensional scans for intraoperative imaging
Yang Liu, Iowa City, IA (US); and Maziyar Askari Karchegani, Coralville, IA (US)
Assigned to UNIFY MEDICAL, INC., Cleveland, OH (US)
Filed by Unify Medical, Inc., Cleveland, OH (US)
Filed on Oct. 15, 2024, as Appl. No. 18/916,579.
Application 18/916,579 is a continuation of application No. 18/358,396, filed on Jul. 25, 2023, granted, now 12,118,738.
Application 18/358,396 is a continuation of application No. 17/129,691, filed on Dec. 21, 2020, granted, now 11,710,249, issued on Jul. 25, 2023.
Claims priority of provisional application 63/040,816, filed on Jun. 18, 2020.
Claims priority of provisional application 62/951,480, filed on Dec. 20, 2019.
Prior Publication US 2025/0037292 A1, Jan. 30, 2025
Int. Cl. G06T 7/593 (2017.01); A61B 34/20 (2016.01); G06T 7/33 (2017.01); G06T 7/73 (2017.01); H04N 13/00 (2018.01); H04N 13/239 (2018.01)
CPC G06T 7/593 (2017.01) [A61B 34/20 (2016.02); G06T 7/33 (2017.01); G06T 7/73 (2017.01); H04N 13/239 (2018.05); A61B 2034/2051 (2016.02); A61B 2034/2055 (2016.02); A61B 2034/2065 (2016.02); G06T 2207/10012 (2013.01); G06T 2207/30004 (2013.01); H04N 2013/0081 (2013.01)] 13 Claims
OG exemplary drawing
 
1. A system for executing a three-dimensional (3D) intraoperative scan of a patient to generate a plurality of intraoperative images of the patient that enables a surgeon to navigate during a surgical operation on the patient, comprising:
a 3D scanner that includes a first image sensor and a second image sensor and is configured to capture a first two-dimensional (2D) intraoperative image of a plurality of object points associated with the patient via the first image sensor and a second 2D intraoperative image of the plurality of object points via the second image sensor;
a 3D scanning controller that is configured to:
project the plurality of object points included in the first 2D intraoperative image onto a first image plane associated with the first image sensor and the plurality of object points included in the second 2D intraoperative image onto a second image plane associated with the second image sensor,
determine a plurality of first epipolar lines associated with the first image plane and a plurality of second epipolar lines associated with the second image plane based on an epipolar plane that triangulates the plurality of object points included in the first 2D intraoperative image to the plurality of object points included in the second 2D intraoperative image, wherein each epipolar line provides a depth of each object point as projected onto the first image plane associated with the first image sensor and the second image plane associated with the second image sensor, and
convert the first 2D intraoperative image and the second 2D intraoperative image to the 3D intraoperative scan of the patient based on the depth of each object point provided by each corresponding epipolar line;
a controller that is configured to:
co-register pre-operative image data captured from at least one pre-operative image of the patient with intraoperative image data provided by the 3D intraoperative scan, and
instruct a display to display the co-registered pre-operative image data as captured from the at least one pre-operative image with the intraoperative image data provided by the 3D intraoperative scan as the surgeon navigates during the surgical operation.