US 12,232,817 B2
Surgical navigation inside a body
Yahav Tako, New York, NY (US); Alon Yakob Geri, Beachwood, OH (US); Mordechai Avisar, Highland Heights, OH (US); and Eliahu Teichman, Byniamina (IL)
Assigned to Surgical Theater, Inc., Beachwood, OH (US)
Filed by Surgical Theater, Inc., Mayfield Village, OH (US)
Filed on Oct. 12, 2020, as Appl. No. 17/068,466.
Application 17/068,466 is a continuation of application No. 15/699,715, filed on Sep. 8, 2017, granted, now 11,197,722.
Application 15/699,715 is a continuation in part of application No. PCT/US2016/056727, filed on Oct. 13, 2016.
Claims priority of provisional application 62/241,447, filed on Oct. 14, 2015.
Prior Publication US 2021/0022812 A1, Jan. 28, 2021
Int. Cl. A61B 34/20 (2016.01); A61B 34/10 (2016.01); A61B 90/00 (2016.01); A61B 90/50 (2016.01); G06T 19/00 (2011.01)
CPC A61B 34/20 (2016.02) [G06T 19/003 (2013.01); A61B 2034/105 (2016.02); A61B 2090/365 (2016.02); A61B 2090/372 (2016.02); A61B 2090/502 (2016.02); G06T 2210/41 (2013.01); G06T 2210/62 (2013.01)] 24 Claims
OG exemplary drawing
 
1. An augmented reality surgical navigation system comprising:
one or more processors;
one or more computer-readable tangible storage devices;
at least one sensor for detecting information about a user's position and motion around a patient;
at least one camera for receiving live images of internal anatomical features of the patient; and
program instructions stored on at least one of the one or more storage devices for execution by at least one of the one or more processors, said program instructions comprising:
first program instructions for preparing a multi dimension virtual model of the internal anatomical features of the patient, said model configured for providing the user with dynamic interaction of the internal anatomical features provided in the model;
second program instructions for receiving tracking information indicative of a user's current view of the patient, including the user's position and motion around the patient as detected by the sensor and the user's angle of view of the patient;
third program instructions for identifying in the virtual model a virtual view based on the received tracking information, wherein the identified virtual view corresponds to the user's view of the patient;
fourth program instructions for rendering a virtual image from the virtual model based on the identified virtual view, said virtual view showing dynamic interactions of the user with the internal anatomical features of the model based on the user's current view of the patient according to the user's position and motion around the patient permitting user dynamic interaction with the internal anatomical features of the model in three dimensions such that said user can see and interact with the virtual image over 360 degrees around the image including the front and back of the image; and
fifth program instructions for communicating the rendered virtual image to a display where the rendered virtual image is combined with the live images of the internal anatomical features of the patient and the user's view to form an augmented reality view of the patient.