CPC A61B 34/20 (2016.02) [G06T 19/003 (2013.01); A61B 2034/105 (2016.02); A61B 2090/365 (2016.02); A61B 2090/372 (2016.02); A61B 2090/502 (2016.02); G06T 2210/41 (2013.01); G06T 2210/62 (2013.01)] | 24 Claims |
1. An augmented reality surgical navigation system comprising:
one or more processors;
one or more computer-readable tangible storage devices;
at least one sensor for detecting information about a user's position and motion around a patient;
at least one camera for receiving live images of internal anatomical features of the patient; and
program instructions stored on at least one of the one or more storage devices for execution by at least one of the one or more processors, said program instructions comprising:
first program instructions for preparing a multi dimension virtual model of the internal anatomical features of the patient, said model configured for providing the user with dynamic interaction of the internal anatomical features provided in the model;
second program instructions for receiving tracking information indicative of a user's current view of the patient, including the user's position and motion around the patient as detected by the sensor and the user's angle of view of the patient;
third program instructions for identifying in the virtual model a virtual view based on the received tracking information, wherein the identified virtual view corresponds to the user's view of the patient;
fourth program instructions for rendering a virtual image from the virtual model based on the identified virtual view, said virtual view showing dynamic interactions of the user with the internal anatomical features of the model based on the user's current view of the patient according to the user's position and motion around the patient permitting user dynamic interaction with the internal anatomical features of the model in three dimensions such that said user can see and interact with the virtual image over 360 degrees around the image including the front and back of the image; and
fifth program instructions for communicating the rendered virtual image to a display where the rendered virtual image is combined with the live images of the internal anatomical features of the patient and the user's view to form an augmented reality view of the patient.
|