US 12,419,699 B2
System and method for augmented reality spine surgery
Mordechai Avisar, Highland Heights, OH (US); Alon Yakob Geri, Orange Village, OH (US); and Robert Louis, Newport Beach, CA (US)
Filed by Surgical Theater, Inc., Los Angeles, CA (US)
Filed on Aug. 9, 2021, as Appl. No. 17/397,157.
Claims priority of provisional application 63/062,921, filed on Aug. 7, 2020.
Claims priority of provisional application 63/068,079, filed on Aug. 20, 2020.
Claims priority of provisional application 63/116,457, filed on Nov. 20, 2020.
Prior Publication US 2022/0039881 A1, Feb. 10, 2022
Int. Cl. A61B 34/00 (2016.01); A61B 34/10 (2016.01); A61B 34/20 (2016.01); G06F 3/01 (2006.01)
CPC A61B 34/25 (2016.02) [A61B 34/10 (2016.02); A61B 34/20 (2016.02); G06F 3/011 (2013.01); A61B 2034/102 (2016.02); A61B 2034/105 (2016.02); A61B 2034/2068 (2016.02); A61B 2034/256 (2016.02)] 23 Claims
OG exemplary drawing
 
1. A method for using a computer system for rendering an interactive augmented view of a physical model representing an anatomical feature and virtual accessory model of an implantable surgical accessary, said method comprising the steps of:
storing data representing the implantable surgical accessory in a database;
providing a physical probe configured for use by a user to interact with the physical model:
the computer system being configured for generating the virtual accessory model of the implantable surgical accessory utilizing the data representing the implantable surgical accessory;
the computer system being configured for determining a direction of view of the user;
the computer system being configured for tracking a position of the physical probe about the physical model;
the computer system being configured for generating an image of the physical model based on the determined direction of view of the user;
the computer system being configured for generating an image of the surgical probe based on the tracked position of the physical probe;
the computer system being configured for generating and displaying an augmented reality view to the user, said augmented reality view showing realistic interaction of the virtual accessory model with the generated image of the physical probe and the generated image of the physical model such that the user can realistically manipulate the virtual accessory model based on user interactions with the physical probe about the physical model;
the computer system being configured to display in the augmented reality view a guide marker on the image of the physical model to guide the user in maneuvering the physical probe to a desired placement of the virtual accessory model with respect to the physical model; and
the computer system being configured to perform a proximity test on the virtual probe to detect a distance between the virtual probe and the guide marker as the virtual probe is brought closer to the guide marker, and, upon detecting the distance falling below a predetermined threshold level, displaying in the augmented reality view a guide line in addition to the guide marker that is otherwise not visible prior to virtual probe being within the predetermined threshold of level of the guide marker, wherein the guide line is utilized to refine guidance and placement of the virtual accessory into the desired location.