US 12,201,369 B2
Interactive extended-reality apparatuses for robotic surgery
Jeffrey Roh, Seattle, WA (US); Justin Esterberg, Mesa, AZ (US); John Cronin, Jericho, VT (US); Seth Cronin, Essex Junction, VT (US); and Michael John Baker, Georgia, VT (US)
Assigned to IX Innovation LLC, Seattle, WA (US)
Filed by IX Innovation LLC, Seattle, WA (US)
Filed on Mar. 10, 2023, as Appl. No. 18/182,023.
Prior Publication US 2024/0299094 A1, Sep. 12, 2024
This patent is subject to a terminal disclaimer.
Int. Cl. A61B 34/10 (2016.01); A61B 90/00 (2016.01); A61B 34/00 (2016.01)
CPC A61B 34/10 (2016.02) [A61B 90/36 (2016.02); A61B 2034/104 (2016.02); A61B 2034/105 (2016.02); A61B 2034/252 (2016.02); A61B 2034/254 (2016.02); A61B 2090/367 (2016.02)] 21 Claims
OG exemplary drawing
 
1. A computer-implemented method for performing a surgical procedure by a surgical robot, comprising:
obtaining one or more images for a patient using one or more sensors;
selecting one or more sensory modalities based on the surgical procedure; generating an immersive experiential multisensory extended-reality (XR) environment by associating one or more virtual models of one or more surgical tools and the surgical robot with the one or more images,
wherein the immersive experiential multisensory XR environment comprises a three-dimensional (3D) digital twin of an anatomy of the patient for performing a simulation of the surgical procedure and provides sensory feedback to a user according to the one or more sensory modalities;
displaying, via an electronic display, the 3D digital twin within the immersive experiential multisensory XR environment for viewing by the user;
identifying virtual surgical actions on the 3D digital twin within the immersive experiential multisensory XR environment;
generating a surgical workflow for the surgical robot, the surgical workflow comprising workflow objects for the surgical procedure based on the virtual surgical actions;
adjusting the surgical workflow based on a comparison of the surgical workflow to stored historical workflows so as to avoid one or more adverse surgical events,
wherein the comparison is used to identify the one or more adverse surgical events associated with the workflow objects based on the virtual surgical actions within the immersive experiential multisensory XR environment;
transmitting the adjusted surgical workflow to the surgical robot to configure the surgical robot with the adjusted surgical workflow,
wherein the adjusted surgical workflow comprises the workflow objects and information describing the surgical actions; and
robotically performing, using the surgical robot, one or more surgical actions on the patient according to the adjusted surgical workflow.