CPC A61B 34/10 (2016.02) [A61B 90/37 (2016.02); G06T 19/006 (2013.01); G06T 19/20 (2013.01); G06V 20/653 (2022.01); G16H 30/40 (2018.01); A61B 2034/105 (2016.02); A61B 2034/107 (2016.02); A61B 2090/365 (2016.02); G06T 2200/04 (2013.01); G06T 2210/41 (2013.01); G06T 2219/2016 (2013.01); G06V 2201/03 (2022.01)] | 12 Claims |
1. A method to augment medical scan information associated with a target object on a first extended reality image, comprising:
obtaining a three-dimensional image associated with the target object;
identifying a first set of three-dimensional feature points from the three-dimensional image;
identifying anatomical points based on the first set of three-dimensional feature points;
obtaining the first extended reality image associated with the anatomical points;
selecting a second set of three-dimensional feature points from the first extended reality image;
performing a first image matching between the first set of three-dimensional feature points and the second set of three-dimensional feature points; and
superimposing the three-dimensional image on the first extended reality image based on the first image matching;
constructing a three-dimensional model based on the medical scan information;
selecting a third set of feature points from the three-dimensional model;
performing a second image matching between the first set of three-dimensional feature points and the third set of three-dimensional feature points to identify a matched surface associated with the three-dimensional image and the three-dimensional model;
matching the three-dimensional model to the matched surface;
obtaining a second extended reality image based on the matched three-dimensional model;
superimposing the second extended reality image on the first extended reality image based on the matched surface to augment the medical scan information associated with the target object;
obtaining a third extended reality image corresponding to an under-surface area of one or more tissues, one or more organs or one or more organ systems of a patient, wherein the third extended reality image is an image associated with the under-surface area to simulate a field of view when a surgical tool reaches the under-surface area;
obtaining another three-dimensional image associated with the under-surface area in response to the surgical tool physically reaches the under-surface area; and
calculating a deviation between the third extended reality image and the another three-dimensional image, wherein the deviation corresponds to a shift of the one or more tissues, one or more organs or one or more organ systems of the patient.
|