US 11,672,617 B2
Medical user interfaces and related methods of use
Timothy P. Harrah, Cambridge, MA (US); Christopher L. Oskin, Grafton, MA (US); Derrick Lenz, Pompton Plains, NJ (US); Arpita Banerjee, Bangalore (IN); Sandesh Gavade, Bangalore (IN); Pavan Misra, Bangalore (IN); and Abhijit Takale, Pune (IN)
Assigned to Boston Scientific Scimed, Inc., Maple Grove, MN (US)
Filed by Boston Scientific Scimed, Inc., Maple Grove, MN (US)
Filed on Jan. 13, 2022, as Appl. No. 17/574,617.
Application 17/574,617 is a continuation of application No. 16/925,694, filed on Jul. 10, 2020, granted, now 11,253,326.
Application 16/925,694 is a continuation of application No. 16/290,430, filed on Mar. 1, 2019, granted, now 10,743,946, issued on Aug. 18, 2020.
Application 16/290,430 is a continuation of application No. 15/416,838, filed on Jan. 26, 2017, granted, now 10,258,415, issued on Apr. 16, 2019.
Claims priority of provisional application 62/288,654, filed on Jan. 29, 2016.
Prior Publication US 2022/0133416 A1, May 5, 2022
This patent is subject to a terminal disclaimer.
Int. Cl. A61B 34/00 (2016.01); A61B 18/26 (2006.01); G16H 40/63 (2018.01); G06T 7/00 (2017.01); G06T 7/60 (2017.01); G06T 11/60 (2006.01); A61B 17/22 (2006.01); A61B 18/00 (2006.01); A61B 34/10 (2016.01); A61B 34/20 (2016.01); A61B 6/00 (2006.01)
CPC A61B 34/25 (2016.02) [A61B 18/26 (2013.01); G06T 7/0014 (2013.01); G06T 7/60 (2013.01); G06T 11/60 (2013.01); G16H 40/63 (2018.01); A61B 6/464 (2013.01); A61B 6/465 (2013.01); A61B 17/22004 (2013.01); A61B 2018/00505 (2013.01); A61B 2018/00511 (2013.01); A61B 2018/00517 (2013.01); A61B 2018/00577 (2013.01); A61B 2018/00982 (2013.01); A61B 2034/107 (2016.02); A61B 2034/2065 (2016.02); A61B 2034/2074 (2016.02); A61B 2034/252 (2016.02); A61B 2034/254 (2016.02); A61B 2034/256 (2016.02); A61B 2217/007 (2013.01); A61B 2218/002 (2013.01); G06T 2207/10116 (2013.01); G06T 2207/30084 (2013.01); G06T 2210/41 (2013.01)] 20 Claims
OG exemplary drawing
 
1. A medical system for use in a medical procedure, comprising:
a first imaging device;
a display device; and
a processor configured to receive input from the first imaging device, wherein the first imaging device is configured to send image data representative of a first image captured within a body lumen of a patient to the processor, wherein the processor is coupled to the display device, wherein the processor is configured to:
analyze the first image to sense a presence of an object within the first image;
when an object is sensed within the first image, analyze the first image to estimate a size of the object;
display, on the display device, a first user interface during a first portion of the medical procedure, wherein the first user interface includes the first image and the estimated size of the object;
display, on the display device, a second user interface during a second portion of the medical procedure, the second portion of the medical procedure including inserting a guidewire into the lumen, and wherein the second user interface includes both a second image and a first x-ray image of the guidewire displayed simultaneously on the display device;
display, on the display device, a third user interface during a third portion of the medical procedure, the third portion of the medical procedure including estimating a size of an orifice within a third image or a second x-ray image, and wherein the third user interface includes the third image or the second x-ray image and the estimated size of the orifice overlaid as a numerical value on either the third image or the second x-ray image; and
display, on the display device, a fourth user interface during a fourth portion of the medical procedure, wherein the fourth portion of the medical procedure includes application of laser energy, and wherein the fourth user interface includes an image of an energy delivery element delivering laser energy to the object within the lumen,
wherein the processor is configured to analyze the image of the energy delivery element delivering laser energy to the object within the lumen to sense whether fragments of the object have been removed by the laser energy, and wherein the processor is configured to determine that a size of the object is decreasing based on a reduced proportion of a camera field being occupied by the object.