US 12,478,433 B2
Image guidance during cannulation
Hiroyuki Mino, Westborough, MA (US); Hirokazu Horio, Allentown, PA (US); Brandon Ranalli, Quincy, MA (US); Mark Lavender, Plymouth, MN (US); Gloria Yee, Westborough, MA (US); Anthony R. Pirozzi, Raleigh, NC (US); Quinn C. Cartall, Atlanta, GA (US); Atanaska Gospodinova, Hamburg (DE); and Katharina Dissmann, Aumühle (DE)
Assigned to Olympus Corporation, Tokyo (JP)
Filed by Olympus Corporation, Tokyo (JP)
Filed on Oct. 18, 2022, as Appl. No. 18/047,555.
Claims priority of provisional application 63/263,711, filed on Nov. 8, 2021.
Claims priority of provisional application 63/262,796, filed on Oct. 20, 2021.
Prior Publication US 2023/0123739 A1, Apr. 20, 2023
Int. Cl. A61B 34/10 (2016.01); A61B 34/20 (2016.01)
CPC A61B 34/10 (2016.02) [A61B 34/20 (2016.02); A61B 2034/107 (2016.02)] 20 Claims
OG exemplary drawing
 
1. An image-guided endoscopic system, comprising:
an endoscope configured to be positioned and navigated in a patient anatomy;
a processor configured to:
receive at least two images of an anatomical target;
reconstruct a three-dimensional (3D) image of the anatomical target using the at least two received images; and
generate an endoscope navigation plan for positioning and navigating the endoscope based at least on the reconstructed 3D image of the anatomical target;
a display configured to display the reconstructed 3D image and the endoscope navigation plan; and
an input unit configured to receive user input for controlling a viewing area and a viewing angle of the reconstructed 3D image, wherein the display is configured to automatically adjust at least one of the viewing area or the viewing angle of the display of the reconstructed 3D image in accordance with at least one of: i) the endoscope navigation plan or ii) a position or direction of a distal end of the endoscope relative to the anatomical target, and the user input received via the input unit, wherein the display is configured to automatically zoom in on a portion of the reconstructed 3D image as the distal end of the endoscope gets closer to the anatomical target, and wherein the processor is further configured to:
determine a projected navigation path for the endoscope toward the anatomical target;
display the projected navigation path on the display overlaid on the reconstructed 3D image;
display a live navigation path on the display overlaid on the reconstructed 3D image as the endoscope moves; and
generate an alert when the live navigation path deviates from the projected navigation path by at least one specified criterion.