US 12,079,440 B2
Method for real time update of fly-through camera placement
Jetmir Palushi, Irvine, CA (US); Henry F. Salazar, Pico Rivera, CA (US); Jordan R. Trott, Redondo Beach, CA (US); Moran Levi, Kiryat-Tivon (IL); Itamar Bustan, Zichron Ya'acov (IL); Yoav Pinsky, Yokneam (IL); Noam Racheli, Hadera (IL); and Athanasios Papadakis, Newport Beach, CA (US)
Assigned to Acclarent, Inc., Irvine, CA (US); and Biosense Webster (Israel) Ltd., Yokneam (IL)
Filed by Acclarent, Inc., Irvine, CA (US); and Biosense Webster (Israel) Ltd., Yokneam (IL)
Filed on Apr. 24, 2023, as Appl. No. 18/138,382.
Application 18/138,382 is a continuation of application No. 17/528,429, filed on Nov. 17, 2021, granted, now 11,656,735.
Application 17/528,429 is a continuation of application No. 16/577,019, filed on Sep. 20, 2019, granted, now 11,204,677, issued on Dec. 21, 2021.
Claims priority of provisional application 62/748,571, filed on Oct. 22, 2018.
Prior Publication US 2023/0259248 A1, Aug. 17, 2023
This patent is subject to a terminal disclaimer.
Int. Cl. G06F 3/04815 (2022.01); A61B 34/00 (2016.01); A61B 34/20 (2016.01); G06F 3/04842 (2022.01); G06T 15/20 (2011.01); G16H 20/40 (2018.01); G16H 30/20 (2018.01); A61B 34/10 (2016.01); A61B 90/00 (2016.01)
CPC G06F 3/04815 (2013.01) [A61B 34/20 (2016.02); A61B 34/25 (2016.02); G06F 3/04842 (2013.01); G06T 15/205 (2013.01); G16H 20/40 (2018.01); G16H 30/20 (2018.01); A61B 2034/105 (2016.02); A61B 2090/374 (2016.02); A61B 2090/3762 (2016.02); A61B 2090/378 (2016.02); G06F 2203/04803 (2013.01); G06T 2200/24 (2013.01); G06T 2210/41 (2013.01)] 20 Claims
OG exemplary drawing
 
1. A system comprising:
(a) a display;
(b) user input;
(c) a set of preoperative images associated with a patient; and
(d) a processor configured to provide a virtual camera placement interface to a user via the display and receive inputs via the user input; the virtual camera placement interface comprising a set of preoperative image panes and a virtual camera view, each of the set of preoperative image panes comprising a preoperative image from the set of preoperative images, and the user input being operable to move a cursor over and make selections from the set of preoperative image panes, the processor being further configured to:
(i) define a first point based upon a first selection received via the user input, the first selection comprising a point on one of the set of preoperative image panes that corresponds to a first cursor position of the cursor when the first selection is received,
(ii) define a second point based upon a second cursor position of the cursor on any of the set of preoperative image panes,
(iii) display a real-time virtual endoscopic preview in the virtual camera view based upon the first point and the second point,
(iv) change a value of the second point as the cursor is moved based on the user input and the second cursor position changes, and
(v) after display of the real-time virtual endoscopic preview, receive a second selection, the second selection comprising a selected second point determined based upon the second point,
(vi) based upon a modify selection, discarding a selected value of one of the first point or the selected second point,
(vii) defining a modified point based upon a third cursor position, the modified point being one of the first point or the selected second point whose value was discarded, and
(viii) display the real-time virtual endoscopic preview in the virtual camera view based upon a retained point and the modified point, the retained point being one of the first point or the selected second point whose value was not discarded.