US 12,376,914 B2
Registration probe for image guided surgery system
Jetmir Palushi, Irvine, CA (US); Itzhak Fang, Irvine, CA (US); Noam Racheli, Hadera (IL); Oleg Dulger, Yoqneam Ilit (IL); and Itamar Bustan, Zichron Ya'acov (IL)
Assigned to Acclarent, Inc., Irvine, CA (US); and Biosense Webster (Israel) Ltd., Yokneam (IL)
Filed by Acclarent, Inc., Irvine, CA (US); and Biosense Webster (Israel) Ltd., Yokneam (IL)
Filed on Aug. 14, 2023, as Appl. No. 18/233,432.
Application 18/233,432 is a continuation of application No. 16/666,782, filed on Oct. 29, 2019, granted, now 11,744,646.
Claims priority of provisional application 62/778,442, filed on Dec. 12, 2018.
Prior Publication US 2023/0380908 A1, Nov. 30, 2023
This patent is subject to a terminal disclaimer.
Int. Cl. A61B 5/06 (2006.01); A61B 34/00 (2016.01); A61B 34/20 (2016.01)
CPC A61B 34/20 (2016.02) [A61B 5/062 (2013.01); A61B 34/25 (2016.02); A61B 2034/2055 (2016.02)] 20 Claims
OG exemplary drawing
 
1. An image guided surgery (IGS) navigation system comprising:
(a) an instrument including a position sensor;
(b) a tracking field generator operable to provide a tracked area, the tracked area being a three dimensional space;
(c) a display operable to provide an IGS navigation interface to a user;
(d) a memory storing a set of control patterns, each of the set of control patterns comprising a spatial input, each of the set of control patterns corresponding to a gesture-based interface action; and
(e) a processor configured to:
(i) receive signals from the position sensor as a result of an interaction of the instrument with the tracked area,
(ii) determine a motion of the instrument based upon a change in position of the instrument in a first dimension and a second dimension,
(iii) when determining the motion of the instrument, disregard a change in position in a third dimension unless the change in position in the third dimension exceeds a configured distance threshold, and
(iv) where at least one control pattern of the set of control patterns matches the motion based upon a comparison of the spatial input for the at least one control pattern with the motion, determine the gesture-based interface action corresponding to the at least one control pattern and execute the gesture-based interface action on the IGS navigation interface.