US 11,744,646 B2
Registration probe for image guided surgery system
Jetmir Palushi, Irvine, CA (US); Itzhak Fang, Irvine, CA (US); Noam Racheli, Hadera (IL); Oleg Dulger, Yoqneam Ilit (IL); and Itamar Bustan, Zichron Ya'acov (IL)
Assigned to Acclarent, Inc., Irvine, CA (US); and Biosense Webster (Israel) Ltd., Yokneam (IL)
Filed by Acclarent, Inc., Irvine, CA (US); and Biosense Webster (Israel) Ltd., Yokneam (IL)
Filed on Oct. 29, 2019, as Appl. No. 16/666,782.
Claims priority of provisional application 62/778,442, filed on Dec. 12, 2018.
Prior Publication US 2020/0188031 A1, Jun. 18, 2020
Int. Cl. A61B 34/20 (2016.01); A61B 5/06 (2006.01); A61B 34/00 (2016.01)
CPC A61B 34/20 (2016.02) [A61B 5/062 (2013.01); A61B 34/25 (2016.02); A61B 2034/2055 (2016.02)] 20 Claims
OG exemplary drawing
 
1. An image guided surgery (IGS) navigation system comprising:
(a) an instrument including a position sensor, the instrument comprising a registration probe, the registration probe being configured to register locations of external anatomical features of a patient;
(b) a tracking field generator operable to provide a tracked area, the tracked area being a three dimensional space;
(c) a processor configured to determine a position of the instrument within the tracked area based upon a set of tracking data, the set of tracking data being based on signals from the position sensor received by the processor as a result of an interaction of the instrument with the tracked area;
(d) a display operable by the processor to provide an IGS navigation interface to a user; and
(e) a memory storing a set of control patterns, each of the set of control patterns comprising a motion-based spatial input, each of the set of control patterns corresponding to a gesture-based interface action;
the processor being further configured to:
(i) monitor a motion of the instrument within the tracked area based on a plurality of positions determined for the instrument,
(ii) perform a comparison of the spatial input for at least one control pattern of the set of control patterns with the motion,
(iii) determine if the at least one control pattern of the set of control patterns matches the motion based upon the comparison of the spatial input for the at least one control pattern with the motion,
(iv) where the at least one control pattern of the set of control patterns matches the motion, determine the gesture-based interface action corresponding to the at least one control pattern and execute the gesture-based interface action on the IGS navigation interface,
(v) determine the motion of the instrument based upon a change in position in a first dimension and a second dimension, and
(vi) when determining the motion of the instrument, disregard a change in position in a third dimension unless the change in position in the third dimension exceeds a configured distance threshold.