US 12,262,958 B2
Use of eye tracking input to initiate computer-vision identification of an object in a surgical image
Kevin Andrew Hufford, Cary, NC (US)
Assigned to Asensus Surgical US, Inc., Durham, NC (US)
Filed by Asensus Surgical US, Inc., Durham, NC (US)
Filed on Mar. 8, 2023, as Appl. No. 18/119,228.
Application 18/119,228 is a continuation of application No. 16/237,418, filed on Dec. 31, 2018, granted, now 11,690,677.
Claims priority of provisional application 62/612,554, filed on Dec. 31, 2017.
Prior Publication US 2024/0299121 A1, Sep. 12, 2024
This patent is subject to a terminal disclaimer.
Int. Cl. A61B 34/30 (2016.01); A61B 34/20 (2016.01); A61B 34/37 (2016.01); A61B 90/00 (2016.01); B25J 9/00 (2006.01); B25J 9/02 (2006.01); B25J 9/16 (2006.01); G06F 3/01 (2006.01); A61B 17/00 (2006.01); A61B 34/00 (2016.01)
CPC A61B 34/20 (2016.02) [A61B 34/30 (2016.02); A61B 34/37 (2016.02); A61B 90/361 (2016.02); A61B 90/37 (2016.02); B25J 9/0021 (2013.01); B25J 9/023 (2013.01); B25J 9/16 (2013.01); B25J 9/1682 (2013.01); B25J 9/1697 (2013.01); G06F 3/013 (2013.01); A61B 2017/00216 (2013.01); A61B 2034/2046 (2016.02); A61B 2034/2055 (2016.02); A61B 2034/305 (2016.02); A61B 34/76 (2016.02)] 15 Claims
OG exemplary drawing
 
1. A method of detecting an object within a surgical work site, comprising:
positioning a surgical instrument within a surgical work area within a body cavity;
mounting the surgical instrument to a robotic manipulator;
causing the robotic manipulator to move the surgical instrument in response to input commands from a hand controller;
capturing an image of a portion of the surgical work area within the body cavity;
displaying the image on an image display;
receiving from an eye gaze sensor input corresponding to a direction of a user's gaze towards the image of the work area on the display, the direction of the user's gaze corresponding to a region of the image; and
executing a computer vision algorithm, said computer vision algorithm analyzing the region within the captured image to detect in the image at least a portion of the surgical instrument that is at the surgical site and displayed in the image; and
following detection of the surgical instrument, initiating a mode in which input commands from the hand controller do not cause movement of the surgical instrument by the robotic manipulator.