US 12,001,602 B2
Brain-computer interface with adaptations for high-speed, accurate, and intuitive user interactions
Ramses Alcaide, Boston, MA (US); Dereck Padden, Newton, MA (US); Jay Jantz, Burlington, MA (US); James Hamet, Cambridge, MA (US); Jeffrey Morris, Jr., Cambridge, MA (US); and Arnaldo Pereira, Acton, MA (US)
Assigned to Neurable Inc., Boston, MA (US)
Filed by Neurable Inc., Boston, MA (US)
Filed on May 12, 2020, as Appl. No. 16/872,730.
Application 16/872,730 is a continuation of application No. PCT/US2018/060797, filed on Nov. 13, 2018.
Claims priority of provisional application 62/585,209, filed on Nov. 13, 2017.
Prior Publication US 2020/0268296 A1, Aug. 27, 2020
Int. Cl. A61B 5/16 (2006.01); G06F 3/01 (2006.01); G06F 3/04842 (2022.01)
CPC G06F 3/013 (2013.01) [G06F 3/015 (2013.01); G06F 3/017 (2013.01); G06F 3/04842 (2013.01); G06F 2203/0381 (2013.01)] 23 Claims
OG exemplary drawing
 
1. An apparatus, comprising:
a display configured to present a control interface to a user;
an eye-tracking device configured to record eye-movement signals associated with the user;
a neural recording device configured to record neural signals associated with the user;
an interfacing device operatively coupled to the display, the eye-tracking device, and the neural recording device, the interfacing device including:
a memory; and
a processor operatively coupled to the memory and configured to:
receive the eye-movement signals from the eye-tracking device and the neural signals from the neural recording device;
generate and present a stimulus, via the control interface and to the user, the stimulus including a set of control items, each control item from the set of control items associated with an action from a set of actions;
provide a sticky control item configured to be (1) transitioned to a picked-up state in response to an eye-movement signal indicating a foveation over the sticky control item, (2) moved based on an eye-movement signal of the user, and (3) transitioned to a dropped state on a target control item to activate the target control item;
provide a grabber object configured to manipulate the sticky control item;
associate the sticky control item with the grabber object when the sticky control item is transitioned to the picked-up state based on an eye-movement signal indicating a foveation on the sticky control item;
determine a point of focus of the user based on at least one of the eye-movement signals or the neural signals, the point of focus being associated with an identified control item from the set of control items;
dissociate, when the sticky control item is at the dropped state and on the identified control item, the sticky control item from the grabber object and associate the sticky control item with the identified control item;
and
activate the identified control item to implement an action intended by the user.