US 12,118,190 B2
Method for processing data to adjust data inputting speed in human-computer interface system controlled by eye gaze and electroencephalography data
Ha Thanh Le, Ha Noi (VN); Dang Hai Kieu, Ha Noi (VN); Hoa Minh Nguyen, Ha Noi (VN); Anh Viet Nguyen, Ha Noi (VN); Duyen Thi Ngo, Ha Noi (VN); Hung Ba Nguyen, Ha Noi (VN); and Khanh Quoc Man, Bac Ninh (VN)
Assigned to UNIVERSITY OF ENGINEERING AND TECHNOLOGY—VIETNAM NATIONAL UNIVERSITY, Ha Noi (VN)
Filed by Ha Thanh Le, Ha Noi (VN); Dang Hai Kieu, Ha Noi (VN); Hoa Minh Nguyen, Ha Noi (VN); Anh Viet Nguyen, Ha Noi (VN); Duyen Thi Ngo, Ha Noi (VN); Hung Ba Nguyen, Ha Noi (VN); and Khanh Quoc Man, Bac Ninh (VN)
Filed on Jun. 15, 2023, as Appl. No. 18/210,342.
Prior Publication US 2023/0325059 A1, Oct. 12, 2023
Int. Cl. G06F 3/0484 (2022.01); G06F 3/01 (2006.01); G06F 3/038 (2013.01); G06F 3/0482 (2013.01)
CPC G06F 3/0484 (2013.01) [G06F 3/013 (2013.01); G06F 3/038 (2013.01); G06F 3/0482 (2013.01)] 15 Claims
OG exemplary drawing
 
1. A method for processing data executed by a computing apparatus with at least a storage and a processor in an eye gaze controlled human-computer interface system using electroencephalography (EEG) data and eye tracking (ET) data,
wherein the human-computer interface system includes at least a screen configured to display a control interface for user via a list of keys which is associated with user actions, an electroencephalography monitoring device for collecting the electroencephalography data, and an eye tracking device for collecting the eye tracking data,
the method comprising following steps:
representing through the control interface, at least one or more sets of the list of keys which is associated with user actions in a specific context;
collecting and preprocessing an ET dataset while the user is looking at the control interface to determine the eye tracking data which is data of coordinate form (x,y) corresponding to the position on the screen that the user's eyes look at;
defining the key that the user wants to select based on the (x,y) coordinates on the screen that the user's eyes be looking at by calculating whether said (x,y) coordinates is within the boundary of a certain key or not;
setting up and calculating a time counting parameter t, during the (x,y) coordinates is within the boundary of the key defined in the foregoing step, to confirm whether the user is trying to select said key or not, based on at least a time constraint defined as t≥dwell,
wherein dwell is a known parameter corresponding to a required time duration that the user has to look at a key to select that key, and when the time constraint is satisfied then confirming that the user wants to select that key,
said setting up and calculating the time counting parameter t are executed as following:
setting up a time counting parameter t=0, an increasing rate of the time counting parameter Δ>0, and an adjusting rate ΔT=0,
performing a loop cycle including:
updating the time counting parameter t according to the formula:
t:=t+Δ+ΔT,
checking the time constraint t≥dwell, and
updating the adjusting rate ΔT based on the EEG data,
wherein said updating the adjusting rate ΔT is carried out as follow:
collecting the EEG data within about n seconds right before the time t of the current loop that is the time duration from t−n to t;
preprocessing the obtained EEG data;
extracting features of the preprocessed EEG dataset;
inputting the extracted features from the EEG data into the classifier M to determine a concentration state p and a confidence score confp; and
setting up the increasing rate of the time counting parameter ΔT based on the concentration state p and the confidence score confp.