US 12,293,018 B2
Wearable computing apparatus and method
Christopher Allen Aimone, Scarborough (CA); Ariel Stephanie Garten, Toronto (CA); Trevor Coleman, Toronto (CA); Locillo (Lou) Giuseppe Pino, Cambridge (CA); Kapil Jay Mishra Vidyarthi, Toronto (CA); Paul Harrison Baranowski, Toronto (CA); Michael Apollo Chabior, Oakville (CA); Tracy Chong, Toronto (CA); Raul Rajiv Rupsingh, Brampton (CA); Madeline Ashby, Toronto (CA); and Paul V. Tadich, Toronto (CA)
Assigned to INTERAXON INC., Toronto (CA)
Filed by INTERAXON INC., Toronto (CA)
Filed on Dec. 18, 2020, as Appl. No. 17/126,587.
Application 17/126,587 is a continuation of application No. 16/440,252, filed on Jun. 13, 2019, granted, now 10,901,509.
Application 16/440,252 is a continuation of application No. 14/216,925, filed on Mar. 17, 2014, granted, now 10,365,716, issued on Jul. 30, 2017.
Claims priority of provisional application 61/792,585, filed on Mar. 15, 2013.
Prior Publication US 2021/0165490 A1, Jun. 3, 2021
Int. Cl. G06F 3/01 (2006.01); A61B 5/00 (2006.01); A61B 5/16 (2006.01); A61B 5/378 (2021.01); A61B 5/38 (2021.01); A61B 5/398 (2021.01); A61M 21/00 (2006.01); G02B 27/01 (2006.01); G02C 11/00 (2006.01); G06F 3/048 (2013.01); G06F 3/0487 (2013.01); G06F 16/90 (2019.01); G09G 3/00 (2006.01); H04W 4/029 (2018.01); H04W 4/30 (2018.01); A61B 5/024 (2006.01); A61B 5/026 (2006.01); A61B 5/375 (2021.01); A61B 5/389 (2021.01); G02C 7/02 (2006.01); H04L 67/12 (2022.01); H04M 1/05 (2006.01)
CPC G06F 3/015 (2013.01) [A61B 5/165 (2013.01); A61B 5/378 (2021.01); A61B 5/38 (2021.01); A61B 5/398 (2021.01); A61B 5/4064 (2013.01); A61B 5/6803 (2013.01); A61M 21/00 (2013.01); G02B 27/017 (2013.01); G02C 11/10 (2013.01); G06F 3/013 (2013.01); G06F 3/048 (2013.01); G06F 3/0487 (2013.01); G06F 16/90 (2019.01); G09G 3/003 (2013.01); H04W 4/029 (2018.02); H04W 4/30 (2018.02); A61B 5/0024 (2013.01); A61B 5/02416 (2013.01); A61B 5/02438 (2013.01); A61B 5/0261 (2013.01); A61B 5/375 (2021.01); A61B 5/389 (2021.01); A61B 5/7267 (2013.01); A61B 5/744 (2013.01); A61B 2560/0493 (2013.01); A61M 2021/0022 (2013.01); A61M 2021/0027 (2013.01); A61M 2021/0044 (2013.01); A61M 2021/0066 (2013.01); A61M 2230/10 (2013.01); A61M 2230/14 (2013.01); G02B 2027/0187 (2013.01); G02C 7/027 (2013.01); H04L 67/12 (2013.01); H04M 1/05 (2013.01); H04M 2250/12 (2013.01)] 26 Claims
OG exemplary drawing
 
1. A computer-implemented method comprising:
displaying an image on a display;
acquiring at least one bio-signal measurement from a user using at least one bio-signal measuring sensor;
receiving visual input of a current field of view of the user from at least one camera oriented to generally align with the user's field of view;
acquiring at least one eye tracking measurement from the at least one user;
detecting one or more objects within the visual input;
identifying at least one location of at least one object of the one or more objects in the current field of view;
processing the at least one bio-signal measurement to determine at least one brain state of the user;
associating the at least one brain state of the user with the at least one object by:
identifying a pattern of eye movement in the at least one eye tracking measurement,
determining that the user is focused on the at least one object using the pattern of eye movement;
and
associating the at least one brain state with the at least one object based on the pattern of eye movement and an evoked potential detected in the at least one bio-signal measurement, wherein evoking the potential comprises:
modifying the at least one portion of the display corresponding to the at least one location of the at least one object at a frequency; and
evaluating one or more characteristics of the evoked potential related to the frequency to determine the user's attention to the at least one object; and
modifying the image based at least partly on the at least one brain state of the user associated with the at least one object, wherein the modifying the image comprises reducing visibility of the at least one object in the current field of view.