US 12,475,367 B2
Image processing system for extracting a behavioral profile from images of an individual specific to an event
Matteo Sorci, Morges (CH); and Timothy Llewellynn, Saint-Prex (CH)
Assigned to BEEMOTION.AI LTD, Nicosia (CY)
Filed by BEEMOTION.AI LTD, Nicosia (CY)
Filed on Jun. 28, 2021, as Appl. No. 17/359,653.
Application 17/359,653 is a continuation in part of application No. 16/403,656, filed on May 6, 2019, granted, now 11,048,921.
Claims priority of provisional application 62/668,856, filed on May 9, 2018.
Prior Publication US 2021/0326586 A1, Oct. 21, 2021
Int. Cl. G06V 40/16 (2022.01); A61B 5/00 (2006.01); A61B 5/16 (2006.01); G05D 1/00 (2006.01); G06N 3/04 (2023.01); G06N 3/08 (2023.01); G06V 10/44 (2022.01); G06V 10/764 (2022.01); G06V 10/94 (2022.01); G06V 20/59 (2022.01); G06V 40/20 (2022.01); G16H 15/00 (2018.01); G16H 40/20 (2018.01); B60W 40/08 (2012.01); G16H 50/20 (2018.01); G16H 50/70 (2018.01)
CPC G06N 3/08 (2013.01) [A61B 5/165 (2013.01); A61B 5/4824 (2013.01); A61B 5/7278 (2013.01); A61B 5/746 (2013.01); G05D 1/0061 (2013.01); G05D 1/0088 (2013.01); G06N 3/04 (2013.01); G06V 10/454 (2022.01); G06V 10/764 (2022.01); G06V 10/95 (2022.01); G06V 20/597 (2022.01); G06V 40/165 (2022.01); G06V 40/171 (2022.01); G06V 40/174 (2022.01); G06V 40/176 (2022.01); G06V 40/20 (2022.01); G16H 15/00 (2018.01); G16H 40/20 (2018.01); A61B 5/0077 (2013.01); A61B 2576/02 (2013.01); B60W 40/08 (2013.01); B60W 2420/403 (2013.01); B60W 2540/22 (2013.01); G06V 2201/03 (2022.01); G16H 50/20 (2018.01); G16H 50/70 (2018.01)] 18 Claims
OG exemplary drawing
 
1. An automated image processing method for assessing facially-expressed emotions of an individual, the facially-expressed emotions being caused by operation of a vehicle, machinery, simulator, or robot by the individual, comprising:
operating a vehicle, machinery, or robot by the individual and thereby exposing a vision of the individual to a stimulus;
detecting non-verbal communication from a physiognomical expression of the individual based on image data by a first computer algorithm, the image data of the physiognomical expression of the individual being caused in response to the stimulus;
assigning features of the non-verbal communication to different types of emotions by a second computer algorithm;
based on the features of the non-verbal communication, generating a first data value associated with a first emotion and a second data value associated with a second emotion;
analyzing the different types of emotions to determine an impairment of the individual, at least in part by determining that the first data value exceeds a first threshold or the second data value exceeds a second threshold; and
generating at least one of a prompt, an alert, or a change in a setting of an operational parameter of the vehicle, machinery, or robot, based on the impairment of the individual.