| CPC G06N 3/08 (2013.01) [A61B 5/165 (2013.01); A61B 5/4824 (2013.01); A61B 5/7278 (2013.01); A61B 5/746 (2013.01); G05D 1/0061 (2013.01); G05D 1/0088 (2013.01); G06N 3/04 (2013.01); G06V 10/454 (2022.01); G06V 10/764 (2022.01); G06V 10/95 (2022.01); G06V 20/597 (2022.01); G06V 40/165 (2022.01); G06V 40/171 (2022.01); G06V 40/174 (2022.01); G06V 40/176 (2022.01); G06V 40/20 (2022.01); G16H 15/00 (2018.01); G16H 40/20 (2018.01); A61B 5/0077 (2013.01); A61B 2576/02 (2013.01); B60W 40/08 (2013.01); B60W 2420/403 (2013.01); B60W 2540/22 (2013.01); G06V 2201/03 (2022.01); G16H 50/20 (2018.01); G16H 50/70 (2018.01)] | 18 Claims |

|
1. An automated image processing method for assessing facially-expressed emotions of an individual, the facially-expressed emotions being caused by operation of a vehicle, machinery, simulator, or robot by the individual, comprising:
operating a vehicle, machinery, or robot by the individual and thereby exposing a vision of the individual to a stimulus;
detecting non-verbal communication from a physiognomical expression of the individual based on image data by a first computer algorithm, the image data of the physiognomical expression of the individual being caused in response to the stimulus;
assigning features of the non-verbal communication to different types of emotions by a second computer algorithm;
based on the features of the non-verbal communication, generating a first data value associated with a first emotion and a second data value associated with a second emotion;
analyzing the different types of emotions to determine an impairment of the individual, at least in part by determining that the first data value exceeds a first threshold or the second data value exceeds a second threshold; and
generating at least one of a prompt, an alert, or a change in a setting of an operational parameter of the vehicle, machinery, or robot, based on the impairment of the individual.
|