US 12,116,012 B2
System and method for autonomous vehicle control based on monitoring driver state
Sun Hong Park, Cheonan-si (KR); Young Dal Oh, Cheonan-si (KR); Jin Hae Yae, Cheonan-si (KR); Dong Woon Ryu, Cheonan-si (KR); and Hun Lee, Cheonan-si (KR)
Assigned to Korea Automotive Technology Institute, Cheonan-si (KR)
Filed by Korea Automotive Technology Institute, Cheonan-si (KR)
Filed on Aug. 17, 2021, as Appl. No. 17/404,458.
Claims priority of application No. 10-2020-0169822 (KR), filed on Dec. 7, 2020; application No. 10-2020-0169823 (KR), filed on Dec. 7, 2020; and application No. 10-2020-0169824 (KR), filed on Dec. 7, 2020.
Prior Publication US 2022/0176996 A1, Jun. 9, 2022
Int. Cl. B60K 35/00 (2024.01); B60W 10/20 (2006.01); B60W 30/14 (2006.01); B60W 30/16 (2020.01); B60W 40/02 (2006.01); B60W 40/09 (2012.01); B60W 50/08 (2020.01); B60W 60/00 (2020.01); B60K 35/23 (2024.01); B60K 35/29 (2024.01)
CPC B60W 60/0013 (2020.02) [B60K 35/00 (2013.01); B60W 10/20 (2013.01); B60W 30/143 (2013.01); B60W 30/16 (2013.01); B60W 40/02 (2013.01); B60W 40/09 (2013.01); B60W 50/082 (2013.01); B60K 35/23 (2024.01); B60K 35/29 (2024.01); B60K 2360/188 (2024.01); B60W 2540/225 (2020.02); B60W 2540/229 (2020.02); B60W 2540/30 (2013.01); B60W 2554/801 (2020.02); B60W 2554/802 (2020.02)] 12 Claims
OG exemplary drawing
 
1. A vehicle control system comprising:
an input unit including one or more related sensors and configured to collect driving situation data, infrastructure data, forward situation data, and driver's state data;
a memory configured to store a program for determining a driving pattern using the driving situation data, the infrastructure data, the forward situation data, and the driver's state data in a case of an autonomous driving mode; and
a processor configured to execute the program,
wherein the processor is further configured to:
perform learning by matching autonomous driving data with the driving situation data, the infrastructure data, the forward situation data including information of whether a gaze point of a driver overlaps a location of a front vehicle, and the driver's state data;
determine an autonomous driving control pattern based on a result of the learning, and transmit an autonomous driving control command according to the determined autonomous driving control pattern;
obtain a reaction of the driver to an autonomous driving, which is performed based on the autonomous driving control command, by collecting, using an imaging sensor, an image of the driver and collecting, by using a biometric sensor, a change in the driver's state data due to the autonomous driving;
update and store the autonomous driving data based on the obtained reaction of the driver; and
display head-up display (HUD) information on a windshield of a vehicle in an HUD information display position that is aligned with the gaze point of the driver, and
wherein the processor is further configured to, based on the gaze point of the driver being changed:
determine, based on whether navigation information indicating to make a turn within a certain distance from a current position of the vehicle has been provided to the driver, whether a change of the gaze point is a temporary deviation and the gaze point is expected to return to a previous gaze direction after making the turn;
maintain the HUD information display position in response to determining that the change of the gaze point is for making the turn as indicated by the navigation information and thus is the temporary deviation; and
change the HUD information display position to align with the changed gaze point, in response to determining that the change of the gaze point is not for making the turn as indicated by the navigation information and thus is not the temporary deviation,
wherein, in a case of a driver driving mode, the memory stores a second program for determining a head-up display (HUD) control command using the driver's state data including driver's gaze position information and the driving situation data including position and color information of a surrounding object,
wherein the processor executes the second program and transmits the HUD control command in consideration of the color information of the surrounding object displayed in the HUD information display position, and
wherein the processor compares a color of content displayed in the HUD information display position with a color of the surrounding object and transmits the HUD control command for changing at least one of color, luminance, and brightness of a head-up display user interface (HUD UI) and a size of the content in consideration of visibility for each driving time.