US 11,749,026 B2
Automotive and industrial motion sensory device
David S. Holz, San Francisco, CA (US); Justin Schunick, Oakland, CA (US); Neeloy Roy, San Francisco, CA (US); Chen Zheng, San Francisco, CA (US); and Ward Travis, San Mateo, CA (US)
Assigned to Ultrahaptics IP Two Limited, Bristol (GB)
Filed by Ultrahaptics IP Two Limited, Bristol (GB)
Filed on Jun. 23, 2022, as Appl. No. 17/848,181.
Application 17/848,181 is a continuation of application No. 14/826,102, filed on Aug. 13, 2015, granted, now 11,386,711.
Claims priority of provisional application 62/038,112, filed on Aug. 15, 2014.
Prior Publication US 2022/0319235 A1, Oct. 6, 2022
This patent is subject to a terminal disclaimer.
Int. Cl. G06F 3/042 (2006.01); G06V 40/20 (2022.01); G01P 13/00 (2006.01); G06F 3/01 (2006.01); G06T 7/70 (2017.01); G06V 20/59 (2022.01); H04N 23/11 (2023.01); G06T 7/20 (2017.01); H04N 7/18 (2006.01)
CPC G06V 40/20 (2022.01) [G01P 13/00 (2013.01); G06F 3/01 (2013.01); G06F 3/011 (2013.01); G06F 3/017 (2013.01); G06F 3/042 (2013.01); G06T 7/20 (2013.01); G06T 7/70 (2017.01); G06V 20/59 (2022.01); H04N 7/181 (2013.01); H04N 23/11 (2023.01); G06T 2207/10012 (2013.01); G06T 2207/10024 (2013.01); G06T 2207/10048 (2013.01); G06T 2207/30196 (2013.01); G06T 2207/30268 (2013.01)] 13 Claims
OG exemplary drawing
 
1. A motion sensory control system, including:
a controller coupled to imaging sensors, sonic transducers, and one or more illumination sources to control operation thereof, the controller being configured to acquire imaging information and sonic information of a scene directly reflected by a body part of an occupant of a vehicle in the scene to the imaging sensors and the sonic transducers; and
wherein the controller is further configured to interact with an augmented reality system providing a heads-up display (HUD) on a windshield of the vehicle, such that a graphical object representing the body part of the occupant is displayed by the HUD onto the windshield and the graphical object is capable of interacting with other objects displayed by the HUD as a result of movement of the body part of the occupant, and
wherein the controller is further configured to determine an overall level of happiness of occupants of the vehicle according to at least a frequency of changes in environmental controls of the vehicle, and
wherein the controller is further configured to select an icon that illustrates facial features inherently representing the determined overall level of happiness, and
wherein the controller is further configured to interact with the augmented reality system to display an augmented reality presentation by the HUD onto the windshield including the selected icon, such that a driver of the vehicle is informed of the overall level of happiness of other occupants of the vehicle, and
wherein the changes in environmental controls of the vehicle include changes in vehicle temperature control.