US 12,235,120 B2
Rendering a situational-awareness view in an autonomous-vehicle environment
Robert Earl Rasmusson, San Francisco, CA (US); Taggart Matthiesen, Kentfield, CA (US); Craig Dehner, San Francisco, CA (US); Linda Dong, San Francisco, CA (US); Frank Taehyun Yoo, San Carlos, CA (US); Karina van Schaardenburg, San Francisco, CA (US); John Tighe, San Francisco, CA (US); Matt Vitelli, San Francisco, CA (US); Jisi Guo, San Francisco, CA (US); and Eli Guerron, San Francisco, CA (US)
Assigned to Lyft, Inc., San Francisco, CA (US)
Filed by Lyft, Inc., San Francisco, CA (US)
Filed on Apr. 29, 2021, as Appl. No. 17/244,838.
Application 17/244,838 is a continuation of application No. 15/812,636, filed on Nov. 14, 2017, granted, now 11,010,615.
Claims priority of provisional application 62/422,025, filed on Nov. 14, 2016.
Prior Publication US 2021/0326602 A1, Oct. 21, 2021
Int. Cl. G01C 21/26 (2006.01); B60W 60/00 (2020.01); B62D 15/02 (2006.01); G01C 21/36 (2006.01); G05D 1/00 (2006.01); G06F 18/24 (2023.01); G06T 7/20 (2017.01); G06V 20/56 (2022.01)
CPC G01C 21/3638 (2013.01) [B60W 60/00253 (2020.02); B62D 15/0285 (2013.01); G01C 21/365 (2013.01); G05D 1/0044 (2013.01); G05D 1/0088 (2013.01); G05D 1/0212 (2013.01); G05D 1/0246 (2013.01); G05D 1/0274 (2013.01); G06F 18/24 (2023.01); G06T 7/20 (2013.01); G06V 20/56 (2022.01); B60W 2420/403 (2013.01); B60W 2420/408 (2024.01); B60W 2554/00 (2020.02); B60W 2555/20 (2020.02); B60W 2556/10 (2020.02); G06T 2207/30252 (2013.01); G06T 2207/30256 (2013.01); G06T 2207/30261 (2013.01)] 20 Claims
OG exemplary drawing
 
1. A method comprising:
determining a desired field of view for a user in a vehicle while traveling on a road;
receiving sensor data corresponding to the desired field of view from a sensor array of the vehicle;
determining a confidence score for a classification of an object based on the sensor data;
determining that the object is a target object based on the confidence score for the classification of the object;
determining that the object is within a threshold distance from the vehicle;
in response to determining that the object is within the threshold distance, generating an object graphic corresponding to the object;
in response to determining that the object is the target object, retrieving, from one or more third-party map-based systems, third-party data associated with the object;
generating an overlay graphic based on the third-party data; and
providing for display, on a user interface device communicatively coupled with the vehicle, the object graphic and the overlay graphic.