US 12,228,928 B1
System and method for evaluating the perception system of an autonomous vehicle
Jiajun Zhu, Palo Alto, CA (US); Christopher Paul Urmson, Mountain View, CA (US); Dirk Haehnel, Sunnyvale, CA (US); Nathaniel Fairfield, Rockingham, VA (US); and Russell Leigh Smith, Los Altos, CA (US)
Assigned to Waymo LLC, Mountain View, CA (US)
Filed by WAYMO LLC, Mountain View, CA (US)
Filed on Jul. 11, 2023, as Appl. No. 18/220,628.
Application 15/874,130 is a division of application No. 15/587,680, filed on May 5, 2017, granted, now 9,911,030, issued on Mar. 6, 2018.
Application 18/220,628 is a continuation of application No. 17/387,199, filed on Jul. 28, 2021, granted, now 11,747,809.
Application 17/387,199 is a continuation of application No. 16/692,643, filed on Nov. 22, 2019, granted, now 11,106,893, issued on Aug. 31, 2021.
Application 16/692,643 is a continuation of application No. 16/209,429, filed on Feb. 27, 2019, granted, now 10,572,717, issued on Feb. 25, 2020.
Application 16/209,429 is a continuation of application No. 15/874,130, filed on Jan. 18, 2018, granted, now 10,198,619, issued on Feb. 5, 2019.
Application 15/587,680 is a continuation of application No. 14/792,995, filed on Jul. 7, 2015, granted, now 9,679,191, issued on Jun. 13, 2017.
Application 14/792,995 is a continuation of application No. 13/200,958, filed on Oct. 5, 2011, granted, now 9,122,948, issued on Sep. 1, 2015.
Claims priority of provisional application 61/391,271, filed on Oct. 8, 2010.
Claims priority of provisional application 61/390,094, filed on Oct. 5, 2010.
Int. Cl. G06K 9/00 (2022.01); B60R 1/00 (2022.01); B60T 7/22 (2006.01); B60T 8/00 (2006.01); B60T 8/17 (2006.01); B60T 8/88 (2006.01); B60T 17/18 (2006.01); B60T 17/22 (2006.01); B60W 30/08 (2012.01); B60W 40/06 (2012.01); B60W 50/14 (2020.01); G01C 21/36 (2006.01); G01S 17/86 (2020.01); G01S 17/931 (2020.01); G05D 1/00 (2006.01); G06Q 10/02 (2012.01); G06Q 30/02 (2023.01); G06Q 30/0207 (2023.01); G06Q 40/08 (2012.01); G06T 7/20 (2017.01); G06T 7/223 (2017.01); G06T 7/231 (2017.01); G06T 7/521 (2017.01); G06T 7/73 (2017.01); G06V 20/58 (2022.01); G06V 20/64 (2022.01); G07C 9/00 (2020.01); B60W 30/186 (2012.01); B60W 50/029 (2012.01); B62D 6/00 (2006.01); G01S 13/86 (2006.01); G06V 10/20 (2022.01); G06V 20/56 (2022.01)
CPC G05D 1/0088 (2013.01) [B60R 1/00 (2013.01); B60T 7/22 (2013.01); B60T 8/00 (2013.01); B60T 8/17 (2013.01); B60T 8/885 (2013.01); B60T 17/18 (2013.01); B60T 17/221 (2013.01); B60W 30/08 (2013.01); B60W 40/06 (2013.01); B60W 50/14 (2013.01); G01C 21/3617 (2013.01); G01S 17/86 (2020.01); G01S 17/931 (2020.01); G05D 1/0055 (2013.01); G05D 1/021 (2013.01); G05D 1/0212 (2013.01); G05D 1/0214 (2013.01); G05D 1/0276 (2013.01); G06Q 10/02 (2013.01); G06Q 30/02 (2013.01); G06Q 30/0207 (2013.01); G06Q 40/08 (2013.01); G06T 7/20 (2013.01); G06T 7/223 (2017.01); G06T 7/231 (2017.01); G06T 7/521 (2017.01); G06T 7/74 (2017.01); G06V 20/58 (2022.01); G06V 20/64 (2022.01); G07C 9/00563 (2013.01); B60R 2300/30 (2013.01); B60T 2201/022 (2013.01); B60T 2210/32 (2013.01); B60T 2270/406 (2013.01); B60W 30/186 (2013.01); B60W 2050/0292 (2013.01); B60W 2420/403 (2013.01); B60W 2420/408 (2024.01); B60W 2552/05 (2020.02); B60W 2555/60 (2020.02); B60W 2556/10 (2020.02); B60W 2556/50 (2020.02); B62D 6/00 (2013.01); G01S 13/865 (2013.01); G01S 13/867 (2013.01); G05B 2219/2637 (2013.01); G05D 1/024 (2013.01); G05D 1/0246 (2013.01); G05D 1/0257 (2013.01); G05D 1/0274 (2013.01); G05D 1/0278 (2013.01); G06T 2207/10004 (2013.01); G06T 2207/10028 (2013.01); G06T 2207/30236 (2013.01); G06T 2207/30252 (2013.01); G06T 2207/30261 (2013.01); G06V 10/255 (2022.01); G06V 20/56 (2022.01); G06V 20/588 (2022.01)] 20 Claims
OG exemplary drawing
 
1. A server comprising:
one or more processors configured to:
receive, from an autonomous vehicle remotely located from the server, one or more images and a first set of object labels applied to objects in the one or more images identified at the autonomous vehicle;
compare the first set of object labels to a second set of object labels applied to objects in the one or more images identified by a review; and
when a result of the comparison indicates that there is at least a threshold difference between the first set of object labels and the second set of object labels, generate a recommendation of optimization of object detection parameters for the autonomous vehicle.