US 11,948,227 B1
Eliminating the appearance of vehicles and/or other objects when operating an autonomous vehicle
Benjamin Piya Austin, Saline, WI (US); John K. Lenneman, Okemos, MI (US); George M. Evans, Ann Arbor, MI (US); Takeshi Yoshida, Ann Arbor, MI (US); William Patrick Garrett, Plymouth, MI (US); and Rebecca L. Kirschweng, Whitmore Lake, MI (US)
Assigned to TOYOTA MOTOR ENGINEERING & MANUFACTURING NORTH AMERICA, INC., Plano, TX (US); and TOYOTA JIDOSHA KABUSHIKI KAISHA, Aichi-Ken (JP)
Filed by Toyota Motor Engineering & Manufacturing North America, Inc., Plano, TX (US)
Filed on Apr. 18, 2023, as Appl. No. 18/135,947.
Int. Cl. G06T 11/00 (2006.01); G06F 3/01 (2006.01); G06T 5/00 (2006.01); G06V 20/20 (2022.01); G06V 20/58 (2022.01); G09G 3/00 (2006.01); B60K 35/00 (2006.01); G02B 27/01 (2006.01)
CPC G06T 11/00 (2013.01) [G06F 3/015 (2013.01); G06T 5/002 (2013.01); G06V 20/20 (2022.01); G06V 20/58 (2022.01); G09G 3/002 (2013.01); B60K 35/00 (2013.01); B60K 2370/1529 (2019.05); B60K 2370/177 (2019.05); B60R 2300/205 (2013.01); B60R 2300/308 (2013.01); G02B 27/01 (2013.01); G06F 2203/011 (2013.01); G06T 2207/30252 (2013.01); G06V 2201/08 (2022.01)] 20 Claims
OG exemplary drawing
 
1. A method for eliminating an appearance of one or more objects in a real-world environment surrounding a vehicle during operation of the vehicle, the method comprising:
receiving data from the vehicle indicating at least a location of the vehicle within the real-world environment;
detecting the one or more objects in first sensor data collected from one or more first sensors onboard the vehicle, wherein the first sensor data is representative of the real-world environment surrounding the vehicle that is observable within a field of view of an augmented reality display associated with the vehicle;
receiving, from one or more other vehicles in the real-world environment, a location of each of the one or more other vehicles in the real-world environment;
receiving, from the one or more other vehicles, second sensor data collected from one or more second sensors onboard the one or more other vehicles, wherein the second sensor data is representative of the real-world environment surrounding each of the one or more other vehicles;
generating one or more augmented reality images depicting portions of the real-world environment obstructed by the one or more objects in the first sensor data using the second sensor data collected from the one or more other vehicles and based on the location of each of the one or more other vehicles and the location of the vehicle; and
displaying the one or more augmented reality images in the augmented reality display such that the one or more augmented reality images are positioned to overlay the one or more objects.