US 12,073,359 B2
Real-time bill of materials for a vehicle
Thomas Bachant, San Francisco, CA (US); Nadav Ullman, San Francisco, CA (US); Joseph Thibeault, San Francisco, CA (US); Jake McCloskey, San Francisco, CA (US); Jose Arturo Covarrubias Reynoso, San Francisco, CA (US); and Paul Garcia, San Francisco, CA (US)
Assigned to GM Cruise Holdings LLC, San Francisco, CA (US)
Filed by GM Cruise Holdings LLC, San Francisco, CA (US)
Filed on Dec. 20, 2019, as Appl. No. 16/723,670.
Prior Publication US 2021/0192450 A1, Jun. 24, 2021
Int. Cl. G06Q 10/0875 (2023.01); G06Q 10/20 (2023.01); G07C 5/00 (2006.01); G07C 5/08 (2006.01); G06T 19/00 (2011.01)
CPC G06Q 10/0875 (2013.01) [G06Q 10/20 (2013.01); G07C 5/008 (2013.01); G07C 5/0808 (2013.01); G06T 19/006 (2013.01)] 15 Claims
OG exemplary drawing
 
1. A method for automatically updating a bill of materials for an autonomous vehicle, wherein the autonomous vehicle is a first autonomous vehicle in a fleet of vehicles, comprising:
calibrating a first component in the first autonomous vehicle to generate a first updated extrinsic calibration value for a first variable, a second updated extrinsic calibration value for a second variable, and a third updated extrinsic calibration value for a third variable, wherein the first component is one of a LIDAR component and a RADAR component;
determining that a calibration has occurred in the first autonomous vehicle;
running an automatic component detection process for detecting data about a plurality of components of the first autonomous vehicle, wherein the plurality of components includes the first component, and wherein the automatic component detection process is run in response to determining that the calibration has occurred;
receiving self-reported component data from the first component including receiving an update to component data wherein component data includes the first updated extrinsic calibration value, the second updated extrinsic calibration value, and the third updated extrinsic calibration value;
outputting an updated component configuration including updated component data for the plurality of components in the first autonomous vehicle, wherein the updated component data includes the self-reported component data from the first component;
saving the updated component configuration to the bill of materials;
comparing changes to the updated component configuration with recorded acceptable configuration parameters;
identifying a calibration conflict including determining that at least one of the first extrinsic calibration value, the second extrinsic calibration value, and the third extrinsic calibration value conflicts with a target calibration value;
determining, at a processor, that the updated component configuration is not acceptable based on the calibration conflict;
flagging the updated component configuration for review based on determining that the updated component configuration is not acceptable;
correlating issues with the updated component configuration with features present in a plurality of the fleet vehicles;
identifying an unacceptable change in the updated component configuration based in part on the correlation, wherein the unacceptable change includes a first change to the first component including one of the first updated extrinsic calibration value, the second updated extrinsic calibration value, and the third updated extrinsic calibration value;
identifying a level of immediacy of service to address the unacceptable change;
scheduling the first autonomous vehicle for service based on the unacceptable change and the level of immediacy;
routing the first autonomous vehicle to a service center;
receiving, from the first autonomous vehicle, image data from at least one of cameras on the first autonomous vehicle and cameras inside the first autonomous vehicle;
presenting data about the first component in the bill of materials to augmented reality glasses, and wherein presenting data includes presenting the component data and presenting a camera feed based on the at least one of cameras on the first autonomous vehicle and cameras inside the first autonomous vehicle;
receiving, from the augmented reality glasses, selection of 3-dimensional coordinates for a precise vehicle location, wherein the presenting data about the first component is based on the precise vehicle location.