US 11,747,164 B2
Methods for multi-dimensional lane matching for autonomous vehicle localization
Meena Nagappan, Richardson, TX (US); Hyukseong Kwon, Thousand Oaks, CA (US); Joshua Lampkins, Gardena, CA (US); and Rajan Bhattacharyya, Thousand Oaks, CA (US)
Assigned to GM Global Technology Operations LLC, Detroit, MI (US)
Filed by GM GLOBAL TECHNOLOGY OPERATIONS LLC, Detroit, MI (US)
Filed on Jan. 13, 2021, as Appl. No. 17/147,745.
Prior Publication US 2022/0221301 A1, Jul. 14, 2022
Int. Cl. G01C 21/00 (2006.01); G05D 1/02 (2020.01); B60W 10/04 (2006.01); B60W 10/10 (2012.01); B60W 10/18 (2012.01); B60W 10/20 (2006.01)
CPC G01C 21/3819 (2020.08) [B60W 10/04 (2013.01); B60W 10/10 (2013.01); B60W 10/18 (2013.01); B60W 10/20 (2013.01); G01C 21/3833 (2020.08); B60W 2552/05 (2020.02); B60W 2552/30 (2020.02); B60W 2552/53 (2020.02); B60W 2556/40 (2020.02); B60W 2556/45 (2020.02); B60W 2710/1005 (2013.01); B60W 2710/18 (2013.01); B60W 2710/20 (2013.01); B60W 2720/106 (2013.01); B60W 2720/125 (2013.01); G05D 1/0212 (2013.01)] 3 Claims
OG exemplary drawing
 
1. A method for controlling a vehicle, comprising:
receiving, by a controller, perception input data from a sensor of the vehicle;
receiving, by the controller, map data including road lane information in a vicinity of the vehicle;
processing, by the controller, the perception input data to extract perceived road lane information including a perceived x position, a perceived y position, a perceived z position, a perceived lane type, a perceived lane color, and a perceived lane curvature;
processing, by the controller, the map data to extract map road lane information including a map x position, a map y position, a map z position, a map lane type, a map lane color, and a map lane curvature;
calculating, by the controller, a transformation matrix from the perceived road lane information, including perceived x position, perceived y position, perceived z position, perceived lane type, perceived lane color, and perceived lane curvature, and the map road lane information, including perceived x position, perceived y position, perceived z position, perceived lane type, perceived lane color, and perceived lane curvature;
generating, by the controller, a fitness score evaluating an effectiveness of the transformation matrix;
updating, by the controller, the map data and a localization of the vehicle based on the transformation matrix; and
generating, by the controller, a control signal to control an actuator of the vehicle;
wherein calculating the transformation matrix comprises determining a distance d between a first sample point from the perception input data and a second sample point from the map data using an equation

OG Complex Work Unit Math
where [x Y z]P, cP, tP, and κP are the perceived x position, perceived y position, perceived z position, perceived lane color, perceived lane type, and perceived lane curvature for a perception input data sample point;
[x y z]M, cM, tM, κM are the map x position, map y position, map z position, map lane color, map lane type and map lane curvature for a map data sample point; and
α,β, γ, and δ are corresponding weight factors.