| CPC G06V 20/58 (2022.01) [B60W 40/105 (2013.01); G06V 20/588 (2022.01); B60W 2420/403 (2013.01); B60W 2554/4049 (2020.02)] | 17 Claims |

|
1. A method of controlling a vehicle, the method comprising:
receiving, via at least one processor, image data from at least one camera of the vehicle;
detecting, via the at least one processor, a 2D measurement location of a static object in an image plane of the camera using the image data;
receiving, via the at least one processor, an input vector determined in response to a steering angle detected by a steering angle sensor of the vehicle, a wheel speed detected by a wheel speed sensor of the vehicle and an acceleration measured by an inertial measurement unit of the vehicle;
predicting, via the at least one processor, a predicted 3D location of the static object using an Unscented Kalman Filter (UKF) that incorporates a motion model for the vehicle and further using the 2D measurement location of the static object, and the input vector wherein the predicting the 3D location of the static object using the UKF includes a prediction step and an update step, wherein the prediction step performs the following:
constructing a prediction sigma point matrix,
propagating the prediction sigma point matrix through the motion model to obtain propagated sigma points,
determining an estimated 3D location of the static object using the propagated sigma points,
estimating a propagated estimation error covariance using the propagated sigma point; and
wherein the update step performs the following:
constructing an update sigma point matrix using the estimated 3D location of the static object and the propagated estimation error covariance,
propagating the update sigma point matrix through a measurement model to obtain a measurement sigma point matrix representing predicted 2D measurement locations of the static object in the image plane of the camera,
determining an updated 3D location of the static object using a disparity between the measurement sigma point matrix and the 2D measurement location of the static object in the image plane of the camera;
controlling, via the at least one processor, at least one vehicle feature based on the predicted 3D location of the static object, wherein the UKF is initialized based on a set of sigma points that are generated using a first detection of the 2D measurement location of the static object in the image plane and a range prior, and
wherein the set of sigma points are used to generate a prior 3D location for an initialization of the UKF, and wherein a disparity between the first detection of the 2D location measurement and a predicted 2D state measurement is used to update the estimated 3D location of the static object such that an updated 3D state and an updated covariance are provided and wherein a subsequent iteration of the UKF is performed in response to the updated 3D state and the updated covariance.
|