US 11,983,899 B2
Stereo vision camera system that tracks and filters calibration parameters
Leaf Alden Jiang, Concord, MA (US); Philip Bradley Rosen, Lincoln, MA (US); and Piotr Swierczynski, Providence, RI (US)
Assigned to NODAR Inc., Somerville, MA (US)
Filed by NODAR Inc., Somerville, MA (US)
Filed on Mar. 14, 2022, as Appl. No. 17/693,634.
Application 17/693,634 is a continuation of application No. 17/365,623, filed on Jul. 1, 2021, granted, now 11,282,234.
Application 17/365,623 is a continuation of application No. PCT/US2021/012294, filed on Jan. 6, 2021.
Claims priority of provisional application 62/964,148, filed on Jan. 22, 2020.
Prior Publication US 2023/0005184 A1, Jan. 5, 2023
This patent is subject to a terminal disclaimer.
Int. Cl. G06T 7/80 (2017.01); B60W 40/02 (2006.01); G06T 7/00 (2017.01); G06T 7/579 (2017.01); G06T 7/593 (2017.01); G06T 7/90 (2017.01); G06V 20/58 (2022.01); H04N 13/239 (2018.01); H04N 13/243 (2018.01); H04N 13/246 (2018.01); H04N 13/254 (2018.01); H04N 13/271 (2018.01); H04N 13/282 (2018.01); H04N 17/00 (2006.01); H04N 13/00 (2018.01)
CPC G06T 7/85 (2017.01) [B60W 40/02 (2013.01); G06T 7/579 (2017.01); G06T 7/593 (2017.01); G06T 7/90 (2017.01); G06T 7/97 (2017.01); G06V 20/582 (2022.01); H04N 13/239 (2018.05); H04N 13/243 (2018.05); H04N 13/246 (2018.05); H04N 13/254 (2018.05); H04N 13/271 (2018.05); H04N 13/282 (2018.05); H04N 17/002 (2013.01); B60W 2420/42 (2013.01); G06T 2207/10012 (2013.01); G06T 2207/10024 (2013.01); G06T 2207/10028 (2013.01); G06T 2207/30252 (2013.01); H04N 2013/0081 (2013.01)] 30 Claims
OG exemplary drawing
 
1. A stereo vision system for a vehicle, comprising:
at least one computer processor in communication with at least two camera sensors, the at least one computer processor being configured to:
receive, in real time or nearly real time, video signals from the at least two camera sensors, each of the at least two camera sensors being configured to sense reflected energy of a first scene and to generate a video signal based on the reflected energy;
produce three-dimensional (3D) data corresponding to the first scene; and
calibrate the at least two camera sensors by:
managing outputs from a plurality of calibration engines to determine a best estimate of calibration parameters for each of the at least two camera sensors, and
automatically updating camera parameters for the at least two camera sensors using the best estimate determined from the outputs of the plurality of calibration engines,
wherein:
the best estimate is determined by tracking a series of estimates of the calibration parameters over time and filtering the series of estimates to determine the best estimate, and
the updating of the camera parameters is performed based on frames of the video signals from the at least two camera sensors, such that:
outputs from a first calibration engine of the plurality of calibration engines occur every first number of frames of the video signals, and
outputs from a second calibration engine of the plurality of calibration engines occur every second number of frames of the video signals, the second number being different from the first number.