US 11,959,749 B2
Mobile mapping system
Naser El-Sheimy, Calgary (CA); and Amr Al-Hamad, Calgary (CA)
Assigned to Profound Positioning Inc., Calgary (CA)
Filed by Profound Positioning Inc., Calgary (CA)
Filed on Jun. 22, 2015, as Appl. No. 14/746,819.
Claims priority of provisional application 62/014,984, filed on Jun. 20, 2014.
Prior Publication US 2017/0227361 A1, Aug. 10, 2017
Int. Cl. G01C 11/02 (2006.01); G01C 11/12 (2006.01); G01C 11/34 (2006.01); G01C 21/16 (2006.01); G01C 21/20 (2006.01); G01S 19/40 (2010.01); G01S 19/42 (2010.01); G01S 19/48 (2010.01); G06T 7/73 (2017.01)
CPC G01C 21/20 (2013.01) [G01C 11/02 (2013.01); G01C 11/12 (2013.01); G01C 11/34 (2013.01); G01C 21/1654 (2020.08); G01C 21/1656 (2020.08); G01S 19/40 (2013.01); G01S 19/42 (2013.01); G01S 19/485 (2020.05); G06T 7/74 (2017.01); G06T 2207/30184 (2013.01); G06T 2207/30244 (2013.01); G06T 2207/30252 (2013.01)] 14 Claims
OG exemplary drawing
 
1. A method comprising:
capturing a plurality of images using a camera of a hand-holdable mobile communication platform;
determining an initial set of orientation parameters associated with the plurality of captured images in response to one or more orientation sensors on the hand-holdable mobile communication platform;
initially matching images of the plurality of captured images based on matching object points between respective pairs of first and second matched images;
calculating, using a selected pair of matched images consisting of a first matched image having a first object point and a second matched image having a second object point matched to the first object point, a corrected set of orientation parameters based on the initial set of orientation parameters and application of a non-linear least square estimation algorithm to a cost function, wherein, the cost function is a separation distance between the second object point and an epipolar line of the second matched image, and wherein the epipolar line is expressed as a function of a plurality of points in the second matched image obtained using a projection between the matched images in the selected pair that is a function of orientation parameters, by projecting the first object point to the plurality of points in the second matched image; and
estimating a three-dimensional ground coordinate associated with the captured images in response to the corrected set of orientation parameters.