US 12,240,571 B2
Video sensor fusion and model based virtual and augmented reality systems and methods
Mark Johnson, Vannes (FR); Richard Jales, Eastleigh (GB); Gordon Pope, Hung Hom (HK); Christopher Daniel Gatland, Fareham (GB); Paul Stokes, Fleet (GB); Aaron Ridout, Chichester (GB); Chris Jones, Fareham (GB); Jay E. Robinson, Ventura, CA (US); Neil R. Owens, Winchester (GB); and Peter Long, Fareham (GB)
Assigned to FLIR Belgium BVBA, Meer (BE)
Filed by FLIR Belgium BVBA, Meer (BE)
Filed on Mar. 19, 2021, as Appl. No. 17/207,562.
Application 17/207,562 is a continuation of application No. PCT/US2019/052300, filed on Sep. 20, 2019.
Claims priority of provisional application 62/901,140, filed on Sep. 16, 2019.
Claims priority of provisional application 62/897,104, filed on Sep. 6, 2019.
Claims priority of provisional application 62/851,025, filed on May 21, 2019.
Claims priority of provisional application 62/734,156, filed on Sep. 20, 2018.
Prior Publication US 2021/0206459 A1, Jul. 8, 2021
Int. Cl. B63B 49/00 (2006.01); B63B 35/00 (2020.01); B63B 51/00 (2006.01); B63H 25/04 (2006.01); G01C 21/00 (2006.01); G01C 21/20 (2006.01); G01C 21/26 (2006.01); G01C 21/28 (2006.01); G01C 21/32 (2006.01); G01C 21/34 (2006.01); G01C 21/36 (2006.01); G01C 23/00 (2006.01); G01S 15/86 (2020.01); G01S 15/89 (2006.01); G06T 7/73 (2017.01); G06T 19/00 (2011.01)
CPC B63B 49/00 (2013.01) [B63B 51/00 (2013.01); B63H 25/04 (2013.01); G01C 21/00 (2013.01); G01C 21/005 (2013.01); G01C 21/20 (2013.01); G01C 21/203 (2013.01); G01C 21/26 (2013.01); G01C 21/28 (2013.01); G01C 21/32 (2013.01); G01C 21/34 (2013.01); G01C 21/3407 (2013.01); G01C 21/3415 (2013.01); G01C 21/36 (2013.01); G01C 21/3647 (2013.01); G01C 23/00 (2013.01); G01S 15/86 (2020.01); G01S 15/8993 (2013.01); G06T 19/006 (2013.01); B63B 2035/009 (2013.01); B63B 2213/02 (2013.01); G06T 7/73 (2017.01); G06T 2207/10028 (2013.01)] 20 Claims
OG exemplary drawing
 
1. An apparatus comprising:
a logic device configured to communicate with a plurality of navigational sensors, at least one orientation and/or position sensor (OPS), and an imaging module coupled to a mobile structure, wherein each navigational sensor is configured to provide navigational data associated with the mobile structure, and wherein the logic device is configured to:
receive at least one image from the imaging module;
receive orientation, position, and/or navigation data corresponding to the at least one image from the OPS and/or the plurality of navigational sensors;
generate an augmented reality (AR) display view for viewing by a user of the mobile structure, the AR display view comprising the at least one image and an AR graphics overlay based, at least in part, on the received orientation, position, and/or navigation data and/or the received at least one image;
determine a boundary of a cross track error corresponding to a track traversed by the mobile structure; and
render, in the AR display view, a cross track boundary line configured to identify the boundary of the cross track error within the at least one image and a traversal direction associated with the track.