| CPC B63B 49/00 (2013.01) [B63B 51/00 (2013.01); B63H 25/04 (2013.01); G01C 21/00 (2013.01); G01C 21/005 (2013.01); G01C 21/20 (2013.01); G01C 21/203 (2013.01); G01C 21/26 (2013.01); G01C 21/28 (2013.01); G01C 21/32 (2013.01); G01C 21/34 (2013.01); G01C 21/3407 (2013.01); G01C 21/3415 (2013.01); G01C 21/36 (2013.01); G01C 21/3647 (2013.01); G01C 23/00 (2013.01); G01S 15/86 (2020.01); G01S 15/8993 (2013.01); G06T 19/006 (2013.01); B63B 2035/009 (2013.01); B63B 2213/02 (2013.01); G06T 7/73 (2017.01); G06T 2207/10028 (2013.01)] | 20 Claims |

|
1. An apparatus comprising:
a logic device configured to communicate with a plurality of navigational sensors, at least one orientation and/or position sensor (OPS), and an imaging module coupled to a mobile structure, wherein each navigational sensor is configured to provide navigational data associated with the mobile structure, and wherein the logic device is configured to:
receive at least one image from the imaging module;
receive orientation, position, and/or navigation data corresponding to the at least one image from the OPS and/or the plurality of navigational sensors;
generate an augmented reality (AR) display view for viewing by a user of the mobile structure, the AR display view comprising the at least one image and an AR graphics overlay based, at least in part, on the received orientation, position, and/or navigation data and/or the received at least one image;
determine a boundary of a cross track error corresponding to a track traversed by the mobile structure; and
render, in the AR display view, a cross track boundary line configured to identify the boundary of the cross track error within the at least one image and a traversal direction associated with the track.
|