US 12,112,546 B2
Vision guidance system using dynamic edge detection
Aaron G. Vesperman, Ankeny, IA (US); Jeffery J. Adams, Ankeny, IA (US); and Anthony J. Mainelli, Grimes, IA (US)
Assigned to DEERE & COMPANY, Moline, IL (US)
Filed by Deere & Company, Moline, IL (US)
Filed on Apr. 30, 2021, as Appl. No. 17/245,522.
Prior Publication US 2022/0350991 A1, Nov. 3, 2022
Int. Cl. G06V 20/56 (2022.01); A01B 69/04 (2006.01); B62D 15/02 (2006.01); G05D 1/00 (2024.01); G06T 7/40 (2017.01); G06V 10/22 (2022.01); G06V 10/44 (2022.01); G06V 20/10 (2022.01)
CPC G06V 20/56 (2022.01) [A01B 69/008 (2013.01); B62D 15/025 (2013.01); G05D 1/0038 (2013.01); G06T 7/40 (2013.01); G06V 10/225 (2022.01); G06V 10/44 (2022.01); G06V 20/188 (2022.01); G06T 2207/30252 (2013.01)] 21 Claims
OG exemplary drawing
 
1. A method comprising:
accessing a set of images captured by a vehicle while navigating via autonomous steering through an area of different surface types, the set of images comprising images of a ground surface in front the vehicle;
receiving, from a remote operator, an input representative of a location within the set of images displayed to the remote operator;
identifying a set of candidate edges within an image portion corresponding to the location within the set of images, each candidate edge corresponding to a candidate boundary between two different surface types;
determining, for each of the set of candidate edges, a distance between the candidate edge and the location within the set of images represented by the input received from the remote operator;
applying an edge selection model to the set of candidate edges, the edge selection model configured to select an edge of the set of candidate edges based at least in part on the determined distance for each candidate edge; and
modifying a route being navigated by the vehicle based at least in part on the selected candidate edge.
 
13. A system comprising a hardware processor and a non-transitory computer-readable storage medium storing executable instructions that, when executed by the processor, are configured to cause the system to perform steps comprising:
accessing a set of images captured by a vehicle while navigating via autonomous steering through an area of different surface types, the set of images comprising images of a ground surface in front the vehicle;
receiving, from a remote operator, an input representative of a location within the set of images displayed to the remote operator;
identifying a set of candidate edges within an image portion corresponding to the location within the set of images, each candidate edge corresponding to a candidate boundary between two different surface types;
determining, for each of the set of candidate edges, a distance between the candidate edge and the location within the set of images represented by the input received from the remote operator;
applying an edge selection model to the set of candidate edges, the edge selection model configured to select an edge of the set of candidate edges based at least in part on the determined distance for each candidate edge; and
modifying a route being navigated by the vehicle based at least in part on the selected candidate edge.
 
21. A method comprising:
accessing a set of images captured by a vehicle while navigating via autonomous steering through an area of different surface types, the set of images comprising images of a ground surface in front the vehicle;
identifying a set of candidate edges within the set of images, each candidate edge corresponding to a candidate boundary between two different surface types;
displaying, to a remote operator via a device of the remote operator, the set of images overlayed with the identified set of candidate edges;
receiving, from the remote operator, a selection of a candidate edge from the set of candidate edges displayed to the remote operator to be overlayed on the set of images; and
modifying a route being navigated by the vehicle based at least in part on the selected candidate edge.