US 11,751,558 B2
Autonomous agricultural treatment system using map based targeting of agricultural objects
Gabriel Thurston Sibley, Portland, OR (US); Lorenzo Ibarria, Dublin, CA (US); Curtis Dale Garner, Modesto, CA (US); Patrick Christopher Leger, Belmont, CA (US); Andre Robert Daniel Michelin, Topanga, CA (US); John Phillip Hurliman, II, Oakland, CA (US); Wisit Jirattigalochote, Palo Alto, CA (US); and Hasan Tafish, Foster City, CA (US)
Assigned to Verdant Robotics, Inc., Hayward, CA (US)
Filed by Verdant Robotics, Inc., Hayward, CA (US)
Filed on Aug. 2, 2021, as Appl. No. 17/392,202.
Application 17/392,202 is a continuation of application No. 17/073,236, filed on Oct. 16, 2020, granted, now 11,076,589.
Prior Publication US 2022/0117213 A1, Apr. 21, 2022
This patent is subject to a terminal disclaimer.
Int. Cl. G06V 20/10 (2022.01); A01M 7/00 (2006.01); A01G 7/06 (2006.01); A01C 21/00 (2006.01); A01C 23/00 (2006.01); A01M 9/00 (2006.01); G05B 19/4155 (2006.01); G06V 10/764 (2022.01); G06F 18/243 (2023.01); A01C 23/04 (2006.01); A01M 21/04 (2006.01); A01C 23/02 (2006.01)
CPC A01M 7/0089 (2013.01) [A01C 21/007 (2013.01); A01C 23/008 (2013.01); A01G 7/06 (2013.01); A01M 7/0014 (2013.01); A01M 7/0085 (2013.01); A01M 7/0092 (2013.01); A01M 9/0084 (2013.01); A01M 9/0092 (2013.01); G05B 19/4155 (2013.01); G06F 18/243 (2023.01); G06V 10/764 (2022.01); G06V 20/10 (2022.01); G06V 20/188 (2022.01); A01C 23/007 (2013.01); A01C 23/023 (2013.01); A01C 23/047 (2013.01); A01M 7/0042 (2013.01); A01M 21/043 (2013.01); G05B 2219/40585 (2013.01)] 20 Claims
OG exemplary drawing
 
1. A method performed by an agricultural treatment system comprising one or more processors comprising hardware, one or more sensors, and a treatment unit, the one or more processors configured to perform operations comprising:
receiving sensor data of a plurality of pre identified agricultural objects including each of the pre identified agricultural objects' real-world location;
determining a first real-world location of the agricultural treatment system;
receiving captured sensor data depicting agricultural objects;
selecting one or more pre identified agricultural objects, wherein the real-world locations of the selected pre identified agricultural objects are proximate to the first real-world location;
comparing at least a portion of the selected pre identified agricultural objects with the captured sensor data;
identifying a target object from the comparing of at least one selected pre identified agricultural object with at least a portion of the captured sensor data.