US 11,980,116 B1
System and a method for automation of agricultural treatments
Josie Lynn Suter, Urbana, IL (US); Michael James Hansen, Champaign, IL (US); Girish Vinayak Chowdhary, Champaign, IL (US); and Chinmay Prakash Soman, Urbana, IL (US)
Assigned to EARTHSENSE INC., Champaign, IL (US)
Filed by EARTHSENSE INC., Champaign, IL (US)
Filed on Jan. 12, 2023, as Appl. No. 18/096,287.
Int. Cl. A01B 79/02 (2006.01); A01B 69/00 (2006.01); A01B 69/04 (2006.01); A01B 79/00 (2006.01); A01C 21/00 (2006.01); A01N 25/02 (2006.01); G05D 1/00 (2006.01); G06V 10/82 (2022.01); G06V 20/56 (2022.01)
CPC A01B 79/005 (2013.01) [A01B 69/008 (2013.01); A01B 79/02 (2013.01); A01C 21/002 (2013.01); A01N 25/02 (2013.01); G05D 1/0223 (2013.01); G05D 1/0246 (2013.01); G06V 10/82 (2022.01); G06V 20/56 (2022.01); G05D 2201/0201 (2013.01)] 18 Claims
OG exemplary drawing
 
1. A method for automation of an agricultural treatment, the method comprising:
receiving, by a processor, a set of instructions for the agricultural treatment, wherein the set of instructions comprises a type of agricultural treatment and a target location, and wherein the target location is at least one of a part of an agricultural field and one or more trees in the agricultural field;
determining, by the processor, chemical parameters comprising a composition, a dosage, and a quantity of a chemical required for the agricultural treatment based on the set of instructions using a machine learning algorithm;
navigating, by the processor, a robot through a route to the target location by processing a set of images obtained using at least one of a vision camera and a depth camera, wherein the route is determined based on a coefficient of traversal which is greater than a pre-defined threshold and at least one navigation algorithm, and wherein the coefficient of traversal indicates traversable terrain for the robot;
detecting, by the processor, a spraying section from the target location based on the type of agricultural treatment using a set of sensors installed on the robot, wherein the spraying section is detected using an image processing model, wherein the image processing model is trained using a training dataset comprising different types of agricultural treatment, a plurality of images of the agricultural field, and a plurality of spraying sections corresponding to the different types of agricultural treatments annotated on the plurality of images, and wherein the spraying section is a part of the target location;
determining, by the processor,
a speed of the robot, a proximity of the robot to the spraying section, and a rate of chemical flow based on the type of agricultural treatment, the dosage, and the spraying section using a robot control model,
an area of the detected spraying section from the set of images by using an image processing algorithm and a geometric algorithm, and
a quantity of chemical required to be sprayed on the area based on a product of the dosage and the area of the spraying section;
controlling, by the processor, the robot to match the determined speed of the robot and the determined proximity of the robot using a navigation algorithm; and
dispensing, by the processor, the chemical using a spraying equipment.