US 12,283,053 B2
System and method for determining damage on crops
Aranzazu Bereciartua-Perez, Derio (ES); Artzai Picon Ruiz, Derio (ES); Javier Romero Rodriguez, Utrera (ES); Juan Manuel Contreras Gallardo, Utrera (ES); Rainer Oberst, Limburgerhof (DE); Hikal Khairy Shohdy Gad, Limburgerhof (DE); Gerd Kraemer, Limburgerhof (DE); Christian Klukas, Limburgerhof (DE); Till Eggers, Ludwigshafen am Rhein (DE); Jone Echazarra Huguet, Derio (ES); and Ramon Navarra-Mestre, Limburgerhof (DE)
Assigned to BASF SE, Ludwigshafen am Rhein (DE)
Appl. No. 17/779,819
Filed by BASF SE, Ludwigshafen am Rhein (DE)
PCT Filed Nov. 24, 2020, PCT No. PCT/EP2020/083199
§ 371(c)(1), (2) Date May 25, 2022,
PCT Pub. No. WO2021/110476, PCT Pub. Date Jun. 10, 2021.
Claims priority of application No. 19213250 (EP), filed on Dec. 3, 2019.
Prior Publication US 2023/0017425 A1, Jan. 19, 2023
Int. Cl. G06T 7/00 (2017.01); G06T 3/40 (2024.01); G06T 3/4046 (2024.01); G06T 7/11 (2017.01); G06V 10/44 (2022.01); G06V 10/764 (2022.01); G06V 10/774 (2022.01); G06V 10/776 (2022.01); G06V 10/82 (2022.01); G06V 20/10 (2022.01); G06V 20/70 (2022.01)
CPC G06T 7/11 (2017.01) [G06T 3/4046 (2013.01); G06T 7/0002 (2013.01); G06V 10/454 (2022.01); G06V 10/764 (2022.01); G06V 10/774 (2022.01); G06V 10/776 (2022.01); G06V 10/82 (2022.01); G06V 20/188 (2022.01); G06V 20/70 (2022.01); G06T 2207/10024 (2013.01); G06T 2207/20021 (2013.01); G06T 2207/20081 (2013.01); G06T 2207/20084 (2013.01); G06T 2207/30188 (2013.01)] 15 Claims
OG exemplary drawing
 
1. A computer-implemented method for determining damage on crop plants after herbicide application in an agricultural field, comprising:
receiving an image representing a real world situation in the agricultural field after herbicide application, with at least one crop plant;
rescaling the received image to a rescaled image matching the size of an input layer of a first convolutional neural network (CNN1) referred to as the first CNN,
the first CNN (CNN1) being trained to segment the rescaled image into crop and non-crop portions by using color transformation processes in a data augmentation stage allowing the first CNN to learn to distinguish between soil related pixels and necrotic crop related pixels, and to provide a first segmented output as a mask identifying the crop portions in the rescaled image including necrotic parts of the crop plant;
applying the first CNN (CNN1) to the rescaled image to provide, to a second convolutional neural network (CNN2) referred to as the second CNN, the first segmented output,
the second CNN (CNN2) being a semantic segmentation neural network trained to segment said crop portions into one or more sub-portions with each sub-portion including pixels associated with damaged parts of the crop plant showing a respective damage type being a particular damage type of a plurality of damage types comprising necrosis and at least one further damage type;
applying the second CNN (CNN2) to the crop portions of the rescaled image to identify, in a second segmented output, damaged parts of the at least one crop plant by damage type for the plurality of damage types; and
determining a damage measure for the at least one crop plant for each damage type based on the respective sub-portions of the second segmented output in relation to the crop portion of the first segmented output.