US 12,073,327 B2
Quantifying objects on plants by estimating the number of objects on plant parts such as leaves, by convolutional neural networks that provide density maps
Aitor Alvarez Gila, Derio (ES); Amaia Maria Ortiz Barredo, Vitoria-Gasteiz (ES); David Roldan Lopez, Dos Hermanas (ES); Javier Romero Rodriguez, Utrera (ES); Corinna Maria Spangler, Ludwigshafen (DE); Christian Klukas, Limburgerhof (DE); Till Eggers, Ludwigshafen (DE); Jone Echazarra Huguet, Derio (ES); Ramon Navarra Mestre, Limburgerhof (DE); Artzai Picon Ruiz, Derio (ES); and Aranzazu Bereciartua Perez, Derio (ES)
Assigned to BASF SE, Ludwigshafen am Rhein (DE)
Appl. No. 17/761,849
Filed by BASF SE, Ludwigshafen am Rhein (DE)
PCT Filed Sep. 29, 2020, PCT No. PCT/EP2020/077197
§ 371(c)(1), (2) Date Mar. 18, 2022,
PCT Pub. No. WO2021/063929, PCT Pub. Date Apr. 8, 2021.
Claims priority of application No. 19200657 (EP), filed on Sep. 30, 2019.
Prior Publication US 2023/0351743 A1, Nov. 2, 2023
Int. Cl. G06T 7/00 (2017.01); G06N 3/082 (2023.01); G06T 7/11 (2017.01); G06V 10/44 (2022.01); G06V 10/56 (2022.01); G06V 10/762 (2022.01); G06V 10/764 (2022.01); G06V 10/82 (2022.01); G06V 20/10 (2022.01)
CPC G06N 3/082 (2013.01) [G06T 7/0012 (2013.01); G06T 7/11 (2017.01); G06V 10/454 (2022.01); G06V 10/56 (2022.01); G06V 10/762 (2022.01); G06V 10/764 (2022.01); G06V 10/82 (2022.01); G06V 20/10 (2022.01); G06V 20/188 (2022.01); G06T 2207/20081 (2013.01); G06T 2207/20084 (2013.01)] 15 Claims
OG exemplary drawing
 
1. A computer-implemented method for quantifying biological objects (132) on parts of plants, by estimating the number (NEST) of the objects (132) on parts (122) of a plant (112), the method comprising:
receiving a plant-image (412) taken from a particular plant (112), the plant-image (412) showing at least one of the parts (122) of the particular plant (112);
using a first convolutional neural network (262) to process the plant-image (412) to derive a leaf-image (422) being a contiguous set of pixels that show a part (422-1) of the particular plant (112) completely, the first convolutional neural network (262) having been trained by a plurality of part-annotated plant-images (461), wherein the plant-images (411) are annotated to identify parts (421-1);
splitting the leaf-image (422) into a plurality of tiles (402-k), the tiles being segments of the plant-image (412) having pre-defined tile dimensions;
using a second convolutional neural network (272) to separately process the plurality of tiles (402) to obtain a plurality of density maps (502-k) having map dimensions that correspond to the tile dimensions, the second convolutional neural network (272) having been trained by processing object-annotated plant-images (471), the processing comprising the calculation of convolutions for each pixel based on a kernel function leading to density maps (502) with different integral values for tiles showing biological objects and tiles not showing biological objects; and
combining the plurality of density maps (502) to a combined density map (555) in the dimension of the leaf-image (422), and integrating the pixel values of the combined density map (555) to an estimated number of biological objects (NEST) for the main leaf.