US 11,812,010 B2
Plant feature detection using captured images
Lee Kamp Redden, Palo Alto, CA (US)
Assigned to BLUE RIVER TECHNOLOGY INC., Sunnyvale, CA (US)
Filed by Blue River Technology Inc., Sunnyvale, CA (US)
Filed on Dec. 15, 2022, as Appl. No. 18/082,312.
Application 18/082,312 is a continuation of application No. 17/853,925, filed on Jun. 30, 2022, granted, now 11,570,420.
Application 17/853,925 is a continuation of application No. 17/012,055, filed on Sep. 4, 2020, granted, now 11,425,354, issued on Aug. 23, 2022.
Application 17/012,055 is a continuation of application No. 16/569,649, filed on Sep. 12, 2019, granted, now 10,812,776, issued on Oct. 20, 2020.
Application 16/569,649 is a continuation of application No. 15/407,644, filed on Jan. 17, 2017, granted, now 10,491,879, issued on Nov. 26, 2019.
Claims priority of provisional application 62/279,599, filed on Jan. 15, 2016.
Prior Publication US 2023/0127880 A1, Apr. 27, 2023
This patent is subject to a terminal disclaimer.
Int. Cl. H04N 13/243 (2018.01); H04N 13/204 (2018.01); G06T 7/593 (2017.01); H04N 13/128 (2018.01); H04N 13/239 (2018.01); H04N 13/271 (2018.01); G06V 20/00 (2022.01); G06V 10/764 (2022.01); G06V 20/10 (2022.01); H04N 13/00 (2018.01); G06F 18/2415 (2023.01)
CPC H04N 13/243 (2018.05) [G06F 18/2415 (2023.01); G06T 7/593 (2017.01); G06V 10/764 (2022.01); G06V 20/188 (2022.01); G06V 20/38 (2022.01); H04N 13/128 (2018.05); H04N 13/204 (2018.05); H04N 13/239 (2018.05); H04N 13/271 (2018.05); G06T 2207/10021 (2013.01); G06T 2207/20076 (2013.01); G06T 2207/20081 (2013.01); G06T 2207/30252 (2013.01); G06V 2201/12 (2022.01); H04N 2013/0081 (2013.01); H04N 2013/0092 (2013.01)] 20 Claims
OG exemplary drawing
 
1. A method for generating a combined depth map representing at least a portion of a field comprising a plurality of plants:
accessing, from an image acquisition system of a farming machine travelling through the field, a plurality of images of the field, each image including pixels comprising information describing plants in the image, and distances between objects in the field and the farming machine;
classifying, using a plant identification model, pixels in the plurality of images as plants of the plurality of plants;
generating, using a depth map generation model and classified plants in the plurality of images, a plurality of depth maps quantifying distances between classified plants and the farming machine;
combining the plurality of depth maps into a combined depth map spatially locating classified plants in the field, the combined depth map comprising a plant cluster comprising spatially proximal plants identified between two or more of the depth maps of the plurality;
determining the plant cluster is an individual plant of the plurality of plants based on a spatial proximity of plants in the plant cluster and modifying the combined depth map such that the plant cluster is represented as the individual plant in the combined depth map; and
treating at least one of the plurality of plants in the field based on locations of plants in the combined depth map.