US 11,900,350 B2
Automatic inventory tracking in brick and mortar store based on sensor data
Marvin Balaoro, San Francisco, CA (US); Nicholas Slaney, San Francisco, CA (US); Brett Andler, San Francisco, CA (US); Nikolaj Leschly, Alameda, CA (US); Jeremy Martin, Oakland, CA (US); Joshua Boilard, San Francisco, CA (US); Isreal Blagdan, San Francisco, CA (US); and Patrick Willett, San Francisco, CA (US)
Assigned to Block, Inc., Oakland, CA (US)
Filed by Block, Inc., Oakland, CA (US)
Filed on Jun. 22, 2021, as Appl. No. 17/355,015.
Claims priority of provisional application 63/191,906, filed on May 21, 2021.
Prior Publication US 2022/0374855 A1, Nov. 24, 2022
Int. Cl. G06Q 20/20 (2012.01); H04W 4/80 (2018.01); G06Q 20/40 (2012.01)
CPC G06Q 20/203 (2013.01) [G06Q 20/204 (2013.01); G06Q 20/209 (2013.01); G06Q 20/401 (2013.01); H04W 4/80 (2018.02)] 20 Claims
OG exemplary drawing
 
1. A system for identifying an interaction in a merchant brick-and-mortar (BAM) area, the system comprising:
one or more sensors that are located within a BAM area associated with a merchant and include at least one camera with a field of view covering at least part of the BAM area;
one or more memory units storing instructions; and
one or more processors, wherein execution of the instructions by the one or more processors causes the one or more processors to:
receive, from the one or more sensors, sensor data captured by the one or more sensors, the sensor data including at least one image of the BAM area captured by the at least one camera;
identify, in the sensor data and based on extraction of a first set of features from the sensor data and classification of the first set of features, a representation of an individual who is located in the BAM area, the first set of features including at least a first image feature of the at least one image;
track, based on the representation of the individual in the sensor data and one or more respective locations of the one or more sensors in the BAM area, one or more locations of the individual within the BAM area as the individual moves through the BAM area;
identify, in the sensor data and based on extraction of a second set of features from the sensor data and analysis of the second set of features, a representation of an object that is in the BAM area, the second set of features including at least a second image feature of the at least one image;
determine that the individual has moved the object across a boundary of an inventory area within the BAM area based on at least one of the representation of the individual, the representation of the object, or the one or more locations of the individual within the BAM area;
automatically adjust a merchantable inventory of the object in response to determination of the individual moving the object across the boundary of the inventory area;
automatically track changes over time to a rate of change of the merchantable inventory of the object, the changes including a change based on determination of the individual moving the object across the boundary of the inventory area;
automatically generate, at a first time and based on input into a trained neural network of the tracked changes over time to the rate of change of the merchantable inventory of the object, a predictive recommendation regarding a predicted quantity at a second time of the merchantable inventory of the object in the inventory area, wherein the second time is after the first time;
automatically output the predictive recommendation to a merchant device associated with the merchant; and
automatically train the trained neural network further based on training data that includes the predictive recommendation regarding the predicted quantity compared with an observed quantity of the merchantable inventory of the object in the inventory area at the second time, the observed quantity determined based on additional sensor data from the one or more sensors, the additional sensor data including additional image data from the at least one camera.