US 12,243,353 B2
Image processing for tracking actions of individuals
Brent Vance Zucker, Roswell, GA (US); and Adam Justin Lieberman, Suwanee, GA (US)
Assigned to NCR Voyix Corporation, Atlanta, GA (US)
Filed by NCR Voyix Corporation, Atlanta, GA (US)
Filed on Apr. 23, 2021, as Appl. No. 17/239,039.
Application 17/239,039 is a continuation of application No. 16/174,805, filed on Oct. 30, 2018, granted, now 11,055,874.
Prior Publication US 2021/0241490 A1, Aug. 5, 2021
This patent is subject to a terminal disclaimer.
Int. Cl. G06V 40/20 (2022.01); G06T 7/20 (2017.01); G06T 7/73 (2017.01); G06V 10/20 (2022.01); G06V 10/62 (2022.01); G06V 20/52 (2022.01)
CPC G06V 40/20 (2022.01) [G06T 7/20 (2013.01); G06T 7/74 (2017.01); G06V 10/255 (2022.01); G06V 20/52 (2022.01); G06T 2207/20084 (2013.01); G06T 2207/30196 (2013.01); G06V 10/62 (2022.01)] 19 Claims
OG exemplary drawing
 
1. A method comprising:
tracking a person and an item from a time-stamped stream of images;
assigning a unique person identifier for the person;
isolating item pixels associated with the item from the time-stamped stream of images by removing first pixels associated with at least one individual and second pixels associated with at least one known background object;
providing an item identifier for the item based on the item pixels once the isolated item pixels are sufficient to provide the item identifier after iterating the tracking, and the isolating as the person travels with the item within a store;
wherein iterating further includes:
updating metadata comprising at least one dynamic attribute for the item of the time-stamped stream of images; and
refining the metadata from each current image processed with at least one additional or modified attribute identified with the current image processed;
tracking movements of the item relative to the person from the time-stamped stream of images;
determining a relationship between the item and the person based on the tracking using a trained neural network that takes as input the time-stamped stream of images and outputs the relationship by:
creating a numerical matrix to hold the time-stamped stream of images;
cropping the time-stamped stream of images to pixels associated with the person being tracked within the numerical matrix; and
providing a reference to the numerical matrix as input to the trained neural network; and
linking the item identifier to a virtual shopping cart maintained for the unique person identifier of the person when the relationship indicates that the person is in possession of the item.