US 12,223,537 B2
Detecting and identifying misplaced items using a sensor array
Shahmeer Ali Mirza, Celina, TX (US); Sarath Vakacharla, Irving, TX (US); Sailesh Bharathwaaj Krishnamurthy, Irving, TX (US); and Deepanjan Paul, Plano, TX (US)
Assigned to 7-ELEVEN, INC., Irving, TX (US)
Filed by 7-ELEVEN, INC., Irving, TX (US)
Filed on Nov. 20, 2023, as Appl. No. 18/515,052.
Application 18/515,052 is a continuation of application No. 17/742,078, filed on May 11, 2022, granted, now 11,847,688.
Application 17/742,078 is a continuation of application No. 16/663,794, filed on Oct. 25, 2019, granted, now 11,367,124, issued on Jun. 21, 2022.
Prior Publication US 2024/0086993 A1, Mar. 14, 2024
This patent is subject to a terminal disclaimer.
Int. Cl. G06Q 30/00 (2023.01); G06Q 30/0601 (2023.01); G06T 7/246 (2017.01); G06T 7/73 (2017.01)
CPC G06Q 30/0633 (2013.01) [G06T 7/246 (2017.01); G06T 7/73 (2017.01)] 20 Claims
OG exemplary drawing
 
1. An object tracking system, comprising:
a first sensor configured to capture a first frame of a global plane for at least a first portion of a space, wherein:
the global plane represents (x,y) coordinates for the at least a portion of the space;
the first frame comprises a plurality of pixels;
each pixel from the plurality of pixels is associated with a pixel location comprising a pixel row and a pixel column; and
each pixel in the first frame is associated with a pixel value that indicates a distance between the first sensor and a surface in the space;
a second sensor configured to capture a second frame of at least a second portion of the space, wherein the second portion of the space at least partially overlaps with the first portion of the space to define an overlap region; and
a tracking system operably coupled to the first sensor and the second sensor, comprising:
one or more memories operable to store:
a first homography associated with the first sensor, wherein the first homography is configured to:
translate between pixel locations in the first frame and (x,y) coordinates in the global plane; and
translate between pixel values in the first frame and z-coordinates in the global plane;
a first tracking list associated with the first sensor, wherein the first tracking list identifies:
an object identifier for an object being tracked by the first sensor;
and
pixel location information corresponding with a location of the object in the first frame; and
a second tracking list associated with the second sensor; and
one or more processors operably coupled to the one or more memories, configured to:
receive the first frame;
identify the object within the first frame;
determine a first pixel location for the object, wherein the first pixel location comprises a first pixel row and a first pixel column of the first frame;
determine a first pixel value at the first pixel location;
determine the object is within the overlap region with the second sensor based on the first pixel location; and
in response to determining that the object is within the overlap region with the second sensor:
apply the first homography to the first pixel value to determine a first height for the object;
identify the object identifier for the object from the first tracking list; and
store the object identifier for the object in the second tracking list.