US 12,462,127 B2
Assistive detection of visual states for monitoring devices
James Roger Morley-Smith, Oxfordshire (GB); Cameron James Smith, Whitchurch (GB); and Hannah Marie Legg, London (GB)
Assigned to Zebra Technologies Corporation, Lincolnshire, IL (US)
Filed by ZEBRA TECHNOLOGIES CORPORATION, Lincolnshire, IL (US)
Filed on May 21, 2024, as Appl. No. 18/670,570.
Application 18/670,570 is a continuation of application No. 17/828,634, filed on May 31, 2022, granted, now 11,995,506.
Prior Publication US 2024/0311594 A1, Sep. 19, 2024
Int. Cl. G06K 7/10 (2006.01); G06K 7/14 (2006.01); G06V 10/25 (2022.01)
CPC G06K 7/10722 (2013.01) [G06K 7/1413 (2013.01); G06K 7/1417 (2013.01); G06V 10/25 (2022.01)] 18 Claims
OG exemplary drawing
 
1. A method in a computing device, the method comprising:
capturing, via a camera of a computing device, an image of a receptacle containing items;
detecting, within the image of the receptacle, a machine-readable indicium disposed on the receptacle;
determining, from the machine-readable indicium, a lot number associated with the items in the receptacle and a total quantity of the items in the receptacle;
initiating a logging process for the items in the receptacle;
for each item in the set of items:
capturing, via the camera, an image of a monitoring device on the item, the monitoring device including a sensor presenting a visual state from a set of predefined visual states based on exposure of the monitoring device to an environmental condition;
detecting, within the image of the monitoring device, a machine-readable indicium disposed and a user accessibility feature on the monitoring device;
determining, from the machine-readable indicium disposed on the monitoring device, a state detection parameter associated with the sensor of the monitoring device;
determining a status of the sensor based on the state detection parameter and a visual state of the sensor in the image of the monitoring device;
logging the status of the item as accepted or rejected based on the determined visual state,
wherein the user accessibility feature, the sensor, and the machine-readable indicium of the monitoring device are separate and spaced away from each other.