US 12,333,862 B2
Information processing device
Muneaki Onozato, Tokyo (JP); Satoshi Terasawa, Tokyo (JP); Ryoma Oami, Tokyo (JP); and Yadong Pan, Tokyo (JP)
Assigned to NEC CORPORATION, Tokyo (JP)
Filed by NEC Corporation, Tokyo (JP)
Filed on Aug. 25, 2023, as Appl. No. 18/238,376.
Application 18/238,376 is a continuation of application No. 17/421,532, granted, now 12,183,122, previously published as PCT/JP2019/001464, filed on Jan. 18, 2019.
Prior Publication US 2023/0410558 A1, Dec. 21, 2023
Int. Cl. G06V 40/20 (2022.01); G06V 20/52 (2022.01); G06V 40/10 (2022.01)
CPC G06V 40/20 (2022.01) [G06V 20/52 (2022.01); G06V 40/103 (2022.01)] 18 Claims
OG exemplary drawing
 
1. A monitoring device comprising:
a memory storing instructions; and
a processor connected to the memory and configured to execute the instructions to:
receive a captured image of a target place;
extract an image portion of a person in the captured image;
extract, based on the image portion of the person, person attribute information representing an attribute of the person, person action information representing a movement route of the person, and scene information of the target place;
extract, based on a distance between the person and another person, the person action information representing a movement route of a group including a plurality of the persons;
store the person attribute information, the action information, and the scene information in association with each other;
generate reference attribute information representing an attribute and an action of the plurality of persons in the group corresponding to the target place by learning the person attribute information, the action information, and the scene information stored in association with each other in a storage device;
acquire environment information representing a surrounding environment of the target place from another information processing device connected via a network;
acquire, based on the environment information, the reference attribute information and reference action information associated with the target place, from the storage device;
extract person attribute information representing an attribute of a different person and person action information of the different person in a different captured image obtained by capturing a different image of the target place; and
detect, in the different captured image, a predetermined person performing a predetermined action in the target place, based on the reference attribute information, the reference action information, the person attribute information representing the attribute of the different person, and the person action information of the different person.