US 11,756,036 B1 | ||
Utilizing sensor data for automated user identification | ||
Manoj Aggarwal, Seattle, WA (US); Prithviraj Banerjee, Redmond, WA (US); Gerard Guy Medioni, Seattle, WA (US); and Brad Musick, Seattle, WA (US) | ||
Assigned to Amazon Technologies, Inc., Seattle, WA (US) | ||
Filed by Amazon Technologies, Inc., Seattle, WA (US) | ||
Filed on Dec. 13, 2019, as Appl. No. 16/714,348. | ||
Int. Cl. G06V 40/12 (2022.01); G06Q 20/40 (2012.01); G06V 10/46 (2022.01); G06V 40/13 (2022.01); G06F 18/2321 (2023.01) |
CPC G06Q 20/40145 (2013.01) [G06F 18/2321 (2023.01); G06V 10/462 (2022.01); G06V 40/13 (2022.01); G06V 40/1359 (2022.01); G06V 40/1376 (2022.01)] | 18 Claims |
1. A system comprising:
one or more processors; and one or more non-transitory computer-readable media storing computer-executable instructions that, when executed, cause the one or more processors to perform operations comprising:
receiving, from a user-recognition system, a request to verify that a first palm depicted in first image data corresponds to a second palm depicted in second image data;
receiving the first image data depicting the first palm;
receiving the second image data depicting the second palm;
identifying, using the first image data, a first characteristic exhibited in a first portion of the first palm, the first portion being less than an entirety of the first palm;
extracting, from the first image data, first feature data representing the first characteristic exhibited in the first palm;
extracting, from the second image data, second feature data representing the first characteristic exhibited in the second palm;
analyzing, by the system, the first feature data with reference to the second feature data;
calculating, based at least in part on the analyzing the first feature data, a first similarity score representing a measure of similarity between the first characteristic exhibited in the first palm and the first characteristic exhibited in the second palm;
identifying, using the first image data, a second characteristic exhibited in a second portion of the first palm, the second portion being less than an entirety of the first palm;
extracting, from the first image data, third feature data representing the second characteristic exhibited in the first palm;
extracting, from the second image data, fourth feature data representing the second characteristic exhibited in the second palm;
analyzing, by the system, the third feature data with reference to the fourth feature data;
calculating, based at least in part on the analyzing the third feature data, a second similarity score representing a measure of similarity between the second characteristic exhibited in the first palm and the second characteristic exhibited in the second palm;
verifying, based at least in part on the first similarity score and the second similarity score, that the first palm corresponds to the second palm; and
sending, to the user-recognition system, output data verifying that the first palm corresponds to the second palm.
|