CPC G06Q 30/0631 (2013.01) [A45D 34/04 (2013.01); A45D 34/06 (2013.01); A45D 44/005 (2013.01); A61K 8/02 (2013.01); A61Q 1/02 (2013.01); A61Q 1/12 (2013.01); A61Q 17/005 (2013.01); B08B 3/08 (2013.01); G06T 7/90 (2017.01); G06T 11/001 (2013.01); G06V 10/774 (2022.01); G06V 10/82 (2022.01); G06V 20/40 (2022.01); G06V 40/161 (2022.01); G06V 40/169 (2022.01); G06V 40/171 (2022.01); H04W 4/80 (2018.02); H04W 64/006 (2013.01); A45D 2034/002 (2013.01); A45D 2034/005 (2013.01); A45D 2200/058 (2013.01); A45D 2200/205 (2013.01); A61K 2800/59 (2013.01); G06F 16/9035 (2019.01); G06Q 30/0643 (2013.01); G06Q 50/01 (2013.01); G06T 2207/10016 (2013.01); G06T 2207/30196 (2013.01)] | 5 Claims |
1. A system, comprising:
a mobile user device that includes processing circuitry configured to
execute an application that determines a skintone of a user, and
determine and transmit a recipe for generating a target foundation that is based on a combination of a plurality of separate foundation ingredients that are associated with the skintone of the user; and
a dispensing device configured to receive the transmitted recipe from the mobile user device and dispense each of the plurality of separate foundation ingredients onto a common dispensing surface such that when the dispensed amounts of each of the plurality of separate foundation ingredients is blended on the dispensing surface, the target foundation is achieved,
wherein the processing circuitry of the mobile user device is configured to determine a skintone of the user based on features in a detected face of the user in a self-taken image of the user that is captured by a camera of the mobile user device, and
wherein the self-taken image is a 360° video and the processing circuitry of the mobile user device is configured to predict the skintone of the user further based on a deep learning model based on first metadata associated with the self-taken image that includes information of a season and location for when the self-taken image is capture and second metadata that includes historical climate data.
|