CPC A45D 44/005 (2013.01) [G06F 3/011 (2013.01); G06F 3/017 (2013.01); G06T 19/20 (2013.01); G06V 20/20 (2022.01); G06V 40/107 (2022.01); G06V 40/161 (2022.01); G06V 40/171 (2022.01); G06V 40/174 (2022.01); G06V 40/28 (2022.01); G06T 2219/2016 (2013.01)] | 20 Claims |
1. A method implemented in a computing device for navigating a user interface using a hybrid touchless control mechanism, comprising:
capturing, by a camera, a live video of a user;
determining a location of a facial region of the user in the live video;
determining a location of the user's hand in the live video and determining a finger vector type based on a direction in which at least one finger is pointing relative to the facial region of the user in the live video;
responsive to detecting in the live video a first finger vector type occurring within the facial region involving a single finger, displaying a makeup effects toolbar in the user interface;
responsive to detecting in the live video a second finger vector type involving the single finger, displaying a selection tool for selecting a makeup effect in the makeup effects toolbar;
obtaining a selected makeup effect based on manipulation by the user of the selection tool; and
responsive to detecting a target user action, performing virtual application of the selected makeup effect on the facial region of the user.
|