| CPC G06F 3/014 (2013.01) [G06F 3/017 (2013.01)] | 22 Claims |

|
1. A method comprising:
at an electronic device with one or more processors, a non-transitory memory, an extremity tracking system, a display, and a communication interface provided to communicate with a finger-wearable device:
while displaying a computer-generated object on the display:
obtaining finger manipulation data from the finger-wearable device via the communication interface, wherein the finger manipulation data indicates a respective movement path of a first finger from a first position of an extended reality (ER) environment to a second position of the ER environment during a first time period;
generating, via the extremity tracking system, extremity tracking data by performing computer vision, wherein performing the computer vision includes determining a respective movement path of a second finger across a plurality of images, wherein the respective movement path of the second finger is from a third position of the ER environment to a fourth position of the ER environment during a second time period, and wherein the respective movement path of the first finger is different from the respective movement path of the second finger and the first time period is at least partially overlapping with the second time period;
determining a multi-finger gesture based on the extremity tracking data from the extremity tracking system and the finger manipulation data, wherein determining the multi-finger gesture is based on the respective movement path of the first finger and the respective movement path of the second finger; and
registering an engagement event with respect to the computer-generated object according to the multi-finger gesture.
|