US 12,242,668 B2
Multi-finger gesture based on finger manipulation data and extremity tracking data
Aaron M. Burns, Sunnyvale, CA (US); Adam G. Poulos, Saratoga, CA (US); Arun Rakesh Yoganandan, San Francisco, CA (US); Benjamin Hylak, San Francisco, CA (US); Benjamin R. Blachnitzky, San Francisco, CA (US); and Nicolai Georg, San Francisco, CA (US)
Assigned to APPLE INC., Cupertino, CA (US)
Filed by Apple Inc., Cupertino, CA (US)
Filed on Mar. 20, 2023, as Appl. No. 18/123,762.
Application 18/123,762 is a continuation of application No. PCT/US2021/049598, filed on Sep. 9, 2021.
Claims priority of provisional application 63/081,446, filed on Sep. 22, 2020.
Prior Publication US 2023/0333651 A1, Oct. 19, 2023
Int. Cl. G06F 3/01 (2006.01)
CPC G06F 3/014 (2013.01) [G06F 3/017 (2013.01)] 22 Claims
OG exemplary drawing
 
1. A method comprising:
at an electronic device with one or more processors, a non-transitory memory, an extremity tracking system, a display, and a communication interface provided to communicate with a finger-wearable device:
while displaying a computer-generated object on the display:
obtaining finger manipulation data from the finger-wearable device via the communication interface, wherein the finger manipulation data indicates a respective movement path of a first finger from a first position of an extended reality (ER) environment to a second position of the ER environment during a first time period;
generating, via the extremity tracking system, extremity tracking data by performing computer vision, wherein performing the computer vision includes determining a respective movement path of a second finger across a plurality of images, wherein the respective movement path of the second finger is from a third position of the ER environment to a fourth position of the ER environment during a second time period, and wherein the respective movement path of the first finger is different from the respective movement path of the second finger and the first time period is at least partially overlapping with the second time period;
determining a multi-finger gesture based on the extremity tracking data from the extremity tracking system and the finger manipulation data, wherein determining the multi-finger gesture is based on the respective movement path of the first finger and the respective movement path of the second finger; and
registering an engagement event with respect to the computer-generated object according to the multi-finger gesture.