US 12,443,286 B2
Input recognition based on distinguishing direct and indirect user interactions
David J. Meyer, Menlo Park, CA (US); Julian K. Shutzberg, San Francisco, CA (US); David M. Teitelbaum, San Francisco, CA (US); Daniel J. Brewer, San Jose, CA (US); Bharat C. Dandu, Santa Clara, CA (US); and Christopher D. McKenzie, Burlingame, CA (US)
Assigned to APPLE INC., Cupertino, CA (US)
Filed by Apple Inc., Cupertino, CA (US)
Filed on Sep. 29, 2023, as Appl. No. 18/375,280.
Claims priority of provisional application 63/521,807, filed on Jun. 19, 2023.
Claims priority of provisional application 63/470,565, filed on Jun. 2, 2023.
Prior Publication US 2024/0402821 A1, Dec. 5, 2024
Int. Cl. G06F 3/01 (2006.01)
CPC G06F 3/017 (2013.01) [G06F 3/012 (2013.01); G06F 3/013 (2013.01)] 23 Claims
OG exemplary drawing
 
1. A method comprising:
at an electronic device having a processor:
obtaining a position of a virtual object in an extended reality (XR) environment corresponding to a three-dimensional (3D) space;
obtaining a user hand position in the 3D space based on sensor data, wherein the hand position is associated with a hand gesture;
determining an interaction mode based on the user hand position and the position of the virtual object in the 3D space, wherein a direct interaction mode is selected when a relationship between the user hand position and the position of the virtual object in the 3D space satisfies a criterion and an indirect interaction mode is selected when the relationship satisfies a second criterion different than the first criterion; and
interpreting the hand gesture using an interaction recognition process associated with the determined interaction mode, wherein a direct interaction recognition process is used when the direct interaction mode is selected and an indirect interaction recognition process is used when the indirect interaction mode is selected.