US 12,340,024 B2
Enhanced virtual touchpad
Eyal Bychkov, Hod Hasharon (IL); Oren Brezner, Rishon Lezion (IL); Micha Galor, Tel Aviv (IL); Ofir Or, Ramat Gan (IL); Jonathan Pokrass, Bat Yam (IL); and Amir Eshel, Nes Ziona (IL)
Assigned to Apple Inc., Cupertino, CA (US)
Filed by APPLE INC., Cupertino, CA (US)
Filed on Nov. 7, 2021, as Appl. No. 17/520,671.
Application 17/520,671 is a continuation of application No. 13/849,517, filed on Mar. 24, 2013, granted, now 11,169,611.
Claims priority of provisional application 61/663,638, filed on Jun. 25, 2012.
Claims priority of provisional application 61/615,403, filed on Mar. 26, 2012.
Prior Publication US 2022/0164032 A1, May 26, 2022
This patent is subject to a terminal disclaimer.
Int. Cl. G06F 3/01 (2006.01); G06F 3/042 (2006.01); G06F 3/04842 (2022.01); G06F 3/0485 (2022.01); G06F 3/0486 (2013.01); G06F 3/0488 (2022.01)
CPC G06F 3/017 (2013.01) [G06F 3/013 (2013.01); G06F 3/0425 (2013.01); G06F 3/04842 (2013.01); G06F 3/0485 (2013.01); G06F 3/0486 (2013.01); G06F 3/0488 (2013.01); G06F 2203/04806 (2013.01)] 20 Claims
OG exemplary drawing
 
1. A method, comprising:
presenting, by a computer, multiple interactive items on a display coupled to the computer;
projecting a light toward a scene that includes a user of the computer;
capturing and processing the projected light returned from the scene so as to reconstruct an initial three-dimensional (3D) map;
capturing and processing a two-dimensional (2D) image containing reflections from an eye of the user;
obtaining 3D coordinates of the head of the user based on the initial 3D map;
identifying, based on the 3D coordinates of the head and the reflections from the eye, a direction of a gaze of the user;
detecting, in response to the gaze direction, that the user is gazing toward an area of the display; and
in a gesture-based interaction step, subsequent to detecting that the user is gazing toward the display:
receiving a sequence of three-dimensional (3D) maps containing at least a hand of the user;
analyzing the 3D maps to detect an operation performed by the user, in which the user performs a first finger gesture, then moves the finger side-to-side in a requested scroll direction, and then performs a second finger gesture;
in response to the operation performed by the user, scrolling the interactive items on the display in the requested scroll direction by an amount that depends on a length of side-to-side movement of the finger and a distance of the user from the multiple interactive items on the display; and
controlling a speed of the scrolling in response to the direction of the gaze of the user.