US 12,189,865 B2
Navigating user interfaces using hand gestures
Tu K. Nguyen, Fountain Valley, CA (US); James N. Cartwright, Campbell, CA (US); Elizabeth C. Cranfill, San Francisco, CA (US); Christopher B. Fleizach, Gilroy, CA (US); Joshua R. Ford, San Francisco, CA (US); Jeremiah R. Johnson, Costa Mesa, CA (US); Charles Maalouf, Seattle, WA (US); Heriberto Nieto, Seattle, CA (US); Jennifer D. Patton, Cupertino, CA (US); Hojat Seyed Mousavi, San Jose, CA (US); Shawn R. Scully, Seattle, WA (US); Ibrahim G. Yusuf, Fremont, CA (US); Joanna Arreaza-Taylor, Seattle, WA (US); Hannah G. Coleman, Albuquerque, NM (US); and Yoonju Han, San Francisco, CA (US)
Assigned to Apple Inc., Cupertino, CA (US)
Filed by Apple Inc., Cupertino, CA (US)
Filed on Feb. 14, 2023, as Appl. No. 18/109,808.
Application 18/109,808 is a continuation of application No. 17/747,613, filed on May 18, 2022.
Claims priority of provisional application 63/221,331, filed on Jul. 13, 2021.
Claims priority of provisional application 63/190,783, filed on May 19, 2021.
Prior Publication US 2023/0195237 A1, Jun. 22, 2023
Int. Cl. G06F 3/01 (2006.01); G06F 3/04812 (2022.01); G06F 3/04817 (2022.01); G06F 3/0482 (2013.01); G06F 3/04842 (2022.01); G06F 3/0485 (2022.01)
CPC G06F 3/017 (2013.01) [G06F 3/04812 (2013.01); G06F 3/04817 (2013.01); G06F 3/0482 (2013.01); G06F 3/04842 (2013.01); G06F 3/0485 (2013.01)] 54 Claims
OG exemplary drawing
 
37. A method, comprising:
at a computer system that is in communication with a display generation component and an optical sensor:
detecting, via at least the optical sensor, a first hand gesture;
in response to detecting, via at least the optical sensor, the first hand gesture:
in accordance with a determination that the first hand gesture is a first type of hand gesture and that the first hand gesture is detected while the computer system is not operating in a hand gesture navigation mode, initiating the hand gesture navigation mode on the computer system; and
in accordance with a determination that the first hand gesture is the first type of hand gesture and that the first hand gesture is detected while the computer system is operating in the hand gesture navigation mode, displaying, via the display generation component, a menu; and
while the computer system is operating in the hand gesture navigation mode:
displaying, via the display generation component, a user interface that includes a first user interface object, a second user interface object, a third user interface object, and an indication that the first user interface object is selected;
while displaying the user interface that includes the first user interface object, the second user interface object, the third user interface object, and the indication that the first user interface object is selected, detecting, via at least the optical sensor, a second hand gesture; and
in response to detecting, via at least the optical sensor, the second hand gesture:
in accordance with a determination that the second hand gesture is a second type of gesture different from the first type of gesture, displaying, via the display generation component, an indication that the second user interface object is selected;
in accordance with a determination that the second hand gesture is a third type of gesture that is different from the first type of gesture and the second type of gesture, displaying, via the display generation component, an indication that the third user interface object is selected; and
in accordance with a determination that the second hand gesture is a fourth type of gesture that is different from the first type of gesture, the second type of gesture, and the third type of gesture, performing an operation corresponding to selection of the first user interface object.