US 12,333,081 B2
Interacting with a machine using gestures in first and second user-specific virtual planes
Avinash Dabir, San Francisco, CA (US); Paul Durdik, Foster City, CA (US); Keith Mertens, Oakland, CA (US); and Michael Zagorsek, Yountville, CA (US)
Assigned to Ultrahaptics IP Two Limited, Bristol (GB)
Filed by Ultrahaptics IP Two Limited, Bristol (GB)
Filed on Aug. 23, 2021, as Appl. No. 17/409,767.
Application 17/409,767 is a continuation of application No. 16/659,468, filed on Oct. 21, 2019, granted, now 11,099,653.
Application 16/659,468 is a continuation of application No. 15/917,066, filed on Mar. 9, 2018, granted, now 10,452,151, issued on Oct. 22, 2019.
Application 15/917,066 is a continuation of application No. 14/262,691, filed on Apr. 25, 2014, granted, now 9,916,009, issued on Mar. 13, 2018.
Claims priority of provisional application 61/816,487, filed on Apr. 26, 2013.
Prior Publication US 2021/0382563 A1, Dec. 9, 2021
This patent is subject to a terminal disclaimer.
Int. Cl. G06F 3/01 (2006.01); G06V 40/10 (2022.01); G06V 40/20 (2022.01)
CPC G06F 3/017 (2013.01) [G06V 40/11 (2022.01); G06V 40/28 (2022.01)] 18 Claims
OG exemplary drawing
 
1. A method of interacting with a machine using input gestures, the method comprising:
detecting a first finger state of a first finger of a hand of a user relative to a first user-specific virtual plane in a region of space monitored by a 3D sensor, the first finger state being one of the first finger moving closer to or further away from the first user-specific virtual plane;
detecting a second finger state of a second finger of the hand of the user relative to a second user-specific virtual plane in the region of space monitored by the 3D sensor, wherein (i) the second user-specific virtual plane is independent of the first user-specific virtual plane and (ii) the first finger and the second finger belong to the same hand of the user;
interpreting the first finger state as a first input gesture command to interact with a first functionality of the machine, wherein the first input gesture command is relative to the first user-specific virtual plane; and
interpreting the second finger state as a second input gesture command to interact with a second functionality of the machine, wherein the second input gesture command is relative to the second user-specific virtual plane.