US 12,189,853 B2
Object engagement based on finger manipulation data and untethered inputs
Adam G Poulos, Saratoga, CA (US); Aaron M. Burns, Sunnyvale, CA (US); Arun Rakesh Yoganandan, San Francisco, CA (US); Benjamin R. Blachnitzky, San Francisco, CA (US); and Nicolai Georg, San Francisco, CA (US)
Assigned to APPLE INC., Cupertino, CA (US)
Filed by Apple Inc., Cupertino, CA (US)
Filed on Mar. 11, 2024, as Appl. No. 18/601,376.
Application 18/601,376 is a continuation of application No. 18/114,492, filed on Feb. 27, 2023, granted, now 11,966,510.
Application 18/114,492 is a continuation of application No. PCT/US2021/043312, filed on Jul. 27, 2021.
Claims priority of provisional application 63/107,287, filed on Oct. 29, 2020.
Claims priority of provisional application 63/072,795, filed on Aug. 31, 2020.
Prior Publication US 2024/0211044 A1, Jun. 27, 2024
Int. Cl. G06F 3/01 (2006.01)
CPC G06F 3/014 (2013.01) [G06F 3/013 (2013.01); G06F 3/017 (2013.01)] 20 Claims
OG exemplary drawing
 
1. A method comprising:
at an electronic device with one or more processors, a non-transitory memory, a display, and a communication interface provided to communicate with a finger-wearable device and with a physical proxy object:
displaying, on the display, a plurality of computer-generated objects;
obtaining proxy object manipulation data from the physical proxy object via the communication interface, wherein the proxy object manipulation data corresponds to sensor data associated with one or more sensors integrated in the physical proxy object;
obtaining finger manipulation data from the finger-wearable device via the communication interface; and
registering an engagement event with respect to a first one of the plurality of computer-generated objects based on at least a portion of the proxy object manipulation data and the finger manipulation data.