US 12,449,914 B2
Interacting with a smart device using a pointing controller
Nathaniel James Martin, London (GB); Charles James Bruce, London (GB); and Lewis Antony Jones, London (GB)
Assigned to ARKH LITHO HOLDINGS, LLC, Dallas, TX (US)
Filed by ARKH Litho Holdings, LLC, Dallas, TX (US)
Filed on Apr. 22, 2024, as Appl. No. 18/641,937.
Application 18/641,937 is a continuation of application No. 17/432,028, granted, now 11,989,355, previously published as PCT/IB2020/051292, filed on Feb. 15, 2020.
Claims priority of provisional application 62/807,094, filed on Feb. 18, 2019.
Prior Publication US 2024/0411385 A1, Dec. 12, 2024
Int. Cl. G06F 3/0346 (2013.01); G06F 3/01 (2006.01); G06F 3/0482 (2013.01); G06F 3/04847 (2022.01)
CPC G06F 3/0346 (2013.01) [G06F 3/017 (2013.01); G06F 3/0482 (2013.01); G06F 3/04847 (2013.01)] 20 Claims
OG exemplary drawing
 
1. A method for controlling interactions with a smart device using a pointing controller, the method comprising:
capturing image data of an object within an environment by a camera of a tracking device;
detecting, from the image data, a location of the object within the environment;
directing, via a user interface of the tracking device, a user to point to the object using the pointing controller;
obtaining inertial sensor data from a state sensing device of the pointing controller;
detecting, based on the inertial sensor data of the state sensing device of the pointing controller, when a pitch of the pointing controller matches a pitch vector between the tracking device and the location of the object and when the pointing controller is still;
upon detecting from the inertial sensor data that the pitch of the pointing controller matches the pitch vector between the tracking device and the location of the object and that the pointing controller is still, performing a yaw calibration of the pointing controller to initialize a reference yaw of the pointing controller relative to yaw of the camera of the tracking device; tracking, based on the inertial sensor data, changes in yaw of the pointing controller relative to the reference yaw and tracking changes in roll and pitch relative to a detected direction of gravity;
tracking movement of a pointing vector through a three-dimensional space based on the changes in yaw, roll, and pitch and a stored arm model;
detecting, based on the movement, an intersection of the pointing vector with coordinates in the three-dimensional space associated with the smart device to place the smart device in a selected state;
causing an augmented reality display device to display a virtual menu associated with the smart device;
detecting a control interaction with the pointing controller associated with the virtual menu when the smart device is in the selected state; and
generating a command to control an operation of the smart device based on the control interaction.