US 12,287,916 B2
Content movement and interaction using a single controller
Karen Stolzenberg, Venice, CA (US); Marc Alan McCall, Plantation, FL (US); Frank Alexander Hamilton, IV, Martinsburg, WV (US); Cole Parker Heiner, Vista, CA (US); John Austin Day, Miami, FL (US); and Eric Norman Yiskis, Boca Raton, FL (US)
Assigned to MAGIC LEAP, INC., Plantation, FL (US)
Filed by Magic Leap, Inc., Plantation, FL (US)
Filed on Mar. 15, 2023, as Appl. No. 18/184,586.
Application 18/184,586 is a continuation of application No. 17/673,333, filed on Feb. 16, 2022, granted, now 11,619,996.
Application 17/673,333 is a continuation of application No. 17/153,201, filed on Jan. 20, 2021, granted, now 11,294,461, issued on Apr. 5, 2022.
Claims priority of provisional application 62/965,708, filed on Jan. 24, 2020.
Prior Publication US 2023/0214015 A1, Jul. 6, 2023
Int. Cl. G06F 3/01 (2006.01); G02B 27/00 (2006.01); G02B 27/01 (2006.01); G06F 3/0346 (2013.01)
CPC G06F 3/013 (2013.01) [G02B 27/0093 (2013.01); G02B 27/0101 (2013.01); G02B 27/017 (2013.01); G02B 27/0179 (2013.01); G06F 3/012 (2013.01); G06F 3/016 (2013.01); G06F 3/017 (2013.01); G06F 3/0346 (2013.01); G02B 2027/0138 (2013.01); G02B 2027/014 (2013.01); G02B 2027/0187 (2013.01)] 20 Claims
OG exemplary drawing
 
1. An augmented reality (AR) system, comprising:
an AR display configured to present virtual content to a user of the AR system;
one or more sensors configured to sense movement of the user; and
a hardware processor in communication with the AR display and the one or more sensors, the hardware processor programmed to:
receive a first indication via the one or more sensors;
determine a first pointing vector of a handheld controller that indicates a first direction within a three-dimensional (3D) environment of the handheld controller held by the user based on the first indication received via the one or more sensors;
determine a first location within the 3D environment pointed to by the handheld controller based on the first pointing vector of the handheld controller;
display on the AR display a first interactive content object at the first location in the 3D environment, wherein the first interactive content object is a control menu in which the user can select, highlight, or otherwise interact with the control menu or information associated with the control menu, or a prism that includes one or more interactive virtual objects;
receive a second indication via the one or more sensors;
determine a second pointing vector of the handheld controller that indicates a second direction within the 3D environment of the handheld controller held by the user based on the second indication received via the one or more sensors;
determine a second location within the 3D environment of the user based on the second pointing vector, wherein the second location is different from the first location;
determine a bounded area or volume associated with the first interactive content object;
determine whether the second pointing vector of the handheld controller intersects the bounded area or volume associated with the first interactive content object;
if the second pointing vector of the handheld controller intersects the bounded area or volume associated with the first interactive content object, select content within the control menu or at least one of the one or more interactive virtual objects included in the prism; and
if the second pointing vector of the handheld controller does not intersect the bounded area or volume associated with the first interactive content object, display on the AR display the first interactive content object at the second location in the 3D environment of the user.