US 10,802,580 C1 (12,391st)
Technique for controlling virtual image generation system using emotional states of user
George Alistair Sanger, Coronado, CA (US); Samuel A. Miller, Hollywood, FL (US); and Graeme John Devine, Rockwall, TX (US)
Filed by MAGIC LEAP, INC., Plantation, FL (US)
Assigned to MAGIC LEAP, INC., Plantation, FL (US)
Reexamination Request No. 90/019,123, Oct. 17, 2022.
Reexamination Certificate for Patent 10,802,580, issued Oct. 13, 2020, Appl. No. 16/713,434, Dec. 13, 2019.
Application 90/019,123 is a continuation of application No. 15/655,563, filed on Jul. 20, 2017, granted, now 10,540,004.
Claims priority of provisional application 62/364,957, filed on Jul. 21, 2016.
Ex Parte Reexamination Certificate issued on Sep. 6, 2023.
Int. Cl. A63F 13/212 (2014.01); G06F 3/01 (2006.01); G06V 20/20 (2022.01); G06F 3/03 (2006.01); G06V 40/16 (2022.01); G02B 27/01 (2006.01); A63F 13/52 (2014.01); A63F 13/822 (2014.01); A63F 13/65 (2014.01); G06F 16/58 (2019.01); A63F 13/21 (2014.01); G06T 19/00 (2011.01); G06F 16/56 (2019.01)
CPC G06F 3/012 (2013.01) [A63F 13/21 (2014.09); A63F 13/212 (2014.09); A63F 13/52 (2014.09); A63F 13/65 (2014.09); A63F 13/822 (2014.09); G02B 27/017 (2013.01); G06F 3/011 (2013.01); G06F 3/0304 (2013.01); G06F 16/56 (2019.01); G06F 16/5866 (2019.01); G06T 19/006 (2013.01); G06V 20/20 (2022.01); G06V 40/175 (2022.01)]
OG exemplary drawing
AS A RESULT OF REEXAMINATION, IT HAS BEEN DETERMINED THAT:
Claims 1, 2, 4, 6, 8 and 13-16 are determined to be patentable as amended.
Claims 3, 5, 7, 9-12 and 17-20, dependent on an amended claim, are determined to be patentable.
New claims 21-32 are added and determined to be patentable.
1. A method of operating an augmented reality (AR) display system that includes a wearable display device, the method comprising:
presenting an interactive three-dimensional [ (3D) ] environment to a user [ at least by presenting, with a projection subsystem of the wearable display device, an eye of the user with light beams that represent at least one virtual object ] through the wearable display device, the interactive three-dimensional environment comprising a real-world scene and [ the ] at least one virtual object that is generated by the AR display system and presented within the real-world scene;
[ determining an objective for a plurality of emotional states in the user and a threshold duration of time;
performing a first action that presents a stimulus to the user based at least in part upon the objective;]
responsive to presenting a [ the ] stimulus in the three-dimensional environment, sensing at least one biometric parameter of the user, the at least one biometric parameter [ of a plurality of biometric parameters ] sensed using at least one sensor [ of a plurality of sensors ] that is attached to the wearable display device;
generating biometric data for each of the at least one sensed biometric parameter [ that has been sensed using the at least one sensor] ;
determining at least one emotional state of the user based on the biometric data for the each of the at least one sensed biometric parameter; and
[ determining whether the at least one emotional state is consistent with the objective for at least the threshold duration of time, wherein the objective comprises evoking one or more desired emotional states within the plurality of emotional states for the user;
modifying the at least one virtual object with which the user interacts at least by changing a characteristic pertaining to presenting the at least one virtual object to the user based at least in part upon whether the at least one emotional state is consistent with the objective for the threshold duration of time; and
determining whether the biometric data is to be correlated with the at least one emotional state based at least in part upon whether one or more characteristics of the biometric data satisfy the threshold duration of time; and]
performing an [ a separate ] action through the AR display system based at least partly on determining that the user is in the at least one emotional state, wherein the [ separate ] action includes a presentation of at least one additional virtual object that is selected to evoke another emotional state in the user different from the at least one emotional state of the user.
2. The method of claim 1, further comprising determining the at least one emotional state of the user for a [ tracked ] duration, wherein the action is performed through the AR display system based at least partially on determining that the user is in the at least one emotional state for the [ tracked ] duration.
4. The method of claim 1, wherein the at least one biometric parameter of the user is sensed at a plurality of different times in response to the [ first action ] presenting the stimulus, and the biometric data is generated at these different times.
6. The method of claim 1, wherein the at least one sensed biometric parameter comprises a plurality of different sensed biometric parameters, and
wherein determining the at least one specific emotional state of the user comprises performing a pattern recognition analysis on the generated biometric data.
8. The method of claim 7, wherein
the [ biometric data comprises at ] least one facial expression is [ indicating a smile and ] one or both of an attitude of the mouth and crow's feet around the eyes of the end user, and
wherein the at least one emotional state comprises [ is determined to be ] happiness.
13. The method of claim 12, wherein the generated biometric data for one of the at least one sensed biometric parameter is a biometric scalar data value, the reference biometric data comprises a reference biometric value range, and comparing the generated biometric data to the reference biometric data comprises determining whether the biometric scalar data value falls within the reference biometric value range.
14. The method of claim 12, wherein the generated biometric data for one of the at least one sensed biometric parameter is a biometric multi-dimensional data vector, the reference biometric data comprises a reference biometric multi-dimensional data vector, and comparing the generated biometric data to the reference biometric data comprises performing a correlation function between the generated biometric multi-dimensional data vector and the reference biometric multi-dimensional data vector.
15. The method of claim 12, wherein determining the at least one emotional state of the user further comprises retrieving the reference biometric data from a custom emotional state profile of the end user.
16. The method of claim 1, wherein presenting the three-dimensional environment to the user through the wearable display device comprises rendering a plurality of synthetic image frames of a three-dimensional environment, and sequentially displaying the [ plurality of ] synthetic image frames to the user through [ the ] wearable display device,
wherein the [ plurality of ] synthetic image frames are projected from a transparent display surface in the [ to a ] field of view of the user via a frame structure mounted to the [ a ] head of the user, and
wherein the [ plurality of ] synthetic image frames are superimposed over a real scene visualized by the user.
[ 21. The method of claim 1, wherein modifying the at least one virtual object comprises:
upon a first determination result that the at least one emotional state is consistent with the objective for at least the threshold duration of time, modifying the at least one virtual object at least by making the virtual object more available to the user; and
upon a second determination result that the at least one emotional state is not consistent with the objective for at least the threshold duration of time, modifying the at least one virtual object at least by making the virtual object less available to the user.]
[ 22. The method of claim 1, wherein determining whether the biometric data is to be correlated with the at least one emotional state comprises:
tracking the one or more characteristics pertaining to the at least one emotional state to generate a tracked duration of time; and
determining whether the one or more characteristics of the biometric data satisfy the threshold duration of time at least by comparing the tracked duration of time to the threshold duration of time.]
[ 23. The method of claim 22, wherein determining whether the biometric data is to be correlated with the at least one emotional state further comprises:
upon a determination that the biometric data is to be correlated with the at least one emotional state, correlating the biometric data with the at least one emotional state based at least in part upon the threshold duration of time and one or more characteristics of the biometric data, wherein
the biometric data is not correlated with the at least one emotional state upon a determination that the one or more characteristics of the biometric data do not satisfy the threshold duration of time.]
[ 24. The method of claim 1, wherein the characteristic pertaining to presenting the at least one virtual object comprises a tempo pertaining to a presentation of the at least one virtual object to the user.]
[ 25. The method of claim 1, wherein the characteristic pertaining to presenting the at least one virtual object comprises a frequency pertaining to a presentation of the at least one virtual object to the user.]
[ 26. The method of claim 1, wherein determining whether the at least one emotional state is consistent with the objective comprises:
determining whether the biometric data sensed by the at least one sensor indicates that the user is in one of the one or more desired emotional states.]
[ 27. The method of claim 1, wherein determining whether the at least one emotional state is consistent with the objective further comprises:
determining whether the user is experiencing happiness for the threshold duration of time based at least in part upon the one or more characteristics of the biometric data and the threshold duration of time.]
[ 28. The method of claim 27, wherein determining whether the user is experiencing happiness comprises:
determining a correlation coefficient range from a plurality of correlation coefficient ranges to which a second value in the biometric data belongs, wherein the second value indicates a second extent of crow's feet and is associated with a second weight in determining that the at least one emotional state indicates that the user is experiencing happiness for the duration of time.]
[ 29. The method of claim 28, wherein determining whether the at least one emotional state is consistent with the objective further comprises:
determining a distance range from a plurality of distance ranges to which a value in the biometric data belongs, wherein the value indicates a first extent of a smile and is associated with a first weight in determining that the at least one emotional state indicates that the user is experiencing happiness for the threshold duration of time; and
determining that the at least one emotional state indicates that the user is experiencing happiness based at least in part upon the distance range, the correlation coefficient range, and the duration of time.]
[ 30. The method of claim 27, wherein determining whether the at least one emotional state is consistent with the objective further comprises:
determining whether the one or more characteristics of the biometric data indicate presence of crow's feet; and
determining whether the one or more characteristics of the biometric data last for at least the threshold duration of time.]
[ 31. The method of claim 29, further comprising:
in response to a first determination result that the one or more characteristics of the biometric data do not indicate the presence of crow's feet or a second determination result that the one or more characteristics of the biometric data last for at least the threshold duration of time, determining that the biometric data is not to be correlated with the at least one emotional state.]
[ 32. The method of claim 29, wherein the biometric data is determined to be correlated with the at least one emotional state when it is determined that the one or more characteristics of the biometric data indicate the presence of crow's feet and the one or more characteristics of the biometric data last for at least the threshold duration of time.]