US 11,327,312 C1 (12,523rd)
Imaging modification, display and visualization using augmented and virtual reality eyewear
Nastasja U. Robaina, Coconut Grove, FL (US); Nicole Elizabeth Samec, Fort Lauderdale, FL (US); Christopher M. Harrises, Nashua, NH (US); Rony Abovitz, Weston, FL (US); Mark Baerenrodt, Milbrae, CA (US); and Brian Lloyd Schmidt, Bellevue, WA (US)
Filed by Magic Leap, Inc., Plantation, FL (US)
Assigned to MAGIC LEAP, INC., Plantation, FL (US)
Reexamination Request No. 90/019,137, Dec. 1, 2022.
Reexamination Certificate for Patent 11,327,312, issued May 10, 2022, Appl. No. 16/790,576, Feb. 13, 2020.
Application 90/019,137 is a continuation of application No. 15/657,589, filed on Jul. 24, 2017, granted, now 10,838,210.
Claims priority of provisional application 62/366,599, filed on Jul. 25, 2016.
Claims priority of provisional application 62/369,071, filed on Sep. 16, 2016.
Claims priority of provisional application 62/440,332, filed on Dec. 29, 2016.
Ex Parte Reexamination Certificate issued on Feb. 21, 2024.
Int. Cl. G02B 27/01 (2006.01); A61B 17/00 (2006.01); A61B 34/00 (2016.01); A61B 34/20 (2016.01); A61B 90/00 (2016.01); A61B 90/50 (2016.01); G06F 3/01 (2006.01); G06F 3/03 (2006.01); G06T 19/00 (2011.01)
CPC G02B 27/0172 (2013.01) [A61B 34/25 (2016.02); A61B 90/36 (2016.02); A61B 90/37 (2016.02); G02B 27/01 (2013.01); G06F 3/011 (2013.01); G06F 3/012 (2013.01); G06F 3/013 (2013.01); G06F 3/017 (2013.01); G06F 3/0304 (2013.01); G06T 19/006 (2013.01); A61B 2017/00203 (2013.01); A61B 2017/00207 (2013.01); A61B 2017/00216 (2013.01); A61B 2034/2048 (2016.02); A61B 2034/254 (2016.02); A61B 2034/258 (2016.02); A61B 2090/365 (2016.02); A61B 2090/367 (2016.02); A61B 2090/368 (2016.02); A61B 2090/371 (2016.02); A61B 2090/372 (2016.02); A61B 2090/378 (2016.02); A61B 2090/502 (2016.02)]
OG exemplary drawing
AS A RESULT OF REEXAMINATION, IT HAS BEEN DETERMINED THAT:
Claims 11-13 and 26-31 are cancelled.
Claims 1-10, 14-25, 32 and 33 are determined to be patentable as amended.
New claims 34-44 are added and determined to be patentable.
1. A head-mounted display system configured to project light to an eye of a user to display augmented reality image content in a vision field of said user, [ the vision field of said user being a part of a field of regard surrounding said user and capable of being perceived by said user, ] the vision field having a central region and a peripheral region disposed about said central region, said head-mounted display system comprising:
a frame configured to be supported on a head of the user;
a head-mounted display disposed on the frame, said [ head-mounted ] display configured to project light into said user's eye to display augmented reality image content to the user's vision field, at least a portion of said [ head-mounted ] display being transparent and disposed at a location in front of the user's eye when the user wears said head-mounted display such that said transparent portion transmits light from a portion of the [ an ] environment in front of the user and said head-mounted display to the user's eye to provide a view of said portion of the environment in front of the user and said head-mounted display;
one or more user sensors configured to sense the user;
one or more environmental sensors configured to sense surroundings of the user [ , including one or more objects or events in the field of regard surrounding said user but outside the vision field of said user] ;
processing electronics in communication with the [ head-mounted ] display, the one or more user sensors, and the one or more environmental sensors, the processing electronics configured to:
sense a situation involving increased user focus; and
based at least in part on sensing the increased [ user ] focus on a real or virtual peripheral object in the peripheral region of the user's vision field, de-emphasize image content in a portion of the vision field other than the peripheral object, wherein sensing the increased user focus on the peripheral object is based at least in part on the user's eye squinting at the peripheral object or fluctuations in accommodation of the user's eye when directed at the peripheral object [ ; and
alert said user of an object or an event occurring in the field of regard but outside the vision field of said user, wherein alerting said user of an object or an event occurring in the field of regard but outside the vision field of said user includes displaying an image of the object or the event outside the vision field of said user in the vision field of said user] .
2. The [ head-mounted ] display system of claim 1, wherein the processing electronics are configured to de-emphasize the image content in the portion of the vision field other than the peripheral object by altering at least one of contrast, opacity, color, color saturation, color balance, size, background, brightness, edges, or sharpness of the image content in the portion of the vision field other than the peripheral object.
3. The [ head-mounted ] display system of claim 1, wherein the one or more environmental sensors comprise [ one or more outward-facing image capture devices, ] a depth sensor, a pair of binocular world cameras, a geolocation sensor, a proximity sensor, [ a distance measuring device, ] or a GPS.
4. The [ head-mounted ] display system of claim 1, wherein the one or more user sensors comprise one or more cameras.
5. The [ head-mounted ] display system of claim 1, wherein the processing electronics are further configured to determine the user intent for the situation and alter the user perception of the real or virtual peripheral object within the vision field of the user based at least in part on sensing the increased [ user ] focus.
6. The [ head-mounted ] display system of claim 1, wherein said situation involving increased user focus comprises driving a motor vehicle.
7. The [ head-mounted ] display system of claim 1, wherein said one or more environmental sensors comprise a sensor configured to detect a radio signal.
8. The [ head-mounted ] display system of claim 1, wherein said one or more environmental sensors comprise a sensor configured to detect a blue tooth signal from an automobile.
9. The [ head-mounted ] display system of claim 6, wherein said processing electronics are configured to de-emphasize image content in the portion of the vision field other than the peripheral object based at least in part on one or more data records regarding the user, said one or more data records comprising a driving record of said user.
10. A head-mounted display system configured to project light to an eye of a user to display augmented reality image content, said user's eye having a vision field having a central region and a peripheral region disposed about said central region, [ the vision field of said user being a part of a field of regard surrounding said user and capable of being perceived by said user, ] said head-mounted display system comprising:
a frame configured to be supported on a head of the user;
a head-mounted display disposed on the frame, said [ head-mounted ] display configured to project light into said user's eye so as to present image content at said central region of said user's vision field, at least a portion of said [ head-mounted ] display being transparent and disposed at a location in front of the user's eye when the user wears said head-mounted display such that said transparent portion transmits light from a portion of the [ an ] environment in front of the user and said head-mounted display to the user's eye to provide a view of said portion of the environment in front of the user and said head-mounted display;
[ one or more environmental sensors configured to sense surroundings of the user, including one or more objects or events in the field of regard surrounding said user but outside the vision field of said user] ;
processing electronics in communication with [ the one or more environmental sensors and ] said [ head-mounted ] display to control presentation of image content on said [ head-mounted] display, the processing electronics configured to:
sense a situation involving increased user focus; and
based at least in part on sensing the increased [ user ] focus on a real or virtual peripheral object in the peripheral region of the user's vision field, de-emphasize image content in a portion of the vision field other than the peripheral object, wherein sensing the increased user focus on the peripheral object is based at least in part on the [ a ] number of occasions the user's eye is directed at the peripheral object, the [ an ] amount of time the user's eye is directed at the peripheral object, the user's eye squinting at the peripheral object, or measurement of accommodation of the user's eye when directed at the peripheral object, wherein the [ head-mounted display ] system is configured to provide an alert to the user to indicate the de-emphasized image content [ ; and
alert said user of an object or an event occurring in the field of regard but outside the vision field of said user, wherein alerting said user of an object or an event occurring in the field of regard but outside the vision field of said user includes displaying an image of the object or the event outside the vision field of said user in the vision field of said user] .
14. The [ head-mounted display ] system of claim 10, further comprising an eye tracking device configured to track position and/or movement of said user's eye.
15. The [ head-mounted display ] system of claim 10, wherein the head-mounted display system is configured to process image content presented to at least a portion of said central region of the user's vision field differently in comparison to image content presented to the peripheral region of the user's vision field.
16. The [ head-mounted display ] system of claim 10, wherein the head-mounted display system is configured to de-emphasize image content in the portion of the vision field other than the peripheral object by reducing the [ a ] size of image content in the portion of the vision field other than the peripheral object.
17. The [ head-mounted display ] system of claim 10, wherein the head-mounted display system is configured to de-emphasize image content in the portion of the vision field other than of the peripheral object by decreasing brightness, visibility, sharpness, or contrast in image content in the portion of the vision field other than the peripheral object.
18. The [ head-mounted display ] system of claim 1, wherein the processing electronics are configured to de-emphasize the image content in the portion of the vision field other than the peripheral object by reducing the [ a ] size of image content in the portion of the vision field other than the peripheral object.
19. The [ head-mounted display ] system of claim 1, wherein the processing electronics are configured to de-emphasize the image content in the portion of the vision field other than the peripheral object by decreasing brightness, visibility, sharpness, or contrast of image content in the portion of the vision field other than of the peripheral object.
20. The [ head-mounted display ] system of claim 1, wherein the head-mounted display is configured to project light into said user's eye at different amounts of divergence to display augmented reality image content to the user's vision field as if projected from different distances from the user's eye.
21. The [ head-mounted display ] system of claim 1, wherein the head-mounted display is configured to project light into said user's eye that diverges so as to display augmented reality image content to the user's vision field corresponding to a first depth and to project light into said user's eye that is collimated so as to display augmented reality image content to the user's vision field corresponding to a second depth.
22. The [ head-mounted display ] system of claim 10, wherein the head- mounted display is configured to project light into said user's eye at different amounts of divergence to display augmented reality image content to the user's vision field as if projected from different distances from the user's eye.
23. The [ head-mounted display ] system of claim 10, wherein the head-mounted display is configured to project light into said user's eye that diverges so as to display augmented reality image content to the user's vision field corresponding to a first depth and to project light into said user's eye that is collimated so as to display augmented reality image content to the user's vision field corresponding to a second depth.
24. The [ head-mounted display ] system of claim 1, wherein sensing the increased user focus on the peripheral object is based at least in part on the [ a ] number of occasions the user's eye is directed at the peripheral object.
25. The [ head-mounted display ] system of claim 1, wherein sensing the increased user focus on the peripheral object is based at least in part on the [ an ] amount of time the user's eye is directed at the peripheral object.
32. The [ head-mounted display ] system of claim 10, wherein the alert [ to the user to indicate the de-emphasized image content ] is a visual alert or an audio alert.
33. The [ head-mounted display ] system of claim 1, wherein the processing electronics is [ are ] configured to de-emphasize image content in a portion of the vision field other than the peripheral object based at least in part on sensing the increased [ user ] focus on a real or virtual peripheral object in the peripheral region of the user's vision field while the user is also focused on an object in the central region of the user's vision field.
[ 34. The head-mounted display system of claim 1, wherein the one or more objects in the field of regard surrounding said user but outside the vision field of said user are perceivable by the one or more environmental sensors based on sensor location and sensor field of view.]
[ 35. The head-mounted display system of claim 1, wherein the field of regard includes substantially all of a 4π steradian solid angle surrounding said user.]
[ 36. The head-mounted display system of claim 1, wherein alerting said user of an object or an event occurring in the field of regard but outside the vision field of said user further includes displaying an icon representing an image of the object or the event outside the vision field of said user in the vision field of said user.]
[ 37. The head-mounted display system of claim 1, wherein alerting said user of an object or an event occurring in the field of regard but outside the vision field of said user further includes an audio alert.]
[ 38. The head-mounted display system of claim 10, wherein the one or more objects in the field of regard surrounding said user but outside the vision field of said user are perceivable by the one or more environmental sensors based on sensor location and sensor field of view.]
[ 39. The head-mounted display system of claim 10, wherein the field of regard includes substantially all of a 4π steradian solid angle surrounding said user.]
[ 40. The head-mounted display system of claim 10, wherein alerting said user of an object or an event occurring in the field of regard but outside the vision field of said user further includes displaying an icon representing an image of the object or the event outside the vision field of said user in the vision field of said user.]
[ 41. The head-mounted display system of claim 10, wherein alerting said user of an object or an event occurring in the field of regard but outside the vision field of said user further includes an audio alert.]
[ 42. The head-mounted display system of claim 10, wherein the one or more environmental sensors comprise one or more outward-facing image capture devices, a pair of binocular world cameras, a depth sensor, a geolocation sensor, a proximity sensor, a distance measuring device, or a GPS.]
[ 43. The head-mounted display system of claim 10, wherein the one or more environmental sensors comprise a sensor configured to detect a radio signal.]
[ 44. The head-mounted display system of claim 10, wherein the one or more environmental sensors comprise a sensor configured to detect a blue tooth signal from an automobile.]