US 12,235,446 B2
Wearable image manipulation and control system with high resolution micro-displays and dynamic opacity augmentation in augmented reality glasses
Michael Hayes Freeman, Tulsa, OK (US); Richard C. Freeman, Tulsa, OK (US); Mitchael C. Freeman, Sapulpa, OK (US); Chad Boss, Tulsa, OK (US); Jordan Boss, Tulsa, OK (US); Brian Santee, Tulsa, OK (US); and David Cary, Tulsa, OK (US)
Assigned to RAYTRX, LLC, Tulsa, OK (US)
Filed by Raytrx, LLC, Tulsa, OK (US)
Filed on May 25, 2021, as Appl. No. 17/329,549.
Application 17/329,549 is a continuation of application No. 16/511,451, filed on Jul. 15, 2019, granted, now 11,016,302, issued on May 25, 2021.
Application 17/329,549 is a continuation of application No. 16/511,202, filed on Jul. 15, 2019, granted, now 11,461,936.
Application 17/329,549 is a continuation of application No. 15/073,144, filed on Mar. 17, 2016, granted, now 9,955,862, issued on May 1, 2018.
Application 17/329,549 is a continuation of application No. 15/940,561, filed on Mar. 29, 2018, granted, now 10,111,583, issued on Oct. 30, 2018.
Application 17/329,549 is a continuation of application No. 16/173,719, filed on Oct. 29, 2018, granted, now 10,874,297, issued on Dec. 29, 2020.
Application 17/329,549 is a continuation of application No. 15/962,661, filed on Apr. 25, 2018, granted, now 11,956,414.
Claims priority of provisional application 62/697,854, filed on Jul. 13, 2018.
Claims priority of provisional application 62/489,801, filed on Apr. 25, 2017.
Claims priority of provisional application 62/134,422, filed on Mar. 17, 2015.
Prior Publication US 2021/0382311 A1, Dec. 9, 2021
Int. Cl. G02B 27/01 (2006.01); G06F 3/01 (2006.01); G09G 5/00 (2006.01); G09G 5/377 (2006.01)
CPC G02B 27/0172 (2013.01) [G06F 3/013 (2013.01); G09G 5/003 (2013.01); G09G 5/377 (2013.01); G02B 2027/0118 (2013.01); G02B 2027/0138 (2013.01); G09G 2354/00 (2013.01)] 8 Claims
OG exemplary drawing
 
1. An augmented reality headset system comprising:
one or more camera;
at least one lens subsystem including:
a transparent layer with a first surface and an opposed second surface;
a reflective coating applied to the first surface of the transparent layer; and
an alpha matte layer abutting the second surface of the transparent layer, the alpha matte layer comprising a cholesteric liquid crystal layer comprising a plurality of pixels of cholesteric liquid crystal, where each pixel is capable of independently becoming opaque or transparent;
one or more micro-displays capable of projecting images onto the reflective coating;
an eye-tracking subsystem capable of identifying a position of each eye;
one or more SLAM sensors capable of detecting a distance of a target object; and
a central processing unit in communication with and capable of controlling the cameras, micro-displays, the at least one lens subsystem, eye-tracking subsystem, and SLAM sensors;
where the central processing unit is programmed to execute an algorithm including the steps of:
displaying differential images on the lenses according to the position of each eye as identified by the eye-tracking subsystem based on a focal point of the target object corrected by epipolar geometry triangulation, allowing a user to see a corrected image in each eye as it relates to the distance of the target object for 3D stereo vision; and
displaying a plurality of pixels opaque in the portion of the alpha matte layer aligning with the portion of the reflective coating onto which the differential images are projected, while pixels aligning with any portion of the reflective coating onto which no image is projected remains transparent.