US 12,008,719 B2
Wide area augmented reality location-based services
David McKinnon, San Francisco, CA (US); Kamil Wnuk, Playa del Rey, CA (US); Jeremi Sudol, New York, NY (US); Matheen Siddiqui, Culver City, CA (US); John Wiacek, Los Angeles, CA (US); Bing Song, La Canada, CA (US); and Nicholas J. Witchey, Laguna Hills, CA (US)
Assigned to Nant Holdings IP, LLC, Culver City, CA (US)
Filed by Nant Holdings IP, LLC, Culver City, CA (US)
Filed on Jan. 28, 2022, as Appl. No. 17/587,183.
Application 17/587,183 is a continuation of application No. 16/864,075, filed on Apr. 30, 2020, granted, now 11,392,636.
Application 16/864,075 is a continuation of application No. 16/168,419, filed on Oct. 23, 2018, granted, now 10,664,518, issued on May 26, 2020.
Application 16/168,419 is a continuation of application No. 15/794,993, filed on Oct. 26, 2017, granted, now 10,140,317, issued on Nov. 27, 2018.
Application 15/794,993 is a continuation of application No. 15/406,146, filed on Jan. 13, 2017, granted, now 9,817,848, issued on Nov. 14, 2017.
Application 15/406,146 is a continuation of application No. 14/517,728, filed on Oct. 17, 2014, granted, now 9,582,516, issued on Feb. 28, 2017.
Claims priority of provisional application 61/892,238, filed on Oct. 17, 2013.
Prior Publication US 2022/0156314 A1, May 19, 2022
This patent is subject to a terminal disclaimer.
Int. Cl. G06T 19/00 (2011.01); G06F 16/29 (2019.01); G06F 16/50 (2019.01); G06F 16/58 (2019.01); G06F 16/583 (2019.01); G06F 16/9535 (2019.01); G06T 15/20 (2011.01)
CPC G06T 19/003 (2013.01) [G06F 16/29 (2019.01); G06F 16/50 (2019.01); G06F 16/58 (2019.01); G06F 16/5854 (2019.01); G06F 16/5866 (2019.01); G06F 16/9535 (2019.01); G06T 15/20 (2013.01); G06T 19/006 (2013.01); H05K 999/99 (2013.01); G06T 2219/024 (2013.01)] 42 Claims
OG exemplary drawing
 
1. A device capable of rendering augmented reality (AR), the device comprising:
a plurality of sensors, including a camera and a location sensor;
a display;
a non-transitory computer readable memory storing software instructions; and
at least one processor coupled with the non-transitory computer readable memory, the plurality of sensors, and the display; and, upon execution of the software instructions, is configurable to:
obtain sensor data from at least one sensor wherein the sensor data corresponds to a real-time perspective of a user, and includes image data from the camera and a device location obtained from the location sensor;
obtain an area of interest via an area database based on the sensor data;
access an area tile map of the area of interest, the area tile map represented by a set of tile subareas that includes one or more tessellated tiles from a tessellated tile map;
identify a tile subarea from the set of tile subareas based at least in part on the device location relative to one or more locations of tile subareas from the set of tile subareas, wherein the identified tile subarea covers at least a portion of the area of interest, and wherein one or more tessellated tiles within the identified tile subarea are associated with one or more AR content objects;
populate the non-transitory computer readable memory with at least one of the one or more AR content objects associated with the one or more tessellated tiles within the identified tile subarea; and
render the at least one of the one or more AR content objects that is associated with the identified tile subarea on the display as a visual overlay of a real-world image generated from the image data.