US 12,406,441 B2
Wide area augmented reality location-based services
David McKinnon, San Francisco, CA (US); Kamil Wnuk, Playa del Rey, CA (US); Jeremi Sudol, New York, NY (US); Matheen Siddiqui, Culver City, CA (US); John Wiacek, Los Angeles, CA (US); Bing Song, La Canada, CA (US); and Nicholas J. Witchey, Laguna Hills, CA (US)
Assigned to Nant Holdings IP, LLC, Culver City, CA (US)
Filed by Nant Holdings IP, LLC, Culver City, CA (US)
Filed on Oct. 11, 2023, as Appl. No. 18/378,977.
Application 18/378,977 is a continuation of application No. 17/587,183, filed on Jan. 28, 2022, granted, now 12,008,719.
Application 17/587,183 is a continuation of application No. 16/864,075, filed on Apr. 30, 2020, granted, now 11,392,636, issued on Jul. 19, 2022.
Application 16/864,075 is a continuation of application No. 16/168,419, filed on Oct. 23, 2018, granted, now 10,664,518, issued on May 26, 2020.
Application 16/168,419 is a continuation of application No. 15/794,993, filed on Oct. 26, 2017, granted, now 10,140,317, issued on Nov. 27, 2018.
Application 15/794,993 is a continuation of application No. 15/406,146, filed on Jan. 13, 2017, granted, now 9,817,848, issued on Nov. 14, 2017.
Application 15/406,146 is a continuation of application No. 14/517,728, filed on Oct. 17, 2014, granted, now 9,582,516, issued on Feb. 28, 2017.
Claims priority of provisional application 61/892,238, filed on Oct. 17, 2013.
Prior Publication US 2024/0037857 A1, Feb. 1, 2024
This patent is subject to a terminal disclaimer.
Int. Cl. G06T 19/00 (2011.01); G06F 16/29 (2019.01); G06F 16/50 (2019.01); G06F 16/58 (2019.01); G06F 16/583 (2019.01); G06F 16/9535 (2019.01); G06T 15/20 (2011.01)
CPC G06T 19/003 (2013.01) [G06F 16/29 (2019.01); G06F 16/50 (2019.01); G06F 16/58 (2019.01); G06F 16/5854 (2019.01); G06F 16/5866 (2019.01); G06F 16/9535 (2019.01); G06T 15/20 (2013.01); G06T 19/006 (2013.01); G06T 2219/024 (2013.01)] 39 Claims
OG exemplary drawing
 
1. A device capable of rendering augmented reality (AR), the device comprising:
at least one sensor, including a location sensor;
a display;
a non-transitory computer-readable memory storing software instructions; and
at least one processor coupled with the non-transitory computer-readable memory, the at least one sensor, and the display; and, upon execution of the software instructions, is configurable to:
obtain sensor data from the at least one sensor wherein the sensor data includes a device location obtained from the location sensor;
obtain an area of interest via an area database based on at least the device location within the sensor data;
access an area tile map of the area of interest, the area tile map represented by a set of tile subareas that includes one or more tessellated tiles from a tessellated tile map;
identify a tile subarea from the set of tile subareas based at least in part on the device location relative to one or more locations of tile subareas from the set of tile subareas, wherein the identified tile subarea covers at least a portion of the area of interest, and wherein one or more tessellated tiles within the identified tile subarea are associated with one or more AR content objects;
populate the non-transitory computer-readable memory with at least one of the one or more AR content objects associated with the one or more tessellated tiles within the identified tile subarea to thereby accommodate capabilities of the non-transitory computer-readable memory via more accurately instantiating the at least one of the one or more AR content objects in the non-transitory computer-readable memory when the device is at or near the one or more tessellated tiles within the identified tile subarea; and
render the at least one of the one or more AR content objects that is associated with the identified tile subarea on the display based on a view of interest.