US 10,043,312 C1 (12,716th)
Rendering techniques to find new map points in augmented or virtual reality systems
Samuel A. Miller, Hollywood, FL (US); and Randall E. Hand, Clinton, MS (US)
Filed by Magic Leap, Inc., Dania Beach, FL (US)
Assigned to CITIBANK, N.A., New York, NY (US)
Reexamination Request No. 90/019,242, Sep. 1, 2023.
Reexamination Certificate for Patent 10,043,312, issued Aug. 7, 2018, Appl. No. 14/705,994, May 7, 2015.
Application 90/019,242 is a continuation of application No. 14/690,401, filed on Apr. 18, 2015, granted, now 10,262,462.
Application 14/690,401 is a continuation in part of application No. 14/331,218, filed on Jul. 14, 2014, granted, now 9,671,566.
Claims priority of provisional application 62/012,273, filed on Jun. 14, 2014.
Claims priority of provisional application 61/981,701, filed on Apr. 18, 2014.
Ex Parte Reexamination Certificate issued on Sep. 25, 2024.
Int. Cl. G06T 19/00 (2011.01); A63F 13/56 (2014.01); A63F 13/57 (2014.01); A63F 13/577 (2014.01); G02B 27/00 (2006.01); G02B 27/01 (2006.01); G06F 3/00 (2006.01); G06F 3/01 (2006.01); G06F 3/16 (2006.01); G06Q 30/02 (2023.01); G06T 7/60 (2017.01); G06T 7/70 (2017.01); G06T 11/60 (2006.01); G06T 13/40 (2011.01); G06T 13/80 (2011.01); G06T 15/10 (2011.01); G06V 20/10 (2022.01); G06V 20/20 (2022.01); G06V 20/64 (2022.01); G06V 20/80 (2022.01); H04S 7/00 (2006.01)
CPC G06T 19/006 (2013.01) [A63F 13/56 (2014.09); A63F 13/57 (2014.09); A63F 13/577 (2014.09); G02B 27/0093 (2013.01); G02B 27/0101 (2013.01); G02B 27/017 (2013.01); G02B 27/0172 (2013.01); G02B 27/0179 (2013.01); G06F 3/005 (2013.01); G06F 3/011 (2013.01); G06F 3/012 (2013.01); G06F 3/013 (2013.01); G06F 3/014 (2013.01); G06F 3/016 (2013.01); G06F 3/017 (2013.01); G06F 3/16 (2013.01); G06F 3/167 (2013.01); G06Q 30/02 (2013.01); G06T 7/60 (2013.01); G06T 7/70 (2017.01); G06T 11/60 (2013.01); G06T 13/40 (2013.01); G06T 13/80 (2013.01); G06T 15/10 (2013.01); G06T 19/003 (2013.01); G06V 20/10 (2022.01); G06V 20/20 (2022.01); G06V 20/653 (2022.01); G06V 20/80 (2022.01); H04S 7/304 (2013.01); G02B 2027/0138 (2013.01); G02B 2027/014 (2013.01); G02B 2027/0178 (2013.01); G02B 2027/0187 (2013.01); G06T 2200/04 (2013.01); G06T 2213/08 (2013.01); G06T 2215/16 (2013.01); G06T 2219/024 (2013.01); H04S 2400/11 (2013.01); H04S 2400/15 (2013.01)]
OG exemplary drawing
AS A RESULT OF REEXAMINATION, IT HAS BEEN DETERMINED THAT:
Claims 1-19 are determined to be patentable as amended.
New claims 20-21 are added and determined to be patentable.
1. An augmented reality system, comprising:
one or more sensors to capture a set of map points pertaining to the real world,
wherein
the set of map points are captured through a plurality of augmented reality systems [ , and
a portion of the augmented reality system is head-wearable or mountable] ; and
[ a first printed circuit board that functions in conjunction with ] a processor [ that executes instructions ] to
determine a position of a plurality of keyframes that captured the set of map points,
render lines from the determined positions of the plurality of keyframes to respective map points captured from the plurality of keyframes,
identify points of intersection between the rendered lines, and to determine a set of new map points based at least in part on [ likelihood of existence of a new map point, wherein
the likelihood of existence of the new map point is determined based at least in part upon a total number of rendered lines that intersect at ] the identified points of intersection. [ new map point; and
a second printed circuit board that is operatively coupled to an inward-facing image sensor for tracking an eye of a user of the augmented reality system and is connected to the first printed circuit board.]
2. The augmented reality system of claim 1, [ wherein the processor further executes the instructions to:
respectively render a cone of a plurality of cones from each determined position of the determined positions to each respective map point of the respective map points;
project the cone to determine a bisector of the cone with a first projected side of the cone projected on a first side of the bisector and a second projected side of the cone projected on a second side of the bisector; and
determine a half angle to the first and the second projected sides by a pixel pitch of an outward-facing image sensor of the one or more sensors of the augmented reality system, ] wherein
the set of new map points are determined based at least in part on a [ the ] pixel pitch corresponding to the identified points of intersection.
3. The augmented reality system of claim 1, [ the processor further executing the instructions to:
determine a total number of pixel coordinates of a point based at least in part upon a total number of rays intersecting at the point, ] wherein
[ the total number of pixel coordinates associated with the point indicates likelihood that the point is to be included in ] the set of new map points are [ that is ] added to a map of the real world.
4. The augmented reality system of claim 3, [ further comprising:
one or more outward-facing image sensors that are communicatively coupled to the second printed circuit board that comprises an audio sensor connector;
a microphone that is operatively coupled to the second printed circuit board through the audio sensor connector; and
a belt pack that is separate from and is operatively coupled to a head-worn portion of the augmented reality system with a cable, ] wherein
[ the one or more outward-facing image sensors are integrated into the head-worn portion of the augmented reality system,
the head-worn portion of the augmented reality system is wearable by a user,
a first cone of a plurality of cones rendered from a first keyframe that is farther away is larger than a second cone of the plurality of cones rendered from a second keyframe that is closer to a user of the augmented reality system, and]
the virtual content is delivered to one or more augmented reality display systems based at least in part on the map of the real world.
5. An augmented reality system, comprising:
one or more sensors to capture a set of map points pertaining to the real world, wherein the set of map points are captured through a plurality of augmented reality systems; and
[ a first printed circuit board that functions in conjunction with ] a processor [ that executes instructions ] to
determine a position of a [ for each keyframe of ] plurality of keyframes that captured the set of map points,
render triangular cones from the determined positions of the plurality of keyframes to respective map points captured from the plurality of keyframes, wherein
the captured map points lie on [ respective ] bisectors of the triangular cones and determine a set of new map points based at least in part on the rendered triangular cones . [ that have been rendered,
the set of new map points is determined based at least in part upon likelihood of existence of a new map point, and
the likelihood for the new map point is determined based at least in part upon a total number of the triangular cones that have been rendered and intersect at the new map point; and
a second printed circuit board that is operatively coupled to an inward-facing image sensor for tracking an eye of a user of the augmented reality system and is connected to the first printed circuit board.]
6. The augmented reality system of claim 5, [ the processor further executing the instructions to:
determine a new virtual keyframe based at least in part upon a viewpoint for the virtual keyframe that is used to identify a set of new map points based at least in part upon a plurality of existing features in the plurality of keyframes; and
position the new virtual keyframe based at least in part upon a characteristic of the plurality of keyframes, ] wherein
the processor selectively shades the triangular cones such that [ a bisector of ] the bisectors of the triangular cones are [ is ] the brightest portions [ portion ] of [ a corresponding triangular cone of ] the triangular cones.
7. The augmented reality system of claim 6, [ the processor further executing the instructions to:
identify one or more points of intersection as the set of new map points from a viewpoint of the new virtual keyframe that has been positioned relative to the plurality of keyframes; and
render a feature buffer for the triangular cones from each keyframe of the plurality of keyframes to each feature of a plurality of features;
select a candidate location of one or more features based at least in part upon a characteristic of the candidate location stored in the feature buffer;
store a virtual feature for the candidate location;
create a mask radius indicative of an error of a virtual camera around the candidate location, ] wherein
the processor identifies [ the one or more ] points of intersection between at least two rendered triangular cones, wherein [ and]
the set of new map points are based at least in part on the identified [ one or more ] points of intersection.
8. The augmented reality system of claim 7, [ the processor further executing the instructions to render a label buffer and the feature buffer that are used in rendering the triangular cones from the determined positions of the plurality of keyframes to the respective map points, ] wherein the set of new map points are determined based at least in part on the [ respective ] brightness of the identified [ one or more ] points of intersection [ , and the respective brightness of the one or more points of intersection] .
9. The augmented reality system of claim 7, [ further comprising:
masking a label buffer into a masked label buffer with at least the mask radius;
filtering a poorly conditioned ray by using at least the masked radius at least by collecting a triangular cone where multiple sides of the triangular cone are captured inside a circle corresponding to the masked radius, ] wherein
the set of new map points are determined based at least in part on a pixel pitch corresponding to the identified [ one or more ] points of intersection.
10. The augmented reality system of claim 5, wherein the set of new map points are added to a map of the real world [ , and the processor further executes the instructions to recognize an object in a passable world model using a first type of object recognizer and a second type of object recognizer both of which run on the set of new map points] .
11. The augmented reality system of claim 10, [ further comprising:
one or more outward-facing image sensors that are communicatively coupled to the second printed circuit board that comprises an audio connector;
a microphone that is operatively coupled to the second printed circuit board through the audio connector;
a belt pack that is separate from and is operatively coupled to a head-worn portion of the augmented reality system with a cable, wherein
the one or more outward-facing image sensors are integrated into the head-worn portion of the augmented reality system, and
the head-worn portion of the augmented reality system is wearable by a user; and
the processor further executing the instructions to associate the first type of object recognizer or the second type of object recognizer with an application in which identification of the object is useful for the application,] wherein
the virtual content is delivered to one or more augmented reality display systems based at least in part on the map of the real world [ , and
recognizing at least some data of the object programmatically invokes the application to execute on the extended-reality device] .
12. An augmented reality system, comprising:
one or more sensors to capture a set of map points pertaining to the real world, wherein the set of map points are captured through a plurality of augmented reality systems; and
[ a first printed circuit board (PCB) comprising ] a processor [ and a first board-to-board connection, the processor that executes instructions ] to
determine a position of a plurality of keyframes that captured the set of map points,
place a virtual keyframe in relation to an existing set of keyframes, and
determine a set of new map points based at least in part on the virtual keyframe, wherein
the processor determines a most orthogonal direction to the existing set of keyframes, and positions the virtual keyframe at the determined [ most ] orthogonal direction . [ ,
the set of new map points is determined based at least in part upon likelihood of existence of a new map point, and
the likelihood for the new map point is determined based at least in part upon a total number of the triangular cones that have been rendered and intersect at the new map point; and
a second printed circuit board that is operatively coupled to an inward-facing image sensor for tracking an eye of a user of the augmented reality system and is connected to the first printed circuit board.]
13. The augmented reality system of claim 12, [ the processor further executing the instructions to select the virtual keyframe based at least in part upon a viewpoint for the virtual keyframe that is used to identify the set of new map points from the plurality of keyframes, ] wherein the most orthogonal direction is determined along one of an x coordinate [ axis] , a y coordinate [ axis] , and a z coordinate [ axis] .
14. The augmented reality system of claim 12, [ the processor further executing the instructions to render a feature buffer for rendering lines for determining the set of new map points, ] wherein the processor renders [ the ] lines from the virtual keyframe to the set of map points, and determines the [ set of ] new map points based at least in part on one or more points of intersection of the rendered lines.
15. The augmented reality system of claim 14, [ the processor further executing the instructions to select a candidate location of one or more features based at least in part upon a characteristic of the candidate location stored in the feature buffer; store a virtual feature for the candidate location; and create a mask radius indicative of an error of a virtual camera around the candidate location, ] wherein the processor applies a summing [ the feature ] buffer to determine the [ one or more ] points of intersection.
16. The augmented reality system of claim 12, wherein the processor renders triangular cones from the virtual keyframe to the set of map points, and determines [ the processor further executes the instructions to render a feature buffer and a label buffer that are used in rendering lines from the position determined for each keyframe of the plurality of keyframes to each map point of the set of map points and to determine ] the [ set of ] new map points based at least in part on one or more points of intersection [ of the lines] .
17. The augmented reality system of claim 16, wherein the processor performs a bundle adjust to correct a location of a new map point of the set of new map points [ , wherein the bundle adjust alleviates a stress caused by the location of the new map point and a geometric map at least by radially pushing the stress out in one or more waves that propagate from the location of the new map point to a network of nodes that surround the location of the new map point, and the geometric map represents the plurality of keyframes as respective nodes and connects the respective nodes with a plurality of edges, and a thickness of an edge indicates strength of the edge] .
18. The augmented reality system of 11, [ further comprising:
one or more outward-facing image sensors that are communicatively coupled to the second printed circuit board that comprises an audio connector; and
a microphone that is operatively coupled to the second printed circuit board through the audio connector, ] wherein
the set of new map points are added to a map of the real world [ ,
the one or more outward-facing image sensors are integrated into a head-worn portion of the augmented reality system, and
the head-worn portion of the augmented reality system is wearable by a user] .
19. The augmented reality system of claim 18, [ further comprising a belt pack that is separate from and is operatively coupled to the head-worn portion of the augmented reality system with a cable, ] wherein [ the belt pack comprises a battery, and ] virtual content is delivered to one or more augmented reality display systems based at least in part on the [ a ] map of the real world.
[ 20. The augmented reality system of claim 1, further comprising:
a photosensor responsive to electromagnetic energy in an infrared range; and
an eye sensor that is positioned near a bottom of a frame of a head-wearable portion of the augmented reality system and is used to track movement of an eye of a user wearing at least the head-wearable portion of the augmented reality system.]
[ 21. The augmented reality system of claim 20, further comprising a depth sensor, wherein the processor further executes the instructions to:
detect a user gesture by using at least the depth sensor;
encoding a Z-buffer in a color space;
encode a plurality of depth planes as respective layer-masks in individual color channels; and
blend multiple levels of depth data into a single frame.]