US RE50,637 E1
Augmented reality using projector-camera enabled devices
Aditi Majumder, Irvine, CA (US); and Behzad Sajadi, New York, NY (US)
Assigned to THE REGENTS OF THE UNIVERSITY OF CALIFORNIA, Oakland, CA (US)
Filed by The Regents of the University of California, Oakland, CA (US)
Filed on Jun. 22, 2017, as Appl. No. 15/630,550.
Application 15/630,550 is a reissue of application No. 13/625,657, filed on Sep. 24, 2012, granted, now 9,064,312, issued on Jun. 23, 2015.
Claims priority of provisional application 61/538,414, filed on Sep. 23, 2011.
Int. Cl. G09G 5/00 (2006.01); G06T 3/00 (2024.01); G06T 3/14 (2024.01); G06T 7/00 (2017.01); G06T 7/521 (2017.01); G09F 19/22 (2006.01); H04N 9/31 (2006.01)
CPC G06T 3/14 (2024.01) [G06T 7/521 (2017.01); G09F 19/226 (2013.01); H04N 9/3147 (2013.01); H04N 9/3185 (2013.01); H04N 9/3194 (2013.01); G06T 2207/10028 (2013.01)] 53 Claims
OG exemplary drawing
 
1. A system for projecting an image of a scene onto an arbitrary surface that is non-planar in both horizontal and vertical directions, the system comprising:
a first un-calibrated camera coupled to [ un-calibrated in relative camera calibration parameters: ]
a first projector, the first un-calibrated camera being disposed to capture at least [ a part of at least ] a first portion of the image projected onto the arbitrary surface by the first projector, wherein the first portion of the image includes a first point;
an adjacent un-calibrated [ a second ] camera coupled to [ un-calibrated in relative camera calibration parameters; ]
a second projector, wherein the adjacent un-calibrated [ second ] camera is disposed to capture at least [ a part of at least ] a second portion of the image projected onto the arbitrary surface by the second projector, wherein the second portion of the image includes the first point;
[ wherein the first camera and the second camera are interchangeably coupled and uncoupled from the first projector and the second projector such that any camera can be coupled to any projector; ] and
a [ one or more ] processing computing device coupled to the first camera configured to [ coupled to a non-transitory computer readably storage medium that includes computer readable code that, when executed by the one or more processing computing devices, causes the one or more processing computing devices to] :
[ perform configuration identification of the projectors and cameras based on an adjacency graph representation in context of overlap in their display space or sensing space, ]
recognize [ reconstruct ] an underlying surface geometry of the arbitrary surface by analyzing data in the at least [ the part of at least the ] first portion of the image captured by the first camera and [ , ] the at least [ the part of at least the ] second portion of the image captured by the adjacent [ second ] camera, [ or a combination thereof, ]
register an overlapping portion of the image of the scene, that is projected by the first projector and projected by the second projector, within a three dimensional coordinate system for display on the arbitrary surface [ based on an analysis of the data in the at least the part of at least the first portion of the image captured by the first camera, the at least the part of at least the second portion of the image captured by the second camera, or a combination thereof] , and
control the at least one of the first projector and the second projector to display the overlapping portion of the image of the registered scene onto the arbitrary surface based on an analysis of the data in the at least first portion of the image captured by the first camera and the at least second portion of the image captured by the adjacent camera.