US 12,112,441 B2
Content transformations based on reflective object recognition
Yutaka Yokokawa, Belmont, CA (US); Devin W. Chalmers, Oakland, CA (US); Brian W. Temple, Santa Clara, CA (US); Rahul Nair, Santa Clara, CA (US); and Thomas G. Salter, Foster City, CA (US)
Assigned to Apple Inc., Cupertino, CA (US)
Filed by Apple Inc., Cupertino, CA (US)
Filed on Jun. 27, 2023, as Appl. No. 18/214,575.
Claims priority of provisional application 63/357,503, filed on Jun. 30, 2022.
Prior Publication US 2024/0005612 A1, Jan. 4, 2024
Int. Cl. G06F 3/01 (2006.01); G06T 7/246 (2017.01); G06T 7/73 (2017.01); G06T 19/00 (2011.01); G06V 20/50 (2022.01); H04S 7/00 (2006.01)
CPC G06T 19/006 (2013.01) [G06F 3/012 (2013.01); G06T 7/246 (2017.01); G06T 7/73 (2017.01); G06V 20/50 (2022.01); H04S 7/303 (2013.01); G06T 2207/10028 (2013.01); G06T 2207/30201 (2013.01); H04S 2400/11 (2013.01)] 23 Claims
OG exemplary drawing
 
1. A method comprising:
at an electronic device having a processor and a sensor:
obtaining sensor data from the sensor of the electronic device in a physical environment that includes one or more objects;
detecting a reflective surface of a reflective object amongst the one or more objects based on the sensor data and detecting a reflection of the electronic device upon the reflective surface of the reflective object;
determining a three-dimensional (3D) position of the reflective object in the physical environment based on determining a 3D position of the reflection of the electronic device; and
presenting virtual content within a view of an extended reality (XR) environment that includes a view of the physical environment, wherein the virtual content is positioned at a 3D location within the view of the XR environment based on the determined 3D position of the reflective object in the physical environment.