US 12,327,324 B2
Image processing apparatus, image processing method, and program
Akihiko Kaino, Kanagawa (JP); Masaki Fukuchi, Tokyo (JP); Tatsuki Kashitani, Tokyo (JP); Kenichiro Ooi, Kanagawa (JP); and Jingjing Guo, Tokyo (JP)
Assigned to SONY GROUP CORPORATION, Tokyo (JP)
Filed by Sony Group Corporation, Tokyo (JP)
Filed on Feb. 29, 2024, as Appl. No. 18/591,011.
Application 18/591,011 is a continuation of application No. 17/938,957, filed on Sep. 7, 2022, granted, now 11,941,766.
Application 17/938,957 is a continuation of application No. 17/140,144, filed on Jan. 4, 2021, granted, now 11,468,647, issued on Oct. 11, 2022.
Application 17/140,144 is a continuation of application No. 16/587,070, filed on Sep. 30, 2019, granted, now 10,902,682, issued on Jan. 26, 2021.
Application 16/587,070 is a continuation of application No. 16/051,893, filed on Aug. 1, 2018, granted, now 10,453,266, issued on Oct. 22, 2019.
Application 16/051,893 is a continuation of application No. 15/459,711, filed on Mar. 15, 2017, granted, now 10,068,382, issued on Sep. 4, 2018.
Application 15/459,711 is a continuation of application No. 14/994,950, filed on Jan. 13, 2016, granted, now 9,626,806, issued on Apr. 18, 2017.
Application 14/994,950 is a continuation of application No. 13/824,140, granted, now 9,292,974, issued on Mar. 22, 2016, previously published as PCT/JP2012/005582, filed on Sep. 4, 2012.
Claims priority of application No. 2011-235749 (JP), filed on Oct. 27, 2011.
Prior Publication US 2024/0203070 A1, Jun. 20, 2024
Int. Cl. G06T 19/00 (2011.01); G06T 7/70 (2017.01); G06T 7/73 (2017.01); G06T 19/20 (2011.01); H04N 7/18 (2006.01)
CPC G06T 19/006 (2013.01) [G06T 7/70 (2017.01); G06T 7/73 (2017.01); G06T 19/20 (2013.01); H04N 7/183 (2013.01); G06T 2207/10021 (2013.01); G06T 2207/30244 (2013.01); G06T 2219/004 (2013.01)] 21 Claims
OG exemplary drawing
 
1. An information processing system, comprising:
circuitry configured to
receive a captured image including a real object in a physical environment,
receive at least one of detected three-dimensional position information and posture information of a camera relative to the real object,
initiate to a display an appearance of an augmented reality (AR) object based on the real object, and
control the display to continue displaying the AR object associated with the real object when the real object in the physical environment is no longer detected on the display based on the detected information.