US 11,989,938 B2
Real-time tracking-compensated image effects
Samuel Edward Hare, Los Angeles, CA (US); Fedir Poliakov, Marina Del Rey, CA (US); Guohui Wang, Los Angeles, CA (US); Xuehan Xiong, Los Angeles, CA (US); Jianchao Yang, Los Angeles, CA (US); Linjie Yang, Los Angeles, CA (US); and Shah Tanmay Anilkumar, Los Angeles, CA (US)
Assigned to Snap Inc., Santa Monica, CA (US)
Filed by Snap Inc., Santa Monica, CA (US)
Filed on May 4, 2023, as Appl. No. 18/312,479.
Application 18/312,479 is a continuation of application No. 17/248,393, filed on Jan. 22, 2021, granted, now 11,676,381.
Application 17/248,393 is a continuation of application No. 16/654,898, filed on Oct. 16, 2019, granted, now 10,929,673.
Application 16/654,898 is a continuation of application No. 15/706,096, filed on Sep. 15, 2017, granted, now 10,474,900.
Prior Publication US 2023/0274543 A1, Aug. 31, 2023
Int. Cl. G06V 20/40 (2022.01); G06T 1/20 (2006.01); G06T 7/246 (2017.01)
CPC G06V 20/40 (2022.01) [G06T 1/20 (2013.01); G06T 7/248 (2017.01); G06V 20/46 (2022.01); G06T 2200/28 (2013.01); G06T 2207/10016 (2013.01); G06T 2207/20081 (2013.01)] 20 Claims
OG exemplary drawing
 
1. A method comprising:
generating, using one or more processors of a device, a video sequence comprising a plurality of frames;
detecting a lag of an editing engine configured to apply a machine learning scheme to the video sequence, a previous frame having a corresponding modified previous frame and a current frame that does not have a corresponding modified current frame;
in response to detecting the lag, generating a map between the previous frame and the current frame; and
generating the corresponding modified current frame by applying the map to the modified previous frame.