US 12,452,384 B2
User interface for pose driven virtual effects
Amir Alavi, Los Angeles, CA (US); Olha Rykhliuk, Marina Del Rey, CA (US); Xintong Shi, Los Angeles, CA (US); Jonathan Solichin, Arcadia, CA (US); Olesia Voronova, Santa Monica, CA (US); and Artem Yagodin, Playa del Rey, CA (US)
Assigned to Snap Inc., Santa Monica, CA (US)
Filed by Snap Inc., Santa Monica, CA (US)
Filed on Nov. 27, 2023, as Appl. No. 18/520,255.
Application 18/520,255 is a continuation of application No. 17/445,043, filed on Aug. 13, 2021, granted, now 11,832,015.
Claims priority of provisional application 62/706,391, filed on Aug. 13, 2020.
Prior Publication US 2024/0098211 A1, Mar. 21, 2024
This patent is subject to a terminal disclaimer.
Int. Cl. H04N 5/262 (2006.01); G06F 3/01 (2006.01); G06T 7/246 (2017.01); G06T 11/00 (2006.01); G06V 40/20 (2022.01); H04N 23/611 (2023.01); H04N 23/63 (2023.01)
CPC H04N 5/2621 (2013.01) [G06F 3/017 (2013.01); G06T 7/251 (2017.01); G06T 11/00 (2013.01); G06V 40/28 (2022.01); H04N 23/611 (2023.01); H04N 23/631 (2023.01); G06T 2207/10016 (2013.01); G06T 2207/30196 (2013.01)] 20 Claims
OG exemplary drawing
 
1. A method comprising:
causing display of a skeletal pose tracking system on a graphical user interface of a computing device, the graphical user interface comprising an image of a human body;
receiving, from the computing device, a selection of one or more regions of the human body;
receiving augmented reality effect data for each of the one or more regions of the human body, the augmented reality effect data comprising a virtual effect, a trigger condition and an accuracy threshold; and
causing modification of a video comprising a human user based on the received augmented reality effect data.