US 11,670,059 B2
Controlling interactive fashion based on body gestures
Itamar Berger, Hod Hasharon (IL); Gal Dudovitch, Tel Aviv (IL); Gal Sasson, Kibbutz Ayyelet Hashahar (IL); Ma'ayan Shuvi, Tel Aviv (IL); and Matan Zohar, Rishon LeZion (IL)
Assigned to SNAP INC., Santa Monica, CA (US)
Filed by Snap Inc., Santa Monica, CA (US)
Filed on Sep. 1, 2021, as Appl. No. 17/446,691.
Prior Publication US 2023/0065031 A1, Mar. 2, 2023
Int. Cl. G06T 19/00 (2011.01)
CPC G06T 19/006 (2013.01) 20 Claims
OG exemplary drawing
 
1. A method comprising:
receiving, by one or more processors of a client device, a video that includes a depiction of a person wearing a fashion item;
generating, by the one or more processors, a segmentation of the fashion item worn by the person depicted in the video;
applying one or more augmented reality elements to the fashion item worn by the person based on the segmentation of the fashion item worn by the person;
detecting a gesture performed by the person in the video;
determining that the gesture corresponds to a position of a body part of the person overlapping a given portion of the segmentation of the fashion item; and
in response to determining that the position of the body part of the person overlaps the given portion of the segmentation of the fashion item, adjusting one or more attributes of the one or more augmented reality elements that have been applied to the fashion item worn by the person.