US 12,272,341 B2
Controllable music generation
Zhihao Ouyang, Los Angeles, CA (US); and Keunwoo Choi, Los Angeles, CA (US)
Assigned to LEMON INC., Grand Cayman (KY)
Filed by LEMON INC., Grand Cayman (KY)
Filed on Nov. 8, 2021, as Appl. No. 17/521,435.
Prior Publication US 2023/0147185 A1, May 11, 2023
Int. Cl. G10H 1/00 (2006.01); G06N 20/00 (2019.01)
CPC G10H 1/0025 (2013.01) [G06N 20/00 (2019.01); G10H 2210/036 (2013.01); G10H 2210/115 (2013.01); G10H 2210/571 (2013.01); G10H 2250/005 (2013.01)] 20 Claims
OG exemplary drawing
 
1. A method, comprising:
extracting latent vectors from unlabelled data by an encoder, the unlabelled data comprising a plurality of music note sequences, the plurality of music note sequences indicating a plurality of pieces of music;
clustering the latent vectors into a plurality of classes corresponding to a plurality of music styles by a machine learning model;
generating a plurality of labelled latent vectors corresponding to the plurality of music styles by the machine learning model, each of the plurality of labelled latent vectors comprising information indicating features of a corresponding music style among the plurality of music styles; and
generating, by a decoder, a first music note sequence indicating a first piece of music in a particular music style among the plurality of music styles based at least in part on a particular labelled latent vector among the plurality of labelled latent vectors, the particular labelled latent vector corresponding to the particular music style.