US 12,131,260 B2
Discriminative cosine embedding in machine learning
Marios Savvides, Pittsburgh, PA (US); and Dipan Kumar Pal, Pittsburgh, PA (US)
Assigned to Carnegie Mellon University, Pittsburgh, PA (US)
Filed by Carnegie Mellon University, Pittsburgh, PA (US)
Filed on Apr. 10, 2023, as Appl. No. 18/132,509.
Application 18/132,509 is a continuation of application No. 16/299,498, filed on Mar. 12, 2019, granted, now 11,636,344.
Claims priority of provisional application 62/761,144, filed on Mar. 12, 2018.
Prior Publication US 2023/0281454 A1, Sep. 7, 2023
This patent is subject to a terminal disclaimer.
Int. Cl. G06N 3/084 (2023.01); G06F 17/16 (2006.01); G06N 20/00 (2019.01)
CPC G06N 3/084 (2013.01) [G06F 17/16 (2013.01); G06N 20/00 (2019.01)] 11 Claims
OG exemplary drawing
 
1. A method, in a deep neural network, for training the network to learn an increased discrimination of feature vectors comprising:
inputting a batch of training samples to the deep neural network;
receiving a batch of feature vectors generated by the deep neural network; and
refining the deep neural network by backpropagating a loss representing differences between the batch of feature vectors and ground truth results into the deep neural network, the loss determined by:
a primary loss function; and
a Copernican loss (Lc) comprising:
a planetary loss function that minimizes intra-class variation of the feature vectors by minimizing a cosine distance of the feature vectors to their corresponding class centers; and
a sun loss function that maximizes the inter-class variation of the feature vectors by maximizing a cosine distance of the feature vectors away from a mean of the batch of training samples;
wherein the Copernican loss is given by:
Lc=Lprimary+λLn+Ls
where:
λ is a loss weight;
LPrimary is the primary loss function;
Lp is the planetary loss function; and
Ls is the sun loss function.