US 11,855,970 B2
Systems and methods for blind multimodal learning
Gharib Gharibi, Overland Park, MO (US); Greg Storm, Kansas City, MO (US); Ravi Patel, Kansas City, MO (US); and Riddhiman Das, Parkville, MO (US)
Assigned to TripleBlind, Inc., Kansas City, MO (US)
Filed by TripleBlind, Inc., Kansas City, MO (US)
Filed on Sep. 7, 2022, as Appl. No. 17/939,351.
Application 17/939,351 is a continuation of application No. 17/743,887, filed on May 13, 2022, granted, now 11,531,782.
Application 17/743,887 is a continuation of application No. 17/742,808, filed on May 12, 2022, granted, now 11,599,671.
Application 17/742,808 is a continuation of application No. 17/180,475, filed on Feb. 19, 2021.
Application 17/180,475 is a continuation in part of application No. 17/176,530, filed on Feb. 16, 2021.
Application 17/176,530 is a continuation in part of application No. 16/828,085, filed on Mar. 24, 2020, granted, now 11,582,203.
Application 17/176,530 is a continuation of application No. 16/828,354, filed on Mar. 24, 2020, granted, now 10,924,460, issued on Feb. 16, 2021.
Application 17/180,475 is a continuation in part of application No. 16/828,420, filed on Mar. 24, 2020, granted, now 11,363,002, issued on Jun. 14, 2022.
Application 16/828,420 is a continuation in part of application No. 16/828,216, filed on Mar. 24, 2020.
Claims priority of provisional application 63/241,255, filed on Sep. 7, 2021.
Claims priority of provisional application 63/020,930, filed on May 6, 2020.
Claims priority of provisional application 62/948,105, filed on Dec. 13, 2019.
Prior Publication US 2023/0074339 A1, Mar. 9, 2023
Int. Cl. H04L 9/40 (2022.01); G06N 3/082 (2023.01); H04L 9/00 (2022.01); G06F 17/16 (2006.01); G06N 3/04 (2023.01); H04L 9/06 (2006.01); G06F 18/24 (2023.01); G06F 18/2113 (2023.01); G06N 3/098 (2023.01); G06N 3/048 (2023.01); G06F 16/13 (2019.01); G06F 21/62 (2013.01)
CPC H04L 63/0428 (2013.01) [G06F 16/13 (2019.01); G06F 17/16 (2013.01); G06F 18/2113 (2023.01); G06F 18/24 (2023.01); G06F 21/6245 (2013.01); G06N 3/04 (2013.01); G06N 3/048 (2023.01); G06N 3/082 (2013.01); G06N 3/098 (2023.01); H04L 9/008 (2013.01); H04L 9/0625 (2013.01); H04L 2209/46 (2013.01)] 19 Claims
OG exemplary drawing
 
1. A method comprising:
creating, at a server device and based on assembled data from n client devices, a neural network having n bottom portions and a top portion, wherein the assembled data comprises different types of data;
transmitting, from the server device, each respective bottom portion of the n bottom portions to a respective client device of n client devices;
during a training iteration for training the neural network:
accepting, at the server device, a respective output from each respective bottom portion of the neural network to yield a plurality of respective outputs;
joining the plurality of respective outputs at a fusion layer on the server device to generate fused respective outputs;
passing the fused respective outputs to the top portion of the neural network;
carrying out a forward propagation step at the top portion of neural network;
calculating a loss value after the forward propagation step;
calculating a set of gradients of the loss value with respect to server-side model parameters; and
passing, from the server device, respective subsets of set of gradients of the fusion layer from the server device to a respective client device of the n client devices, wherein each of the n client devices calculates a local set of gradients corresponding to their local models, which is used to update local parameters associated with respective local models on the respective client device to yield a respective trained bottom portion of the neural network; and
after training, receiving and combining the respective trained bottom portion of the neural network from each respective client device into a combined model.