US 11,792,010 B2
Distributed machine learning via secure multi-party computation and ensemble learning
Ramtin Mehdizadeh Seraj, Burnaby (CA); and Nicholas Chow, Vancouver (CA)
Assigned to Dapper Labs, Inc., Vancouver (CA)
Filed by Dapper Labs Inc., Vancouver (CA)
Filed on Feb. 8, 2022, as Appl. No. 17/666,801.
Application 17/666,801 is a continuation of application No. 17/363,615, filed on Jun. 30, 2021, granted, now 11,265,166.
Claims priority of provisional application 63/046,362, filed on Jun. 30, 2020.
Prior Publication US 2022/0166624 A1, May 26, 2022
This patent is subject to a terminal disclaimer.
Int. Cl. G06F 21/00 (2013.01); G06N 20/00 (2019.01); H04L 9/32 (2006.01); G06N 5/04 (2023.01); G06F 21/64 (2013.01)
CPC H04L 9/3218 (2013.01) [G06N 5/04 (2013.01); G06N 20/00 (2019.01)] 20 Claims
OG exemplary drawing
 
1. A method for providing distributed machine learning, the method comprising:
receiving, at a controller device, an electronic communication from a requester device, wherein the electronic communication comprises a request that corresponds with a single response;
parsing and translating, by the controller device, the request to determine a set of machine learning (ML) models that are each associated with a different computing node;
executing each ML model in the set of ML models, each executed ML model determining a corresponding response in a first set of responses to the request;
providing each response of the first set of responses to the request to a computational layer configured to:
assign a corresponding weight to each response of the first set of responses, and
generate a single combined response to the request based on the weighted first set of responses to the request; and
providing, by the controller device to the requester device, the single combined response to the request.