| CPC G06N 20/00 (2019.01) [H04L 67/10 (2013.01); G06F 11/3006 (2013.01); G06N 3/082 (2013.01); G06N 3/084 (2013.01); G06N 3/098 (2023.01); G06N 20/20 (2019.01); H04L 41/0654 (2013.01); H04L 41/149 (2022.05)] | 18 Claims |

|
1. A distributed training method performed by an edge cloud server connected to a central cloud server, the distributed training method comprising:
downloading a machine learning model for training from the central cloud server;
identifying a plurality of user terminals located in a local network coverage of the edge cloud server;
providing machine learning models to the plurality of user terminals based on the downloaded machine learning model and current load conditions or current performance conditions of the plurality of user terminals;
aggregating results of the training of the machine learning models from the plurality of user terminals when the training of the machine learning models is completed in the plurality of user terminals;
evaluating the results of the training of the machine learning models by the plurality of user terminals;
updating the machine learning models based on evaluation results; and
providing rewards to the plurality of user terminals for training the machine learning models.
|