| CPC H04L 9/085 (2013.01) [G06F 18/2148 (2023.01); G06N 3/063 (2013.01); G06N 5/04 (2013.01)] | 9 Claims | 

| 
               1. A learning system that trains a learning model, the learning system comprising: 
            a model generation device; and 
                n pieces of calculation devices connected to the model generation device via a network, n being an integer equal to or greater than three, wherein 
                the model generation device comprises: 
                a first hardware processor configured to function as: 
                  an acquisition unit that acquires m×n pieces of training data used for training the learning model, m being an integer equal to or greater than two, 
                    a splitting unit that splits the m×n pieces of training data into n groups each including m pieces of training data, the n groups corresponding to the respective n pieces of calculation devices on one-to-one basis, 
                    a secret sharing unit that generates m pieces of distribution training data for each of the n groups by a distribution process of a secret sharing scheme, and that generates distribution training data for each of the m pieces of training data included in an i-th group, i being an integer equal to or greater than one and equal to or less than n, among the n groups using an i-th element Pi among n pieces of elements P1, P2, . . . , Pi, . . . , Pn, by the distribution process of the secret sharing scheme, and 
                    a share transmission unit that transmits the corresponding m pieces of distribution training data to each of the n pieces of calculation devices, each of the n pieces of calculation devices comprises: 
                  a second hardware processor configured to function as: 
                a share reception unit that receives the m pieces of distribution training data from the model generation device, 
                    a training unit that trains a distributed learning model having a same configuration as a configuration of the learning model, by the received m pieces of distribution training data, and 
                    a parameter transmission unit that transmits a trained distribution parameter group in the distributed learning model to the model generation device, 
                  the first hardware processor of the model generation device further functions as: 
                a parameter reception unit that receives the trained distribution parameter group from each of k1 pieces of calculation devices, k1 being a predetermined integer equal to or greater than two and equal to or less than n−1, among the n pieces of calculation devices, and 
                  a parameter restoration unit that generates a parameter group of the learning model, based on the distribution parameter group received from each of the k1 pieces of calculation devices, by a restoration process of the secret sharing scheme, and 
                the parameter restoration unit: 
              substitutes the corresponding distribution parameter group into an inverse function of a model expression represented by an equation using result data as a variable and using training data as a value, for each of the k1 pieces of calculation devices, 
                  restores an inverse function of the learning model using k1 pieces of the inverse functions into which the corresponding distribution parameter group is substituted, by the restoration process of the secret sharing scheme, and 
                  generates the parameter group of the learning model based on the restored inverse function of the learning model. 
                 |