| CPC G06V 10/82 (2022.01) [G06N 3/04 (2013.01); G06N 3/082 (2013.01); G06V 10/764 (2022.01); G06V 10/776 (2022.01); G06F 18/217 (2023.01); G06N 3/063 (2013.01)] | 30 Claims |

|
1. A method for training or prediction using a cluster-connected neural network at a local endpoint device, the method comprising:
storing a cluster-connected neural network at the local endpoint device, the cluster-connected neural network having a neural network axis in an orientation extending from an input layer to an output layer and orthogonal to a plurality of intermediate layers, wherein the cluster-connected neural network is divided into a plurality of clusters, wherein each cluster comprises a different plurality of artificial neurons or convolutional channels, wherein the artificial neurons or convolutional channels of each cluster are in a region extending parallel to the direction of the neural network axis resulting in a predominant direction of neuron activation extending from the input layer toward the output layer, wherein each pair of neurons or channels are uniquely connected by a weight or convolutional filter;
within each cluster of the cluster-connected neural network, generating or maintaining a locally dense sub-network of intra-cluster weights or filters, in which a majority of pairs of neurons or channels within the same cluster are connected by intra-cluster weights or filters, such that, the connected majority of pairs of neurons or channels in each cluster are co-activated together as an activation block during training or prediction using the cluster-connected neural network;
outside each cluster of the cluster-connected neural network, generating or maintaining a globally sparse network of inter-cluster weights or filters, in which a minority of pairs of neurons or channels separated by a cluster border across different clusters are connected by inter-cluster weights or filters; and
performing prediction using the cluster-connected neural network at the local endpoint device.
|