US 12,437,524 B2
Cluster-connected neural network
Eli David, Tel Aviv (IL); and Eri Rubin, Kibbutz Ma'ale Ha'hamisha (IL)
Assigned to NANO DIMENSION TECHNOLOGIES, LTD., Ness Ziona (IL)
Filed by Nano Dimension Technologies, Ltd., Ness Ziona (IL)
Filed on Oct. 28, 2021, as Appl. No. 17/513,189.
Application 17/513,189 is a continuation of application No. 17/095,154, filed on Nov. 11, 2020, granted, now 11,164,084, issued on Nov. 2, 2021.
Prior Publication US 2022/0147828 A1, May 12, 2022
This patent is subject to a terminal disclaimer.
Int. Cl. G06V 10/82 (2022.01); G06F 18/21 (2023.01); G06N 3/04 (2023.01); G06N 3/063 (2023.01); G06N 3/082 (2023.01); G06V 10/764 (2022.01); G06V 10/776 (2022.01)
CPC G06V 10/82 (2022.01) [G06N 3/04 (2013.01); G06N 3/082 (2013.01); G06V 10/764 (2022.01); G06V 10/776 (2022.01); G06F 18/217 (2023.01); G06N 3/063 (2013.01)] 30 Claims
OG exemplary drawing
 
1. A method for training or prediction using a cluster-connected neural network at a local endpoint device, the method comprising:
storing a cluster-connected neural network at the local endpoint device, the cluster-connected neural network having a neural network axis in an orientation extending from an input layer to an output layer and orthogonal to a plurality of intermediate layers, wherein the cluster-connected neural network is divided into a plurality of clusters, wherein each cluster comprises a different plurality of artificial neurons or convolutional channels, wherein the artificial neurons or convolutional channels of each cluster are in a region extending parallel to the direction of the neural network axis resulting in a predominant direction of neuron activation extending from the input layer toward the output layer, wherein each pair of neurons or channels are uniquely connected by a weight or convolutional filter;
within each cluster of the cluster-connected neural network, generating or maintaining a locally dense sub-network of intra-cluster weights or filters, in which a majority of pairs of neurons or channels within the same cluster are connected by intra-cluster weights or filters, such that, the connected majority of pairs of neurons or channels in each cluster are co-activated together as an activation block during training or prediction using the cluster-connected neural network;
outside each cluster of the cluster-connected neural network, generating or maintaining a globally sparse network of inter-cluster weights or filters, in which a minority of pairs of neurons or channels separated by a cluster border across different clusters are connected by inter-cluster weights or filters; and
performing prediction using the cluster-connected neural network at the local endpoint device.