| CPC G06V 10/82 (2022.01) [G06F 18/2113 (2023.01); G06F 18/217 (2023.01); G06F 18/2431 (2023.01); G06N 3/08 (2013.01); G06V 10/764 (2022.01)] | 19 Claims |

|
10. A neural network system comprising:
one or more processors; and
a non-transitory memory storing instructions for reducing a degree of catastrophic forgetting in a neural network by scoring training data samples according to an ability to preserve latent decision boundaries for previously observed classes while promoting learning from an input batch of new images from an online data stream, wherein the instructions, when executed by the one or more processors, cause the one or more processors to perform:
receiving the input batch of the new images from the online data stream;
obtaining an evaluation set for a first type of training data and a second type of training data from a first class-balanced random subset of the training data samples from the memory and a first candidate set from a second class-balanced random subset of the training data samples from the memory excluding any training data included in the second type of training data, wherein the evaluation set and the first candidate set comprise different data points,
wherein the second type of training data corresponds to cooperative data points that are representative of training data samples in the memory to retain latent decision boundaries for previously observed classes,
wherein the first type of training data corresponds to adversarial data points that are near samples in the input batch and with different labels to differentiate current classes from previously seen classes, and
wherein the adversarial data points are adversarial to the new images from the online data stream;
determining a K-Nearest Neighbor Shapley value (KNN-SV) of first candidate points among the first candidate set with respect to evaluation points among the evaluation set and the new images for the first type of training data and the second type of training data;
selecting a subset of the first candidate points for memory replay to reduce the degree of catastrophic forgetting, by aggregating the determined KNN-SVs of the first candidate points, wherein a size of the subset of the first candidate points corresponds to a same size of the received input batch of the new images;
concatenating the subset of the first candidate points selected for memory replay to the received input batch of new images to form a mini-batch for training the neural network with the formed mini-batch; and
training the neural network to perform image recognition based on the formed mini-batch; and
a memory update process comprising:
obtaining an evaluation set for a third type of training data from the first class-balanced random subset of the training data samples from the memory and a second candidate set from a randomly selected subset of the training data samples from the memory and the new images from the input batch, wherein a size of the second candidate set corresponds to a number of the new images in addition to a number of a size of the randomly selected training data samples from the memory;
determining a KNN-SV of second candidate points among the second candidate set with respect to the evaluation points by obtaining latent features of the third type of training data from the evaluation set and the second candidate set;
determining a mean of the determined KNN-SVs of the second candidate points across the evaluation points; and
replacing the second candidate points that are the training data samples in the memory having a smaller average KNN-SV than the training data samples from the input batch determined to have a higher average KNN-SV.
|