US 12,008,474 B2
Automatic thresholds for neural network pruning and retraining
Zhengping Ji, Temple City, CA (US); John Wakefield Brothers, Calistoga, CA (US); Ilia Ovsiannikov, Studio City, CA (US); and Eunsoo Shim, Princetion Junction, NJ (US)
Assigned to SAMSUNG ELECTRONICS CO., LTD., (KR)
Filed by Samsung Electronics Co., Ltd., Suwon-si (KR)
Filed on Sep. 9, 2020, as Appl. No. 17/016,363.
Application 17/016,363 is a continuation of application No. 15/488,430, filed on Apr. 14, 2017, granted, now 10,832,135.
Claims priority of provisional application 62/457,806, filed on Feb. 10, 2017.
Prior Publication US 2020/0410357 A1, Dec. 31, 2020
Int. Cl. G06N 20/10 (2019.01); G06N 3/08 (2023.01); G06N 3/082 (2023.01); G06N 20/20 (2019.01)
CPC G06N 3/082 (2013.01) 20 Claims
OG exemplary drawing
 
1. A method, comprising:
pruning, by a processor, a layer of a neural network using a first threshold, the neural network comprising multiple layers, the pruning comprising configuring weights of the layer based on the first threshold;
determining, by the processor, a pruning error of the layer based on a percentage of the configured weights of the layer with respect to an initial number of weights of the layer; and
repeatedly configuring, by the processor, the weights of the layer until the pruning error of the layer equals a pruning error allowance for the layer, wherein the repeatedly configuring the weights of the layer allows a percentage of the number of weights remaining after the repeatedly configuring the weights of the layer with respect to the initial number of weights of the layer to be determined, and wherein each iteration of the repeatedly configuring the weights of the layer uses a different first threshold based on the pruning error.