US 11,922,302 B2
Hyper-parameter optimization method for spiking neural network and the processing apparatus thereof
Seok Hoon Jeon, Seoul (KR); Byung Soo Kim, Yongin-si (KR); Hee Tak Kim, Seoul (KR); and Tae Ho Hwang, Yongin-si (KR)
Assigned to Korea Electronics Technology Institute, Seongnam-si (KR)
Filed by Korea Electronics Technology Institute, Seongnam-si (KR)
Filed on Nov. 14, 2019, as Appl. No. 16/684,087.
Claims priority of application No. 10-2019-0144181 (KR), filed on Nov. 12, 2019.
Prior Publication US 2021/0142162 A1, May 13, 2021
Int. Cl. G06N 3/08 (2023.01); G06N 3/049 (2023.01); G06N 3/082 (2023.01); G06N 3/084 (2023.01)
CPC G06N 3/08 (2013.01) [G06N 3/049 (2013.01); G06N 3/082 (2013.01); G06N 3/084 (2013.01)] 10 Claims
OG exemplary drawing
 
1. A spiking neural network (SNN) system comprising a SNN accelerator and a computer-implemented hyperparameter optimizer for optimizing hyperparameters of the SNN accelerator, the hyperparameters comprising two or more of a length of spike train, an amount of neurons in a spiking neural network, a generation frequency of input spikes, a synapse weight learning rate, a threshold learning rate, an initial threshold value, or a membrane time constant,
wherein the SNN accelerator comprises an neuromorphic chip configured to perform learning and make an inference,
wherein the hyperparameter optimizer is connected to the SNN accelerator and comprises:
a processor configured to:
receive hardware performance elements from the SNN accelerator connected to the hyperparameter optimizer, the hardware performance elements being indicative of hardware constraints of the SNN accelerator;
define hyperparameter-specific allowable ranges for the hyperparameters based on the hardware performance elements;
receive training data comprising a plurality of pieces of data;
process the training data and compute a statistical value including at least one of an amount of the pieces of data, label-specific distribution characteristics, a minimum data value, a maximum data value, a variance, or a standard deviation regarding the training data;
generate hyperparameter-specific objective functions based on the computed statistical value;
perform regression on the hyperparameter-specific objective functions;
select hyperparameters using the hyperparameter-specific objective functions according to certain selection rules and further based on the hyperparameter-specific allowable ranges; and
transfer the selected hyperparameters to the SNN accelerator, and
wherein the neuromorphic chip is configured to perform on-chip learning using the selected hyperparameters.