CPC G06N 3/049 (2013.01) [G06F 17/10 (2013.01); G06F 17/14 (2013.01); G06F 17/142 (2013.01); G06F 17/156 (2013.01); G06F 17/18 (2013.01); G06N 3/044 (2023.01); G06N 3/084 (2013.01); G06N 20/00 (2019.01); G06N 20/10 (2019.01)] | 23 Claims |
1. A special-purpose physical computer apparatus configured for conducting machine learning pre-processing by performing a feature encoding data structure transformation of an input data set by decomposing time features of the input data set into a computer implemented data structure as transformed inputs into a machine learning data model architecture, the special-purpose physical computer apparatus residing in a data center and coupled to a message bus such that the input data set can be received across a network interface and the transformed inputs can be provided through the network interface to the machine learning data model architecture, the special purpose physical computer apparatus comprising a computer processor operating in conjunction with computer memory, input and output interfaces, and network interfaces, the computer processor configured to:
receive the input data set and extract an initial set of machine learning input vectors;
generate, based on the input data set and the extracted initial set of machine learning input vectors, a decomposed set of feature components representative of at least one or more periodic functions such that the decomposed set of feature components represent time through at least one or more periodic functions and the one or more periodic functions include at least frequency and phase-shift learnable parameters, the one or more periodic functions including at least one of sine functions, cosine functions and sawtooth functions;
provide the decomposed set of feature components into the machine learning data model architecture, the machine learning data model architecture including neural networks, the machine learning data model architecture including a first neural network activation layer that accepts the one or more periodic functions as input, wherein the first neural network activation layer comprises a plurality of neurons, each neuron with an activation function, the plurality of neurons taking in the decomposed set of feature components representing time, wherein the first neural network activation layer outputs an attribute vector comprising attribute components that represent a plurality of distinct temporal attributes having both a periodic and non-periodic part; and
update, as the machine learning data model architecture iterates on a set of training data, the frequency and phase-shift learnable parameters of the decomposed set of feature components based on a reward or penalty function such that weightings associated with the frequency and phase-shift learnable parameters shift over training to reflect periodicity of the set of training data and improve at least one of accuracy, convergence speed, and reduced total number of processing cycles; and
replace, during inference usage of the machine learning data model architecture, the input data set with the decomposed set of feature components representing a transformed variant of the input data set;
wherein the decomposed set of feature components representative of the at least one or more periodic functions is a vector representation t2v(τ) having k sinusoids, provided in accordance with a relation:
![]() wherein t2v(τ)[i] is a vector of size k+1, t2v(τ)[i] is the ith element of t2v(τ), ωi is the frequency learnable parameter, τ is a time feature, and φi is the phase-shift learnable parameter.
|