| CPC G06N 20/00 (2019.01) [G06F 17/18 (2013.01); G06F 11/3452 (2013.01)] | 20 Claims |

|
1. A computer-implemented method for inferencing signal following in a machine learning model, the method comprising:
calculating an average standard deviation of measured values of time series signals in a set of more than one time series signal;
training the machine learning model to predict values of the time series signals, by:
feeding an observation interval of the time series signals that represents a normal operating state into the machine learning model for the training, and
configuring the machine learning model to generate for individual signals of the time series signals an output signal value that would be expected in the normal operating state based on input signal values and correlations among the time series signals other than the individual signals;
predicting values of each of the time series signals with the trained machine learning model;
generating a time series set of residuals between the predicted values and the measured values for each of the time series signals;
calculating an average standard deviation of the sets of residuals;
determining that signal following is present in the trained machine learning model where a ratio of the average standard deviation of measured values to the average standard deviation of the sets of residuals exceeds a threshold; and
presenting an alert indicating the presence of signal following in the trained machine learning model.
|