US 11,755,451 B2
Method and apparatus for tuning adjustable parameters in computing environment
Stefano Doni, Milan (IT); Giovanni Paolo Gibilisco, Milan (IT); and Stefano Cereda, Milan (IT)
Assigned to AKAMAS S.P.A., Milan (IT)
Filed by AKAMAS S.R.L., Milan (IT)
Filed on Mar. 13, 2020, as Appl. No. 16/818,263.
Claims priority of application No. 102019000003667 (IT), filed on Mar. 13, 2019.
Prior Publication US 2020/0293835 A1, Sep. 17, 2020
Int. Cl. G06F 11/34 (2006.01); G06N 20/00 (2019.01); G06F 11/22 (2006.01); G06F 11/36 (2006.01); G06F 18/2415 (2023.01); G06F 18/214 (2023.01); G06N 7/01 (2023.01)
CPC G06F 11/3442 (2013.01) [G06F 11/2289 (2013.01); G06F 11/3409 (2013.01); G06F 11/3428 (2013.01); G06F 11/3684 (2013.01); G06F 18/2148 (2023.01); G06F 18/24155 (2023.01); G06N 7/01 (2023.01); G06N 20/00 (2019.01)] 19 Claims
OG exemplary drawing
 
1. A computer-implemented method carried out on an information technology framework including at least one processor where workflows, performance metrics and a System Under Test having a set of tunable parameters are defined, an optimizer module, driven by said at least one processor to generate candidate configurations of said System Under Test having a set of tunable parameters, implementing a machine learning model, a configurator module, driven by said at least one processor to at least apply said candidate configurations to said System Under Test, a load generator module, driven by said at least one processor to inject a test workload into said System Under Test to reach a work regime, the at least one processor being configured to gather performance metrics from said System Under Test under said injected test workload, the method comprising:
identifying a set of tunable parameters representing a candidate configuration for said System Under Test, using the at least one processor through said optimizer module, and applying said candidate configuration to said System Under Test using said configurator module;
running a performance test on said System Under Test and collecting performance metrics using said at least one processor to determine a performance indicator; and
supplying said performance metrics to said machine learning model of the optimizer module to generate an optimized candidate configuration,
wherein said machine learning model uses Bayesian Optimization with Gaussian Processes as a surrogate model that provides as output, corresponding to a candidate set of parameters, both an expected value of said performance indicator and a prediction uncertainty thereof which are used by said optimizer module to build an Acquisition Function which is used to derive a candidate configuration and by said load generator module to build said test workload, and
said test workload is computed through said machine learning model,
wherein an output of said machine learning model is further submitted to perform outliers detecting to discard individual performance metrics which are affected by noise in the information technology framework.