US 12,079,719 B1
Lifelong machine learning (LML) model for patient subpopulation identification using real-world healthcare data
Guanhao Wei, Wayne, PA (US); Yunlong Wang, Malvern, PA (US); Li Zhou, Yardley, PA (US); Lynn Lu, San Diego, CA (US); Emily Zhao, Wayne, PA (US); Lishan Feng, Philadelphia, PA (US); Fan Zhang, King of Prussia, PA (US); Frank Jing, Beijing (CN); and Yilian Yuan, North Wales, PA (US)
Assigned to IQVIA Inc., Durham, NC (US)
Filed by IQVIA Inc., Danbury, CT (US)
Filed on Aug. 26, 2020, as Appl. No. 17/003,127.
Int. Cl. G06N 3/08 (2023.01); G06N 3/044 (2023.01); G06N 3/045 (2023.01); G06N 3/047 (2023.01); G06N 3/048 (2023.01); G06N 5/01 (2023.01); G06N 20/20 (2019.01); G16H 10/60 (2018.01); G16H 40/20 (2018.01); G16H 50/70 (2018.01); G16H 70/20 (2018.01); G16H 70/40 (2018.01); G16H 70/60 (2018.01)
CPC G06N 3/08 (2013.01) [G06N 3/044 (2023.01); G06N 3/045 (2023.01); G06N 3/047 (2023.01); G06N 3/048 (2023.01); G06N 5/01 (2023.01); G06N 20/20 (2019.01); G16H 10/60 (2018.01); G16H 40/20 (2018.01); G16H 50/70 (2018.01); G16H 70/20 (2018.01); G16H 70/40 (2018.01); G16H 70/60 (2018.01)] 19 Claims
OG exemplary drawing
 
1. A computing device configured for implementing a machine learning model using a Bayesian neural network, comprising:
one or more processors; and
one or more hardware-based non-transitory computer-readable memory devices storing instructions which, when executed by the one or more processors, cause the computing device to:
collect current healthcare data from one or more remote data structures;
prepare a plurality of non-sequential input features from the collected current healthcare data;
provide the prepared non-sequential input features to a wide component of the machine learning model, wherein the wide component implements a first plurality of rectified linear unit (ReLU) activation functions arranged in a first multi-level neural network;
obtain sequential features from the current healthcare data;
provide the sequential features to a set of LSTM (long short-term memory) models that are instantiated in a deep component of the machine learning model, in which the deep component implements a second plurality of ReLU activation functions arranged in a second multi-level neural network;
combine the wide component, the deep component, and a prior component of the machine learning model in a wide and shallow neural network that provides a posterior structure of the machine learning model, the prior component comprising one or more machine learning and/or deep learning models trained on historical healthcare data; and
provide the posterior structure to the one or more models of the prior component for use in connection with subsequent healthcare data.