US 11,694,137 B2
Re-training a model for abnormality detection in medical scans based on a re-contrasted training set
Li Yao, San Francisco, CA (US); Jordan Prosky, San Francisco, CA (US); Eric C. Poblenz, Palo Alto, CA (US); Kevin Lyman, Fords, NJ (US); Ben Covington, Berkeley, CA (US); and Anthony Upton, Malvern (AU)
Assigned to Enlitic, Inc., Fort Collins, CO (US)
Filed by Enlitic, Inc., San Francisco, CA (US)
Filed on Mar. 25, 2022, as Appl. No. 17/656,526.
Application 17/656,526 is a continuation of application No. 16/360,682, filed on Mar. 21, 2019, granted, now 11,322,233.
Claims priority of provisional application 62/770,334, filed on Nov. 21, 2018.
Prior Publication US 2022/0215918 A1, Jul. 7, 2022
This patent is subject to a terminal disclaimer.
Int. Cl. G06Q 10/0631 (2023.01); G16H 10/60 (2018.01); G16H 30/40 (2018.01); G16H 15/00 (2018.01); G06T 5/00 (2006.01); G06T 5/50 (2006.01); G06T 7/00 (2017.01); G06T 11/00 (2006.01); G06N 5/04 (2023.01); G16H 30/20 (2018.01); G06N 20/00 (2019.01); G06F 9/54 (2006.01); G06T 7/187 (2017.01); G06T 7/11 (2017.01); G06F 3/0482 (2013.01); G06T 3/40 (2006.01); A61B 5/00 (2006.01); G16H 50/20 (2018.01); G06F 21/62 (2013.01); G06Q 20/14 (2012.01); G16H 40/20 (2018.01); G06F 3/0484 (2022.01); G16H 10/20 (2018.01); G06N 5/045 (2023.01); G06T 7/10 (2017.01); G06T 11/20 (2006.01); G06F 16/245 (2019.01); G06T 7/44 (2017.01); G06N 20/20 (2019.01); H04L 67/12 (2022.01); H04L 67/01 (2022.01); G06V 10/82 (2022.01); G06F 18/40 (2023.01); G06F 18/214 (2023.01); G06F 18/21 (2023.01); G06F 18/2115 (2023.01); G06F 18/2415 (2023.01); G06V 10/25 (2022.01); G06V 30/19 (2022.01); G06V 10/764 (2022.01); G06V 40/16 (2022.01); G06V 10/22 (2022.01); G16H 50/70 (2018.01); G06T 7/70 (2017.01); G16H 50/30 (2018.01); A61B 5/055 (2006.01); A61B 6/03 (2006.01); A61B 8/00 (2006.01); A61B 6/00 (2006.01); G06Q 50/22 (2018.01); G06F 40/295 (2020.01); G06F 18/24 (2023.01); G06F 18/2111 (2023.01); G06V 30/194 (2022.01)
CPC G06Q 10/06315 (2013.01) [A61B 5/7264 (2013.01); G06F 3/0482 (2013.01); G06F 3/0484 (2013.01); G06F 9/542 (2013.01); G06F 16/245 (2019.01); G06F 18/214 (2023.01); G06F 18/217 (2023.01); G06F 18/2115 (2023.01); G06F 18/2415 (2023.01); G06F 18/41 (2023.01); G06F 21/6254 (2013.01); G06N 5/04 (2013.01); G06N 5/045 (2013.01); G06N 20/00 (2019.01); G06N 20/20 (2019.01); G06Q 20/14 (2013.01); G06T 3/40 (2013.01); G06T 5/002 (2013.01); G06T 5/008 (2013.01); G06T 5/50 (2013.01); G06T 7/0012 (2013.01); G06T 7/0014 (2013.01); G06T 7/10 (2017.01); G06T 7/11 (2017.01); G06T 7/187 (2017.01); G06T 7/44 (2017.01); G06T 7/97 (2017.01); G06T 11/001 (2013.01); G06T 11/006 (2013.01); G06T 11/206 (2013.01); G06V 10/225 (2022.01); G06V 10/25 (2022.01); G06V 10/764 (2022.01); G06V 10/82 (2022.01); G06V 30/19173 (2022.01); G06V 40/171 (2022.01); G16H 10/20 (2018.01); G16H 10/60 (2018.01); G16H 15/00 (2018.01); G16H 30/20 (2018.01); G16H 30/40 (2018.01); G16H 40/20 (2018.01); G16H 50/20 (2018.01); H04L 67/01 (2022.05); H04L 67/12 (2013.01); A61B 5/055 (2013.01); A61B 6/032 (2013.01); A61B 6/5217 (2013.01); A61B 8/4416 (2013.01); G06F 18/2111 (2023.01); G06F 18/24 (2023.01); G06F 40/295 (2020.01); G06Q 50/22 (2013.01); G06T 7/70 (2017.01); G06T 2200/24 (2013.01); G06T 2207/10048 (2013.01); G06T 2207/10081 (2013.01); G06T 2207/10088 (2013.01); G06T 2207/10116 (2013.01); G06T 2207/10132 (2013.01); G06T 2207/20076 (2013.01); G06T 2207/20081 (2013.01); G06T 2207/20084 (2013.01); G06T 2207/30004 (2013.01); G06T 2207/30008 (2013.01); G06T 2207/30016 (2013.01); G06T 2207/30061 (2013.01); G06V 30/194 (2022.01); G06V 2201/03 (2022.01); G16H 50/30 (2018.01); G16H 50/70 (2018.01)] 20 Claims
OG exemplary drawing
 
1. A method comprising:
generating first contrast significance data for a first computer vision model, wherein the first computer vision model was generated from a first training set of medical scans;
identifying first significant contrast parameters based on the first contrast significance data;
generating a first re-contrasted training set based on performing a first intensity transformation function on the first training set of medical scans, wherein the first intensity transformation function utilizes the first significant contrast parameters;
generating a first re-trained model from the first re-contrasted training set, which is associated with corresponding output labels based on abnormality data for the first training set of medical scans;
generating re-contrasted image data of a new medical scan based on performing the first intensity transformation function;
generating inference data indicating at least one abnormality detected in the new medical scan based on utilizing the first re-trained model on the re-contrasted image data; and
transmitting the inference data for display.