CPC G06V 10/82 (2022.01) [A61B 8/02 (2013.01); A61B 8/06 (2013.01); A61B 8/065 (2013.01); A61B 8/085 (2013.01); A61B 8/4427 (2013.01); A61B 8/46 (2013.01); A61B 8/52 (2013.01); A61B 8/5207 (2013.01); A61B 8/5223 (2013.01); G06F 18/24133 (2023.01); G06T 7/0012 (2013.01); G06T 7/0014 (2013.01); G06T 7/70 (2017.01); G06T 11/60 (2013.01); G06T 19/006 (2013.01); G06V 10/454 (2022.01); G06V 30/194 (2022.01); G06V 30/19173 (2022.01); G06V 40/67 (2022.01); A61B 8/0833 (2013.01); A61B 8/0883 (2013.01); A61B 8/4263 (2013.01); A61B 8/463 (2013.01); A61B 8/5215 (2013.01); A61B 2034/2065 (2016.02); A61B 2090/365 (2016.02); A61B 2090/378 (2016.02); A61B 2090/3937 (2016.02); G06T 2207/10132 (2013.01); G06T 2207/20081 (2013.01); G06T 2207/20084 (2013.01); G06T 2207/20221 (2013.01); G06T 2207/30048 (2013.01); G06T 2207/30061 (2013.01); G06T 2210/41 (2013.01); G06V 2201/03 (2022.01)] | 15 Claims |
1. An apparatus, comprising:
at least one processor configured to:
obtain an ultrasound image of a subject;
identify at least one anatomical feature of the subject in the ultrasound image using an automated image processing technique;
identify a value of an ejection fraction of the subject using the at least one anatomical feature in the ultrasound image; and
form a composite image including the ultrasound image and the value of the ejection fraction,
wherein the at least one processor is further configured to obtain the ultrasound image by guiding an operator of an ultrasound device to capture the ultrasound image of the subject,
wherein guiding the operator of the ultrasound device comprises providing the ultrasound image as an input to a first multi-layer neural network, and
wherein the at least one processor is further configured to identify the at least one anatomical feature of the subject by providing the ultrasound image as an input to a second multi-layer neural network that is different from the first multi-layer neural network.
|