US 11,861,887 B2
Augmented reality interface for assisting a user to operate an ultrasound device
Matthew de Jonge, Brooklyn, NY (US); Robert Schneider, Killingworth, CT (US); David Elgena, Orlando, FL (US); Alex Rothberg, New York, NY (US); Jonathan M. Rothberg, Miami Beach, FL (US); Michal Sofka, Princeton, NJ (US); Tomer Gafner, Forest Hills, NY (US); Karl Thiele, St. Petersburg, FL (US); and Abraham Neben, Guilford, CT (US)
Assigned to BFLY OPERATIONS, INC., Burlington, MA (US)
Filed by BFLY Operations, Inc., Guilford, CT (US)
Filed on Sep. 7, 2021, as Appl. No. 17/468,633.
Application 17/468,633 is a continuation of application No. 16/889,944, filed on Jun. 2, 2020, granted, now 11,185,307.
Application 16/889,944 is a continuation of application No. 15/626,771, filed on Jun. 19, 2017, granted, now 10,702,242.
Claims priority of provisional application 62/463,094, filed on Feb. 24, 2017.
Claims priority of provisional application 62/453,696, filed on Feb. 2, 2017.
Claims priority of provisional application 62/445,195, filed on Jan. 11, 2017.
Claims priority of provisional application 62/434,980, filed on Dec. 15, 2016.
Claims priority of provisional application 62/384,144, filed on Sep. 6, 2016.
Claims priority of provisional application 62/384,187, filed on Sep. 6, 2016.
Claims priority of provisional application 62/352,382, filed on Jun. 20, 2016.
Prior Publication US 2022/0167945 A1, Jun. 2, 2022
Int. Cl. G06V 10/82 (2022.01); A61B 8/08 (2006.01); G06V 10/44 (2022.01); G06V 40/60 (2022.01); G06F 18/2413 (2023.01); G06V 30/19 (2022.01); G06V 30/194 (2022.01); A61B 8/06 (2006.01); A61B 8/00 (2006.01); G06T 7/70 (2017.01); G06T 19/00 (2011.01); A61B 8/02 (2006.01); G06T 7/00 (2017.01); G06T 11/60 (2006.01); A61B 90/00 (2016.01); A61B 34/20 (2016.01)
CPC G06V 10/82 (2022.01) [A61B 8/02 (2013.01); A61B 8/06 (2013.01); A61B 8/065 (2013.01); A61B 8/085 (2013.01); A61B 8/4427 (2013.01); A61B 8/46 (2013.01); A61B 8/52 (2013.01); A61B 8/5207 (2013.01); A61B 8/5223 (2013.01); G06F 18/24133 (2023.01); G06T 7/0012 (2013.01); G06T 7/0014 (2013.01); G06T 7/70 (2017.01); G06T 11/60 (2013.01); G06T 19/006 (2013.01); G06V 10/454 (2022.01); G06V 30/194 (2022.01); G06V 30/19173 (2022.01); G06V 40/67 (2022.01); A61B 8/0833 (2013.01); A61B 8/0883 (2013.01); A61B 8/4263 (2013.01); A61B 8/463 (2013.01); A61B 8/5215 (2013.01); A61B 2034/2065 (2016.02); A61B 2090/365 (2016.02); A61B 2090/378 (2016.02); A61B 2090/3937 (2016.02); G06T 2207/10132 (2013.01); G06T 2207/20081 (2013.01); G06T 2207/20084 (2013.01); G06T 2207/20221 (2013.01); G06T 2207/30048 (2013.01); G06T 2207/30061 (2013.01); G06T 2210/41 (2013.01); G06V 2201/03 (2022.01)] 15 Claims
OG exemplary drawing
 
1. An apparatus, comprising:
at least one processor configured to:
obtain an ultrasound image of a subject;
identify at least one anatomical feature of the subject in the ultrasound image using an automated image processing technique;
identify a value of an ejection fraction of the subject using the at least one anatomical feature in the ultrasound image; and
form a composite image including the ultrasound image and the value of the ejection fraction,
wherein the at least one processor is further configured to obtain the ultrasound image by guiding an operator of an ultrasound device to capture the ultrasound image of the subject,
wherein guiding the operator of the ultrasound device comprises providing the ultrasound image as an input to a first multi-layer neural network, and
wherein the at least one processor is further configured to identify the at least one anatomical feature of the subject by providing the ultrasound image as an input to a second multi-layer neural network that is different from the first multi-layer neural network.