US 11,928,859 B2
Automated image analysis for diagnosing a medical condition
Daniel Nouri, Burlington, MA (US); Alex Rothberg, Burlington, MA (US); Matthew de Jonge, Burlington, MA (US); Jimmy Jia, Burlington, MA (US); Jonathan M. Rothberg, Burlington, MA (US); Michal Sofka, Burlington, MA (US); David Elgena, Burlington, MA (US); Mark Michalski, Burlington, MA (US); Tomer Gafner, Burlington, MA (US); and Abraham Neben, Burlington, MA (US)
Assigned to BFLY OPERATIONS, INC., Burlington, MA (US)
Filed by BFLY OPERATIONS, INC., Burlington, MA (US)
Filed on Oct. 27, 2022, as Appl. No. 17/975,268.
Application 17/975,268 is a continuation of application No. 15/626,954, filed on Jun. 19, 2017, granted, now 11,540,808.
Claims priority of provisional application 62/463,094, filed on Feb. 24, 2017.
Claims priority of provisional application 62/453,696, filed on Feb. 2, 2017.
Claims priority of provisional application 62/445,195, filed on Jan. 11, 2017.
Claims priority of provisional application 62/434,980, filed on Dec. 15, 2016.
Claims priority of provisional application 62/384,187, filed on Sep. 6, 2016.
Claims priority of provisional application 62/384,144, filed on Sep. 6, 2016.
Claims priority of provisional application 62/352,382, filed on Jun. 20, 2016.
Prior Publication US 2023/0117915 A1, Apr. 20, 2023
Int. Cl. A61B 5/00 (2006.01); A61B 8/00 (2006.01); A61B 8/02 (2006.01); A61B 8/06 (2006.01); A61B 8/08 (2006.01); G06F 18/2413 (2023.01); G06T 7/00 (2017.01); G06T 7/70 (2017.01); G06T 11/60 (2006.01); G06T 19/00 (2011.01); G06V 10/44 (2022.01); G06V 10/82 (2022.01); G06V 30/19 (2022.01); G06V 30/194 (2022.01); G06V 40/60 (2022.01); A61B 34/20 (2016.01); A61B 90/00 (2016.01)
CPC G06V 10/82 (2022.01) [A61B 8/02 (2013.01); A61B 8/06 (2013.01); A61B 8/065 (2013.01); A61B 8/085 (2013.01); A61B 8/4427 (2013.01); A61B 8/46 (2013.01); A61B 8/52 (2013.01); A61B 8/5207 (2013.01); A61B 8/5223 (2013.01); G06F 18/24133 (2023.01); G06T 7/0012 (2013.01); G06T 7/0014 (2013.01); G06T 7/70 (2017.01); G06T 11/60 (2013.01); G06T 19/006 (2013.01); G06V 10/454 (2022.01); G06V 30/19173 (2022.01); G06V 30/194 (2022.01); G06V 40/67 (2022.01); A61B 8/0833 (2013.01); A61B 8/0883 (2013.01); A61B 8/4263 (2013.01); A61B 8/463 (2013.01); A61B 8/5215 (2013.01); A61B 2034/2065 (2016.02); A61B 2090/365 (2016.02); A61B 2090/378 (2016.02); A61B 2090/3937 (2016.02); G06T 2207/10132 (2013.01); G06T 2207/20081 (2013.01); G06T 2207/20084 (2013.01); G06T 2207/20221 (2013.01); G06T 2207/30048 (2013.01); G06T 2207/30061 (2013.01); G06T 2210/41 (2013.01); G06V 2201/03 (2022.01)] 20 Claims
OG exemplary drawing
 
1. An ultrasound system, comprising:
an ultrasound device; and
a handheld computing device in operative communication with the ultrasound device, the handheld computing device comprising a display screen and configured to:
obtain an electronic medical record of a subject;
identify an organ associated with medical information contained in the electronic medical record;
identify a target anatomical view of the identified organ of the subject to be imaged by the ultrasound device;
display, on the display screen, a coarse instruction for capturing ultrasound data representing the target anatomical view, the coarse instruction comprising:
a graphical image of the subject; and
a graphical indication of where an operator should place the ultrasound device in order to capture the ultrasound data representing the target anatomical view, wherein the indication is superimposed on the graphical image of the subject; and
display, on the display screen, a fine instruction for capturing the ultrasound data representing the target anatomical view, the fine instruction comprising an instruction selected from the group consisting of translating, rotating, and tilting the ultrasound device in order to capture the ultrasound data representing the target anatomical view.