US 12,295,784 B2
System and method for augmented reality data interaction for ultrasound imaging
Jonathan Silva, St. Louis, MO (US); Christopher Andrews, Chesterfield, MO (US); Jennifer Silva, St. Louis, MO (US); and Zahid Iqbal, Kirkwood, MO (US)
Assigned to Washington University, St. Louis, MO (US)
Appl. No. 17/799,826
Filed by Washington University, St. Louis, MO (US)
PCT Filed Apr. 13, 2021, PCT No. PCT/US2021/027064
§ 371(c)(1), (2) Date Aug. 15, 2022,
PCT Pub. No. WO2021/211570, PCT Pub. Date Oct. 21, 2021.
Claims priority of provisional application 63/008,997, filed on Apr. 13, 2020.
Prior Publication US 2023/0065505 A1, Mar. 2, 2023
Int. Cl. A61B 8/00 (2006.01); A61B 34/20 (2016.01)
CPC A61B 8/42 (2013.01) [A61B 34/20 (2016.02); A61B 8/4416 (2013.01)] 24 Claims
OG exemplary drawing
 
1. A mixed reality (MR) visualization system, comprising:
an MR device comprising a holographic display configured to display a holographic image to an operator;
a hand-held ultrasound imaging device configured to obtain a real-time ultrasound image of a subject's anatomy; and
a computing device communicatively coupled to the MR device and the hand-held ultrasound imaging device, the computing device comprising a non-volatile memory and a processor, wherein the computing device is configured to:
receive the real-time ultrasound image;
determine a real-time 3D position and orientation of the hand-held ultrasound imaging device;
generate a modified real-time ultrasound image by modifying the real-time ultrasound image to correspond to the real-time 3D position and orientation of the hand-held ultrasound imaging device; and
transmit the modified real-time ultrasound image to the MR device for display as the holographic image positioned at a predetermined location relative to the hand-held ultrasound imaging device.