US 12,483,815 B2
Sensor systems and methods for characterizing health conditions
Nelson L. Jumbe, Mountain View, CA (US); Andreas Schuh, Mountain View, CA (US); Peter Rexelius, Mountain View, CA (US); Michael Morimoto, Mountain View, CA (US); Dimosthenis Katsis, Mountain View, CA (US); Nikola Knezevic, Mountain View, CA (US); Steve Krawczyk, Mountain View, CA (US); Kevin Hammond, Mountain View, CA (US); Krzysztof Krawiec, Mountain View, CA (US); and Gregory A. Kirkos, Mountain View, CA (US)
Assigned to LEVEL 42 AI, Mountain View, CA (US)
Filed by LEVEL 42 AI, Mountain View, CA (US)
Filed on Dec. 9, 2021, as Appl. No. 17/546,168.
Application 17/546,168 is a continuation of application No. 17/096,806, filed on Nov. 12, 2020, granted, now 11,240,579.
Claims priority of provisional application 63/075,056, filed on Sep. 4, 2020.
Claims priority of provisional application 63/075,059, filed on Sep. 4, 2020.
Claims priority of provisional application 63/067,179, filed on Aug. 18, 2020.
Claims priority of provisional application 63/022,362, filed on May 8, 2020.
Claims priority of provisional application 63/022,336, filed on May 8, 2020.
Prior Publication US 2022/0103922 A1, Mar. 31, 2022
Int. Cl. H04R 1/04 (2006.01); A61B 5/00 (2006.01); A61B 5/0531 (2021.01); A61B 5/277 (2021.01); A61B 5/318 (2021.01); A61B 7/04 (2006.01); A61B 8/00 (2006.01); G01P 1/00 (2006.01); G01P 15/08 (2006.01); G10L 25/66 (2013.01); H04R 1/46 (2006.01); H04R 9/02 (2006.01); H04R 9/04 (2006.01); H04R 9/08 (2006.01)
CPC H04R 1/04 (2013.01) [A61B 5/0002 (2013.01); A61B 5/0531 (2013.01); A61B 5/277 (2021.01); A61B 5/318 (2021.01); A61B 5/412 (2013.01); A61B 5/7267 (2013.01); A61B 7/04 (2013.01); A61B 8/488 (2013.01); G01P 1/00 (2013.01); G01P 15/08 (2013.01); G10L 25/66 (2013.01); H04R 1/46 (2013.01); H04R 9/025 (2013.01); H04R 9/045 (2013.01); H04R 9/08 (2013.01); A61B 2560/0214 (2013.01); A61B 2560/0252 (2013.01); A61B 2560/0257 (2013.01); A61B 2560/0431 (2013.01); A61B 2560/0443 (2013.01); A61B 2562/0204 (2013.01); A61B 2562/0219 (2013.01)] 19 Claims
OG exemplary drawing
 
1. A method comprising:
receiving vibroacoustic data corresponding to a first training set of subjects having a bodily condition and a second training set of subjects having an absence of the bodily condition, wherein the vibroacoustic data was recorded by sensing devices, and wherein each of the sensing devices comprises a vibroacoustic sensor module comprising a voice coil component, a magnet component, a connector, and a diaphragm;
segmenting the vibroacoustic data in the time domain into overlapping time windows;
splitting the overlapping time windows in the frequency domain into frequency ranges;
extracting feature sequences from the split windows;
training a machine learning model, using the feature sequences, to compute a biosignature corresponding to the bodily condition;
determining, by the trained machine learning model, and based on the biosignature, a bodily condition of a subject not part of the first or second training set; and
outputting an indication of the bodily condition of the subject.