US 12,251,617 B2
Technology adapted to facilitate user-specific calibration of instrumented mouthguard devices, including calibration methods for instrumented mouthguard devices
Michael Vegar, Queenscliff (AU); and Ben Nizette, Queenscliff (AU)
Assigned to HitIQ Limited, South Melbourne (AU)
Appl. No. 17/906,047
Filed by HitIQ Limited, South Melbourne (AU)
PCT Filed Mar. 10, 2021, PCT No. PCT/AU2021/050209
§ 371(c)(1), (2) Date Sep. 9, 2022,
PCT Pub. No. WO2021/179041, PCT Pub. Date Sep. 16, 2021.
Claims priority of application No. 2020900733 (AU), filed on Mar. 10, 2020.
Prior Publication US 2023/0107952 A1, Apr. 6, 2023
Int. Cl. A63B 71/08 (2006.01)
CPC A63B 71/085 (2013.01) [A63B 2220/40 (2013.01); A63B 2220/53 (2013.01); A63B 2220/803 (2013.01)] 25 Claims
OG exemplary drawing
 
1. A method for customizing an instrumented mouthguard device for a subject, which is performable on an instrumented mouthguard formed via the following steps:
performing a 3D scan of the subject's mouthguard fitting region;
based on the 3D scan of the subject's mouthguard fitting region, forming a mouthguard body from resilient plastics material;
mounting a plurality of motion sensor components to the formed mouthguard body, and
performing additional construction steps thereby to form the instrumented mouthguard;
the method including:
attaching the instrumented mouthguard to a moving unit that is configured to move the instrumented mouthguard in a predefined manner;
capturing data from the one or more motion sensor components during the movement of the mouthguard in the predefined manner;
processing the captured data thereby to identify relative orientation and position of the plurality of motion sensing components;
performing a 3D scan of subject's head;
based on the 3D scan of the subject's head, estimating relative locations of the subject's jaw and brain; and
based on (A) the estimated relative locations of the subject's jaw and brain; and (B) the relative orientation and position of the plurality of motion sensing components, defining transforms for data received from the motion sensing components that approximate accelerations for a defined point on the subject's head.