US 12,442,654 B2
Intelligent electronic footwear and logic for navigation assistance by automated tactile, audio, and visual feedback
Christopher Andon, Portland, OR (US); and Hien Tommy Pham, Beaverton, OR (US)
Assigned to NIKE, Inc., Beaverton, OR (US)
Filed by NIKE, Inc., Beaverton, OR (US)
Filed on Jan. 24, 2024, as Appl. No. 18/421,794.
Application 16/414,353 is a division of application No. 16/114,648, filed on Aug. 28, 2018, granted, now 10,334,906, issued on Jul. 2, 2019.
Application 18/421,794 is a continuation of application No. 18/311,450, filed on May 3, 2023.
Application 18/311,450 is a continuation of application No. 17/404,292, filed on Aug. 17, 2021, granted, now 11,678,713, issued on Jun. 20, 2023.
Application 17/404,292 is a continuation of application No. 16/874,944, filed on May 15, 2020, granted, now 11,122,852, issued on Sep. 21, 2021.
Application 16/874,944 is a continuation in part of application No. 16/809,749, filed on Mar. 5, 2020, granted, now 11,058,166, issued on Jul. 13, 2021.
Application 16/874,944 is a continuation in part of application No. 16/561,324, filed on Sep. 5, 2019, granted, now 10,674,783, issued on Jun. 9, 2020.
Application 16/809,749 is a continuation of application No. 16/414,353, filed on May 16, 2019, granted, now 10,681,954, issued on Jun. 16, 2020.
Application 16/561,324 is a continuation of application No. 16/220,403, filed on Dec. 14, 2018, granted, now 10,441,020, issued on Oct. 15, 2019.
Application 16/220,403 is a continuation of application No. 16/114,632, filed on Aug. 28, 2018, granted, now 10,178,890, issued on Jan. 15, 2019.
Claims priority of provisional application 62/678,796, filed on May 31, 2018.
Prior Publication US 2024/0160245 A1, May 16, 2024
Int. Cl. G06F 1/16 (2006.01); A43B 3/34 (2022.01); A43B 3/36 (2022.01); A43B 3/44 (2022.01); A43B 3/48 (2022.01); A43B 3/50 (2022.01); A43C 19/00 (2006.01); B60Q 1/14 (2006.01); B60Q 1/26 (2006.01); B60Q 5/00 (2006.01); B60Q 9/00 (2006.01); G01C 21/36 (2006.01); G01S 5/00 (2006.01); G01S 19/19 (2010.01); G08B 7/06 (2006.01); G08B 21/02 (2006.01); G08G 1/005 (2006.01); G08G 1/16 (2006.01); H04W 4/021 (2018.01); H04W 4/80 (2018.01); A43C 11/16 (2006.01); B60Q 1/50 (2006.01); G01S 19/49 (2010.01)
CPC G01C 21/3652 (2013.01) [A43B 3/34 (2022.01); A43B 3/36 (2022.01); A43B 3/44 (2022.01); A43B 3/48 (2022.01); A43B 3/50 (2022.01); A43C 19/00 (2013.01); B60Q 1/14 (2013.01); B60Q 1/2673 (2013.01); B60Q 5/006 (2013.01); B60Q 9/008 (2013.01); G01S 5/0027 (2013.01); G01S 5/0072 (2013.01); G01S 19/19 (2013.01); G06F 1/163 (2013.01); G08B 7/06 (2013.01); G08B 21/02 (2013.01); G08G 1/005 (2013.01); G08G 1/162 (2013.01); G08G 1/166 (2013.01); H04W 4/021 (2013.01); H04W 4/80 (2018.02); A43C 11/165 (2013.01); B60Q 1/525 (2013.01); B60Q 2300/05 (2013.01); G01S 19/49 (2013.01)] 25 Claims
OG exemplary drawing
 
1. An intelligent electronic shoe (IES) system, comprising:
a shoe structure configured to receive and support thereon a foot of a user;
a fastening mechanism attached to the shoe structure and configured to secure the foot of the user to the shoe structure;
an alert system attached to the shoe structure and configured to generate visible, audible, and/or haptic outputs perceptible by the user, the alert system including an electric motor configured to selectively tension and untension the fastening mechanism;
a wireless communications device attached to the shoe structure and configured to wirelessly communicate with a remote computing node; and
a controller communicatively connected to the alert system and the wireless communications device, the controller being programmed to:
receive user location data indicative of a user location of the user;
determine if the user location is within a predetermined location relative to a building associated with the remote computing node;
receive, from the remote computing node via the wireless communications device after determining the user location, a series of steps corresponding to instructions for performing a physical activity; and
command the alert system to output one or more of the visible, audible, and/or haptic outputs for each step in the series of steps to thereby instruct the user to perform the physical activity, including commanding the electric motor to generate one or more user-perceptible tactile cues for one or more of the steps.