US 12,270,676 B2
Navigation methods and apparatus for the visually impaired
Benjamin Kimia, Providence, RI (US)
Assigned to Brown University, Providence, RI (US)
Filed by Brown University, Providence, RI (US)
Filed on Aug. 21, 2023, as Appl. No. 18/453,134.
Application 18/453,134 is a continuation of application No. 17/465,745, filed on Sep. 2, 2021, abandoned.
Application 17/465,745 is a continuation of application No. 15/697,966, filed on Sep. 7, 2017, granted, now 11,112,261, issued on Sep. 7, 2021.
Application 15/697,966 is a continuation of application No. 14/707,163, filed on May 8, 2015, abandoned.
Claims priority of provisional application 61/990,638, filed on May 8, 2014.
Prior Publication US 2023/0392944 A1, Dec. 7, 2023
Int. Cl. G06T 7/80 (2017.01); G01C 21/20 (2006.01); G01C 21/36 (2006.01); G06F 3/00 (2006.01); G06F 3/01 (2006.01); G06V 20/20 (2022.01)
CPC G01C 21/3652 (2013.01) [G01C 21/206 (2013.01); G01C 21/3602 (2013.01); G01C 21/3629 (2013.01); G06F 3/005 (2013.01); G06F 3/016 (2013.01); G06T 7/80 (2017.01); G06V 20/20 (2022.01); G06T 2207/10016 (2013.01); G06T 2207/10021 (2013.01); G06T 2207/10024 (2013.01); G06T 2207/10028 (2013.01); G06T 2207/30244 (2013.01)] 5 Claims
OG exemplary drawing
 
1. A system for guiding a visually impaired user comprising:
a remote database for storing a map, said map comprising an undirected graph of nodes and edges wherein each node comprises:
location data corresponding to a location;
previously acquired image data associated with said location; and
annotations associated with said location;
an image acquisition system configured to be worn by a user, said image acquisition system comprising:
two or more forward-looking cameras for capturing, in real time, forward image data from the front of the user; and
two or more side-looking cameras for capturing, in real time, peripheral image data from either side of the user;
an odometry module for determining motion of the user based on inertial data and visual data;
a processor configured to retrieve said map based upon the user's location and issue auditory and non-auditory guidance to the user to traverse a path defined by said map based upon comparison of said forward image data and said peripheral image data to said previously acquired image data and the motion of the user, said processor further configured to be in communication with said remote database for remotely accessing said map and for sending current location data to said remote database, the processor further configured to execute a method comprising:
performing feature detection on the continuously captured image data to obtain a plurality of features from the continuously captured image data, wherein feature detection includes identifying curves, edges, and evaluating metadata of the continuously captured image data;
comparing the plurality of features from the continuously captured image data and the plurality of stored path data to determine a user's location;
automatically selecting a traversable path from the stored path data corresponding to the user's location and a user-selected destination location;
dynamically updating said traversable path representing the continuously captured image data wherein an origin node corresponds to the user's current location and a destination node corresponds to the user's target location, both the current location and the target location being live and continuously captured and dynamically updated;
presenting a sequence of commands indicating which direction the user needs to move in the walkable path from the current location to the target location; and
dynamically updating the sequence of commands based on the user's progress as the user follows the sequence of commands corresponding to the direction the user needs to move from the current location to the target location, the sequence of commands dynamically updated based on performing feature detection on the continuously captured image data to identify curves and edges of moving objects within the walkable path, and comparing the curves and edges of the moving objects with the curves and edges of obstacles or non-moving objects within the walkable path;
a plurality of haptic devices for rendering real-time non-auditory navigation guidance; and
an audio interface for providing real-time auditory navigation guidance.