US 11,898,848 B2
Visual navigation for mobile devices operable in differing environmental lighting conditions
Michael Dooley, Pasadena, CA (US); Nikolai Romanov, Oak Park, CA (US); and James Philip Case, San Francisco, CA (US)
Assigned to Labrador Systems, Inc., Oak Park, CA (US)
Filed by Labrador Systems, Inc., Oak Park, CA (US)
Filed on Mar. 24, 2022, as Appl. No. 17/703,096.
Application 17/703,096 is a continuation of application No. 17/257,472, granted, now 11,287,262, previously published as PCT/US2019/041846, filed on Jul. 15, 2019.
Claims priority of provisional application 62/697,520, filed on Jul. 13, 2018.
Prior Publication US 2022/0214172 A1, Jul. 7, 2022
Int. Cl. G01C 21/20 (2006.01); H04N 13/254 (2018.01); H04N 13/239 (2018.01); H04N 23/72 (2023.01); H04N 23/90 (2023.01)
CPC G01C 21/206 (2013.01) [H04N 13/239 (2018.05); H04N 13/254 (2018.05); H04N 23/72 (2023.01); H04N 23/90 (2023.01)] 15 Claims
OG exemplary drawing
 
1. A method for rapidly training a mobile robot configured for autonomous operation to navigate an unstructured residential environment, the method comprising:
identifying a set of desired robot destinations within the unstructured residential environment, wherein the set of desired robot destinations comprises at least four desired robot destinations;
guiding the mobile robot along paths between at least a minimum number of different pairs of desired robot destinations of the set of desired robot destinations to enable establishment of full connectivity between each different pair of desired robot destinations;
receiving operator input signals from an operator interface associated with the mobile robot, wherein the operator input signals are configured to start and stop recording of defined routes during the guiding of the mobile robot along the paths, wherein the defined routes identify routes to be used by the mobile robot during subsequent positioning and navigation operations;
recording images of surroundings experienced by the mobile robot during the guiding of the mobile robot along the paths; and
creating a visual SLAM map of at least a portion of the unstructured residential environment from the recorded images.