US 12,433,463 B2
Mapping an environment around an autonomous vacuum
Navneet Dalal, Atherton, CA (US); Seungho Yang, Mountain View, CA (US); Gavin Li, Menlo Park, CA (US); and Mehul Nariyawala, Los Altos, CA (US)
Assigned to MATIC ROBOTS, INC., Mountain View, CA (US)
Filed by MATIC ROBOTS, INC., Mountain View, CA (US)
Filed on Feb. 9, 2021, as Appl. No. 17/172,022.
Claims priority of provisional application 63/121,842, filed on Dec. 4, 2020.
Claims priority of provisional application 62/972,563, filed on Feb. 10, 2020.
Prior Publication US 2021/0244254 A1, Aug. 12, 2021
Int. Cl. A47L 7/00 (2006.01); A46B 9/00 (2006.01); A46B 13/00 (2006.01); A46D 1/00 (2006.01); A47L 5/30 (2006.01); A47L 5/34 (2006.01); A47L 9/04 (2006.01); A47L 9/14 (2006.01); A47L 9/28 (2006.01); A47L 11/20 (2006.01); A47L 11/40 (2006.01); B01D 46/00 (2022.01); B01D 46/02 (2006.01); G01C 21/00 (2006.01); G05D 1/00 (2024.01); G05D 1/225 (2024.01); G05D 1/646 (2024.01)
CPC A47L 9/2847 (2013.01) [A46B 9/005 (2013.01); A46B 13/006 (2013.01); A46D 1/0207 (2013.01); A47L 5/30 (2013.01); A47L 5/34 (2013.01); A47L 7/0004 (2013.01); A47L 7/0009 (2013.01); A47L 7/0023 (2013.01); A47L 9/0477 (2013.01); A47L 9/1409 (2013.01); A47L 9/1427 (2013.01); A47L 9/281 (2013.01); A47L 9/2826 (2013.01); A47L 11/201 (2013.01); A47L 11/4052 (2013.01); A47L 11/4061 (2013.01); B01D 46/0036 (2013.01); B01D 46/02 (2013.01); G01C 21/383 (2020.08); G05D 1/0044 (2013.01); G05D 1/0212 (2013.01); G05D 1/225 (2024.01); G05D 1/646 (2024.01); A46B 2200/3033 (2013.01); A47L 9/2852 (2013.01); A47L 9/2857 (2013.01); A47L 2201/00 (2013.01); A47L 2201/04 (2013.01); A47L 2201/06 (2013.01); B01D 2279/55 (2013.01)] 20 Claims
OG exemplary drawing
 
1. A non-transitory computer-readable storage medium storing instructions that when executed cause a computer processor to:
receive, via a sensor system, visual data about a home environment;
determine a ground plane of the home environment in the visual data;
detect objects within the home environment based on the detected ground plane;
for each object:
segment a three-dimensional representation of the object out of the visual data,
determine an object type of the object based on visual data of the object,
wherein the object type is a classification of different objects found in the home environment,
apply a descriptor tag to the object indicating the object type of the object,
determine whether the object is static or dynamic,
responsive to determining the object is static, map the three-dimensional representation to a long-term level of a map of the home environment, and
responsive to determining the object is dynamic, map the three-dimensional representation to an intermediate level of the map of the home environment;
wherein a first set of objects having a first descriptor tag indicating a first object type is mapped to the long-term level and a second set of objects having the first descriptor tag indicating the first object type is mapped to the intermediate level;
localize a current position of an autonomous vacuum in the home environment based on objects mapped to the long-term level of the map of the home environment while excluding objects mapped to the intermediate level of the map;
determine navigation instructions to navigate the autonomous vacuum along a path from the current position of the autonomous vacuum to a target position where a mess is located, the path being based on the current position of the autonomous vacuum and objects detected within the environment in the long-term level of the map and the intermediate level of the map; and
instruct an actuator assembly to move the autonomous vacuum to vacuum dirt in the home environment using the map of the home environment and the navigation instructions.