US 11,899,463 B1
Obstacle recognition method for autonomous robots
Ali Ebrahimi Afrouzi, Henderson, NV (US); Soroush Mehrnia, Helsingborg (SE); and Lukas Robinson, York (CA)
Assigned to AI Incorporated, Toronto (CA)
Filed by Ali Ebrahimi Afrouzi, Henderson, NV (US); Soroush Mehrnia, Helsingborg (SE); and Lukas Robinson, York (CA)
Filed on Aug. 5, 2022, as Appl. No. 17/882,498.
Application 17/882,498 is a continuation of application No. 17/679,215, filed on Feb. 24, 2022, granted, now 11,449,063.
Application 17/679,215 is a continuation of application No. 16/995,480, filed on Aug. 17, 2020, granted, now 11,467,587.
Application 16/995,480 is a continuation of application No. 16/832,180, filed on Mar. 27, 2020, granted, now 10,788,836, issued on Sep. 29, 2020.
Application 16/832,180 is a continuation in part of application No. 16/570,242, filed on Sep. 13, 2019, granted, now 10,969,791, issued on Apr. 6, 2021.
Application 16/570,242 is a continuation of application No. 15/442,992, filed on Feb. 27, 2017, granted, now 10,452,071, issued on Oct. 22, 2019.
Claims priority of provisional application 62/986,946, filed on Mar. 9, 2020.
Claims priority of provisional application 62/952,384, filed on Dec. 22, 2019.
Claims priority of provisional application 62/952,376, filed on Dec. 22, 2019.
Claims priority of provisional application 62/942,237, filed on Dec. 2, 2019.
Claims priority of provisional application 62/933,882, filed on Nov. 11, 2019.
Claims priority of provisional application 62/914,190, filed on Oct. 11, 2019.
Claims priority of provisional application 62/301,449, filed on Feb. 29, 2016.
Int. Cl. G05D 1/02 (2020.01); B25J 9/16 (2006.01); A47L 11/40 (2006.01); G06V 10/141 (2022.01); H04W 12/50 (2021.01); G06F 3/16 (2006.01); G06V 10/70 (2022.01); G06V 20/58 (2022.01); G06N 5/04 (2023.01); G06V 20/64 (2022.01)
CPC G05D 1/0214 (2013.01) [A47L 11/4011 (2013.01); B25J 9/1676 (2013.01); B25J 9/1697 (2013.01); G05D 1/0225 (2013.01); G05D 1/0238 (2013.01); G05D 1/0246 (2013.01); G06F 3/167 (2013.01); G06N 5/04 (2013.01); G06V 10/141 (2022.01); G06V 10/70 (2022.01); G06V 20/58 (2022.01); G06V 20/64 (2022.01); H04W 12/50 (2021.01); A47L 2201/024 (2013.01); A47L 2201/04 (2013.01); A47L 2201/06 (2013.01); G05D 2201/0203 (2013.01); G05D 2201/0215 (2013.01)] 119 Claims
OG exemplary drawing
 
1. A robot, comprising:
a plurality of sensors;
a processor;
an image sensor;
at least one cleaning tool for performing one of vacuuming and mopping; and
a tangible, non-transitory, machine readable medium storing instructions that when executed by the processor effectuates operations comprising:
capturing, with the image sensor, images of a workspace as the robot moves within the workspace;
identifying, with the processor, at least one characteristic of an object captured in the images of the workspace;
determining, with the processor, an object type of the object based on an object dictionary of different types of objects, wherein the different object types comprise at least a cord, clothing garments, a shoe, earphones, and pet bodily waste; and
instructing, with the processor, the robot to execute at least one action based on the object type of the object, wherein the at least one action comprises avoiding the object or cleaning around the object.