US 12,226,067 B2
Robot cleaner and control method thereof
Ahhyun Kim, Suwon-si (KR); Nooree Na, Suwon-si (KR); Miyoung Lee, Suwon-si (KR); and Hyeokjin Choi, Suwon-si (KR)
Assigned to SAMSUNG ELECTRONICS CO., LTD., Suwon-si (KR)
Filed by SAMSUNG ELECTRONICS CO., LTD., Suwon-Si (KR)
Filed on May 31, 2022, as Appl. No. 17/828,697.
Application 17/828,697 is a continuation of application No. PCT/KR2022/002940, filed on Mar. 2, 2022.
Claims priority of application No. 10-2021-0029247 (KR), filed on Mar. 5, 2021; and application No. 10-2021-0093064 (KR), filed on Jul. 15, 2021.
Prior Publication US 2022/0291693 A1, Sep. 15, 2022
Int. Cl. A47L 9/00 (2006.01); A47L 11/40 (2006.01); G01C 21/00 (2006.01)
CPC A47L 9/009 (2013.01) [A47L 11/4011 (2013.01); G01C 21/383 (2020.08)] 12 Claims
OG exemplary drawing
 
1. A robot cleaner comprising:
a driving device;
a sensor;
a camera;
a memory; and
a processor configured to:
control the driving device to drive the robot cleaner;
generate a map of a space where the robot cleaner is located based on information obtained through the sensor while driving the robot cleaner in the space;
store the generated map in the memory;
generate a normal line for a Voronoi graph generated based on the map;
classify the map into a plurality of regions in consideration of an area of a closed space and a length of the normal line divided in the map based on the normal line;
based on the area of the closed space divided in the map is larger than a predetermined size based on a normal line within a predetermined range, identify a corresponding closed space as one area, and identify a gate connecting the identified area and an other area;
divide the space on the generated map into a plurality of rooms;
detect an obstacle around the robot cleaner based on the map, a position and a rotation angle of the robot cleaner on the map, and a distance from a surrounding object to the robot cleaner obtained through the sensor;
obtain an image through the camera;
recognize an object and a position of the recognized object in the space on the generated map based on the information obtained through the sensor;
store information about the recognized object and the position of the recognized object on the map; and
generate a room name based on at least one of a type of the recognized object located at each of the plurality of rooms,
wherein the processor further configured to:
based on a division command to divide one room of the plurality of rooms being input on the map, divide the one room on the map to correspond to the input division command, and generate a room name for each of the divided rooms generated through the division;
based on a merge command to merge two or more rooms of the plurality of rooms to one merged room being input on the map, merge the two or more rooms of the plurality of rooms on the map to correspond to the input merge command, and generate a room name for the one merged room generated through the merging;
based on an area of each room generated through division command, designate the name of the room before the division to a room having a widest area, and generate a room name based on information about an object located at a room for remaining rooms;
based on the merge command to merge two or more rooms of the plurality of rooms to one merged room being input on the map is inputted, identify the room name of a room having a widest area, and generate the identified room name of the room having a widest area as the room name for the one merged room.