US 12,306,626 B2
Mobile robot cleaning system
Kenrick E. Drew, Northborough, MA (US); Philip Wasserman, Somerville, MA (US); and Christopher V. Jones, Woburn, MA (US)
Assigned to iRobot Corporation, Bedford, MA (US)
Filed by iRobot Corporation, Bedford, MA (US)
Filed on May 2, 2023, as Appl. No. 18/310,933.
Application 18/310,933 is a continuation of application No. 16/508,705, filed on Jul. 11, 2019, granted, now 11,669,086.
Claims priority of provisional application 62/698,004, filed on Jul. 13, 2018.
Prior Publication US 2023/0280743 A1, Sep. 7, 2023
Int. Cl. G05D 1/00 (2024.01); A01D 34/00 (2006.01); A01D 101/00 (2006.01); A47L 9/00 (2006.01); A47L 9/28 (2006.01); A47L 11/24 (2006.01); A47L 11/28 (2006.01); G06F 3/0484 (2022.01); G06T 7/00 (2017.01); G06T 7/73 (2017.01)
CPC G05D 1/0016 (2013.01) [A47L 9/009 (2013.01); A47L 9/2857 (2013.01); A47L 11/24 (2013.01); A47L 11/28 (2013.01); G05D 1/0038 (2013.01); G05D 1/0044 (2013.01); G05D 1/0231 (2013.01); G06F 3/0484 (2013.01); G06T 7/74 (2017.01); G06T 7/97 (2017.01); A01D 34/008 (2013.01); A01D 2101/00 (2013.01); A47L 2201/04 (2013.01); G05D 1/0219 (2013.01); G06T 2200/24 (2013.01)] 15 Claims
OG exemplary drawing
 
1. A method comprising:
presenting, on a display of a mobile computing device, an image representing signals captured by a camera of the mobile computing device;
receiving, at the mobile computing device, an input from a user of the mobile computing device, wherein the input identifies a point within the presented image corresponding to a point in a real-world environment;
determining first coordinates for the point in the real-world environment, wherein the first coordinates are based on a first coordinate system;
converting the first coordinates for the point in the real-world environment to second coordinates for the point in the real-world environment based on a mapping between the first coordinate system and a second coordinate system, wherein the second coordinates are based on the second coordinate system, and wherein the mapping is established by matching features identified in one or more images captured by the camera of the mobile computing device with corresponding features derived from one or more images captured by a camera of a mobile cleaning robot; and
transmitting a command including the second coordinates to the mobile cleaning robot to perform a cleaning operation at the point in the real-world environment.