US 11,878,421 B2
Robot navigation and robot-IoT interactive task planning using augmented reality
Yuanzhi Cao, West Lafayette, IN (US); Karthik Ramani, West Lafayette, IN (US); and Zhuangying Xu, Redmond, WA (US)
Assigned to PURDUE RESEARCH FOUNDATION, West Lafayette, IN (US)
Appl. No. 17/050,309
Filed by Purdue Research Foundation, West Lafayette, IN (US)
PCT Filed Apr. 23, 2019, PCT No. PCT/US2019/028797
§ 371(c)(1), (2) Date Oct. 23, 2020,
PCT Pub. No. WO2019/209878, PCT Pub. Date Oct. 31, 2019.
Claims priority of provisional application 62/661,082, filed on Apr. 23, 2018.
Prior Publication US 2021/0078172 A1, Mar. 18, 2021
Int. Cl. B25J 9/16 (2006.01); G05D 1/00 (2006.01); B25J 13/08 (2006.01); B25J 19/02 (2006.01); G05D 1/02 (2020.01); G06F 3/04847 (2022.01); G06F 3/0486 (2013.01)
CPC B25J 9/1661 (2013.01) [B25J 9/1664 (2013.01); B25J 9/1689 (2013.01); B25J 13/089 (2013.01); B25J 19/023 (2013.01); G05D 1/0038 (2013.01); G05D 1/0044 (2013.01); G05D 1/0251 (2013.01); G05D 1/0274 (2013.01); G06F 3/0486 (2013.01); G06F 3/04847 (2013.01)] 30 Claims
OG exemplary drawing
 
1. A method for authoring tasks for execution by a programmable mobile robot in a physical environment, the method comprising:
at an application running on a mobile device with augmented reality simultaneous localization and mapping (AR-SLAM) capabilities, the mobile device comprising a processor, a memory, a network interface for communications via one or more wired or wireless computer networks, a camera for recording images captured within a field-of-view (FOV) of a display of the mobile device, and one or more sensors for estimating motion of the camera in the physical environment:
generating a dynamic simultaneous localization and mapping (SLAM) map comprising spatial information for an augmented reality (AR) scene that comprises a digital representation of at least part of the physical environment in a 3-dimensional coordinate system;
displaying the AR scene in the display of the mobile device;
receiving input defining a pathway through which the programmable mobile robot is to navigate in the physical environment wherein the pathway is defined based on recording spatial movements of the mobile device in the physical environment using one or more of the sensors on the mobile device;
generating a task sequence comprising routing instructions to be executed by the programmable mobile robot for navigation in the physical environment based on the input defining the pathway through which the robot is to navigate in the physical environment;
transferring the task sequence from the mobile device to the programmable mobile robot wherein the programmable mobile robot is adapted to execute the routing instructions to generate control signals for controlling robot navigation in the physical environment, wherein the physical environment includes a distribution of one or more Internet of Things (IoT) devices and wherein the one or more IoT devices are spatially registered within the dynamic SLAM map and serve as landmarks for the programmable mobile robot to navigate in the AR scene; and
docking with IoT devices using functions of the mobile device camera wherein functions of the one or more IoT devices are configured to be edited in-situ in the AR scene.