US 11,989,835 B2
Augmented reality overlay
Matthew Amacker, Santa Clara, CA (US); Arshan Poursohi, Berkley, CA (US); and Allison Thackston, San Jose, CA (US)
Assigned to Toyota Research Institute, Inc., Los Altos, CA (US)
Filed by Toyota Research Institute, Inc., Los Altos, CA (US)
Filed on Jul. 18, 2018, as Appl. No. 16/038,248.
Claims priority of provisional application 62/563,366, filed on Sep. 26, 2017.
Prior Publication US 2019/0096134 A1, Mar. 28, 2019
Int. Cl. G06T 19/00 (2011.01); B25J 9/16 (2006.01); G05D 1/00 (2006.01); G06F 3/04815 (2022.01); G06V 20/10 (2022.01); G06V 20/20 (2022.01); G06V 20/64 (2022.01)
CPC G06T 19/006 (2013.01) [B25J 9/1697 (2013.01); G05D 1/0038 (2013.01); G06F 3/04815 (2013.01); G06V 20/10 (2022.01); G06V 20/20 (2022.01); G06V 20/64 (2022.01); G06V 2201/12 (2022.01)] 21 Claims
OG exemplary drawing
 
1. A computing device configured to display a virtual representation of an environment of a robot on a display device, the computing device comprising:
a memory; and
a processor coupled to the memory, wherein the processor is configured to:
receive data from a visual detection sensor and a spatial detection sensor of an other robot that was previously in the environment of the robot with respect to an object within the environment;
generate point cloud data to denote coordinates of the object based on the data received from the other robot that was previously in the environment of the robot;
generate, for display on the display device, a virtual representation of the object by visually imposing the data received from the spatial detection sensor upon the data received from the visual detection sensor, wherein the virtual representation of the object for display comprises a plurality of point cloud indicators generated based on the point cloud data;
receive input data selecting the virtual representation of the object;
output instructions for the robot to interact with the object within the environment of the robot based upon the input data selecting the virtual representation of the object; and
display a navigation path of the robot generated based on mapping data of the environment received from the other robot.