US 11,866,258 B2
User interface for mission generation of area-based operation by autonomous robots in a facility context
Josip Cesic, Zagreb (HR); Tomislav Haus, Donja Stubica (HR); and Kruno Lenac, Zagreb (HR)
Assigned to GIDEON BROTHERS D.O.O., Osijek (HR)
Filed by Gideon Brothers d.o.o., Osijek (HR)
Filed on Dec. 30, 2020, as Appl. No. 17/138,435.
Claims priority of provisional application 63/111,480, filed on Nov. 9, 2020.
Claims priority of provisional application 63/093,682, filed on Oct. 19, 2020.
Prior Publication US 2022/0121353 A1, Apr. 21, 2022
Int. Cl. G06F 3/0486 (2013.01); B65G 1/06 (2006.01); B66F 9/06 (2006.01); G05D 1/02 (2020.01); G05D 1/00 (2006.01); B65G 1/137 (2006.01); G06V 20/10 (2022.01); G06V 30/194 (2022.01); G06V 20/64 (2022.01); G06F 18/21 (2023.01); G05B 19/418 (2006.01); G06F 3/04847 (2022.01)
CPC B65G 1/06 (2013.01) [B65G 1/1375 (2013.01); B66F 9/063 (2013.01); G05B 19/41895 (2013.01); G05D 1/0044 (2013.01); G05D 1/0212 (2013.01); G05D 1/0214 (2013.01); G05D 1/0231 (2013.01); G05D 1/0297 (2013.01); G06F 3/0486 (2013.01); G06F 18/21 (2023.01); G06V 20/10 (2022.01); G06V 20/64 (2022.01); G06V 30/194 (2022.01); B65G 2203/0233 (2013.01); B65G 2203/041 (2013.01); G05D 2201/0216 (2013.01); G06F 3/04847 (2013.01)] 20 Claims
OG exemplary drawing
 
1. A non-transitory computer readable medium comprising memory with instructions encoded thereon that, when executed, cause one or more processors to perform operations, the instructions comprising instructions to:
generate for display to a remote operator a user interface comprising a map of a warehouse facility, the map comprising visual representations of a source area comprising a plurality of pallets, a plurality of candidate robots, and a plurality of candidate destination areas that are configured to accept one or more of the plurality of pallets;
receive, via the user interface, a selection of a visual representation of a candidate robot of the plurality of candidate robots;
detect a plurality of gestures comprising a drag-and-drop gesture within the user interface of the visual representation of the candidate robot being dragged-and-dropped to a visual representation of a candidate destination area of the plurality of candidate destination areas and the plurality of gestures comprising a drag gesture defining a target orientation; and
responsive to detecting the plurality of gestures, generate a mission, wherein the mission causes the candidate robot to autonomously load a pallet of the plurality of pallets from the source area onto a fork of the candidate robot, to transport the pallet to the candidate destination area, and to unload the pallet from the fork of the candidate robot to a zone within the candidate destination area, the pallet unloaded so as to have the target orientation defined by the drag gesture when resting within the candidate destination area after being unloaded.