US 12,440,987 B2
Processing systems and methods for providing processing of a variety of objects
Thomas Wagner, Concord, MA (US); Kevin Ahearn, Nebo, NC (US); Benjamin Cohen, White Plains, NY (US); Michael Dawson-Haggerty, Pittsburgh, PA (US); Christopher Geyer, Arlington, MA (US); Thomas Koletschka, Somerville, MA (US); Kyle Maroney, Saunderstown, RI (US); Matthew T. Mason, Atlanta, GA (US); Gene Temple Price, Somerville, MA (US); Joseph Romano, Arlington, MA (US); Daniel Carlton Smith, Wexford, PA (US); Siddhartha Srinivasa, Seattle, WA (US); Prasanna Velagapudi, Pittsburgh, PA (US); and Thomas Allen, Reading, MA (US)
Assigned to Berkshire Grey Operating Company, Inc., Bedford, MA (US)
Filed by Berkshire Grey Operating Company, Inc., Bedford, MA (US)
Filed on Jul. 2, 2024, as Appl. No. 18/761,968.
Application 18/761,968 is a continuation of application No. 17/843,666, filed on Jun. 17, 2022, granted, now 12,059,810.
Application 17/843,666 is a continuation of application No. 16/775,798, filed on Jan. 29, 2020, granted, now 11,420,329, issued on Aug. 23, 2022.
Application 16/775,798 is a continuation of application No. 15/348,498, filed on Nov. 10, 2016, granted, now 10,625,432, issued on Apr. 21, 2020.
Claims priority of provisional application 62/277,234, filed on Jan. 11, 2016.
Claims priority of provisional application 62/255,069, filed on Nov. 13, 2015.
Prior Publication US 2024/0359328 A1, Oct. 31, 2024
Int. Cl. B25J 9/16 (2006.01); B07C 3/18 (2006.01); B07C 5/36 (2006.01); B25J 9/00 (2006.01); B25J 19/04 (2006.01); G05B 19/418 (2006.01)
CPC B25J 9/1669 (2013.01) [B07C 3/18 (2013.01); B07C 5/36 (2013.01); B25J 9/0093 (2013.01); B25J 9/1612 (2013.01); B25J 9/1664 (2013.01); B25J 9/1687 (2013.01); B25J 9/1697 (2013.01); B25J 19/04 (2013.01); G05B 19/4183 (2013.01); G05B 2219/32037 (2013.01); G05B 2219/39106 (2013.01); G05B 2219/39295 (2013.01); G05B 2219/39476 (2013.01); G05B 2219/39484 (2013.01); G05B 2219/39504 (2013.01); G05B 2219/39548 (2013.01); G05B 2219/40053 (2013.01); G05B 2219/40078 (2013.01); G05B 2219/40116 (2013.01); G05B 2219/40538 (2013.01); G05B 2219/45045 (2013.01); Y02P 90/02 (2015.11)] 27 Claims
OG exemplary drawing
 
1. An object processing system comprising:
a programmable motion device including an end-effector;
a perception unit for capturing real-time image data of the plurality of objects at an input area;
an interactive display system including a touch screen input display for displaying the real-time image data and through which machine learning grasp input data regarding a plurality of objects is received; and
a control system accessing the machine learning grasp input data and for providing object grasp information regarding a grasp location for grasping the object responsive to the machine learning grasp input data regarding a plurality of objects.
 
10. An object processing system comprising:
a programmable motion device including an end-effector;
a perception unit for capturing real-time image data of the plurality of objects at an input area;
an interactive display that includes a touch screen input display for displaying the real -time image data; and
a control system for providing object grasp information regarding a plurality of grasp locations for grasping the object with the end-effector, the plurality of object grasp locations being derived from a plurality of machine learning grasp input data regarding a plurality of objects, the machine learning grasp input data including data received via the interactive display system that includes the touch screen input display.
 
19. A method of processing objects received at an input area, said method comprising:
providing a programmable motion device with an end-effector. obtaining first grasp input information for a selected object of a plurality of objects in a container at an input area responsive to machine learning grasp input data;
using the end effector to move the selected object of the plurality of objects in the container at the input area without grasping the object; and
obtaining second grasp input information for the selected object of the plurality of objects in the container at the input area responsive to machine learning grasp input data.