US 11,054,811 B2
Systems and methods for line balancing
Prasad Narasimha Akella, Palo Alto, CA (US); Krishnendu Chaudhury, Saratoga, CA (US); Sameer Gupta, Palot Alto, CA (US); and Ananth Uggirala, Mountain View, CA (US)
Assigned to Drishti Technologies, Inc., Palo Alto, CA (US)
Filed by Drishti Technologies, Inc., Palo Alto, CA (US)
Filed on Nov. 5, 2018, as Appl. No. 16/181,112.
Claims priority of provisional application 62/581,541, filed on Nov. 3, 2017.
Prior Publication US 2019/0137979 A1, May 9, 2019
Int. Cl. G05B 19/418 (2006.01); G06Q 10/06 (2012.01); G06F 16/9035 (2019.01); G06F 16/904 (2019.01); G06F 16/2455 (2019.01); G06F 30/20 (2020.01); G06F 30/23 (2020.01); G06N 3/04 (2006.01); G06N 3/08 (2006.01); G06F 16/23 (2019.01); G06F 11/07 (2006.01); G06N 20/00 (2019.01); G06K 9/00 (2006.01); G06F 16/22 (2019.01); G06N 3/00 (2006.01); G06F 9/48 (2006.01); G06F 16/901 (2019.01); G06N 7/00 (2006.01); G06F 9/448 (2018.01); G06T 19/00 (2011.01); G09B 19/00 (2006.01); G06F 111/10 (2020.01); G06F 111/20 (2020.01); G06K 9/62 (2006.01); G01M 99/00 (2011.01); G06Q 50/26 (2012.01); B25J 9/16 (2006.01); G05B 19/423 (2006.01); G16H 10/60 (2018.01); G06Q 10/08 (2012.01)
CPC G05B 19/41835 (2013.01) [G05B 19/4183 (2013.01); G06F 9/4498 (2018.02); G06F 9/4881 (2013.01); G06F 11/079 (2013.01); G06F 11/0721 (2013.01); G06F 16/2228 (2019.01); G06F 16/2365 (2019.01); G06F 16/24568 (2019.01); G06F 16/904 (2019.01); G06F 16/9024 (2019.01); G06F 16/9035 (2019.01); G06F 30/20 (2020.01); G06F 30/23 (2020.01); G06K 9/00335 (2013.01); G06N 3/008 (2013.01); G06N 3/04 (2013.01); G06N 3/0445 (2013.01); G06N 3/0454 (2013.01); G06N 3/08 (2013.01); G06N 3/084 (2013.01); G06N 7/005 (2013.01); G06N 20/00 (2019.01); G06Q 10/06 (2013.01); G06Q 10/06316 (2013.01); G06Q 10/06393 (2013.01); G06Q 10/06395 (2013.01); G06Q 10/06398 (2013.01); G06Q 10/063112 (2013.01); G06T 19/006 (2013.01); G09B 19/00 (2013.01); B25J 9/1664 (2013.01); B25J 9/1697 (2013.01); G01M 99/005 (2013.01); G05B 19/41865 (2013.01); G05B 19/423 (2013.01); G05B 2219/32056 (2013.01); G05B 2219/36442 (2013.01); G06F 2111/10 (2020.01); G06F 2111/20 (2020.01); G06K 9/6262 (2013.01); G06N 3/006 (2013.01); G06Q 10/083 (2013.01); G06Q 50/26 (2013.01); G16H 10/60 (2018.01)] 24 Claims
OG exemplary drawing
 
1. A method comprising:
receiving one or more sensor streams including one or more video frame streams with an engine;
utilizing the engine to identify one or more actions from the one or more video frame streams that are performed on a product at a first station of a plurality of stations;
utilizing the engine to identify one or more actions from the one or more video frame streams that are performed on the product at a second station of the plurality of stations;
storing in a data structure the received one or more sensor streams, the identified one or more actions performed on the product at the first station, and the identified one or more actions performed on the product at the second station, wherein the identified one or more actions performed on the product at each of the first and second stations are mapped to the one or more sensor streams;
utilizing the engine to characterize each of the identified one or more actions performed on the product at each of the first and second stations to produce determined characterizations thereof, wherein the determined characterizations include time taken to perform the one or more actions performed at each of the first and second stations; and
based on one or more of the determined characterizations, automatically producing a recommendation, either dynamically or post-facto, to move at least one of the identified one or more actions performed on the product at the second station to the first station to reduce cycle time.