US 12,276,969 B2
Automated work chart systems and methods
Ananth Uggirala, Mountain View, CA (US); Yash Raj Chhabra, New Delhi (IN); Zakaria Ibrahim Assoul, Oakland, CA (US); Krishnendu Chaudhury, Saratoga, CA (US); and Prasad Narasimha Akella, Palo Alto, CA (US)
Assigned to R4N63R CAPITAL LLC, Wilmington, DE (US)
Filed by R4N63R Capital LLC, Wilmington, DE (US)
Filed on Nov. 5, 2018, as Appl. No. 16/181,173.
Claims priority of provisional application 62/581,541, filed on Nov. 3, 2017.
Prior Publication US 2019/0138971 A1, May 9, 2019
This patent is subject to a terminal disclaimer.
Int. Cl. G05B 19/418 (2006.01); G06F 9/448 (2018.01); G06F 9/48 (2006.01); G06F 11/07 (2006.01); G06F 11/34 (2006.01); G06F 16/22 (2019.01); G06F 16/23 (2019.01); G06F 16/2455 (2019.01); G06F 16/901 (2019.01); G06F 16/9035 (2019.01); G06F 16/904 (2019.01); G06F 30/20 (2020.01); G06F 30/23 (2020.01); G06F 30/27 (2020.01); G06N 3/008 (2023.01); G06N 3/04 (2023.01); G06N 3/044 (2023.01); G06N 3/045 (2023.01); G06N 3/08 (2023.01); G06N 3/084 (2023.01); G06N 7/01 (2023.01); G06N 20/00 (2019.01); G06Q 10/06 (2023.01); G06Q 10/0631 (2023.01); G06Q 10/0639 (2023.01); G06T 19/00 (2011.01); G06V 10/25 (2022.01); G06V 10/44 (2022.01); G06V 10/82 (2022.01); G06V 20/52 (2022.01); G06V 40/20 (2022.01); G09B 19/00 (2006.01); B25J 9/16 (2006.01); G01M 99/00 (2011.01); G05B 19/423 (2006.01); G05B 23/02 (2006.01); G06F 18/21 (2023.01); G06F 111/10 (2020.01); G06F 111/20 (2020.01); G06N 3/006 (2023.01); G06Q 10/083 (2023.01); G06Q 50/26 (2012.01); G16H 10/60 (2018.01)
CPC G05B 19/4183 (2013.01) [G05B 19/41835 (2013.01); G06F 9/4498 (2018.02); G06F 9/4881 (2013.01); G06F 11/0721 (2013.01); G06F 11/079 (2013.01); G06F 11/3452 (2013.01); G06F 16/2228 (2019.01); G06F 16/2365 (2019.01); G06F 16/24568 (2019.01); G06F 16/9024 (2019.01); G06F 16/9035 (2019.01); G06F 16/904 (2019.01); G06F 30/20 (2020.01); G06F 30/23 (2020.01); G06F 30/27 (2020.01); G06N 3/008 (2013.01); G06N 3/04 (2013.01); G06N 3/044 (2023.01); G06N 3/045 (2023.01); G06N 3/08 (2013.01); G06N 3/084 (2013.01); G06N 7/01 (2023.01); G06N 20/00 (2019.01); G06Q 10/06 (2013.01); G06Q 10/063112 (2013.01); G06Q 10/06316 (2013.01); G06Q 10/06393 (2013.01); G06Q 10/06395 (2013.01); G06Q 10/06398 (2013.01); G06T 19/006 (2013.01); G06V 10/25 (2022.01); G06V 10/454 (2022.01); G06V 10/82 (2022.01); G06V 20/52 (2022.01); G06V 40/20 (2022.01); G09B 19/00 (2013.01); B25J 9/1664 (2013.01); B25J 9/1697 (2013.01); G01M 99/005 (2013.01); G05B 19/41865 (2013.01); G05B 19/423 (2013.01); G05B 23/0224 (2013.01); G05B 2219/32056 (2013.01); G05B 2219/36442 (2013.01); G06F 18/217 (2023.01); G06F 2111/10 (2020.01); G06F 2111/20 (2020.01); G06N 3/006 (2013.01); G06Q 10/083 (2013.01); G06Q 50/26 (2013.01); G16H 10/60 (2018.01)] 19 Claims
OG exemplary drawing
 
1. A method of creating work charts comprising:
receiving one or more given indicators or criteria including a selection of cycles within a given time period;
accessing one or more given data sets based on the one or more given indicators or criteria, wherein the one or more given data sets include one or more indicators of at least one of one or more cycles, one or more processes, one or more actions, one or more sequences, one or more objects, and one or more parameters of a manufacturing operation of a product determined by convolution neural network deep learning using a computing device executing a machine learning engine over each of a plurality of video frame sensor streams from a plurality of manufacturing stations across an assembly line, wherein the one or more given indicators of the at least one of one or more cycles, one or more processes, one or more actions, one or more sequences, one or more objects, and one or more parameters of the manufacturing operation are indexed to corresponding portions of the plurality of video frame sensor streams, wherein the determination by the convolution neural network deep learning comprises:
performing, with a frame feature extractor, a two-dimensional convolution operation on video frames of the plurality of video frame sensor streams to generate a two-dimensional array of feature vectors;
combining neighboring feature vectors in the two-dimensional array of feature vectors and determining a dynamic region of interest, with a region of interest detector, in a set of the neighboring feature vectors, wherein the region of interest detector and the frame feature extractor share layers of a convolution neural network; and
extracting a feature vector from an area within the dynamic region of interest to analyze with the convolution neural network while discarding remaining feature vectors of the two-dimensional array outside of the dynamic region of interest;
determining, by the computing device executing the machine learning engine, a representative data set including the at least one of one or more cycles, one or more processes, one or more actions, one or more sequences, one or more objects and one or more parameters, and quantitative data, judgement data and inference data statistically derived from the one or more given data sets for the selection of cycles within the given time period;
creating, by the computing device executing the machine learning engine, a work chart from the representative data set, wherein the work chart includes a plurality of work elements and dependencies between the plurality of work elements and associated time for performing the plurality of work elements and associated time between the plurality of work elements for the selection of cycles within the given time period, and wherein the plurality of work elements are indexed to the at least one of one or more cycles, one or more processes, one or more actions, one or more sequences, one or more objects, and one or more parameters of the manufacturing operation of the representative data set and to the corresponding portions of the plurality of video frame sensor streams; and
adjusting the manufacturing operation of the product to gain efficiencies of one or more of the plurality of work elements based on the dependencies between the plurality of work elements and the associated time for performing the plurality of work elements and the associated time between the plurality of work elements in the work chart.