US 12,462,198 B1
Iterative attention-based neural network training and processing
Steven Dennis Flinn, Sugar Land, TX (US); and Naomi Felina Moneypenny, Bellevue, WA (US)
Assigned to STEVEN D. Flinn, Sugar Land, TX (US)
Filed by Steven D Flinn, Brenham, TX (US)
Filed on Jan. 16, 2025, as Appl. No. 19/024,727.
Application 19/024,727 is a continuation of application No. 18/963,473, filed on Nov. 27, 2024, granted, now 12,327,166.
Application 18/963,473 is a continuation of application No. 18/101,612, filed on Jan. 26, 2023, granted, now 12,223,404.
Application 18/101,612 is a continuation of application No. 16/660,908, filed on Oct. 23, 2019, granted, now 11,593,708, issued on Feb. 28, 2023.
Application 16/660,908 is a continuation of application No. 15/000,011, filed on Jan. 18, 2016, granted, now 10,510,018, issued on Dec. 17, 2019.
Application 15/000,011 is a continuation in part of application No. 14/816,439, filed on Aug. 3, 2015, abandoned.
This patent is subject to a terminal disclaimer.
Int. Cl. G06N 5/048 (2023.01); G06F 40/211 (2020.01); G06F 40/216 (2020.01); G06F 40/30 (2020.01); G06N 3/043 (2023.01); G06N 3/045 (2023.01); G06N 5/045 (2023.01); G06N 20/00 (2019.01); G06N 3/02 (2006.01)
CPC G06N 20/00 (2019.01) [G06F 40/211 (2020.01); G06F 40/216 (2020.01); G06F 40/30 (2020.01); G06N 3/043 (2023.01); G06N 3/045 (2023.01); G06N 5/045 (2013.01); G06N 5/048 (2013.01); G06N 3/02 (2013.01)] 115 Claims
OG exemplary drawing
 
1. A computer-implemented method, comprising:
at a system including one or more processors, and one or more memories in communication with the one or more processors and with one or more programs stored therein
causing access to a computer-implemented neural network trained on computer code so that the trained computer-implemented neural network is configured for probability generation in connection with a plurality of executable computer code elements for selection therefrom;
causing access to a first plurality of syntactical elements;
causing a first attention to be directed to a representation of a first subset of the first plurality of syntactical elements;
causing a second attention to be directed to a representation of a second subset of the first plurality of syntactical elements, for use by the trained computer-implemented neural network;
causing generation, by application of the trained computer-implemented neural network, of a plurality of probabilities that are each associated with a corresponding subset of the plurality of executable computer code elements, based on the second attention;
causing a selection of one or more of the plurality of executable computer code elements based on the plurality of probabilities; and
causing a communication to be sent to a user.