US 12,079,106 B1
Transformer-based bug fixing
Yaojie Hu, Ames, IA (US); Xingjian Shi, Sunnyvale, CA (US); Qiang Zhou, San Jose, CA (US); and Lee Pike, Portland, OR (US)
Assigned to Amazon Technologies, Inc., Seattle, WA (US)
Filed by Amazon Technologies, Inc., Seattle, WA (US)
Filed on Dec. 8, 2021, as Appl. No. 17/545,770.
Int. Cl. G06F 11/36 (2006.01); G06N 5/04 (2023.01)
CPC G06F 11/3636 (2013.01) [G06N 5/04 (2013.01)] 20 Claims
OG exemplary drawing
 
1. A computer-implemented method comprising:
receiving a request to perform transformer-based bug fixing on code;
performing bug fixing inference on the code by applying a transformer model, wherein the transformer model includes an encoder to generate an encoder memory from tokens from the code and a decoder to predict a next token in an editing sequence, wherein the decoder includes a selectable first branch to predict edit actions and inserted words and a selectable second branch that uses a pointer network to predict edit locations based at least partially on the encoder memory generated by the encoder of the transformer model, wherein the bug fixing inference is performed using a beam search over output of the encoder, and wherein the beam search uses a finite state machine to determine a branch of the decoder to use; and
reporting out a result of the bug fixing inference, wherein the result includes an indication of a location of a potential edit to be made in the code and the potential edit to be made in the code.