US 12,437,183 B2
Subtask storage for streaming convolutions in neural network processor
Sayyed Karen Khatamifard, Bellevue, WA (US); Chenfan Sun, Shoreline, WA (US); Alon Yaakov, Raanana (IL); Husam Khashiboun, Peqiin (IL); Jeffrey D Marker, Pleasant View, UT (US); Saman Naderiparizi, Seattle, WA (US); Ramana V Rachakonda, Austin, TX (US); and Rohit K Gupta, Saratoga, CA (US)
Assigned to APPLE INC., Cupertino, CA (US)
Filed by Apple Inc., Cupertino, CA (US)
Filed on Jun. 6, 2022, as Appl. No. 17/833,476.
Prior Publication US 2023/0394276 A1, Dec. 7, 2023
Int. Cl. G06F 9/50 (2006.01); G06F 9/48 (2006.01); G06F 9/54 (2006.01); G06N 3/04 (2023.01)
CPC G06N 3/04 (2013.01) [G06F 9/4881 (2013.01); G06F 9/5016 (2013.01); G06F 9/5038 (2013.01); G06F 9/544 (2013.01); G06F 2209/5017 (2013.01)] 20 Claims
OG exemplary drawing
 
1. A neural processor circuit, comprising:
a neural engine circuit configured to perform a plurality of convolution operations of a plurality of layers in a streaming mode; and
a neural task manager configured to:
obtain a plurality of task descriptors and a plurality of subtask descriptors, each of the plurality of task descriptors identifying a respective set of the plurality of convolution operations of a respective layer of the plurality of layers, each of the plurality of subtask descriptors identifying a corresponding task descriptor of the plurality of task descriptors and a subset of the plurality of convolution operations on a portion of a layer of the plurality of layers identified by the corresponding task descriptor, and
configure the neural engine circuit for execution of the subset of the plurality of convolution operations using the corresponding task descriptor,
wherein, in the streaming mode, the neural engine circuit is configured to perform the subset of the plurality of convolution operations to generate output data that correspond to input data of another subset of the plurality of convolution operations identified by another subtask descriptor of the plurality of subtask descriptors.