CPC G06N 3/084 (2013.01) [G06F 18/213 (2023.01); G06F 18/24 (2023.01); G06F 18/24137 (2023.01); G06N 3/045 (2023.01); G06V 10/454 (2022.01); G06V 10/7715 (2022.01); G06V 10/82 (2022.01); G06V 30/18057 (2022.01); G06V 30/19127 (2022.01); G06V 30/19173 (2022.01); G06V 30/10 (2022.01)] | 20 Claims |
1. A device, comprising:
a processor; and
a memory,
the processor and the memory being configured as a neural network comprising:
at least one layer comprising an input and an output, the layer configured to receive an input feature map at the input and output an output feature map at the output, at least one of the input feature map and the output feature map having been quantized by a unitary quantizing operation to reduce a number of bits of at least one value of the feature map from a first predetermined number of bits to a second predetermined number of bits that is less than the first predetermined number of bits without changing a dimension of the feature map.
|