US 12,175,365 B2
Learning apparatus, method, and non-transitory computer readable medium
Atsushi Yaguchi, Tokyo (JP); Shuhei Nitta, Tokyo (JP); Yukinobu Sakata, Kawasaki (JP); and Akiyuki Tanizawa, Kawasaki (JP)
Assigned to KABUSHIKI KAISHA TOSHIBA, Tokyo (JP)
Filed by KABUSHIKI KAISHA TOSHIBA, Tokyo (JP)
Filed on Feb. 26, 2021, as Appl. No. 17/186,924.
Claims priority of application No. 2020-155384 (JP), filed on Sep. 16, 2020.
Prior Publication US 2022/0083856 A1, Mar. 17, 2022
Int. Cl. G06N 3/08 (2023.01); G06N 3/045 (2023.01)
CPC G06N 3/08 (2013.01) [G06N 3/045 (2023.01)] 18 Claims
OG exemplary drawing
 
1. A learning apparatus comprising:
processing circuitry configured to:
set one or more second training conditions based on a first training condition relating to a first trained model, and
train one or more neural networks in accordance with the one or more second training conditions, and generate one or more second trained models which execute a task identical to a task executed by the first trained model; and
a display configured to display a graph showing an inference performance and calculation cost of each of the one or more second trained models,
wherein the processing circuitry is further configured to:
select a second trained model having an inference performance and/or calculation cost corresponding to an inference performance and/or calculation cost designated by a user through the graph among the one or more second trained models;
set a third training condition based on the first training condition and the second training condition relating to the selected second trained model;
train a neural network in accordance with the third training condition and generate a third trained model; and
output the third trained model.