US 12,236,203 B2
Translation method, model training method, electronic devices and storage mediums
Ruiqing Zhang, Beijing (CN); Xiyang Wang, Beijing (CN); Hui Liu, Beijing (CN); Zhongjun He, Beijing (CN); Zhi Li, Beijing (CN); and Hua Wu, Beijing (CN)
Assigned to BEIJING BAIDU NETCOM SCIENCE TECHNOLOGY CO., LTD., Beijing (CN)
Filed by BEIJING BAIDU NETCOM SCIENCE TECHNOLOGY CO., LTD., Beijing (CN)
Filed on Sep. 23, 2022, as Appl. No. 17/951,216.
Claims priority of application No. 202111353074.X (CN), filed on Nov. 16, 2021.
Prior Publication US 2023/0153543 A1, May 18, 2023
Int. Cl. G06F 40/51 (2020.01); G06F 40/42 (2020.01)
CPC G06F 40/51 (2020.01) [G06F 40/42 (2020.01)] 18 Claims
OG exemplary drawing
 
1. A translation method, comprising:
acquiring, based on a to-be-translated specified sentence and a pre-trained weighting model, a weight for each translation model in at least two pre-trained translation models translating the specified sentence; and
translating the specified sentence using the at least two translation models based on the weight for each translation model translating the specified sentence, comprising:
acquiring target words for locations generated in a process of translating the specified sentence by the at least two translation models, based on the weight for each translation model translating the specified sentence, comprising:
acquiring several candidate words for each location predicted in the process of translating the specified sentence by each of the at least two translation models, and probabilities corresponding to the candidate words;
calculating an inference probability of each candidate word for each location, based on the probability corresponding to each candidate word predicted during the translation by each of the at least two translation models, and the weight for each translation model translating the specified sentence; and
determining the target word at each location based on the inference probability of the candidate words for each location.