US 12,002,487 B2
Information processing apparatus and information processing method for selecting a character response to a user based on emotion and intimacy
Hideo Nagasaka, Tokyo (JP)
Assigned to SONY GROUP CORPORATION, Tokyo (JP)
Appl. No. 17/430,755
Filed by SONY GROUP CORPORATION, Tokyo (JP)
PCT Filed Feb. 22, 2019, PCT No. PCT/JP2019/006855
§ 371(c)(1), (2) Date Aug. 13, 2021,
PCT Pub. No. WO2020/170441, PCT Pub. Date Aug. 27, 2020.
Prior Publication US 2022/0165293 A1, May 26, 2022
Int. Cl. G10L 25/63 (2013.01); G06T 7/20 (2017.01); G06T 13/40 (2011.01); G10L 13/00 (2006.01); G10L 15/18 (2013.01); G10L 21/06 (2013.01)
CPC G10L 25/63 (2013.01) [G06T 7/20 (2013.01); G06T 13/40 (2013.01); G10L 13/00 (2013.01); G10L 15/1815 (2013.01); G10L 21/06 (2013.01)] 12 Claims
OG exemplary drawing
 
1. An information processing apparatus, comprising:
circuitry configured to:
determine, based on a result of analysis of a first utterance sentence of a plurality of utterance sentences, a first emotion corresponding to the first utterance sentence of a character included in a scenarios;
select a response attitude pattern from a plurality of response attitude patterns, based on each of the determined first emotion and an intimacy between the character and a user, wherein each of the plurality of response attitude patterns is mapped with corresponding emotion levels and intimacy levels;
select, based on a content of the first utterance sentence and the determined first emotion, a first motion of a plurality of motions for the character that is synchronized with the first utterance sentence;
adjust a movement speed of the selected first motion based on the selected response attitude pattern; and
add, to the scenario, a description for adjustment of presentation of the selected first motion to match a voice output timing of the first utterance sentence.