US 11,858,118 B2
Robot, server, and human-machine interaction method
Wenhua Sun, Shenzhen (CN); Jiali Fu, Shenzhen (CN); Heng Liao, Shenzhen (CN); and Huimin Zhang, Shenzhen (CN)
Assigned to HUAWEI TECHNOLOGIES CO., LTD., Guangdong (CN)
Filed by HUAWEI TECHNOLOGIES CO., LTD., Guangdong (CN)
Filed on Jun. 28, 2019, as Appl. No. 16/457,676.
Application 16/457,676 is a continuation of application No. PCT/CN2017/119107, filed on Dec. 27, 2017.
Claims priority of application No. 201611267452.1 (CN), filed on Dec. 31, 2016.
Prior Publication US 2019/0337157 A1, Nov. 7, 2019
Int. Cl. B25J 11/00 (2006.01); G06N 20/00 (2019.01); G09B 5/06 (2006.01); G10L 25/63 (2013.01); G06V 40/16 (2022.01)
CPC B25J 11/001 (2013.01) [G06N 20/00 (2019.01); G06V 40/176 (2022.01); G09B 5/065 (2013.01); G10L 25/63 (2013.01)] 14 Claims
OG exemplary drawing
 
1. A human-machine interaction method, executed by a companion robot, comprising:
detecting and collecting, by the companion robot, sensing information of a companion object and emotion information of a target object when the target object interacts with the companion object, wherein each of the emotion information and the sensing information comprises at least one of view information or voice information, the companion object is a caregiver or a guardian of the target object;
extracting, by the companion robot, an emotion feature quantity based on the emotion information;
determining, by the companion robot, based on the emotion feature quantity, an emotional pattern used by the target object to interact with the companion object;
determining, by the companion robot, based on the emotional pattern, a degree of interest of the target object in the companion object;
extracting, by the companion robot, behavioral data of the companion object from the sensing information based on the degree of interest;
screening, by the companion robot, the behavioral data to extract a behavioral key feature; and
generating, by the companion robot, simulated object data by using the behavioral key feature, wherein the simulated object data is used by a robot to simulate the companion object and the simulated object data is used to describe the companion object; wherein the behavioral data comprises a body action, the behavioral key feature comprises a body key point or a body action unit, and the behavioral key feature is generated through statistical learning or machine learning; or
wherein the behavioral data comprises an expression, the behavioral key feature comprises a partial face key point or a facial action unit, and the behavioral key feature is generated through pre-specification or machine learning; or
wherein the behavioral data comprises a tone, the behavioral key feature comprises an acoustic signal feature in voice input of the companion object, and the behavioral key feature is generated through pre-specification or machine learning.