US 12,443,285 B2
Human-machine interaction method and human-machine interaction apparatus
Shuaihua Peng, Shanghai (CN); and Hao Wu, Shanghai (CN)
Assigned to Shenzhen Yinwang Intelligent Technologies Co., Ltd.
Filed by SHENZHEN YINWANG INTELLIGENT TECHNOLOGIES CO., LTD., Shenzhen (CN)
Filed on Jun. 30, 2023, as Appl. No. 18/345,631.
Application 18/345,631 is a continuation of application No. PCT/CN2021/070188, filed on Jan. 4, 2021.
Prior Publication US 2023/0350498 A1, Nov. 2, 2023
Int. Cl. G06F 3/01 (2006.01); B60K 35/00 (2024.01); B60K 35/28 (2024.01); B60K 35/29 (2024.01); B60W 60/00 (2020.01); G06F 3/03 (2006.01)
CPC G06F 3/017 (2013.01) [B60K 35/00 (2013.01); B60W 60/00253 (2020.02); G06F 3/0304 (2013.01); B60K 35/28 (2024.01); B60K 35/29 (2024.01); B60K 2360/176 (2024.01); B60K 2360/191 (2024.01)] 20 Claims
OG exemplary drawing
 
1. A human-machine interaction method, comprising:
obtaining motion track information of a mobile terminal, wherein the motion track information is obtained by using a motion sensor of the mobile terminal;
in response to determining that a predefined operation is performed on the mobile terminal, obtaining first gesture action information of a user, wherein the first gesture action information is obtained by using an optical sensor of an object device that interacts with the user, wherein the first gesture action information comprises gesture action form information and gesture action time information, and the motion track information comprises motion track form information and motion track time information;
determining whether a similarity of a first form of a motion of the mobile and a second form of a gesture exists by processing the gesture action form information and the motion track form information using a machine learning model;
determining whether a consistency between a first time of the motion of the mobile and a second time of the gesture by comparing a preset threshold with a difference between the gesture action form information and the motion track time information;
determining that the first gesture action information matches the motion track information in response to determining that the similarity exists and the consistency exists; and
executing first control when the first gesture action information matches the motion track information, wherein the first control comprises control executed according to a control instruction corresponding to the first gesture action information.