US 12,455,367 B2
Ego motion estimation based on consecutive time frames input to machine learning model
Hyunwoong Cho, Seoul (KR); and Sungdo Choi, Suwon-si (KR)
Assigned to Samsung Electronics Co., Ltd., Suwon-si (KR)
Filed by Samsung Electronics Co., Ltd., Suwon-si (KR)
Filed on Mar. 13, 2019, as Appl. No. 16/351,689.
Claims priority of application No. 10-2018-0073709 (KR), filed on Jun. 27, 2018.
Prior Publication US 2020/0003886 A1, Jan. 2, 2020
Int. Cl. G01S 13/58 (2006.01); G01S 13/60 (2006.01); G01S 13/62 (2006.01); G06N 3/02 (2006.01); G06N 3/045 (2023.01); G06N 3/084 (2023.01); G06N 20/00 (2019.01); G06N 20/20 (2019.01)
CPC G01S 13/588 (2013.01) [G01S 13/58 (2013.01); G01S 13/589 (2013.01); G01S 13/60 (2013.01); G01S 13/62 (2013.01); G06N 3/02 (2013.01); G06N 3/045 (2023.01); G06N 3/084 (2013.01); G06N 20/00 (2019.01); G06N 20/20 (2019.01)] 25 Claims
OG exemplary drawing
 
1. A processor-implemented ego motion estimation method of an electronic apparatus, the method comprising:
generating input data based on radar sensing data, collected by one or more radar sensors, for each of a plurality of time frames;
for each of the plurality of time frames, estimating ego motion information of the time frame by executing, using the input data of the time frame and the input data of a previous frame of the plurality of time frames as inputs to a machine learning motion recognition model, the machine learning motion recognition model generating extracted feature data of the time frame, and generating estimated respective ego motion information of the time frame dependent on the extracted feature data of the time frame; and
recognizing a motion of an object, exterior of the electronic apparatus, based on the estimated respective ego motion information of the plurality of time frames,
wherein, for each of the plurality of time frames, the estimated respective ego motion information of the time frame includes information of a position and/or pose of the apparatus relative to the object.