| CPC G06F 3/042 (2013.01) | 6 Claims |

|
1. A method for recognizing a touch of an object by a server, the method comprising the steps of:
(a) extracting, on the basis of sensing information about a plurality of objects sensed through a LiDAR sensor, coordinates of the plurality of objects for a plurality of frames including a first frame and a second frame that is a previous frame continuous with the first frame;
(b) calculating a first distance between coordinates of a plurality of first objects extracted from the first frame;
(c) performing clustering by comparing the first distance with a preset first distance threshold value;
(d) setting a first cluster generated by performing clustering as a first touch object of the first frame and calculating first coordinates of the first touch object;
(e) generating a touch object list by using the first touch object, the first coordinates of the first touch object, and the number of the plurality of first objects included in the first touch object;
(f) removing the first touch object from the touch object list based on the number of the plurality of first objects being equal to or less than a preset number;
(g) determining a state of the first touch object by using the first touch object and the first coordinates;
(h) converting the first coordinates to real coordinates to determine whether the first coordinates are located in a preset effective area; and
(i) displaying the first touch object at the real coordinates if it is determined that the first touch object is located in the effective area,
wherein the step (g) comprises:
(g-1) determining whether a second touch object exists in the second frame,
(g-2) based on the second touch object existing in the second frame, calculating a second distance between the first coordinates of the first touch object in the first frame and second coordinates of the second touch object in the second frame,
(g-3) based on at least one of a result of determining whether the second touch object exists in the second frame and the calculated second distance, determining a state of the first touch object as one of a first enter state, a second enter state, a stay state, and a move state,
wherein the first enter state indicates that no sensing information is in the second frame and a new touch object is recognized in the first frame,
wherein the second enter state indicates that the first touch object is located in a different area from the second touch object and a new touch object corresponding to the first touch object is recognized in the first frame,
wherein the stay state indicates that a touch object for which sensing information is present continues to exist on a same area,
wherein the move state indicates that the first touch object has moved from an area on which the second touch object is located to an area on which the first touch object is located,
wherein the plurality of frames include a third frame that is a next frame continuous with the first frame, and
wherein the step of (g) further comprises the steps of:
determining whether a third touch object exists in the third frame that is the next frame continuous with the first frame;
based on the third touch object not existing in the third frame, determining the state of the first touch object as an exit state, wherein the exit state indicates that the first touch object no longer exists in the third frame,
based on the third touch object existing in the third frame, calculating a third distance between the first coordinates of the first touch object and third coordinates of the third touch object; and
determining the state of the first touch object by comparing the third distance with a preset third threshold value.
|