US 12,259,733 B2
Spatial blind spot monitoring systems and related methods of use
Ajay Vishnu, Gurgaon (IN); Arijit Saha, Kolkata (IN); and Rohit Verma, Gurgaon (IN)
Assigned to ANRAM HOLDINGS, Mississauga (CA)
Filed by Anram Holdings, Oakville (CA)
Filed on Oct. 25, 2021, as Appl. No. 17/510,188.
Claims priority of application No. 202011046475 (IN), filed on Oct. 24, 2020.
Prior Publication US 2022/0129006 A1, Apr. 28, 2022
Int. Cl. G05D 1/00 (2024.01); G01S 17/87 (2020.01); G01S 17/89 (2020.01); G01S 17/931 (2020.01)
CPC G05D 1/0255 (2013.01) [G01S 17/87 (2013.01); G01S 17/89 (2013.01); G01S 17/931 (2020.01); G05D 1/0088 (2013.01); G05D 1/0219 (2013.01); G05D 1/0223 (2013.01)] 20 Claims
OG exemplary drawing
 
1. A method of controlling a robot for autonomous navigation, the method comprising:
receiving, by a controller, a set of point values defining light detection and ranging (LIDAR) data from a LIDAR sensor having a two-dimensional (2D) field of view, the point values being distance values received based on the LIDAR sensor performing a scan of an environment, wherein the scan is performed in an omnidirectional plane comprising the 2D field of view;
receiving, by the controller, a sensor value from an ultrasonic sensor having a three-dimensional (3D) field of view, the sensor value being a measure of distance to an object and received based on the ultrasonic sensor scanning the environment, wherein the 3D field of view excludes the omnidirectional plane;
resolving, by the controller, an observable field of view for the LIDAR sensor, wherein the observable field of view is resolved based on a sensor distance between the LIDAR sensor and the ultrasonic sensor, the received sensor value, and the 3D field of view of the ultrasonic sensor;
comparing, by the controller, the sensor value with one or more point values in the set, the one or more point values corresponding to a portion of the omnidirectional plane, wherein the portion extends along the observable field of view;
determining, by the controller, whether the object is located in a region outside the 2D field of view of the LIDAR sensor based on the comparison, wherein the object is determined to be located in the region outside the 2D field of view based on the sensor value being less than the one or more point values in the set;
modifying, by the controller, the LIDAR data based on the object being located in the region outside the 2D field of view, the LIDAR data being modified by replacing the one or more point values in the set with the sensor value, wherein the modified LIDAR data indicates the object being detected by the LIDAR sensor despite the object being located outside the 2D field of view; and
generating, by the controller, a control signal based on the modified LIDAR data, wherein the control signal triggers a motor for manipulating an orientation of a robot towards a path away from the object.