US 11,710,271 B2
Three-dimensional data creation method, three-dimensional data transmission method, three-dimensional data creation device, and three-dimensional data transmission device
Toru Matsunobu, Osaka (JP); Takahiro Nishi, Nara (JP); Tadamasa Toma, Osaka (JP); Toshiyasu Sugio, Osaka (JP); Satoshi Yoshikawa, Hyogo (JP); and Tatsuya Koyama, Kyoto (JP)
Assigned to PANASONIC INTELLECTUAL PROPERTY CORPORATION OF AMERICA, Torrance, CA (US)
Filed by Panasonic Intellectual Property Corporation of America, Torrance, CA (US)
Filed on Sep. 15, 2020, as Appl. No. 17/21,188.
Application 17/021,188 is a continuation of application No. 16/243,764, filed on Jan. 9, 2019, granted, now 10,810,786.
Application 16/243,764 is a continuation of application No. PCT/JP2017/019115, filed on May 23, 2017.
Claims priority of provisional application 62/364,009, filed on Jul. 19, 2016.
Prior Publication US 2020/0410745 A1, Dec. 31, 2020
This patent is subject to a terminal disclaimer.
Int. Cl. G06T 15/08 (2011.01); G06T 7/50 (2017.01); G06T 9/00 (2006.01); G06T 17/00 (2006.01); G08G 1/16 (2006.01); H04L 67/12 (2022.01)
CPC G06T 15/08 (2013.01) [G06T 7/50 (2017.01); G06T 9/00 (2013.01); G06T 17/00 (2013.01); G06T 2207/10028 (2013.01); G06T 2207/20221 (2013.01); G06T 2207/30252 (2013.01); G06T 2210/08 (2013.01); G06T 2210/56 (2013.01); H04L 67/12 (2013.01)] 5 Claims
OG exemplary drawing
 
1. A three-dimensional data generation method performed by a first movable object, the method comprising:
generating first three-dimensional data using a sensor provided on the first movable object, the first three-dimensional data including first three-dimensional points each representing a three-dimensional position;
receiving second three-dimensional data via a network, the second three-dimensional data including second three-dimensional points each representing a three-dimensional position, the second three-dimensional data including three-dimensional points included in an occlusion region of the sensor; and
merging the first three-dimensional data with the second three-dimensional data to generate third three-dimensional data, the third three-dimensional data including third three-dimensional points each representing a three-dimensional position;
detecting a traveling speed of a second movable object in a vicinity of the first movable object, using the third three-dimensional data; and
receiving information indicating a traveling speed of a third movable object included in the occlusion region of the sensor, via the network.