US 12,093,459 B2
Information processing apparatus, information processing method, and non-transitory computer readable media storing a program
Mamoru Miyawaki, Tokyo (JP); Hiroyuki Shinoda, Tokyo (JP); Takaaki Kamigaki, Tokyo (JP); Mitsuru Ito, Tokyo (JP); Tao Morisaki, Tokyo (JP); Shun Suzuki, Tokyo (JP); Atsushi Matsubayashi, Tokyo (JP); Ryoya Onishi, Tokyo (JP); Yasutoshi Makino, Tokyo (JP); and Seki Inoue, Tokyo (JP)
Assigned to THE UNIVERSITY OF TOKYO, (JP)
Filed by THE UNIVERSITY OF TOKYO, Tokyo (JP)
Filed on Dec. 8, 2022, as Appl. No. 18/077,393.
Claims priority of application No. 2022-156628 (JP), filed on Sep. 29, 2022.
Prior Publication US 2024/0111366 A1, Apr. 4, 2024
Int. Cl. G06F 3/01 (2006.01); A63F 13/285 (2014.01); G06F 3/04817 (2022.01); G06F 3/0484 (2022.01); G06F 3/16 (2006.01); G06T 7/20 (2017.01); G06T 7/40 (2017.01); G06T 7/70 (2017.01); G06T 11/00 (2006.01); G08B 6/00 (2006.01)
CPC G06F 3/016 (2013.01) [G06F 3/011 (2013.01); G06F 3/165 (2013.01); G06F 3/167 (2013.01); G06T 7/20 (2013.01); G06T 7/70 (2017.01); G06T 11/00 (2013.01); A63F 13/285 (2014.09); A63F 2300/8082 (2013.01); G06F 3/04817 (2013.01); G06F 3/0484 (2013.01); G06T 7/40 (2013.01); G06T 2207/30196 (2013.01); G08B 6/00 (2013.01)] 19 Claims
OG exemplary drawing
 
1. An information processing apparatus comprising a processor configured to execute a program so as to:
as an information-acquiring unit, acquire viewpoint information, motion information, an object image representing an object, and object information relating to the object, wherein:
the viewpoint information includes information indicating a field of view of a user,
the motion information is information indicating a motion of an irradiation target with ultrasound, and
the object information includes at least one of a coordinate, a shape, a surface condition, solidity, a temperature, a mass, and a friction coefficient of the object;
as an image-processing unit, process the object image based on the object information, the viewpoint information, and the motion information so as to update a position, a direction, or the shape of the object according to physical interaction;
as a position-processing unit, calculate a position when superimposing a background image and the updated object image, wherein:
the background image is an image according to the field of view of the user;
as a visual-information-generating unit, generate visual information superimposing the background image and the updated object image based on the position calculated by the position-processing unit, and output the generated visual information to a display device; and
as a tactile-information-generating unit, generate tactile information for an ultrasound generator irradiating the user with the ultrasound corresponding to the object based on the position calculated by the position-processing unit, the motion information, and the object information, and output the generated tactile information to the ultrasound generator,
wherein the processor is configured to execute the program so as to:
determine a point of load against the object based on an operation of the visual information; and
as the tactile-information-generating unit, generate the tactile information for the ultrasound generator irradiating the user with the ultrasound corresponding to the object based on the physical interaction at the point of load.