CPC A63F 13/57 (2014.09) [G06F 3/017 (2013.01); G06F 18/22 (2023.01); G06V 20/20 (2022.01); G06V 40/23 (2022.01)] | 19 Claims |
1. A method for sharing movement data, the method comprising:
receiving sensor data from one or more sensors associated with a user during an interactive session, wherein the sensor data is captured during a movement by the user in a real-world environment;
analyzing the sensor data to generate metadata characterizing the movement as a series of sub-movements;
receiving user input specifying that one or more of the sub-movements is associated with a specified audio-visual effect in a virtual environment of the interactive session;
rendering a corresponding movement by a virtual character within the virtual environment based on the sensor data;
capturing video of the corresponding movement within the virtual environment, wherein the captured video is associated with the sensor data;
providing a movement profile that includes the captured video and the specified audio-visual effect associated with the specified sub-movements characterized by the metadata generated from the sensor data to a recipient device of a recipient designated by the user; and
verifying that the recipient is performing the specified sub-movements by comparing data regarding the recipient during play of the captured video to the sensor data associated with the captured video within the movement profile, wherein the specified audio-visual effect occurs in the virtual environment based on the verification.
|