| CPC G06T 19/006 (2013.01) [B60L 3/12 (2013.01); B60L 15/2036 (2013.01); G01C 22/025 (2013.01); G02B 27/017 (2013.01); G01C 21/365 (2013.01); G02B 2027/0178 (2013.01); G06F 3/04842 (2013.01)] | 20 Claims |

|
1. A method of providing an interactive personal mobility system, performed by one or more processors, comprising:
determining an initial pose of a wearable augmented reality device by visual-inertial odometry performed on images and inertial measurement unit (IMU) data generated by the wearable augmented reality device moving through a real-world environment;
receiving, by the one or more processors, motion sensor data transmitted from a personal mobility system;
performing sensor fusion on the motion sensor data received from the personal mobility system and data for the initial pose thereby to generate an updated pose;
displaying a particular virtual object on the wearable augmented reality device based on the updated pose, the display of the particular virtual object being fixed relative to a frame of reference of the real-world environment;
determining a relative position and orientation of the personal mobility system and the wearable augmented reality device based in part on data transmitted from the personal mobility system;
displaying a further virtual object on the wearable augmented reality device, the display of the further virtual object being fixed relative to a frame of reference of the personal mobility system;
detecting interaction of a user's body part with the further virtual object; and
in response to detecting the interaction, altering a performance characteristic of the personal mobility system.
|