CPC G06T 19/006 (2013.01) [G02B 27/0172 (2013.01); G06T 7/73 (2017.01); G06V 40/10 (2022.01); G10L 15/26 (2013.01); H04N 13/111 (2018.05); H04N 13/207 (2018.05); H04N 13/332 (2018.05); G02B 2027/0138 (2013.01); G02B 2027/0178 (2013.01); G06T 2207/30196 (2013.01); H04N 2213/008 (2013.01)] | 15 Claims |
1. Eyewear, comprising:
a frame;
an optical member supported by the frame;
a display coupled to the optical member;
a microphone supported by the frame; and
a processor configured to:
determine a position of the eyewear as an (x, y, z) coordinate in a shared environment with respect to one or more objects within the shared environment via a simultaneous localization and mapping (SLAM) algorithm, wherein the shared environment is a physical environment;
detect a remote eyewear device of a person in the shared environment;
receive an input from the remote eyewear device indicative of a position of the remote eyewear device as an (x, y, z) coordinate in the shared environment;
receive an input from the remote eyewear device indicative of an identity of the person;
receive an input from the remote eyewear device that is text indicative of speech of the person in a second language;
translate the text in the second language to text in a first language that is different than the second language;
display the text in the first language proximate the person of the remote eyewear device on the display;
transcribe speech of a user of the eyewear to text in the first language using the microphone;
transmit text indicative of the transcribed speech of the user to the remote eyewear device; and
transmit a position of the eyewear as an (x, y, z) coordinate in the shared environment to the remote eyewear device.
|