US 12,485,547 B2
Head mounted display for remote operation of machinery
Westin Sykes, Gower, MO (US); Timothy J. Mourlam, Kansas City, KS (US); Aaron Beck, Kansas City, MO (US); and William Naber, Saint Joseph, MO (US)
Assigned to Altec Industries, Inc., Birmingham, AL (US)
Filed by Altec Industries, Inc., Birmingham, AL (US)
Filed on Mar. 6, 2024, as Appl. No. 18/597,033.
Application 18/597,033 is a continuation of application No. 16/860,176, filed on Apr. 28, 2020, granted, now 11,945,123.
Prior Publication US 2024/0208066 A1, Jun. 27, 2024
Int. Cl. B25J 9/16 (2006.01); B25J 9/04 (2006.01); B25J 13/00 (2006.01); G02B 27/01 (2006.01); H04N 7/22 (2006.01)
CPC B25J 9/1689 (2013.01) [B25J 9/04 (2013.01); B25J 9/1697 (2013.01); B25J 13/006 (2013.01); G02B 27/0101 (2013.01); G02B 27/0172 (2013.01); H04N 7/22 (2013.01); G02B 2027/0138 (2013.01)] 20 Claims
OG exemplary drawing
 
1. A system configured to be coupled to an end of a boom assembly and disposed in a remote operating environment for remote operation, the system comprising:
a frame portion;
a controller;
one or more robotic arms coupled to the frame portion;
a camera-supporting robotic arm coupled to the frame portion, the camera-supporting robotic arm comprising a plurality of pivotable joints, wherein the camera-supporting robotic arm is independently moveable from the one or more robotic arms;
a remote capture device fixed to a distal end of the camera-supporting robotic arm and operable to capture real-time sensory information including video data from the remote operating environment, the remote capture device comprising one or more cameras operable to capture the video data;
one or more microphones operable to capture audio data from the remote operating environment; and
a communication portion operable to transmit a signal comprising the real-time sensory information from the remote capture device to an operator and to receive a signal comprising control inputs from the operator to the controller, the control inputs including instructions for controlling motion of at least one of the one or more robotic arms and the camera-supporting robotic arm,
wherein a portion of the real-time sensory information is selected for presentation to the operator based at least in part on a viewing angle of the operator.