US 11,787,050 B1
Artificial intelligence-actuated robot
Adrian L. Kaehler, Boulder Creek, CA (US); Christopher M. Cianci, San Jose, CA (US); Kyle J. Martin, Glendale, CA (US); Carolyn Wales, Menlo Park, CA (US); and Jeffrey S. Kranski, Portland, OR (US)
Assigned to Sanctuary Cognitive Systems Corporation, Vancouver (CA)
Filed by SANCTUARY COGNITIVE SYSTEMS CORPORATION, Vancouver (CA)
Filed on Jul. 1, 2020, as Appl. No. 16/918,999.
Application 16/918,999 is a continuation in part of application No. PCT/US2019/068204, filed on Dec. 22, 2019.
Application PCT/US2019/068204 is a continuation in part of application No. 16/237,721, filed on Jan. 1, 2019, granted, now 11,312,012.
Claims priority of provisional application 62/854,071, filed on May 29, 2019.
Int. Cl. B25J 9/16 (2006.01); B25J 9/06 (2006.01); B25J 9/12 (2006.01); B25J 13/08 (2006.01); B25J 9/10 (2006.01); B25J 15/10 (2006.01)
CPC B25J 9/1664 (2013.01) [B25J 9/06 (2013.01); B25J 9/1075 (2013.01); B25J 9/126 (2013.01); B25J 9/161 (2013.01); B25J 13/085 (2013.01); B25J 13/089 (2013.01); B25J 15/10 (2013.01)] 38 Claims
OG exemplary drawing
 
1. A robot, comprising:
a robot structural frame;
a kinematic chain including a root joint connected to the structural frame, the kinematic chain comprising a plurality of joints and links arranged downstream from the root joint and at least one end effector;
a plurality of actuators, wherein the actuators in the plurality of actuators produce actuator data indicating current actuator states and respond to actuator command data to drive the actuators;
a plurality of tendons connected to a corresponding plurality of actuation points on the kinematic chain and to actuators in the plurality of actuators, the plurality of tendons arranged to translate actuator position and force to actuation points on the kinematic chain, wherein actuators in the plurality of actuators are positioned on the structural frame or on the kinematic chain upstream from respective actuation points in the plurality of actuation points;
a sensor configured to generate sensor data indicating a position of the at least one end effector and of an object; and
a controller in communication with the plurality of actuators and the sensor to operate the kinematic chain, the controller including a trained neural network in a feedback loop receiving feedback data derived from or including the actuator data and the sensor data as feedback input, the trained neural network trained to generate actuator command data to cause the robot to execute a task to manipulate the object in response to the feedback data.