| CPC G06N 5/048 (2013.01) [G06N 20/00 (2019.01); G06F 21/62 (2013.01); G06F 21/629 (2013.01)] | 35 Claims |

|
1. A computing system for implementing an on-device machine learning platform, comprising:
one or more processors; and
one or more non-transitory computer-readable media that store instructions that are executable to cause the computing system to perform operations, the operations comprising:
determining, using a context provider that performs client permission control, a mapping that indicates a respective permission status of a client relative to respective context data, wherein the mapping comprises a first permission status of the client relative to first context data, wherein the first permission status indicates that the client has permission to obtain inferences from the on-device machine-learning platform that are based on the first context data;
receiving, from a client via an application programming interface (API), an API call that requests for an inference to be generated using a machine-learned model executed by the on-device machine learning platform on the basis of input data received from the client and according to one or more configuration options specified by the client, wherein a configuration option identifies the first context data to be used to generate the inference;
determining, based on the mapping, that the client has permission to obtain inferences from the on-device machine-learning platform that are based on the first context data;
obtaining the first context data, wherein the first context data is not provided to the client;
based on determining that the client has access to the first context data, generating, using the machine-learned model, at least one inference based on the input data and the first context data; and
providing, using the API, the at least one inference to the client.
|