US 12,423,463 B2
Data processing for release while protecting individual privacy
Xin Yang, Los Angeles, CA (US); Yuanshun Yao, Los Angeles, CA (US); Tianyi Liu, Los Angeles, CA (US); Jiankai Sun, Los Angeles, CA (US); Chong Wang, Los Angeles, CA (US); and Ruihan Wu, Los Angeles, CA (US)
Assigned to LEMON INC., Grand Cayman (KY)
Filed by Lemon Inc., Grand Cayman (KY)
Filed on Nov. 24, 2021, as Appl. No. 17/535,398.
Prior Publication US 2023/0161899 A1, May 25, 2023
Int. Cl. G06F 21/62 (2013.01)
CPC G06F 21/6245 (2013.01) 15 Claims
OG exemplary drawing
 
1. A method of releasing data while protecting individual privacy, comprising:
compressing a dataset to a lowered dimension space by a differential privacy model utilizing a first random matrix, wherein the dataset is owned by a respective data provider among the plurality of data providers, wherein there are a plurality of datasets owned by the plurality of data providers, wherein the first random matrix is shared among the plurality of data providers, and wherein each of the plurality of data providers includes the differential privacy model for processing their respective datasets;
adding a noise by applying a random Gaussian matrix to the compressed dataset to obtain a processed dataset, wherein the random Gaussian matrix comprises elements sampled from a Gaussian distribution with mean 0 and variance of 4dσεδ, to implement (ε, δ)-differential privacy, wherein the processed dataset ensures data privacy protection; and
releasing the processed dataset while protecting individual privacy of the dataset from exposure, wherein a party of receiving the processed dataset is unable to infer attributes or identity of any individual data sample.