| CPC G06F 21/6218 (2013.01) [G06F 16/24568 (2019.01); G06F 21/6245 (2013.01)] | 9 Claims |

|
1. A processor-implemented method comprising steps of:
receiving, via an input/output interface, one or more sensitivity parameters of a structured data, an epsilon value, a plurality of differential privacy techniques, and a privacy budget selected by a user;
fetching, via one or more hardware processors, the structured data from a predefined database to generate a production data for a differential privacy, wherein the structured data is labelled and in a tabular form;
profiling, via the one or more hardware processors, the production data based on type and nature of the structured data, wherein the type of the structured data includes a numerical form, a categorical form, a binary form, and in a text form, and the nature of the structured data includes continuous, discrete, integer, and Boolean;
creating, via the one or more hardware processors, based on the profiling of the production data, a staging data for analytical purpose;
selecting, via the one or more hardware processors, at least one differential privacy technique from the plurality of differential privacy techniques based on one or more sensitive data fields of the staging data, the epsilon value, the sensitivity parameters of the data, and the privacy budget, wherein the plurality of differential privacy techniques comprises a Laplace classic, a Laplace bounded, an exponential and a random toss;
applying, via the one or more hardware processors, the selected at least one differential privacy technique iteratively on the staging data and an incremental addition of epsilon value at each iteration, wherein the incremental addition of epsilon at each iteration is based on a privacy loss at each iteration that is less than the privacy budget, wherein each of the selected at least one differential privacy technique generates a privacy enabled structured data, and wherein the epsilon value and the sensitivity parameters of the data are used as levers to control a degree of noise being added to the sensitive data fields by the selected at least one differential privacy technique; and
enabling, via the one or more hardware processors, the user to select the differential privacy enabled structured data based on one or more results of the selected differential privacy technique application, wherein the one or more results include a set of privacy metrics providing information on privacy strength and similarity tolerance of each query in an interactive way.
|