Skip to content

Commit 118c0fc

Browse files
committed
introduction blog update
1 parent ac347d2 commit 118c0fc

File tree

1 file changed

+10
-3
lines changed

1 file changed

+10
-3
lines changed

docs/introduction.rst

+10-3
Original file line numberDiff line numberDiff line change
@@ -18,6 +18,9 @@ This is where “Differential Privacy” comes into the picture, a smarter way t
1818
.. figure:: https://user-images.githubusercontent.com/19529592/91377299-b58fbf80-e83c-11ea-9b56-a068ea3155c6.png
1919
:alt: my-picture1
2020
:align: center
21+
:figclass: align-center
22+
23+
(Privacy Preserving AI (Andrew Trask) | MIT Deep Learning Series )
2124

2225
Why is Differential Privacy so important ?
2326
============
@@ -35,15 +38,17 @@ Hence, this process is prone to risk and is considered as fundamentally wrong. N
3538
.. figure:: https://user-images.githubusercontent.com/19529592/91381064-14a50280-e844-11ea-9dd0-1af088c3924d.png
3639
:alt: netflix
3740
:align: center
41+
:figclass: align-center
3842

39-
< a click from Secure AI Course>
43+
Image Credits: Secure and Private AI (Udacity)
4044

4145

4246
Despite the fact that the dataset was anonymized (no username or movie name was released) yet two Researchers at University of Texas released a `paper <https://www.cs.utexas.edu/~shmat/shmat_oak08netflix.pdf>`_ where they showed how they have de-anonymized a maximum chunk of the daetaset.
4347

4448
.. figure:: https://user-images.githubusercontent.com/19529592/91381399-ef64c400-e844-11ea-8535-0180f37962de.png
4549
:alt: research
4650
:align: center
51+
:figclass: align-center
4752

4853
They scraped the IMDB Website and by statistical analysis on these two datasets, they were able to identify the movie names and also the individual names. Ten years down the line they have published yet another `paper <https://www.cs.princeton.edu/~arvindn/publications/de-anonymization-retrospective.pdf>`_ where they have reviewed de-anonymization of datasets in the present world. There are other instances too where such attacks have been made which led to the leakage of private information.
4954

@@ -70,8 +75,9 @@ In local differential privacy the random noise is applied at the start of the pr
7075
.. figure:: https://user-images.githubusercontent.com/19529592/91381482-1e7b3580-e845-11ea-9419-cd6bdbbd9dbf.png
7176
:alt: local
7277
:align: center
78+
:figclass: align-center
7379

74-
(from google images)
80+
Image Credit: Google Images
7581

7682
Global Differential Privacy
7783
-----
@@ -80,6 +86,7 @@ In Global differential privacy the random noise is applied at the global level i
8086
.. figure:: https://user-images.githubusercontent.com/19529592/91381550-4ec2d400-e845-11ea-8f63-b7a3adb3fde8.png
8187
:alt: global
8288
:align: center
89+
:figclass: align-center
8390

8491
Image Credits: Google Images
8592

@@ -109,7 +116,7 @@ Differential Privacy ensures privacy of all sorts of data which can be used by a
109116

110117
SOME OTHER LIBRARIES FOR DP
111118

112-
* `OpenDp <https://github.com/opendifferentialprivacy>`_ by Harvard University
119+
* `OpenDp <https://github.com/opendifferentialprivacy>`_ by Harvard University and Microsoft
113120
* `Diffprivlib <https://github.com/IBM/differential-privacy-library>`_ by IBM
114121
* Google’s Differential Privacy `Library <https://github.com/IBM/differential-privacy-library>`_ .
115122

0 commit comments

Comments
 (0)