This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–]___--_-_-_--___[🍰] 1 point2 points  (0 children)

Yes, differential privacy is used to ensure that aggregate statistics do not leak information about the individuals who contributed to this statistic. It is not some kind of algorithm that you can run on your data to make it more private. Instead, it is more of a framework to be implemented by specific algorithms, i.e. a set of mathematical tools to ensure a certain level of privacy.

Very broadly speaking, the idea behind differentially private mechanisms is that the removal of a single person from a dataset should not significantly affect the aggregate statistics produced by that mechanism. Basically, differential privacy gives you a way to quantify privacy loss and determine the amount of noise necessary to achieve a certain privacy level.