The US National Institute of Standards and Technology (NIST) has published its finalised Guidelines for Evaluating ‘Differential Privacy’ Guarantees to De-Identify Data (NIST Special Publication 800-226). Differential privacy works by adding random “noise” to the data in a way that obscures the identity of the individuals but keeps the database useful overall as a source of statistical information. However, noise applied in the wrong way can jeopardise privacy or render the data less useful. These guidelines aim to help practitioners of all backgrounds better understand how to think about differentially private software solutions. Multiple factors for consideration are identified in a differential privacy pyramid along with several privacy hazards, which are common pitfalls that arise as the mathematical framework of differential privacy is realized in practice.
Click here for the official article/release
Disclaimer
The Legal Wire takes all necessary precautions to ensure that the materials, information, and documents on its website, including but not limited to articles, newsletters, reports, and blogs (“Materials”), are accurate and complete. Nevertheless, these Materials are intended solely for general informational purposes and do not constitute legal advice. They may not necessarily reflect the current laws or regulations. The Materials should not be interpreted as legal advice on any specific matter. Furthermore, the content and interpretation of the Materials and the laws discussed within are subject to change.