Data Anonymisation and Quantifying Risk Competition

17 mins 11 secs,  250.46 MB,  MPEG-4 Video  640x360,  29.97 fps,  44100 Hz,  1.94 Mbits/sec
Share this media item:
Embed this media item:


About this item
Image inherited from collection
Description: Kikuchi, H (Meiji University)
Monday 5th December 2016 - 14:00 to 14:20
 
Created: 2016-12-13 12:46
Collection: Newton Gateway to Mathematics
Publisher: Isaac Newton Institute
Copyright: Kikuchi, H
Language: eng (English)
 
Abstract: One of the main difficulties is to be able to design and formalize realistic adversary models, by taking into account the background knowledge of the adversary and his inference capabilities. In particular, many privacy models currently exist in the literature such as k-anonymity, and its extensions such as l-diversity and differential privacy. However, these models are not necessarily comparable and what might appear to be the optimal anonymization method in one model is not necessarily the best one for a different model. To be able to assess the privacy risks of publishing a particular anonymized data, it is necessary to evaluate the risk of the data anonymized from a common dataset. The main objective of the competition is precisely to investigate the strengths and limits of existing anonymization methods, both from theoretical and practical perspective. More precisely, by given a common dataset containing personal data and history of online retail payments, some attendances of the competition attempt to anonymize the given dataset in a way where re-identification of records of the dataset is impossible without losing data utility. They are encouraged to try to re-identify the dataset anonymized by the other attendances as well. With pre-defined utility functions and re-identification algorithms, the security and the utility of the anonymized dataset are automatically evaluated as the maximum re-identification probability and the mean average error between the anonymized data and the original dataset, respectively. Throughout the competition, we aim at gaining an in-depth understanding on how to quantify the privacy level provided by a particular anonymization method as well as the achievable trade-off between privacy and utility of the resulting data. The outcomes of the meeting will greatly benefit to the privacy community.
Available Formats
Format Quality Bitrate Size
MPEG-4 Video * 640x360    1.94 Mbits/sec 250.46 MB View Download
WebM 640x360    760.94 kbits/sec 95.86 MB View Download
iPod Video 480x270    522.67 kbits/sec 65.78 MB View Download
MP3 44100 Hz 249.9 kbits/sec 31.48 MB Listen Download
Auto (Allows browser to choose a format it supports)