Show simple item record

dc.contributor.authorAbowd, John M.
dc.date.accessioned2018-02-02T21:56:41Z
dc.date.available2018-02-02T21:56:41Z
dc.date.issued2018-02-01
dc.identifier.otherhttp://amstat.commpartners.com/se/Meetings/Playback.aspx?meeting.id=683000
dc.identifier.urihttps://hdl.handle.net/1813/55761
dc.descriptionThe opinions expressed in this talk are the my own and not necessarily those of the U.S. Census Bureauen_US
dc.description.abstractWebinar for Privacy Day 2018, Sponsored by the ASA Committee on Privacy and Confidentiality. For statistical agencies, the Big Bang event in disclosure avoidance occurred in 2003 when Irit Dinur and Kobbi Nissim, two well-known cryptographers, turned their attention to properties of safe systems for data publication from confidential sources. And the paradigm-shifting message was a very strong result showing that most of the confidentiality protection systems used by statistical agencies around the world, collectively known as statistical disclosure limitation, were not designed to defend against a database reconstruction attack. Such an attack recreates increasingly accurate record-level images of the confidential data as an agency publishes more and more accurate statistics from the same database. Why are we still talking about this theorem fifteen years later? What is required to modernize our disclosure limitation systems? The answer is recognizing that the database reconstruction theorem identified a real constraint on agency publication systems—there is only a finite amount of information in any confidential database. We can’t repeal that constraint. But it doesn’t help with the public-good mission of statistical agencies to publish data that are suitable for their intended uses. The hard work is incorporating the required privacy-loss budget constraint into the decision-making processes of statistical agencies. This means balancing the interests of data accuracy and privacy loss. A leading example of this process is the need for accurate redistricting data, to enforce the Voting Rights Act, and the protection of sensitive racial and ethnic information in the detailed data required for this activity. Wrestling with this tradeoff stares-down the database reconstruction theorem, and uses the formal privacy results that it inspired to specify the technologies. Specifying the decision framework for selecting a point on that technology has proven much more challenging. We still have a lot of work to do.en_US
dc.description.sponsorshipParts of this talk were supported by the National Science Foundation, the Sloan Foundation, and the Census Bureau (before and after my appointment started). Webinar for Privacy Day 2018, Sponsored by the ASA Committee on Privacy and Confidentiality.en_US
dc.language.isoen_USen_US
dc.rightsAttribution-NonCommercial-ShareAlike 4.0 International*
dc.rights.urihttps://creativecommons.org/licenses/by-nc-sa/4.0/*
dc.subjectprivacyen_US
dc.subjectconfidentialityen_US
dc.subjectCensus Bureauen_US
dc.subjectdifferential privacyen_US
dc.titleWhat Is a Privacy-Loss Budget and How Is It Used to Design Privacy Protection for a Confidential Database?en_US
dc.typepresentationen_US


Files in this item

Thumbnail
Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record

Except where otherwise noted, this item's license is described as Attribution-NonCommercial-ShareAlike 4.0 International

Statistics