Press Release: CACR Examines Data Protection, Personal Privacy in New Report

The Center for Applied Cybersecurity Research issued a press release today about a new report released by the Center, Date Use and Impact Global Workshop.

The formal report can be found here.

Below is a portion of the press release. The full release can be found here

——-

A report issued today (Dec. 4) from Indiana University’s Center for Applied Cybersecurity Research (CACR) paves the way for updated privacy protection by focusing on the uses—both positive and negative—of personal information.

“Only by focusing on how personal data are used and the potential harms and benefits likely to result from those uses can we assure that data are used responsibly, that individual privacy is protected, and that data users are accountable stewards of the data they possess.” -Fred H. Cate

The Data Use and Impact report is based on a global workshop hosted by CACR in London this past May. Participants included 25 senior representatives from industry, government, academia, and advocacy in Australia, Canada, France, Israel, Italy, Mexico, New Zealand, the United Kingdom, and the United States. The workshop and report were funded by The Privacy Projects, a nonprofit organization dedicated to improving current privacy policies, practices, and technologies through research, collaboration, and education.

Fred H. Cate, CACR director and co-convener of the CACR workshop and the Microsoft summit, said that the sheer volume of data being generated today creates overwhelming challenges for the creation and enforcement of provisions to guide the use of it. For the average consumer, the implications arising from the use of their personal information may not be immediately clear. “When was the last time anyone scrolled through the 50-plus screen pages of Apple’s terms and conditions when registering their new iPhone or bothered to read the privacy policies of social media sites?” Cate asked.

Those are but two common scenarios consumers face, and as the ubiquity of data—be it geolocational, biomedical, financial, or otherwise—increases at rapid speed, those who have access to it are saddled with an array of issues governing its use.

“Only by focusing on how personal data are used and the potential harms and benefits likely to result from those uses can we assure that data are used responsibly, that individual privacy is protected, and that data users are accountable stewards of the data they possess,” Cate said.

The Data Use and Impact report offers ten conclusions, including:

  • The current focus of most data protection regimes on notice and consent at time of data collection is not working. Privacy notices are too complex; many privacy policies don’t provide meaningful terms, choices, or restrictions on data use; individuals do not read them; there is a lack of clarity as to what constitutes a harm or risk of harm in connection with personal data; and our preoccupation with notice and choice focuses too much attention on the “bureaucracy of privacy,” rather than on meaningful privacy protection.
  • Existing data protection systems should evolve to focus more on use of data and less on notice and choice, so that data users would evaluate the appropriateness of an intended use of personal data not by focusing primarily on the terms under which the data were originally collected, but rather on the likely risks (of benefits and harms) associated with the proposed use of the data. A greater focus on use would “make privacy relevant again” and enhance both privacy protection (including enforcement) and individual trust.
  • The real benefit to individuals and society doesn’t come just from a greater focus on uses of data, but rather from assessing the risks—good and bad—of proposed uses. As a result, a critical component of the evolution toward a more use-focused data protection system is the development of a simple, transparent approach to risk assessment.
  • The goal of a risk management approach focused on data uses is to reduce or eliminate the harm that personal information can cause to individuals. Accomplishing this, however, requires a clearer understanding of what constitutes “harm” in the privacy context.
  • The evolution to a more use-based approach is particularly important with the advent of “big data” and the analytical tools that have accompanied it because personal data may have substantial valuable uses that were wholly unanticipated when the data were collected. In fact, the analysis of big data doesn’t always start with a question or hypothesis, but rather may reveal insights that were never anticipated. As a result, data protection that is based on a notice specifying intended uses of data and consent for collection based on that notice can result in blocking socially valuable uses of data, lead to meaninglessly broad notices, or require exceptions to the terms under which the individual consented.
  • Implementing a more use-focused data protection system must be achieved through evolution, rather than revolution. This evolution actually recaptures the early focus on data protection in Europe and the United States on risk assessment to prevent harm, rather than on protecting individual privacy rights. Laws implementing use-based risk analysis should specify objectives and outcomes, rather than telling information users how to achieve those.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s