Putting Privacy First in Big Data Technologies | Re/code

Putting Privacy First in Big Data Technologies | Re/code

As big data has evolved beyond buzzword status to become a substantive force that is helping business, government and individuals gain helpful insights, the issue of data privacy has moved to the forefront. While the quest for insight has rarely been stronger or more necessary, concern over protecting the privacy interests of individuals has given many parties pause.

Looking at health care as just one example, there is an obvious fine line between the benefits big data can bring and the need to protect people’s privacy.

Today, big data is driving efforts behind the sequencing of the human genome, which hopefully will lead to breakthroughs in cancer treatment and personalized medicine. This data, and the technology behind it, is responsible for the falling cost of genome sequencing — from $95.3 million in 2001 to $6,600 in 2012, according to the U.S. government’s National Human Genome Research Institute. The time it takes for genome sequencing has dropped dramatically as well, from a matter of months to days.

However, recent research by the Institute for Health Technology Transformation finds that potential leakage or theft of personally identifiable information poses an important threat to patient privacy. We need to chart a course to gain the benefits from the use of this data, while also providing lasting and robust privacy protections. We can do this by increasing our focus on the appropriate and accountable use of the data.

Realizing privacy is an asset

Similar data privacy challenges exist across industries. As the use of big data to drive business outcomes increases, companies are wrestling with data privacy issues. The prevailing view: Data privacy is a liability, and therefore companies must be on the privacy defensive. But data privacy can and should be an asset, where it becomes a selling point for customers and other stakeholders.

Realizing this asset is not easy, as it requires organizational commitment, resources and the ability to demonstrate a dedication to accountability. Technology innovation creates both challenges and opportunities for realizing this goal. The proliferation of digital devices makes privacy controls and settings more available, but also more complex. Technology is becoming more connected and more personal. More information relating to individuals is available for analysis and use. Also, while there is more information about us out there, a smaller percentage of that information came directly from us. Much of this available personal data comes from government databases, generally available information on the Internet, and information provided by our friends, family and colleagues in social networking websites and applications. While the ability to consent to data collection and use is still fundamentally important, there are many situations where it is infeasible or impossible. This is why we need to complement consent with a focus on appropriate and accountable use.

We will need to capture these new concepts in legislation and regulation. The current privacy regulatory environment is tremendously complex. There are literally hundreds of pieces of legislation to comply with. In the U.S., there is state and federal legislation as well as industry-specific guidelines. In Europe, all 28 members of the European Union have their own privacy statutes, and those laws have greatly varying requirements. In Asia, some countries are making privacy violations criminal offenses, which could lead to prison time for company executives.

We need to drive for greater interoperability of these requirements, while still allowing for respect for diverse cultures, economies and histories. Here are some recommendations for creating a foundation for data privacy, especially with respect to collecting and using big data:

Think about the individual

Much of the data collected by new technologies does not immediately identify an individual; it relates only to a computer, sensor, or some other device. However, as this information is combined with other data fields, it may then relate to an identifiable individual. Companies should construct their policies and practices with an eye toward what the individual would reasonably expect based on the context of their interaction with the technology.

Focus on appropriate use of data

Tremendous societal value can come from new uses of existing data. Going back to health care as an example, using location data for health research may yield significant gains in the fight against infectious disease outbreaks. Companies need a process to determine which uses of data are appropriate and which are inappropriate. A helpful tool is to think about the individual, and ask whether the use is compatible with the context of how the data was first collected.

Let’s explore advertising as an example. The industry claims that people love targeted online advertising because they like being engaged in a more relevant manner. From the data we’ve seen, people in the U.S. aren’t overly concerned with targeted advertising. What they are concerned with, however, is what targeting inferences get shared with third-party organizations, which can then use that information for purposes other than advertising. Increasingly, organizations will need to demonstrate to individuals that information relating to them will not be used in inappropriate ways.

Consider privacy from the outset

Over the past 14 years, Intel has practiced Privacy by Design — asking data privacy questions up front in the design of new technologies, new business processes and new ideas. When Intel develops a new technology, there is usually an opportunity to create unique identifiers within the product. The Privacy by Design process asks the developers whether they really need to make that identifier unique, and how long the identifier must stay static, or whether it can become dynamic in a short period of time so it is less likely to identify the device. When combined with other privacy protection efforts (such as transparency of the identifier, protection of the identifier from remote access by software, logs of when the identifier was accessed), this process can improve data privacy for individuals, while not sacrificing business goals.

Balance privacy with more privacy

Right now, there is much debate over the balance between data privacy and national security, as well as discussions about balancing privacy with functionality and usability. However, obtaining security, functionality and usability does not need to mean giving up privacy. More national security doesn’t have to mean less privacy; less privacy doesn’t always mean more national security. Our goal should be to have as much privacy and national security as possible. As Ken Mortensen, a friend of mine and the chief privacy officer of CVS Caremark, has said, we need to think more in terms of legal balances, like the scales of justice.

If we want to do more with data, particularly in today’s big-data environment, we can balance these interests not by letting up on privacy, but by putting more privacy control on the other side of the scale. How? By taking into consideration other elements of the Organization for Economic Cooperation and Development’s Fair Information Practices, such as increasing the accountability of the organization with robust transparency, oversight, access rights, the ability to contest whether certain data should be stored/used, and improved security safeguards.

How meaningful are privacy policies?

The first principle to consider is the Individual Participation Principle, which says that individuals have the right to obtain information about what data an organization has that relates to them, and to challenge that data. In an environment where consent is not always possible or feasible, the ability to understand what data an organization has, and to be able to challenge that data is critical. The concept that individuals should have some ability to have data erased, rectified, completed or amended will provide individuals with more practical protections. For example, victims of domestic violence should have the ability to challenge the publishing of their location or address data. Individuals need to have the ability to contest information that is misleading because it is old, incomplete, wrong or because it will have a disproportionate impact on them.

Use the data like you said you would

The second principle is “purpose limitation,” or the idea that when data is initially used, it’s restricted from being used in a way other than what was described in the original notice. In part because of big data, this principle is going through tremendous change, led by Europeans. The Article 29 Working Party, a collection of European regulators, wrote a recent paper on the definition of reasonable purpose limitation. Recognizing the concept of big data, they expanded the definition to say, “Data that have already been gathered may also be genuinely useful for other purposes, not initially specified. Therefore, there is also a value in allowing, within carefully balanced limits, some degree of additional use.” The Working Party noted that these additional uses are fine as long as they are “not incompatible” with the designated purpose. This is useful guidance, as many important societal uses, such as health care research, should satisfy the test, as long as the organization incorporates measures to guard against additional disclosure or inappropriate use.

Be accountable

The third principle is accountability. The more that organizations want to use data in new and innovative ways, the more they should demonstrate they have effective policies and processes to protect privacy. The Accountability Project, led by Marty Abrams, president of the Centre for Information Policy Leadership, is helping to define the elements of accountability and to show how organizations can apply them in different contexts. This focus on how privacy can be protected in practice is long overdue, and can drive significant benefits from individuals’ data being better protected.

Privacy as business opportunity

Individuals are clearly saying they want to be able to trust their use of digital devices and their participation in the digital economy. Making certain that the reasonable expectations of these individuals are met is not just a legal obligation, not only an ethical imperative, but is also good business. Intel’s vision is to create and extend computing technology to connect and enrich the lives of every person on earth. Privacy is a fundamental prerequisite of our success in driving toward that goal.

David A. Hoffman is director of security policy and global privacy officer at Intel Corporation, in which capacity he heads the organization that oversees Intel’s privacy compliance activities, legal support for privacy and security, and all external privacy and security engagements. Reach him @hoffprivacy.

Update: This article has been revised from a previous version.



More – 

Putting Privacy First in Big Data Technologies | Re/code

Share this post