One answer is that it’s bleak.
A saving grace, however, is that certain current factors may increasingly lead to greater judicial application of traditional privacy laws even as data protection (DP) laws falter.
The Current Mess
In the U.S., there is a difference between privacy and data protection. Privacy values have long existed, such as in federal and state constitutions, common law torts, and a range of laws aimed at privacy as traditionally conceived, typically in regards to the right or value of avoiding disclosure of secret or intimate information about a human being.1 Essentially, that kind of privacy protects us from third parties butting into what is our business, not theirs. Traditional privacy laws, however, tend to not work well with non-private data (such as a name in the proverbial public telephone book).
Non-private data has become the province of DP laws. This started with the increase in data created by the Internet and its electronic format and connectivity, and then was further influenced by the data explosion from social media and open governmental datasets, etc. This enabled the current focus on big data and artificial intelligence, concepts that involve analysis of massive, diverse datasets to see what patterns emerge and what they might mean.2
In the early stages of DP, regulators such as the Federal Trade Commission (FTC) started bringing enforcement actions in the name of privacy, even though much of the data was not private. This created cognitive dissonance in the U.S. for businesses subject to the FTC’s jurisdiction.
- Businesses could not understand how non-private data that had been commonly collected and used for years was suddenly deemed and searched in vain for a new law expressly stating that.
- Instead, businesses were allegedly required to locate, comb through, and collate regulatory blogs, newsletters, guidance, etc. to determine their new compliance requirements.
- What they were able to find was not uniform and kept changing. For example, FTC staff views of protectable “personally identifying information” morphed to include non-personal information and then morphed to include data reasonably linkable to an individual or their devices, all in nonobvious announcements of the type not used for changes in law.
- Regulators barely mentioned First Amendment concepts that constitutionally protect the free flow of information, even though data can be a building block of free speech and knowledge advancement.3
Piled on top of the U.S. cognitive dissonance was (and is) the attempt by U.S. businesses to learn and comply with non-uniform DP laws globally, all in a context that increasingly includes regulatory infighting and creates a specter of compliance futility. This is further indication that some of these laws seem more aimed at trade competition and lucrative fines for regulators than feasible DP.
The current result for many businesses is “privacy fatigue.” They feel like the guinea pig on a wheel—it cannot run fast enough. The result for data subjects is as lamentable, but from the opposite perspective. Individuals seem to be experiencing “privacy futility” as their opt-outs fail to work or last, and the uneven patchwork of compliance allows their data to escape into the data sphere.
The bleak prospects of DP and the rise of big data analytics might, ironically, result in adapted applications of traditional privacy laws. It is hard to assert that use of a name from the public phone book violates privacy. It is much easier to assert that a detailed data profile on an individual can cross a privacy line even if each bit of data is, not itself, necessarily private. In a sense, big data and artificial intelligence have the ability to create more than the sum of their parts.
Profile creation or uses evolving from deception have the same capacity to cross lines, such as lines that might invoke privacy torts protecting against intruding on seclusion4 or casting an individual in a false light. The FTC has already indicated its views on how big data uses can run afoul of consumer protection, anti-discrimination, and fair lending or employment laws.5 One question is whether and the extent to which such uses might increasingly run afoul of true privacy laws.
The point that “privacy is dead, get over it,” may have been true in the early days of the Internet when regulators pretended non-private data was private. Such data was not really private in the first place, however, hence the development of DP. But big data and data profiling uses will test the legal foundations of DP, and some DP laws will fall to the First Amendment. At the same time, some uses of big data may cross privacy lines and give privacy a fresh chance to rise. In short, privacy might not be dead after all, even if DP laws falter.
1 For more information about this legal foundation, see Chapter 12 of Towle and Nimmer, The Law of Electronic Commercial Transactions (2003–2017).
2 The results are not based on statistical sampling and include “garbage in, garbage out” concepts, i.e., results may or may not be relevant, misleading, or brilliant. “Ice cream does not cause summer” is a phrase sometimes used to illustrate the point: big data will reveal a pattern of ice cream references tied to summer, but that does not mean that ice cream causes or is necessary to summer.
3 See Sorrell v. IMS Health Inc., 131 S. Ct. 2653, 180 L. Ed. 2d 544 (2011).
4 See e.g., by analogy, In re Google Inc. Cookie Placement Consumer Privacy Litigation, 806 F.3d 125 (3d Cir. 2015). A company may commit intrusion upon seclusion by collecting information using duplicitous tactics—in the case, Google and several other advertising companies devised ways to evade cookie-blocking options in Safari’s browser while touting publicly that they respected their users’ choices about whether to take advantage of cookie-blocking technology.
5 See “Big Data a Tool for Inclusion or Exclusion? Understanding the Issues” (January 2016) at https://www.ftc.gov/system/files/documents/reports/big-data-tool-inclusion-or-exclusion-understandingissues/160106big-data-rpt.pdf.
6 Typically attributed to Sun Miocrosystems’ CEO, Scott McNealy, in 1999: “You have zero privacy anyway. Get over it.” See also http://www.computerworld.com/article/2585627/security0/mcnealy-calls-for-smart-cards-to-help-security.html.