A Statutory “Right of Compensation”

The below was excerpted and modified from written testimony submitted to the New York Senate.

According to the World Economic Forum (“WEF”), “personal data represents an emerging asset class, potentially every bit as valuable as other assets such as traded goods, gold or oil.”  Rethinking Personal Data:  Strengthening Trust, at 7, World Economic Forum Report (May 2012).  A subsequent paper from the WEF suggests “claims of personal data being ‘a new asset class’ are the strongest for . . . the inferred data [Institutions] possess about individuals on the basis that they invested the time, energy and resources in creating it.” Rethinking Personal Data: A New Lens for Strengthening Trust, at 17, World Economic Forum Report (May 2014). 

Surveillance advertising prevents those privacy rights underlying these assets from being easily managed – it is not up to consumers whether they will be fed an ad based on previous website visits or purchases it will just happen.  Indeed, according to a survey of 1,000 persons conducted by Ipsos Public Affairs and released by Microsoft in January 2013, forty-five percent of respondents felt they had little or no control over the personal information companies gather about them while they are browsing the Web or using online services.   Indeed, even before the prevalence of online surveillance advertising, consumers faced privacy diminution from widespread tracking efforts.  See Steve Bibas, A Contractual Approach to Data Privacy, 17 Harv. J. Law & Public Policy 591 (Spring 1994) (“Although the ready availability of information helps us to trust others and coordinate actions, it also lessens our privacy. George Orwell presciently expressed our fear of losing all privacy to an omniscient Big Brother. Computers today track our telephone calls, credit-card spending, plane flights, educational and employment records, medical histories, and more.  Someone with free access to this information could piece together a coherent picture of our actions.”). 

Current law, however, does not prevent someone from collecting publicly available information to create a comprehensive consumer profile – nor should there be any such law.  Similarly, there should not be the right to opt out of having publicly recorded information sold or shared.   The same rules, however, should not apply to personal information that is not collected solely from public sources.

This does not mean companies should have the unfettered right to use personal data. To address the market disparity between owners of personal data “assets” and those companies who monetize this data, Alastair Mactaggart launched in 2017 a California ballot initiative that led to the enactment of the California Consumer Privacy Act (CCPA).  Now that CCPA has come online – albeit in a weakened form due to the amendments signed into the law, Mr. Mactaggart is pushing a new ballot initiative that presumably will strengthen CCPA.      

While the EU by way of its General Data Protection Regulation (GDPR) ultimately looks to protect the EU and its residents from countries with weaker data processing safeguards, the path begun by CCPA – and continuing with Mactaggart’s latest ballot initiative – The California Privacy Rights and Enforcement Act  of 2020 (Mactaggart 2020 Ballot Initiative), seeks to protect individuals on a more fundamental level by giving them novel statutory rights.   For example, so long as companies are compliant with GDPR’s processing legal requirements and have a legal basis to conduct such processing, data subjects are left without any real financial recourse.  Under GDPR, data subjects may be provided with specific rights, including the right to erasure, processing, portability, access and correction, etc., yet these rights are in no way transferable or of any value apart from GDPR’s regulatory scheme.

CCPA apparently has as its core mission the bringing of transparency to the use of consumer data.  Before the enactment of CCPA, California was like all other states in that its residents could not readily learn what personal information a business had collected about them, how such information was used, and how to prevent such use from taking place in the future.  Despite similar protections in GDPR, the fact that GDPR precludes processing of data unless certain requirements are met, represents a fundamental difference between the US and EU approaches.  Under the EU approach, subject to certain exceptions a company can continue processing data as long as this GDPR data processing regime is followed.  Under the CCPA approach, consumers have a say on whether processing takes place no matter what level of compliance as long as financial triggers exist for the use of the data.   

State legislatures should study the Mactaggart 2020 Ballot Initiative given its correction of defects and weaknesses found in CCPA while still pushing forward CCPA’s consumer-first mandate.  This does not mean, however, the Mactaggart 2020 Ballot Initiative is not without flaws.  The suggestion that a browser-based solution can easily address the “Do Not Sell” requirement is flawed given the 12-month wait period before a company can again request consent after a “Do Not Sell” request.  When taking into consideration VPN usage and the transient nature of a browser’s tracking tools, a future viable solution will require much more of a comprehensive technology framework – all the while without requiring users to create a new account.  Cal. Civ. Code § 1798.135(a)(1)–(2) (precluding companies from requiring consumers to create a new account as a means of enforcing their right to bar sales of data).

Moreover, the CCPA’s Sisyphean task of creating a “Do Not Sell My Personal Information” link that is both “clear and conspicuous” and yet found on every page that collects personal information is not rectified in the Mactaggart 2020 Ballot Initiative.  With most compliant sites, links have simply been added to every footer for those site visitors with a California IP address – even if generated by a VPN, or they are simply directed to a CCPA-specific page. A footer link is hardly “clear and conspicuous” yet that coupled with a linked page having a “Do Not Sell My Persona Information” button is now the regulatory-accepted compliance tool for this requirement.

CCPA recognizes that the consumer data problem relates to the vast amounts of information in the hands of few data merchants.  There are also Congressional efforts focused on combating large bad actors by instilling more transparency in the data collection process.  Specifically, U.S. Sens. Mark R. Warner (D-VA), Josh Hawley (R-MO) and Richard Blumenthal (D-CT) have introduced “the Augmenting Compatibility and Competition by Enabling Service Switching (ACCESS) Act, bipartisan legislation that will encourage market-based competition to dominant social media platforms by requiring the largest companies to make user data portable – and their services interoperable – with other platforms, and to allow users to designate a trusted third-party service to manage their privacy and account settings, if they so choose.”  The bill expressly focuses on only those technology platforms with over 100 million monthly active users.  As stated in the press release, “Sens. Warner and Hawley have partnered on the DASHBOARD Act, legislation to require data harvesting companies such as social media platforms to disclose how they are monetizing consumer data, as well as the Do Not Track Act, which would allow users to opt out of non-essential data collection, modeled after the Federal Trade Commission’s (FTC) “Do Not Call” list.”

As with CCPA, the Mactaggart 2020 Ballot Initiative, and other state initiatives, federal initiatives generally focus on transparency – helping consumers understand what and how data is collected; control – allowing consumers reject usage of personal data; and accountability – providing adequate data security and compliance coupled with consumer consents.  Ultimately, these are the three pillars of any privacy law worth enacting. There is one more pillar, however, that has not gotten much attention yet is equally important.

A Right of Compensation

Lacking in current federal and state privacy laws and bills is the statutory pronouncement that consumers have an actual property right derived from their personal data.  Providing for a specific statutory property right that is fixed and delineated in a privacy law will make transparency, control and accountability much easier to enforce. 

The privacy community has long toyed with ascribing property rights to personal data.  See Julie E. Cohen, Examined Lives: Informational Privacy and the Subject as Object, 52 Stan. L. Rev. 1373, 1379 (2000) (“One answer to the question “Why ownership?” then, is that it seems we simply cannot help ourselves. Property talk is just how we talk about matters of great importance. In particular, it is how we talk about the allocation of rights in things, and personally-identified information seems “thingified” (or detached from self) in ways that other sorts of private matters—intimate privacy, for example—are not. On this view, the “propertization” of the informational privacy debate is a matter of course; it merely testifies to the enormous power of property thinking in shaping the rules and patterns by which we live.”).

Some have voiced preemptive opposition to any data ownership approach.  See generally Sarah Jeong, We don’t allow people to sell their kidneys. We shouldn’t let them sell the details of their lives, either, The New York Times (July 5, 2019) (“Legally vesting ownership in data isn’t a new idea. It’s often been kicked around as a way to strengthen privacy. But the entire analogy of owning data, like owning a house or a car, falls apart with a little scrutiny. A property right is alienable — once you sell your house, it’s gone. But the most fundamental human rights are inalienable, often because the rights become meaningless once they are alienable. What’s the point of life and liberty if you can sell them?”); Mark MacCarthy, Privacy Is Not A Property Right In Personal Information, Forbes (November 2, 2018) (“Some commentators new to the privacy debate are quick to offer what they think is a clever idea: assign property rights over personal information to the user and let the marketplace decide what happens next. Whether this idea is meritorious has big implications for how we think about things like data portability and consent.  Turns out it’s wrong.”). 

As referenced in Mark MacCarthy’s opinion piece, the notion of personal data as property conflicts with the reality, for example, that medical information can simultaneously potentially be owned by patients, medical schools, pharmacies, doctors, pharmaceutical companies, EMR software vendors, advertising companies and Internet service providers. Opponents of data property rights also wonder how property ownership rights will be allocated – for associated payments or veto power, to the constituent owners.

It can also be argued that personal data continually percolates uncontrolled around the world and constitutes a “social good” that can never be owned by individuals.  For example, it is the underpinning of a good deal of medical research that ends up curing disease.  Information concerning a consumer’s interaction with others presumably also allows participants to these interactions to also claim ownership of related inferred data to themselves.  Moreover, it is easy to argue the First Amendment should bar the creation of a data property regime given it might potentially stifle speech between parties.

The “social good” argument is likely the one with the strongest appeal.  For example, on November 11, 2019 The Wall Street Journal exposed Google’s “Project Nightingale” and its resulting company access to the health information maintained by Ascension – one of the nation’s leading health systems.  In a November 11, 2019 blog post, Google explained that this arrangement was to support Ascension “with technology that helps them to deliver better care to patients across the United States.”  What is noticeably absent from the blog post is whether Google will also obtain access to patient medical records in a deidentified or other manner.  This is noteworthy given last year researchers at Google announced a way to predict a person’s blood pressure, age, and smoking status simply from an image of their retina.  In order to do so, however, Google first had to analyze retinal images from 284,335 patients.  Given health research is obviously a “social good” the use or sale of deidentified protected health information (PHI) has long been an accepted use of medical data.  Oregon’s failed Senate Bill 703 would have been the first in the nation to require specific consent for the sale of deidentified PHI that is now currently sold each year for billions. 

No matter how they are ultimately couched, all of the paternalistic arguments against individuals having property rights in their data still miss the mark.  First, simply because a privacy right may be perceived as “inalienable” – as it is under the California Constitution, does not mean there cannot be transferable “compensation units” derived from such rights.  Indeed, certain inalienable rights, e.g., right to freedom, right to property, etc., are routinely suspended during a trial and after conviction based on the voluntary commission of a crime.  This unfortunately happens every day throughout the country.  There is no reason a person should be precluded from voluntarily transforming certain ascribed rights into fungible ownership interests for a set duration and upon a specific set of circumstances.  No one currently corrals persons living on the streets claiming they need to assert their right to privacy despite the fact outdoor sleeping is obviously a knowing waiver of a right to privacy.   Similarly, persons every day voluntarily join affinity clubs to obtain rewards while trading away unknown personal data in an unknown surveillance arrangement.  Such conduct certainly does not mean the inalienable “right to privacy” was shredded up and destroyed by such individuals. 

The fact that multiple parties may claim ownership rights in the same personal data also does not negate the fact an ownership regime can viably exist – only that it will require careful coordination and adequate technology to implement.  Moreover, any argument based on the First Amendment also misses the mark in the same way no one has a First Amendment right to produce a copyright-protected play without proper consent from the writer.

Providing consumers with the ability and “statutory right to trade one’s personal data” – even if the fair market value of such data might be actually quite minimal, is the actual specific ownership right that should be statutorily created.  Noted academics long ago suspected this might be the correct path to take.  See Julie E. Cohen, Examined Lives: Informational Privacy and the Subject as Object, 52 Stan. L. Rev. 1373, 1391 (2000) (“A successful data privacy regime is precisely one that guarantees individuals the right to trade their personal information for perceived benefits, and that places the lowest transaction cost barriers in the way of consensual trades. If individuals choose to trade their personal data away without placing restrictions on secondary or tertiary uses, surely it is their business. On this view, choice rather than ownership is (or should be) the engine of privacy policy. What matters most is that personal data is owned at the end of the day in the manner the parties have agreed.”) (emphasis added); Id. at 1383 (“A relational approach to personally-identified data might, but need not, assign “ownership” or control of exchange based on possession.”); Richard A. Posner, The Right of Privacy, 12 Ga. L. Rev. 393, 394 (Spring 1977) (“People invariably possess information, including facts about themselves and contents of communications, that they will incur costs to conceal. Sometimes such information is of value to others: that is, others will incur costs to discover it. Thus we have two economic goods, “privacy” and “prying.” . . . An alternative [economic analysis of privacy] is to regard privacy and prying as intermediate rather than final goods, instrumental rather than ultimate values. Under this approach, people are assumed not to desire or value privacy or prying in themselves but to use these goods as inputs into the production of income or some other broad measure of utility or welfare.”) (emphasis added).

Current efforts at creating a statutory privacy regime can actually be considered precursors to a statutory “transactional property” approach.  Under CCPA:  “A business may offer financial incentives, including payments to consumers as compensation, for the collection of personal information, the sale of personal information, or the deletion of personal information.”  Cal. Civ. Code § 1798.125(b)(1).   Indeed, the healthcare privacy regime of HIPAA long understood the possibility PHI might be sold by a covered entity.  See 45 CFR § 164.508(a)(4)(i) (“Notwithstanding any provision of this subpart, other than the transition provisions in § 164.532, a covered entity must obtain an authorization for any disclosure of protected health information which is a sale of protected health information, as defined in § 164.501 of this subpart. (ii) Such authorization must state that the disclosure will result in remuneration to the covered entity.”).  Moreover, HIPAA even anticipates state statutes having greater protections.  See 45 CFR § 160.203 (There is an express exemption under HIPAA for State law when that “State law relates to the privacy of health information and is more stringent than a standard, requirement, or implementation specification adopted” under HIPAA).

A transactional property approach empowers consumers without placing unnecessary barriers on the “social good” use of data – it is even the trigger for certain of CCPA’s consumer rights.  Consumers could either choose to accept certain new statutory protections, i.e., the right to delete, or lease their data based on an economic model that would allow for the transparency needed to determine whether the data is even able to be sold.  If data is not actually salable, consumers should be limited in how they can prevent companies from using their data given the countervailing social good inherent in the free exchange of consumer data.   If there is no existing viable market for the consumer data in question, there should not be any associated requirement that a company pay any set amount for such data or be precluded from using such data in a deidentified format.  In other words, the burdens claimed by opponents of a property approach would be mitigated – consumers would only be given a piece of the pie and not the whole pie and any purported “veto power” would never really come into existence.  Moreover, a regulatory framework that allows market dynamics dictate the applicability of protections afforded to consumers is likely the fairest approach to both consumers and data merchants alike. 

Similar to the way the Mactaggart 2020 Ballot Initiative proposes the creation of a new California agency, namely the California Privacy Protection Agency (CPPA) which would cost $10 million to implement, it is suggested that a public benefit corporation ensure the necessary framework get implemented.  In other words, unlike in California where CPPA currentlly only buttresses the enforcement and regulatory work done by the California Attorney General’s Office, a public Data Protection Corporation (DPC) would coordinate with the private sector to ensure the requirements of a privacy law are viable and can come to life.  Simply put, the creation of the DPC will ensure the  current compliance problems visited on those companies subject to CCPA never come to life.  There is analogous precedent for the creation of the DPC found in the environmental arena.

No one can dispute one primary purpose of an environmental law is to either prevent potential toxins from infiltrating land, water and air or to remove and properly dispose of the pollutants if already released.  Addressing improperly used consumer data similarly needs a massive cleanup effort and can take a page from how environmental concerns were previously addressed in New York.  To that end, in 1970 the New York State Environmental Facilities Corporation (EFC) was created by the New York State Environmental Facilities Corporation Act.   

As a public benefit corporation of the State, EFC is a corporate entity separate and apart from the State.  State law empowers the EFC to provide financing for certain environmental projects as well as “render technical advice and assistance to private entities, state agencies and local government units on sewage treatment and collection, pollution control, recycling, hazardous waste abatement, solid waste disposal and other related subjects.”  Indeed, as stated by the EFC on its website, its mission is to provide “expert technical assistance for environmental projects in New York State. . . . We promote innovative environmental technologies and practices in all of our programs.”

Similarly, the DPC would provide technical assistance in conformance with the enacting law’s mandate to protect consumer data.  At a basic level, there is never the need to grant access to all data for all purposes to all companies interested in consumer data.   Whether by evaluating current zero-knowledge proof solutions – where a verifier has “zero knowledge of” information unnecessary for an actual verification, or determining the feasibility of certain self-sovereign identity solutions, the DPC can ultimately provide the necessary “secret sauce” for a successful privacy law.    Statutory efforts to legislate on privacy will forever be hamstrung if implementation technology remains an afterthought that will presumably simply sort itself out after a law is passed.  The goal of the DPC would be to ensure there are adequate technical means available to execute on the legislation passed – not to pick technology sides or inadvertently delay private sector efforts at technology development.  

To sum up, the four major components needed in a successful privacy law should begin with the creation of a statutory “Right of Compensation” and end with the means to effectuate such a right: 

  1. Creation of a “transactional property right” in consumer data giving rise to a new Right of Compensation;
  2. Development of a compliance framework that would only apply to companies maintaining significant amounts of consumer data;
  3. Insertion of rights and obligations that focus on the three established privacy pillars of transparency, control and accountability; and
  4. Creation of a “Data Protection Corporation” – a public corporation largely tasked with ensuring that what is statutorily required is feasible from both a technological and market perspective.