All posts by Paul E. Paray

Wyndham Settles with FTC

Ending its epic battle with the FTC, Wyndham entered into a settlement agreement with the FTC.  Under the terms of the Stipulated Order that was filed on December 9, 2015 with Judge Salas, Wyndham will establish a “comprehensive information security program designed to protect cardholder data – including payment card numbers, names and expiration dates.”  In addition, the company is required to “conduct annual information security audits and maintain safeguards in connections to its franchisees’ servers.”

These safeguards have a shelf-life of 20 years — common for FTC stipulated agreements involving data breaches.  What is noteworthy and distinct from other settlements, however, is that there is no money changing hands — Wyndham pays no fines, investigative costs or any amount for that matter.   This overall result — especially in light of the Third Circuit ruling, can only be considered a solid victory for Wyndham.

Franchise operators also scored somewhat of a victory given the FTC finally gives some guidance as to what it considers to be a reasonable security program for franchise operators.  First, the FTC alerts future companies that if they conform to the most current Payment Card Industry Data Security Standard (PCI DSS) for certification of a company’s security program, they are in the right direction towards implementing a satisfactory program.  Indeed, the settlement specifically defines its terms as per PCI DSS Version 3.1.  Not surprisingly, the second aspect of a suitable program requires the implementation of a risk-based approach to threat assessment.  As set forth in I.C of the Stipulated Order, Wyndham’s program must include “the design and implementation of reasonable safeguards to control the risks identified through risk assessment (including any risks emanating from the Wyndham-branded Hotels), and regular testing or monitoring of the effectiveness of the safeguards’ key controls, systems, and procedure.”

The agreed-upon requirements also apply to any “entity that that Defendant has any ownership interest in or controls directly or indirectly that may affect compliance obligations arising under this Order.”  And overall compliance features of the Stipulated Order mimic discovery process available under the Federal Rules of Evidence and will certainly be tested over the twenty-year term.  Such future testing — coupled with potential new breaches, may lead to future stipulated Orders.  For the moment, however, Wyndham should be relieved with the results of its FTC skirmish — as well as happy with the work done by its counsel.

California Rakes in $25 Million from Comcast

On September 17, 2015, a California Judge approved a final stipulated judgment between media giant Comcast and the California Public Utilities Commission.  In Paragraph 17 of the Complaint filed the same day, Comcast was not exactly accused of heinous conduct:  “for varying periods of time between July 2010 to December 2012, and for many customers the entire period, approximately 75,000 Comcast residential subscribers in California who had paid Comcast the monthly fee for a non-published or non-listed phone number nevertheless had their subscriber listing information published on Ecolisting, and (in some cases) in phone books, and/or made available by a directory assistance provider.”

In other words, Comcast customers who paid to avoid potentially being listed on sites such a whitepages.com were inadvertently deprived of that purchased service.  Specifically, because “the ‘privacy flag’ was not attached to the listings of approximately 75 ,000 non-published/non-listed subscribers, Neustar provided those listings to Comcast’s vendor, Microsoft FAST, who then published them for Comcast on the Ecolisting website.”  Complaint at ¶ 15.

No financial data was exposed.  No transaction or business data was exposed.  No medical data was exposed.  No emails or passwords were compromised.  Indeed, the only information exposed was the very same information that could be obtained by anyone doing a few sophisticated Google searches – names, addresses, and phone numbers.   For most people, such information exists online independently of any Comcast action or inaction.   In other words, whether or not Comcast properly withheld such information would not likely prevent someone from finding it online.

As part of the settlement, Comcast must pay $25 million in penalties and investigative costs to the California Department of Justice and the California Public Utilities Commission.   The 75,000 customers who were “compromised” ended up with refunds and $100 more in restitution added to their Comcast bills.

And, as part of the stipulated judgment, Comcast also agreed to a permanent injunction that requires the company to strengthen the restrictions it places on its vendors’ use of personal information about customers.  The injunction also requires Comcast to provide a new disclosure form to all customers that explains the ways in which it uses unlisted phone numbers and other personal information.  Such restrictions and added duties have little to do with the actual transgression in question — they represent added gimmes obtained by the California AG’s office given the leverage it had over Comcast.

This case is yet another wake-up call to companies maintaining or processing large amounts of customer data.  Even though the Comcast settlement is somewhat unique given the nature of the information as well as the “unlisting service” provided, other companies also safeguard what may otherwise be publicly available information.  When there are assurances made that such information will be safeguarded, does that automatically elevate the value of the information?

The larger question is how can a transgression with no ostensible harm mushroom into a $25 million payment to a governmental agency?  Until a General Counsel can answer that question with definite certainty, the only course of action is to treat all customer data equally and ensure the requisite reasonable precautions undertaken to safeguard such information matches or exceeds what is considered state-of-the-art for that company’s industry sector.

Third Circuit Affirms Judge Salas in FTC v. Wyndham

In a 47-page ruling, the United States Court of Appeals for the Third Circuit affirmed today an April 7, 2014 ruling of Judge Esther Salas against Wyndham Worldwide.  In affirming the district court ruling, the Third Circuit left intact Judge Salas’s decision that the FTC has power to regulate “unfair trade practices” based on the alleged failed data security of Wyndham.

The Third Circuit recast Wyndham’s argument and ultimately rejected what was potentially viable on appeal as “[t]oo little and too late.”  As recognized by the Court:

Wyndham repeatedly argued there is no FTC interpretation of § 45(a) or (n) to which the federal courts must defer in this case, and, as a result, the courts must interpret the meaning of the statute as it applies to Wyndham’s conduct in the first instance. Thus, Wyndham cannot argue it was entitled to know with ascertainable certainty the cybersecurity standards by which the FTC expected it to conform. Instead, the company can only claim that it lacked fair notice of the meaning of the statute itself – a theory it did not meaningfully raise and that we strongly suspect would be unpersuasive under the facts of this case..

In what was a sua sponte rejection of Wyndham’s “implied” argument that it was not provided with sufficient statutory notice of the century-old Federal Trade Commission Act, the Court of Appeals recognized:

Moreover, Wyndham is entitled to a relatively low level of statutory notice for several reasons. Subsection 45(a) does not implicate any constitutional rights here. [citation omitted] It is a civil rather than criminal statute. [citation omitted] And statutes regulating economic activity receive a “less strict” test because their “subject matter is often more narrow, and because businesses, which face economic demands to plan behavior carefully, can be expected to consult relevant legislation in advance of action.” [citation omitted]

In other words, one of Wyndham’s arguments deemed potentially viable, i.e., that it should not be held to a standard never actually put forth by the FTC in any prior ruling, will likely be rejected on summary judgment.    According to the Court, the relevant standard “considers a number of relevant factors, including the probability and expected size of reasonably unavoidable harms to consumers given a certain level of cybersecurity and the costs to consumers that would arise from investment in stronger cybersecurity.”  It is this applicable standard that the Court found Wyndham should have been on notice of prior to the FTC Complaint being filed against it.

In a section of the opinion that may come back to haunt Wyndham – as well as future victims of a major data incident, the Court was quite blunt in its assessment as to whether this statutory standard was potentially satisfied.  Id. at 41 (“Wyndham’s as-applied challenge is even weaker given it was hacked not one or two, but three, times. At least after the second attack, it should have been painfully clear to Wyndham that a court could find its conduct failed the cost-benefit analysis. That said, we leave for another day whether Wyndham’s alleged cybersecurity practices do in fact fail, an issue the parties did not brief. We merely note that certainly after the second time Wyndham was hacked, it was on notice of the possibility that a court could find that its practices fail the cost-benefit analysis.”).

The import of this decision obviously reaches well beyond the Third Circuit.  As the only appellate court to affirm the FTC’s authority to enforce what it considers applicable cybersecurity standards — “standards” that no other governmental body uses as aggressively as the FTC, the FTC will have even greater leverage in future settlement agreements.  Given the scorched earth tactics taken during this litigation, it is possible the United States Supreme Court will be asked by Wyndham to weigh in.   There is certainly an argument to be made that Wyndham’s time and money would be better spent mending fences with the FTC.

UPDATE:   On the heels of this victory, the FTC announced on August 28, 2015 that it was going to hold a free “PrivacyCon” conference on January 14, 2016 at its Constitution Center offices.  According to the event description, PrivacyCon will “bring together a diverse group of stakeholders, including whitehat researchers, academics, industry representatives, consumer advocates, academics, and a range of government regulators, to discuss the latest research and trends related to consumer privacy and data security.”  Given that there is a call for “presentations seeking original research on new vulnerabilities and how they might be exploited to harm consumers” hopefully the attendee list to this free event does not have too many “John Smiths” listed.

NJDC Affirms FTC Regulatory Power Regarding Data Security Practices

Judge Esther Salas of the United States District Court of New Jersey ruled today that a Section 5 action brought by the FTC was sustainable against Wyndham Worldwide Corporation (“Wyndham Worldwide”) as well as various corporate affiliates primarily involved in the franchise side of its business.  This decision re-affirmed the FTC ‘s power to regulate “unfair trade practices” based on the failed data security of companies.   Judge Salas denied a motion to dismiss a FTC action based on the alleged violation of both the deception and unfairness prongs of Section 5(a) “in connection with Defendants’ failure to maintain reasonable and appropriate data security for consumers’ sensitive personal information.”  Wyndham Worldwide also looked to dismiss the action given the consumer representations made by some corporate affiliates were not intended to be applicable to all corporate affiliates.

In what Wyndham Worldwide considered a matter of first impression, the Court rejected Wyndham Worldwide’s position that the FTC does not have authority to bring an unfairness claim involving lax data security.  Another allegedly unique aspect of this case turns on the fact the corporate affiliate who initially sustained the data incident and also made most of the representations in question (Wyndham Hotels and Resorts, LLC) was able to implicate its corporate parent.

This decision is a rare judicial affirmation of the FTC’s broad power to assert itself in the data protection activities of companies. Typically, the FTC simply obtains consent as a byproduct of a settlement agreement.  Hacked companies routinely acknowledge the FTC’s power in this regard.

Although this decision merely resolves a motion to dismiss — with liability issues left unresolved, privacy practitioners who visit with the FTC should review Judge Salas’ opinion and continue to track this matter.  Given the hard public positions taken by Wyndham and the FTC,  this case may very well end up in the Third Circuit or even the Supreme Court — eventually leading to an appellate court potentially defining the exact contours of the FTC’s authority to regulate hacked companies.

New Jersey State AG Enters into COPPA Consent Order

Capitalizing on its federal grant of authority, the New Jersey state Attorney General’s Office recently resolved claims it brought against Dokogeo, the California-based maker of the Dokobots app, that were based on the Children’s Online Privacy Protection Act (COPPA) and state Consumer Fraud Act.  According to the Consent Order filed on November 13, 2013, the Dokobots app is a geo-location scavenger hunt game that encourages users to visit new locations and gather “photos and notes from the people they meet.” One major attribute of the app is its geo-tracking of users.  A product review of the app describes it as blending “playtime, learning, exploration, and creativity in a curiously enticing way.” The State’s position was that the app was directed to children by virtue of its use of animated characters and “child-themed” storyline.

The Consent Order alleges that the app collects information, “including e-mail address, photographs, and geolocation information” deemed personal information under COPPA yet did not provide any neutral screen registration process to restrict the age of its users to those over the age of 13. Moreover, there was no terms of use agreement and its privacy policy does not disclose that the app is restricted to users over the age of 13. Pursuant to the Consent Order, Dokogeo removed all photographs of children and location information from its website and agreed to more clearly disclose information it collects.  As of November 24, 2013, the Dokobots site merely had a static home page – presumably given it is still in the process of implementing the terms of the Order.

The Consent Order also provides for a suspended fine of $25,000 which will only be enforced if Dokogeo fails to meet the terms of the Order.

This is the second such settlement reached by the New Jersey state AG’s office.  In July 2012, authorities announced a similar settlement against Los Angeles-based 24 x 7 Digital, LLC requiring the destruction of all children’s data that had previously been collected and transmitted to third parties.  That action was commenced by way of Complaint filed in June 2012.

It is not unusual for state AGs to commence COPPA actions against out-of-state companies.  In fact, a state AG action under COPPA was brought years ago by the Texas AG against a Brooklyn-based company for improperly collecting personal information such as names, ages, and home addresses from children.  What is interesting about the Kokogeo case, however, is that the underlying statute requires that the “the attorney general of a State has reason to believe that an interest of the residents of that State has been or is threatened or adversely affected. . . .” 15 USC § 6504 (emphasis added). Other than merely reciting the statute, no actual finding was made or referenced by the New Jersey AG’s office regarding the impact to New Jersey residents.  In fact, Kokogeo defended by arguing the app was intended for adults and there was no discussion by either side regarding New Jersey users.

App developers are well advised to appreciate two basic lessons from Kokogeo. If an app appears to target children, developers should comply with COPPA — especially given FTC guidelines involving the collection of geo-data and use of photographs.   And, if they do not comply, they should be prepared to defend against those state AGs who are not adverse to spending state dollars pursuing an enforcement action

Plaintiffs Bar Hit Hard by Recent CMIA Decision

Insurers providing privacy liability coverage were collectively breathing a sigh of relief last week given a decision from the California Court of Appeals.  Interpreting the California Confidentiality of Medical Information Act (CMIA), the court in Regents of the Univ. of Cal. v. Superior Court of Los Angeles County, No. B249148 (Cal. Ct. App. October 15, 2013) significantly limited the ability of plaintiffs to obtain nominal statutory damages of $1,000 per patient under CMIA.  For the past several years, CMIA was pretty much the best game in town when it came to data breach litigation.  Although enacted in 2008, CMIA was only over the past several years successfully used by plaintiffs’ counsel to obtain settlements previously unattainable post-breach.  The CMIA “statutory damages” bonanza reaped by class counsel was significant – the prospect of such damages allowed counsel to overcome Article III and other “lack of injury” arguments, potentially allowed for class certification even with an otherwise uneven plaintiff pool, and created an early incentive to settle on the part of a defendant – and its insurer – given the potential size of an award.

It is no surprise CMIA was the bane of a good number of network security and privacy insurers – it led to settlements that would not have otherwise occurred.  The Regents decision is noteworthy given it was the first appellate court to decide the availability of CMIA statutory damages and rejected the notion that mere negligence coupled with disclosure could trigger statutory damages.  This is a major departure from how the law was interpreted by the lower courts and instantly dried up a good  part of the statutory damages manna drunk by plaintiffs’ counsel. 

The facts of the case would provide a nice law school hypothetical – a doctor’s home is burglarized and his encrypted external drive is stolen – and, just for good measure, he cannot locate the note card containing the drive’s password.   Was there unauthorized access to the stolen information?  A CMIA private right of action allowing for statutory damages turns on whether “negligence results in unauthorized or wrongful access to the information.”   It is easy to assume when someone may have also stolen the password located near a stolen hard drive that the theft will result in an unauthorized access – especially when the stolen drive is never found.

After reviewing the statute’s legislative history and related laws, the Court of Appeals strictly construed the statute to allow for nominal, or statutory damages of $1,000 – but only when there was actual “unauthorized or wrongful access to the information.”  Given that the class plaintiff was unable to allege her information was improperly viewed or otherwise accessed, the superior court was ordered to have the case dismissed. 

In effect, the Court of Appeals significantly neutered CMIA by requiring actual improper access to a patient’s medical information.  In most likely breach scenarios, ID theft and “actual access” can go hand in hand.  Armed with evidence of potential or actual ID theft, most plaintiffs’ counsel would withstand some level of motion practice – with or without CMIA.  In other words, the benefits derived from CMIA’s availability of nominal damages may have dwindled to some potential commonality assistance during a class certification motion. 

Although it remains to be seen whether insurers will lower healthcare privacy premiums due to this one decision, one thing is certain – claims adjusters will have “a little” extra free time on their hands.

Is New Jersey Seeking to Become the New California When it comes to Privacy?

By way of a recent opinion of the New Jersey Supreme Court, New Jersey became the first state establishing a Constitutional right to cell-phone location information – thereby precluding law enforcement’s retrieval of such information without a warrant or exigent circumstances.   See State v. Earls, No. A-53-11, slip op., (NJ July 18, 2013) (unanimous opinion).

Recognizing that its decision “creates a new rule of law that would disrupt the administration of justice if applied retroactively”, the Court limited its ruling to the subject defendant and prospective cases only.  Interestingly, the Court did not even make a passing reference to a 2011 New Jersey appellate court that previously ruled no privacy tort existed for the surreptitious use of a location tracking device on a car.  The Earls case is the first appellate case to build on the United States Supreme Court’s GPS decision in United States v. Jones or address in great detail the proliferation and use of location-based information.

The Court in Earles recognized that “[w]ith increasing accuracy, cell phones can now trace our daily movements and disclose not only where individuals are located at a point in time but also which shops, doctors, religious services, and political events they go to, and with whom they choose to associate.”  Not surprisingly, the Court also realized “that cell-phone location information can be a powerful tool to fight crime.” 

Relying on the New Jersey Constitution, however, the Court reasoned that individuals expect that information provided to a third party in order to procure services will only be used by the recipient – in this case a telephone company – to provide the services in question.  In addition to this affirmative expectation of privacy, there is also a concomitant expectation that this information will not also be provided to the government. 

New Jersey’s landmark decision comes on the heels of one state legislator’s proposal of an amendment to the New Jersey Constitution stating that “people have a right to privacy from government intrusion, unless the government follows the due process of law.”  In addition to a proposed Constitutional amendment, Assemblywoman Handlin is also the sponsor of six bills and another resolution that address a person’s right to privacy as well as the freedom of the press:

A-4305:  prohibits the improper release of photographs or videos captured by security cameras or other recording devices operated by public entities.

A-4306:  prohibits a governmental entity from obtaining a biometric identifier of an individual without that individual’s consent. The bill does not prohibit any law enforcement agency from obtaining biometric identifiers of someone who has been placed under arrest. A “biometric identifier” is a retina or iris scan, fingerprint, voiceprint or DNA.

A-4307:  a person who knowingly obtains or discloses personally identifiable health information, in violation of the federal health privacy rule, is guilty of a crime of the third degree.

A-4308:  this bill increases the penalties for the unlawful disclosure or use of taxpayer information by State tax officials. The purpose of this bill is to provide enhanced deterrence against violations of taxpayer confidentiality.

A-4309:  requires a Superior Court judge to approve the installation of any video camera by a public entity.

A-4310:  requires an administrative agency to include a privacy impact statement when adopting, amending, or repealing a rule.

ACR-201:  requests the President and Congress enact a federal shield law for journalists. A shield law would grant journalists notice and an opportunity to be heard in federal court in order to challenge a federal subpoena seeking phone records or other information identifying a source. Federal bills S.987 and H.R.1962, both titled the “Free Flow of Information Act of 2013,” were introduced in May 2013. The bills would establish the federal shield law.

Sandwiched between these privacy-protective efforts, exists a bill aimed at safeguarding the social media accounts of employees.  Before it was conditionally vetoed by Governor Christie in May 2013, New Jersey was teetering on passing the most onerous law in the country regarding employee social media protections – allowing for a private right of action and seeking to bar employers from even asking if an employee has a social media account.  As it currently stands, the bill – if it is ever finally signed by the Governor – will still be among one of the stronger such laws.

Despite its recent efforts, New Jersey still has a great deal of heavy lifting before it can catch up with the land of SB 1386 – California already has a constitutionally guaranteed right to privacy, over seventy privacy-related laws on the books, and multiple regulatory agencies set up to enforce these laws.   It is no surprise that Attorney General Kamala Harris’s recent report opens with the words:  California has the strongest consumer privacy laws in the country.

Privacy and Civil Liberties Oversight Board will conduct a public hearing on July 9, 2013

Announced in a public notice published on August 28, 2013, the Privacy and Civil Liberties Oversight Board (“the Board”) will conduct a public hearing on July 9, 2013.  According to this notice, “invited experts, academics and advocacy organizations” will discuss “surveillance programs operated pursuant to Section 215 of the USA PATRIOT Act and Section 702 of Foreign Intelligence Surveillance Act.”  Members of the public are invited to participate.   The Washington, D.C. location of the event has yet to be determined.

By way of background, the Board consists of five members appointed by the President and confirmed by the Senate.   The Board was first created in 2004 within the executive branch but became an independent agency several years later.  Until the current NSA surveillance leak, the Board was quiet to say the least.  Apparently, there is no record of the Board members meeting more than once and President Obama met with Board members only days ago for the first time.  Notwithstanding the relative inexperience of the Board working as a team, this hearing will be of great interest if for no other reason the recent unveiling of the NSA surveillance programs may actually be the tip of the proverbial iceberg.

As reported in the UK press — the very same UK paper that broke and explored the Snowden leak, “Britain’s spy agency GCHQ has secretly gained access to the network of cables which carry the world’s phone calls and internet traffic and has started to process vast streams of sensitive personal information which it is sharing with its American partner, the National Security Agency (NSA).”  It would be nice if the American press were interested in taking a ride on this UK investigative bandwagon.  Maybe after July 9, 2013, it finally will.

Update:  July 15, 2013

Here’s the transcript of this very lively public workshop.

California’s Right to Know Law Put on Hold

As reported by the LA Times, “a powerful coalition of technology companies and business lobbies that included Facebook, Inc., Google, Inc., the California Chamber of Commerce, insurers, bankers and cable television companies as well as direct marketers and data brokers” were able to stop a California bill aimed at giving consumers greater insight as to the use of their personal data.

First introduced in February by Assemblywoman Bonnie Lowenthal (D-Long Beach), the proposed Right to Know Law (AB 1291) would have implemented major revisions to existing law and created new rights for consumers.  Specifically, the proposed law would require

any business that has a customer’s personal information, as defined, to provide at no charge, within 30 days of the customer’s specified request, a copy of that information to the customer as well as the names and contact information for all 3rd parties with which the business has shared the information during the previous 12 months, regardless of any business relationship with the customer.

This new level of transparency might have helped sooth consumer concerns.  According to a 2012 USC Dornsife/Los Angeles Times poll, “82 percent of Californians said they are “very concerned” or “somewhat concerned” about Internet and smartphone companies collecting their personal information.”   On the other hand, providing a full and accurate accounting of who had access to a consumer’s data – even to only the small percentage of consumers who would actually take the time to request it – would have generated a major undertaking for a wide range of companies.  It is not surprising that the companies who fought so hard to pull the plug on this bill represent a very diverse coalition of businesses.

Even if this bill does not get revived in a new form sometime in the future, the prospect of what it might have brought to the table should serve as a wake up call to those businesses deep into online behavioral advertizing.  It may be time to better understand just who has access to what information – and it may not eventually matter whether that information belongs to a current client or consumer or whether it was anonymized.  As usual, staying in front of the regulatory curve remains a sound business practice.

Financial Correlation of Privacy Rights

In Letting Down Our Guard With Web Privacy, published on March 30, 2013, the author details ongoing research being conducted by Alessandro Acquisti, a behavioral economist at Carnegie Mellon University.  Mr. Acquisti’s research is cutting edge when it comes to online behavioral advertising (OBA)  and associated consumer behavior.  Indeed, he’s the academic who famously announced in 2011 that one might be able to discover portions of someone’s social security number simply by virtue of a posted photograph.   His research often distills to one major premise – consumers may not always act in their best interests when it comes to online privacy decisions.

It appears consumers and merchants alike may be missing out on fully cultivating a very valuable commodity.  According to the World Economic Forum, “personal data represents an emerging asset class, potentially every bit as valuable as other assets such as traded goods, gold or oil.”  Rethinking Personal Data:  Strengthening Trust, at 7, World Economic Forum Report (May 2012).  Before this asset class can ever be completely exploited and fully commercialized, however, its constituent value components must be correlated by all in the privacy food chain.

Over three decades ago, it was recognized that the three pillars of privacy – the very foundation of personal data – secrecy, anonymity, and solitude, were distinct yet interrelated.  See Gavison, Ruth, Privacy and the Limits of Law, 89 The Yale Law Journal 421, 428-429 (1980) (“A loss of privacy occurs as others obtain information about an individual, pay attention to him, or gain access to him. These three elements of secrecy, anonymity, and solitude are distinct and independent, but interrelated, and the complex concept of privacy is richer than any definition centered around only one of them.”).

Current OBA has made it so these three privacy pillars may be confusing for consumers to value, manage, and isolate when online – it is not generally up to consumers whether they will be fed an ad based on previous website visits or purchases – it will just happen.  Indeed, according to a survey of 1,000 persons conducted by Ipsos Public Affairs and released by Microsoft in January 2013, forty-five percent of respondents felt they had little or no control over the personal information companies gather about them while they are browsing the Web or using online services.  This view may not be unfounded given that data routinely gathered online, e.g., operating system, browser, IP address, persistent cookies, last used server, can be used to divulge the activity of individual devices.

The privacy trade-offs being researched by Mr. Acquisti and others offer insight into the true value of these data constituents.  Consumers who try to “shut off” or render anonymous access to their device’s data or settings, would not only likely fail in their attempt at being anonymized, they would also lose out on access to most social media and other websites requiring browsers to accept cookies as well as product offers that may presumably are of interest.  Indeed, this coordinated tracking of consumers is not even unique to the Internet.   See generally Bibas, Steve, A Contractual Approach to Data Privacy, 17 Harv. J. Law & Public Policy 591 (Spring 1994) (“Although the ready availability of information helps us to trust others and coordinate actions, it also lessens our privacy. George Orwell presciently expressed our fear of losing all privacy to an omniscient Big Brother.  Computers today track our telephone calls, credit-card spending, plane flights, educational and employment records, medical histories, and more.  Someone with free access to this information could piece together a coherent picture of our actions.”).  There are even companies that bridge the gap between offline and online activities by taking in-store point of sale purchases and converting such data to an anonymous online cookie ID that will eventually be used online by clients.  Such use of in-store data is generally permissible under a retailer’s loyalty program.

Current law does not generally prevent someone from collecting public information to create consumer profiles – nor is there the right to opt out of having your public record information sold or shared.  And, when one wants to self-determine whether data will be disclosed or whether he or she will be “untraceable”, “anonymous” or “left alone”, there may not always exist the ability to easily curtail these rights from being exploited – there is certainly no way to obtain a direct financial gain in return for the relinquishment of such privacy rights.  Instead, there has generally been a “privacy for services” marketing/advertizing arrangement that has been accepted by consumers – which, in fact, has helped pay for and fuel the growth of the commercial Internet.

The current OBA ecosystem does not posit a “loss of privacy” as much as it offers a bartering system where one party feels the value of what is being bartered away while the other party actually quantifies with cascading/monetizing transactions what is only felt by the other party.  In other words, it is not a financial transaction.  Those who are able to find an entertaining online video or locate a product online using a search engine don’t really mind that an ad will be served to them while visiting some other website given they feel this loss of privacy is worth the value of the services being provided.

Ironically, the interactive advertising industry itself may believe it is collecting too much sensitive consumer data.  According to a study conducted by the Ponemon Institute, 67 percent of responding online advertisers believe “limiting sensitive data collection for OBA purposes is key to improving consumer privacy and control when browsing or shopping online.” Leading Practices in Behavioral Advertising & Consumer Privacy:  A Study of Internet Marketers & Advertisers, at 2, The Ponemon Institute (February 2012).

As recognized by privacy researchers, “[e]mpirical evidence on the behavioral effects of privacy is rather scarce.”  Regner, Tobias; Riener, Gerhard, Voluntary Payments, Privacy and Social Pressure On The Internet: A Natural Field Experiment, DICE Discussion Paper, No. 82 (December 2012) at 6.  Although “some consumers are willing to pay a premium to purchase from privacy protective websites”; there is no measure of what that premium should be or how widespread a factor it is for consumers as a whole.  Id. at 7.

More often than not, consumers have been “often willing to provide personal information for small or no rewards.”  Losses, Gains, and Hyperbolic Discounting: An Experimental Approach to Information Security Attitudes and Behavior, presented by Alessandro Acquisti and Jens Grossklags at the 2nd Annual Workshop on Economics and Information Security, College Park, Maryland, May 2003, at 4.

This does not mean researchers have not tried to quantify a “privacy valuation” model.  In 2002, a Jupiter Research study found 82% of online shoppers willing to give personal data to new shopping sites in exchange for the chance to win $100.  See c.f. Tsai, Janice; Egelman, Serge; Cranor, Lorrie; Acquisti, Alessandro; The Effect of Online Privacy Information on Purchasing Behavior: An Experimental Study, Information Systems Research (February 2010) at 22 (describing survey results which concludes that “people will tend to purchase from merchants that offer more privacy protection and even pay a premium to purchase from such merchants.”); Beresford, Alastair; Kübler, Dorothea; Preibusch, Sören, Unwillingness To Pay For Privacy: A Field Experiment, 117 Economics Letters 25 (2010) (“Thus, participants predominantly chose the firm with the lower price and the more sensitive data requirement, indicating that they are willing to provide information about their monthly income and date of birth for a 1 Euro discount.”).

In his 1994 paper, A Contractual Approach to Data Privacy, Steve Bibas suggests that individual contracts may provide the best solution to the privacy compensation dilemma:  “In the hands of the contracting parties, however, flexibility allows people to control their lives and efficiently tailor the law to meet their needs. Flexibility is the market’s forte; the pricing mechanism is extremely sensitive to variations in valuation and quickly adjusts to them.”  Bibas, 17 Harv. J. Law & Public Policy 591 (Spring 1994).   Mr. Bibas, however, recognized the limitations in what could be accomplished with privacy transactions that relied only on static privacy trades.  In other words, a model that might be effective is one that customizes the financial rewards to consumers are based on a continuous exchange of information between the consumer and merchant.

One problem most consumers face when using commonly marketed solutions that are meant to safeguard their privacy is that they fail to also create an acceptable value proposition for merchants.  As well, those recently formed companies promising a private web experience will not be able to – nor should they even try – to curtail firms from using OBA to reach consumers.  For the foreseeable future, OBA will continue to drive the Internet and “pay” for a much richer and rewarding consumer experience than would otherwise exist.  It may one day be determined, however, that an even more effective means to satisfy all constituent needs of the OBA ecosystem (consumer, merchant, publisher, agency, etc.) will be to find a means to directly correlate between privacy rights, consumer data, and a merchant’s revenue.