Category Archives: Behavioral Advertising

Carpenter may prod monetization of consumer data property rights

On November 29, 2017, the United States Supreme Court heard oral argument in U.S. v. Carpenter – a case involving robbery suspects who were convicted using cellphone tracking data obtained without a probable cause warrant.  Subpoenas and warrants available under the Stored Communications Act (“SCA”) allow for access to such records without any probable cause showing.    As previously pointed out, the ACLU is looking to push the Supreme Court into making a technology-forward decision by stressing how data collection methods have improved since the 2011 arrest of Carpenter.

According to Law360, Justice Samuel Alito said at the hour-long oral argument:  “I agree with [Carpenter] that this new technology is raising very serious privacy concerns, but I need to know how much of existing precedent you want us to overrule or declare obsolete.”  Justice Alito referenced the third-party doctrine that offers no added protections to material freely given to third parties given such material is generally provided without any expectation of privacy.

At oral argument, Law360 reports Carpenter’s counsel Nathan Wessler of the ACLU said that the bank records and dialed phone numbers found in third-party doctrine cases were “more limited” and freely given to a business as opposed to cellphone location records, which many users don’t understand can “chart a minute-by-minute account of a person’s locations and movements and associations.”

Law360 also reported that Justice Sonia Sotomayor raised doubt that the third-party doctrine found in prior precedent was applicable given there are instances when sensitive data freely given to third parties – such as medical records, still require consent.  According to Law360, Justice Neil Gorsuch said:  “It seems like your whole argument boils down to if we get it from a third party we’re OK, regardless of property interest.”   And, finally according to the SCOTUS Blog, Justice Stephen Breyer recognized at oral argument: “This is an open box. We know not where we go.”

Despite the third-party doctrine, it seems the Court is leaning towards carving out Constitutional exceptions to the SCA based on data gathering technologies that may give rise to an expectation of privacy.   As often done, the Justices will likely come up with a result that takes into consideration stare decisis while meshing with new technological capabilities far removed from earlier cases.   As recognized by Justice Sotomayor in the U.S. v. Jones case of 2012, “it may be necessary to reconsider the premise that an individual has no reasonable expectation of privacy in information voluntarily disclosed to third parties.  This approach is ill suited to the digital age, in which people reveal a great deal of information about themselves to third parties in the course of carrying out mundane tasks.”

To that end, the most interesting aspect of this case involving robberies in Detroit will be how far the decision goes in helping define property rights for consumers of digital services.  In a nod to Justice Breyer’s Pandora’s Box allusion, this decision might eventually give rise to a newfound consumer awareness mandating a change in how consumer data is used by companies.  In other words, property rights acknowledged in this case may help prod consumers into seeking compensation for their consumer data property rights – something the tech amicus might not have envisioned when filing their brief in U.S. v. Carpenter.

CA lawmakers do not pass AB 375 – The California Broadband Internet Privacy Act

Succumbing to the pressure of heavy lobbying, the proposed California Broadband Internet Privacy Act was shelved early this morning by the California Senate:

If enacted, the law would have beginning in 2019 barred ISPs from monetizing consumer browsing data without first obtaining consumer consent.  In essence, large ISPs such as AT&T and Verizon would have been barred from refusing to provide service or limiting service if customers did not waive their privacy rights.  It would have also barred them from charging customers a penalty or offering discounts in exchange for waiving privacy rights.

By way of background, the FCC earlier this year pulled back on those Obama-era regulations that impacted ISPs – regulations that completely ignored the data collection practices of companies such as Google and Facebook given they were not subject to FCC regulation.  The “Net Neutrality” Red Herring previously used by lobbyists to protect those tech companies alleged ISPs deserve different treatment because they curtail broadband usage for certain segments of society – alleging that ISPs closed out the Internet for many in poorer rural communities.

Current FCC policy, however, maintains rules that protect the “openness” of the Internet.  And, the stated intent on the FCC’s May 2017 pullback was a desire to implement a “light-touch regulatory framework” that immediately leveled the field, allow the FTC to continue enforcing privacy infractions, and ultimately defer for a later date the exact parameters of any  federal consumer privacy consent law.

Currently, the privacy infractions of companies like Google and Facebook are policed by the FTC and not the FCC so having the FTC also focus on ISPs is perfectly natural within our current regulatory scheme.  After all, the only reason large ISP’s such as AT&T, Cox and Verizon even came under the FCC’s purview was because they are also telecom and cable operators.  To use this FCC front door to regulate the backdoor Internet businesses of telecom and cable operators was always forced and unnatural.  Indeed, this very public dispute between ISPs and website owners was itself s a subterfuge.  Not surprisingly, AB 375 was also opposed by Google and Facebook because “expanded privacy regulations could indirectly affect the websites’ own ability to gather and monetize user data.

As accurately stated by a libertarian blog:  “By framing this as a dispute between ISPs and websites — instead of accurately presenting it as a struggle between Internet users and anyone who would mine and sell their data, the powers that be (including lawmakers, bureaucrats, corporations, and the media) have muddied the waters to conceal a simple fact: This is actually a struggle between those who value their privacy and those who would profit by violating it.”

Perhaps fearing the demise of AB 375, a California ballot initiative proposed on September 1, 2017 would allow California consumers to know what personal information businesses are collecting from them as well as how that information is used.  As it stands, consumers obviously have no clue who ultimately processes, uses or outright purchases their data.  The California Consumer Privacy Act of 2018 will be placed on the November 2018 statewide ballot if it obtains 365,880 valid voter signatures.  This ballot initiative goes further than AB 375 given it would apply to any business that collects and deals in data for commercial purposes and not just ISPs.

The apparent premise behind this ballot initiative is that there is no longer any such thing as anonymous data – it only takes about 10 visited URLs in total to uniquely identify someone, and there certainly is no difference between what a Google or an AT&T  ultimately do with consumer data.  As it stands, relatively few use a Firefox browser set to its highest privacy setting or a Privacy Badger extension to keep Google scripts from running Google Analytics.  Even fewer users forgo Google in favor of the donation-funded DuckDuckGo search tool that allows users to browse the web without storing search results.

As suggested years ago:  “It may one day be determined, however, that an even more effective means to satisfy all constituent needs of the [online behavioral advertising] ecosystem (consumer, merchant, publisher, agency, etc.) will be to find a means to directly correlate between privacy rights, consumer data, and a merchant’s revenue.”

Until such direct financial correlation takes place – with the ensuing compensation to consumers, the true value of consumer data will never be known.  Very likely, companies who continue pilfering something consumers do not properly value will never do as well as companies that actually pay for what they want.

Given any present mass consumer education necessary to prod these issues forward will rely on online tools provided by companies with the most to gain or lose, the only immediately viable solution necessarily requires agreement from the likes of Google and Facebook.  Unfortunately, given current circumstances, there simply is no financial incentive for these companies to rock a very lucrative boat.





AG’s move against Google’s latest cy pres settlement

Without tackling the underlying merits of the case, the Attorneys General of Alaska, Arizona, Arkansas, Louisiana, Mississippi, Missouri, Nevada, Oklahoma, Rhode Island, Tennessee, and Wisconsin asked the Third Circuit to reverse approval of a $5.5 million settlement involving consumer privacy claims against Google.   Relying on Fed. R. Civ. P. 23(e)’s prohibitions against unfair settlements, the AG’s argued in their July 5, 2017 brief, the proposed cy pres settlement fund would be unfair given consumers would not receive a dime from these settlements.

In their brief, the AG’s point out that because “class members extinguish their claims in exchange for settlement funds, the funds belong to class members.”  Brief at 5.  And, simply giving these proceeds to various privacy rights groups chosen by Google and class counsel would be unfair to the actual class members.

The underlying multidistrict lawsuit – which was previously before the Third Circuit (In re: Google Inc. Cookie Placement Consumer Privacy Litigation), was filed in 2012 and alleges that Google deliberately circumvented default privacy settings used to prevent advertisers from tracking the browsing activities of persons using Safari and Internet Explorer.

Google is no stranger to cy pres funds pegged at $5.5 million.  In August 2016, Google settled a privacy suit by paying $5.5 million into a cy pres fund benefiting some of the same privacy groups looking to benefit from this latest settlement.  And, years earlier Google and Quantcast settled yet other privacy matters by way of a cy pres fund.

A cy pres fund provides the best of both worlds for defendants such as Google – it allows resolution of costly disputes while being able to fund non-profit organizations that ultimately help their cause.  Moreover, they have willing partners in class counsel given it really does not matter if an unnamed class plaintiff sees compensation so long as the settlement is approved and counsel’s fees are paid.  Hopefully, the United States Court of Appeals for the Third Circuit issues a well-reasoned opinion that guides courts around the country on this very troublesome practice.

FTC settles major IoT privacy case with smart TV maker VIZIO

On February 6, 2017, smart TV maker VIZIO entered into a stipulated Order granting injunctive relief and a monetary judgment to the FTC and New Jersey Division of Consumer Affairs.  The FTC brought its claims pursuant to Section 13(b) of the Federal Trade Commission Act, 15 U.S.C. § 53(b), and the New Jersey DCA brought claims pursuant to the New Jersey Consumer Fraud Act, N.J. Stat. Ann. § 56:8-1 et seq.  VIZIO and a subsidiary will pay $2.2 million to settle claims that the companies improperly tracked consumers’ viewing habits and sold this information without compensating viewers.  According to the Complaint filed the same day as the stipulated Order, Vizio and its subsidiary since February 2014 continuously collected viewing data on a “second-by-second” basis without any notice to the consumer.  Complaint at ¶ 14.  This action comes on the heels of the FTC’s smart TV workshop this past December.

Pursuant to the Order, all viewing data obtained by VIZIO prior to March 1, 2016 must be destroyed.  As for obtaining future viewing data, VIZIO must first prominently disclose to the consumer, separate and apart from any “privacy policy” or “terms of use” page: “(1) the types of Viewing Data that will be collected and used, (2) the types of Viewing Data that will be shared with third parties; (3) the identity or specific categories of such third parties; and (4) all purposes for Defendants’ sharing of such information.”  And, VIZIO will be able to collect such information only after the consumer affirmatively consents to such collection.

It is not entirely clear what incentive currently exists for consumers to voluntarily provide their viewing data to VIZIO given their initial smart TV purchases exist apart from any potential future relationship with VIZIO.  In other words, VIZIO really has nothing new to offer for this viewing data – it can only offer something on behalf of those who buy or broker this data.  Accordingly, VIZIO may act in the future as a new stream of commercials.  It has already been suggested that Netflix could make billions by bringing ads to its streaming offerings.

It has been reported that over half of US households use an internet-enabled television.  The VIZIO settlement with the FTC and New Jersey DCA does a great job of highlighting the peril of collecting IoT data such as TV viewing data without proper consent.  Samsung and LG faced similar pressure in 2015 but that was far from a clarion call given the lack of any hefty fine.

The VIZIO resolution may actually be more similar to the major shift brought on after CardSystems was breached over a decade ago.  CardSystems had no excuse for unsecurely maintaining track 2 data for its potential marketing purposes so that breach definitely helped promulgate the PCI data security standard.  Similarly, the VIZIO settlement may lead to more safeguards regarding the use of IoT data.  Rather than Visa or Mastercard waiting in the wings to enforce compliance we would have the FTC and state regulatory bodies.  Nevertheless, such efforts will still have to garner consumer support given the backdoor of affirmative consent that still exists even after the VIZIO resolution.  In other words, there may still have to be something in it for the consumer.

As previously suggested, it may finally be time for consumers to just be paid cash for their consumer data.

Google pays $5.5 million to cy pres fund

Gavel at the computer keyboard

On August 29, 2016, Google resolved yet another privacy suit – this one for $5.5 million with again nothing going to consumers.  Instead, the money will go to privacy groups agreed upon by Google and class counsel. Specifically, the list of proposed recipients of this latest cy pres fund include:

  1. Berkeley Center for Law & Technology;
  1. Berkman Center for Internet & Society at Harvard University;
  1. Center for Democracy & Technology (Privacy and Data Project);
  1. Public Counsel;
  1. Privacy Rights Clearinghouse; and
  1. Center for Internet & Society at Stanford University (Consumer Privacy Project)

As previously discussed, the cy pres method of settling privacy class actions is sought after by tech companies given such a mechanism more easily helps fund non-profit partners – organizations that more often than not push for the very policies advocated by defendants.  Given that class counsel look to resolve cases as soon as possible, the settling defendant obviously dictates the cy pres recipients.   More than likely, this latest Google settlement will obtain the necessary court approval.

Hopefully, Courts in the future take a harder look at this settlement method given the lack of direct benefit to those most impacted.

Data Privacy Day 2016 — Time to Get Paid for Your Personal Data?

Despite an active website and Twitter feed, most folks do not realize that January 28th was chosen as a “birthday” celebration for privacy statutory rights given the first statutory privacy scheme came into being on 28 January 1981 when the Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data was passed by the Council of Europe.  As pointed out four years ago, the purpose of this convention was to secure for residents respect for “rights and fundamental freedoms, and in particular his right to privacy, with regard to automatic processing of personal data relating to him.”  It used to be heavily sponsored by Microsoft and Intel without much focus on how personal data is used for online behavioral marketing.  Perhaps spurred on by articles such as a recent one describing how Facebook values its users, the value of personal data is certainly more front and center on Data Privacy Day 2016.

As recognized today by an author writing about Date Privacy Day 2016, “you’re a walking, talking data source.”  The author goes on to discuss a project from the Harvard Data Privacy Lab springing from the fact “the average person has no idea just how much personal data is bought and sold.”

Data Privacy Lab director Latanya Sweeney, who is a former chief technology officer for the Federal Trade Commission, helped launch the project titled, “All the Places Personal Data Goes,” to illustrate the path personal info takes from one place to another.  According to the article, the Lab gathers “information on data buyers and sellers and make it available to journalists and others.  The project will also soon host a data-visualization competition to bring the issue to life.”  It is no surprise that the think tank created by publishing icons John and James Knight, the Knight Foundation, awarded the Lab’s project $440,000 to expand its efforts.

It’s very possible that after consumers read in the press exactly how valuable their personal data is to so many different companies they just might want in on the action.  The first company that helps make that a reality would certainly benefit consumers — as well as data buyers and sellers.


New Jersey State AG Enters into COPPA Consent Order

Capitalizing on its federal grant of authority, the New Jersey state Attorney General’s Office recently resolved claims it brought against Dokogeo, the California-based maker of the Dokobots app, that were based on the Children’s Online Privacy Protection Act (COPPA) and state Consumer Fraud Act.  According to the Consent Order filed on November 13, 2013, the Dokobots app is a geo-location scavenger hunt game that encourages users to visit new locations and gather “photos and notes from the people they meet.” One major attribute of the app is its geo-tracking of users.  A product review of the app describes it as blending “playtime, learning, exploration, and creativity in a curiously enticing way.” The State’s position was that the app was directed to children by virtue of its use of animated characters and “child-themed” storyline.

The Consent Order alleges that the app collects information, “including e-mail address, photographs, and geolocation information” deemed personal information under COPPA yet did not provide any neutral screen registration process to restrict the age of its users to those over the age of 13. Moreover, there was no terms of use agreement and its privacy policy does not disclose that the app is restricted to users over the age of 13. Pursuant to the Consent Order, Dokogeo removed all photographs of children and location information from its website and agreed to more clearly disclose information it collects.  As of November 24, 2013, the Dokobots site merely had a static home page – presumably given it is still in the process of implementing the terms of the Order.

The Consent Order also provides for a suspended fine of $25,000 which will only be enforced if Dokogeo fails to meet the terms of the Order.

This is the second such settlement reached by the New Jersey state AG’s office.  In July 2012, authorities announced a similar settlement against Los Angeles-based 24 x 7 Digital, LLC requiring the destruction of all children’s data that had previously been collected and transmitted to third parties.  That action was commenced by way of Complaint filed in June 2012.

It is not unusual for state AGs to commence COPPA actions against out-of-state companies.  In fact, a state AG action under COPPA was brought years ago by the Texas AG against a Brooklyn-based company for improperly collecting personal information such as names, ages, and home addresses from children.  What is interesting about the Kokogeo case, however, is that the underlying statute requires that the “the attorney general of a State has reason to believe that an interest of the residents of that State has been or is threatened or adversely affected. . . .” 15 USC § 6504 (emphasis added). Other than merely reciting the statute, no actual finding was made or referenced by the New Jersey AG’s office regarding the impact to New Jersey residents.  In fact, Kokogeo defended by arguing the app was intended for adults and there was no discussion by either side regarding New Jersey users.

App developers are well advised to appreciate two basic lessons from Kokogeo. If an app appears to target children, developers should comply with COPPA — especially given FTC guidelines involving the collection of geo-data and use of photographs.   And, if they do not comply, they should be prepared to defend against those state AGs who are not adverse to spending state dollars pursuing an enforcement action

California’s Right to Know Law Put on Hold

As reported by the LA Times, “a powerful coalition of technology companies and business lobbies that included Facebook, Inc., Google, Inc., the California Chamber of Commerce, insurers, bankers and cable television companies as well as direct marketers and data brokers” were able to stop a California bill aimed at giving consumers greater insight as to the use of their personal data.

First introduced in February by Assemblywoman Bonnie Lowenthal (D-Long Beach), the proposed Right to Know Law (AB 1291) would have implemented major revisions to existing law and created new rights for consumers.  Specifically, the proposed law would require

any business that has a customer’s personal information, as defined, to provide at no charge, within 30 days of the customer’s specified request, a copy of that information to the customer as well as the names and contact information for all 3rd parties with which the business has shared the information during the previous 12 months, regardless of any business relationship with the customer.

This new level of transparency might have helped sooth consumer concerns.  According to a 2012 USC Dornsife/Los Angeles Times poll, “82 percent of Californians said they are “very concerned” or “somewhat concerned” about Internet and smartphone companies collecting their personal information.”   On the other hand, providing a full and accurate accounting of who had access to a consumer’s data – even to only the small percentage of consumers who would actually take the time to request it – would have generated a major undertaking for a wide range of companies.  It is not surprising that the companies who fought so hard to pull the plug on this bill represent a very diverse coalition of businesses.

Even if this bill does not get revived in a new form sometime in the future, the prospect of what it might have brought to the table should serve as a wake up call to those businesses deep into online behavioral advertizing.  It may be time to better understand just who has access to what information – and it may not eventually matter whether that information belongs to a current client or consumer or whether it was anonymized.  As usual, staying in front of the regulatory curve remains a sound business practice.

Financial Correlation of Privacy Rights

In Letting Down Our Guard With Web Privacy, published on March 30, 2013, the author details ongoing research being conducted by Alessandro Acquisti, a behavioral economist at Carnegie Mellon University.  Mr. Acquisti’s research is cutting edge when it comes to online behavioral advertising (OBA)  and associated consumer behavior.  Indeed, he’s the academic who famously announced in 2011 that one might be able to discover portions of someone’s social security number simply by virtue of a posted photograph.   His research often distills to one major premise – consumers may not always act in their best interests when it comes to online privacy decisions.

It appears consumers and merchants alike may be missing out on fully cultivating a very valuable commodity.  According to the World Economic Forum, “personal data represents an emerging asset class, potentially every bit as valuable as other assets such as traded goods, gold or oil.”  Rethinking Personal Data:  Strengthening Trust, at 7, World Economic Forum Report (May 2012).  Before this asset class can ever be completely exploited and fully commercialized, however, its constituent value components must be correlated by all in the privacy food chain.

Over three decades ago, it was recognized that the three pillars of privacy – the very foundation of personal data – secrecy, anonymity, and solitude, were distinct yet interrelated.  See Gavison, Ruth, Privacy and the Limits of Law, 89 The Yale Law Journal 421, 428-429 (1980) (“A loss of privacy occurs as others obtain information about an individual, pay attention to him, or gain access to him. These three elements of secrecy, anonymity, and solitude are distinct and independent, but interrelated, and the complex concept of privacy is richer than any definition centered around only one of them.”).

Current OBA has made it so these three privacy pillars may be confusing for consumers to value, manage, and isolate when online – it is not generally up to consumers whether they will be fed an ad based on previous website visits or purchases – it will just happen.  Indeed, according to a survey of 1,000 persons conducted by Ipsos Public Affairs and released by Microsoft in January 2013, forty-five percent of respondents felt they had little or no control over the personal information companies gather about them while they are browsing the Web or using online services.  This view may not be unfounded given that data routinely gathered online, e.g., operating system, browser, IP address, persistent cookies, last used server, can be used to divulge the activity of individual devices.

The privacy trade-offs being researched by Mr. Acquisti and others offer insight into the true value of these data constituents.  Consumers who try to “shut off” or render anonymous access to their device’s data or settings, would not only likely fail in their attempt at being anonymized, they would also lose out on access to most social media and other websites requiring browsers to accept cookies as well as product offers that may presumably are of interest.  Indeed, this coordinated tracking of consumers is not even unique to the Internet.   See generally Bibas, Steve, A Contractual Approach to Data Privacy, 17 Harv. J. Law & Public Policy 591 (Spring 1994) (“Although the ready availability of information helps us to trust others and coordinate actions, it also lessens our privacy. George Orwell presciently expressed our fear of losing all privacy to an omniscient Big Brother.  Computers today track our telephone calls, credit-card spending, plane flights, educational and employment records, medical histories, and more.  Someone with free access to this information could piece together a coherent picture of our actions.”).  There are even companies that bridge the gap between offline and online activities by taking in-store point of sale purchases and converting such data to an anonymous online cookie ID that will eventually be used online by clients.  Such use of in-store data is generally permissible under a retailer’s loyalty program.

Current law does not generally prevent someone from collecting public information to create consumer profiles – nor is there the right to opt out of having your public record information sold or shared.  And, when one wants to self-determine whether data will be disclosed or whether he or she will be “untraceable”, “anonymous” or “left alone”, there may not always exist the ability to easily curtail these rights from being exploited – there is certainly no way to obtain a direct financial gain in return for the relinquishment of such privacy rights.  Instead, there has generally been a “privacy for services” marketing/advertizing arrangement that has been accepted by consumers – which, in fact, has helped pay for and fuel the growth of the commercial Internet.

The current OBA ecosystem does not posit a “loss of privacy” as much as it offers a bartering system where one party feels the value of what is being bartered away while the other party actually quantifies with cascading/monetizing transactions what is only felt by the other party.  In other words, it is not a financial transaction.  Those who are able to find an entertaining online video or locate a product online using a search engine don’t really mind that an ad will be served to them while visiting some other website given they feel this loss of privacy is worth the value of the services being provided.

Ironically, the interactive advertising industry itself may believe it is collecting too much sensitive consumer data.  According to a study conducted by the Ponemon Institute, 67 percent of responding online advertisers believe “limiting sensitive data collection for OBA purposes is key to improving consumer privacy and control when browsing or shopping online.” Leading Practices in Behavioral Advertising & Consumer Privacy:  A Study of Internet Marketers & Advertisers, at 2, The Ponemon Institute (February 2012).

As recognized by privacy researchers, “[e]mpirical evidence on the behavioral effects of privacy is rather scarce.”  Regner, Tobias; Riener, Gerhard, Voluntary Payments, Privacy and Social Pressure On The Internet: A Natural Field Experiment, DICE Discussion Paper, No. 82 (December 2012) at 6.  Although “some consumers are willing to pay a premium to purchase from privacy protective websites”; there is no measure of what that premium should be or how widespread a factor it is for consumers as a whole.  Id. at 7.

More often than not, consumers have been “often willing to provide personal information for small or no rewards.”  Losses, Gains, and Hyperbolic Discounting: An Experimental Approach to Information Security Attitudes and Behavior, presented by Alessandro Acquisti and Jens Grossklags at the 2nd Annual Workshop on Economics and Information Security, College Park, Maryland, May 2003, at 4.

This does not mean researchers have not tried to quantify a “privacy valuation” model.  In 2002, a Jupiter Research study found 82% of online shoppers willing to give personal data to new shopping sites in exchange for the chance to win $100.  See c.f. Tsai, Janice; Egelman, Serge; Cranor, Lorrie; Acquisti, Alessandro; The Effect of Online Privacy Information on Purchasing Behavior: An Experimental Study, Information Systems Research (February 2010) at 22 (describing survey results which concludes that “people will tend to purchase from merchants that offer more privacy protection and even pay a premium to purchase from such merchants.”); Beresford, Alastair; Kübler, Dorothea; Preibusch, Sören, Unwillingness To Pay For Privacy: A Field Experiment, 117 Economics Letters 25 (2010) (“Thus, participants predominantly chose the firm with the lower price and the more sensitive data requirement, indicating that they are willing to provide information about their monthly income and date of birth for a 1 Euro discount.”).

In his 1994 paper, A Contractual Approach to Data Privacy, Steve Bibas suggests that individual contracts may provide the best solution to the privacy compensation dilemma:  “In the hands of the contracting parties, however, flexibility allows people to control their lives and efficiently tailor the law to meet their needs. Flexibility is the market’s forte; the pricing mechanism is extremely sensitive to variations in valuation and quickly adjusts to them.”  Bibas, 17 Harv. J. Law & Public Policy 591 (Spring 1994).   Mr. Bibas, however, recognized the limitations in what could be accomplished with privacy transactions that relied only on static privacy trades.  In other words, a model that might be effective is one that customizes the financial rewards to consumers are based on a continuous exchange of information between the consumer and merchant.

One problem most consumers face when using commonly marketed solutions that are meant to safeguard their privacy is that they fail to also create an acceptable value proposition for merchants.  As well, those recently formed companies promising a private web experience will not be able to – nor should they even try – to curtail firms from using OBA to reach consumers.  For the foreseeable future, OBA will continue to drive the Internet and “pay” for a much richer and rewarding consumer experience than would otherwise exist.  It may one day be determined, however, that an even more effective means to satisfy all constituent needs of the OBA ecosystem (consumer, merchant, publisher, agency, etc.) will be to find a means to directly correlate between privacy rights, consumer data, and a merchant’s revenue.

The Privacy Tug of War

According to the World Economic Forum, “personal data represents an emerging asset class, potentially every bit as valuable as other assets such as traded goods, gold or oil.”  Given the inherent value of this new asset class, it’s no surprise there has been an ongoing tug of war regarding how consumers should be compensated for access to their personal data.

In a March 2003 Wired article titled, “Who’s Winning Privacy Tug of War?“, the author suggests that “[c]onsumers appear to have become weary of the advertising bombardment, no matter how targeted to their tastes those ads may be.”  And, the “tit-for-tat tactic on the Web” that requires users to provide certain personal information in exchange for product or other information may be much less than a perfect marketing model given these marketing preference databases “are polluted with lies.”

Fast forward a decade or so and companies are still trying to figure out the Privacy Tug of War rules of engagement.  In a report released on September 19, 2012, UK think tank Demos released a report it considered “the most in-depth research to date on the public’s attitudes toward the sharing of information.”   Not surprisingly, Demos found that in order to maximize the potential value of customer data, there needs to be “a certain level of trust established and a fair value exchange.”   The firm found that only 19 percent of those surveyed understand the value of their data, and the benefits of sharing it.

The surveys, workshops and other research tools referenced in the Demos report all point towards a “crisis of confidence” which may “lead to people sharing less information and data, which would have detrimental results for individuals, companies and the economy.”   Demos offers up a possible solution to this potential crisis:

The solution is to ensure individuals have more control over what, when and how they share information. Privacy is not easily defined. It is a negotiated concept that changes with technology and culture. It needs continually updating as circumstances and values change, which in turn requires democratic deliberation and a dialogue between the parties involved.

It is hard to have any meaningful deliberations when no one is charting a clear path to victory in the Privacy Tug of War — nor is there any consensus regarding whether it is preferable to even have such a path.   Some on the privacy circuit have suggested we must create better privacy metrics and offer tools to use those metrics to measure whether a company’s privacy protections are “satisfactory”.   Consumers right now can rely on sites such as Clickwrapped to score the online privacy policies of major online brands.   Certification services such as TRUSTe provide insight regarding the online privacy standards of thousands of websites.   If they don’t like what they see, consumers can always “opt out” and use services such as that of start-up Safe Shepherd to remove “your family’s personal info from websites that sell it.”

Unfortunately, no commercially available privacy safeguard, testing service or certification can ever move fast enough to address technological advances that erode consumer privacy given such advances will always launch unabated — and undetected — for a period of time.  Not unlike Moore’s Law regarding the doubling of transistor computing power every two years, it appears that consumer privacy diminishes in some direct proportion to new technological advances.  Consumer privacy expectations should obviously be guided accordingly.   Unlike with Moore’s Law, however, there is no uniform technology, product, or privacy metric that can be benchmarked as it is in the computer industry.

This does not mean we are powerless to follow technology trends and quantify an associated privacy impact.  For example, the Philip Dick/Steven Spielberg Minority Report vision of the future where public iris scanning offers up customized advertisements to people walking around a mall has already taken root in at least one issued iris-scanning patent that is jointly owned by the federal government and a start-up looking to serve ads suggested using facial recognition techniques.  In direct reaction to EU criticism of Facebook’s own facial recognition initiative, Facebook temporarily suspended its “tag-suggest” feature.  This automatic facial recognition system recognized and suggested names for those people included in photographs uploaded to Facebook – without first obtaining the consent of those so recognized and tagged.

Closely monitoring technological advances that may impact privacy rights — whether the body diagnostics of Mc10 and ingested medical sensors from Proteus, the latest in Big Data analytics, or a new EHR system that seamlessly ties such innovations together — becomes the necessary first step towards understanding how to partake in the Privacy Tug of War.

Unlike the PC industry that is tied to Moore’s Law, our government’s unbounded funding is an active participant in developing privacy-curtailing technological advances.  For example, the FBI is currently undergoing a billion-dollar upgrade creating its Next Generation Identification Program which will deploy the latest in facial recognition technologies.   As recognized by CMU Professor Alessandro Acquisti, this “combination of face recognition, social networks data and data mining can significantly undermine our current notions and expectations of privacy and anonymity.”

Not surprisingly, there has been some push back on such government initiatives.    For example, on September 25, 2012, the ACLU filed suit against several government agencies under the Freedom of Information Act seeking seeking records on their use and funding of automatic license plate readers (APLRs).  According to the Complaint, “ALPRs are cameras mounted on stationary objects (e.g., telephone poles and the underside of bridges) or on patrol cars [and] photograph the license plate of each vehicle that passes, capturing information on up to thousands of cars per minute.”   The ACLU suggests that APLRs “pose a serious threat to innocent Americans’ privacy.”

The imminent unleashing of unmanned aircraft systems – commonly known as “drones” – sets in motion another technological advance that should raise serious concerns for just about anyone.  Signed by President Obama in February 2012, The FAA Modernization and Reform Act of 2012, among other things, requires that the Federal Aviation Administration accelerate the use of drone flights:

Not later than 270 days after the date of enactment of this Act, the Secretary of Transportation, in consultation with representatives of the aviation industry, Federal agencies that employ unmanned aircraft systems technology in the national airspace system, and the unmanned aircraft systems industry, shall develop a comprehensive plan to safely accelerate the integration of civil unmanned aircraft systems into the national airspace system.

As recognized by the Government Accountability Office in a September 14, 2012 Report, even though “[m]any [privacy] stakeholders believe that there should be federal regulations” to protect the privacy of individuals from drone usage, “it is not clear what entity should be responsible for addressing privacy concerns across the federal government.”

This is not an insignificant failing given according to this same report, commercial and government drone expenditures could top $89.1 billion over the next decade ($28.5 billion for R&D and $60.6 billion for procurement).  Interestingly, the necessary comprehensive plan to accelerate integration of civil drones into our national airspace systems will be due on November 10, 2012 – right after elections.   According to an Associated Press-National Constitution Center poll, 36 percent of those polled say they “strongly oppose” or “somewhat oppose” police use of drones.   This somewhat muted response is likely driven by the fact most polled just do not understand the capabilities of these drones and just how pervasive they will become in the coming years.

The technology advance that may have the greatest impact on privacy rights does not take to the skies but is actually found in most pockets and purses.   The same survey referenced above found that 43 percent of those polled (the highest percentage) primarily use a mobile device alone rather than a landline or a combination of mobile device and landline — with 34 percent of those polled not even having a landline in their home.   Not surprisingly, companies have been aggressively tapping into the Big Data treasure trove available from mobile device usage.   Some politicians have taken notice and are already drawing lines in the digital sand.

Under the Mobile Device Privacy Act introduced by Congressman Edward J. Markey, anyone who sells a mobile service, device, or app must inform customers if their product contains monitoring software — with statutory penalties ranging from $1,000 per unintentional violation to $3,000 per intentional violation.   This new bill addresses only a single transgression of the personal-data-orgy now being enjoyed by so many different companies up and down the mobile device communication and tech food chain.   As evidenced by the current patent landscape — including an issued Google patent that involves serving ads based on a mobile device’s environmental sounds — and the now well-known GPS capabilities of mobile devices, the privacy Battle of Midway will likely be fought around mobile devices. Companies with a stake in the Privacy Tug of War — as well as those professionals who advise such companies — will only be adequately prepared if they recognize that this battle may ultimately have no clear winners or losers — only willing participants.