Recent HIPAA settlements are wake up calls

On March 16, 2016, the Office for Civil Rights (“OCR”) announced its $1.55 million Resolution Agreement and Corrective Action Plan with North Memorial Health Care of Minnesota.  North Memorial  agreed to settle charges that it potentially violated the Health Insurance Portability and Accountability Act of 1996 (HIPAA) Privacy and Security Rules by failing to implement a business associate agreement with a major contractor and failing to institute an organization-wide risk analysis to address the risks and vulnerabilities to its patient information.

OCR initiated its investigation of North Memorial following receipt of a report on September 27, 2011, which indicated that “an unencrypted, password-protected laptop was stolen from a business associate’s workforce member’s locked vehicle, impacting the electronic protected health information (ePHI) of 9,497 individuals.”

The investigation indicated that North Memorial gave its business associate, Accretive, access to North Memorial’s hospital database, which stored the ePHI of 289,904 patients. OCR further determined that North Memorial failed to complete a risk analysis to address all of the potential risks and vulnerabilities to the ePHI that it maintained, accessed, or transmitted across its entire IT infrastructure – “including but not limited to all applications, software, databases, servers, workstations, mobile devices and electronic media, network administration and security devices, and associated business processes.”

In addition to the $1,550,000 payment, North Memorial is required to develop “an organization-wide risk analysis and risk management plan, as required under the Security Rule.”  North Memorial will also train appropriate workforce members on “all policies and procedures newly developed or revised pursuant to this corrective action plan.”

In by now typical fashion, OCR announced another settlement right after the North Memorial settlement.

On March 17, 2016, the OCR announced its $3.9 million HIPAA settlement with the biomedical research institute, Feinstein Institute for Medical Research.  Feinstein settled potential HIPAA violations by agreeing to undertake a substantial corrective action plan.  OCR’s investigation began after Feinstein filed a report indicating that on September 2, 2012, a laptop computer containing ePHI of approximately 13,000 patients and research participants was stolen from an employee’s car. The ePHI stored in the laptop included “names of research participants, dates of birth, addresses, social security numbers, diagnoses, laboratory results, medications, and medical information relating to potential participation in a research study.”

OCR’s investigation discovered that Feinstein’s security management process was “limited in scope, incomplete, and insufficient to address potential risks and vulnerabilities to the confidentiality, integrity, and availability of ePHI held by the entity.” Further, Feinstein lacked “policies and procedures for authorizing access to ePHI by its workforce members, failed to implement safeguards to restrict access to unauthorized users, and lacked policies and procedures to govern the receipt and removal of laptops that contained ePHI into and out of its facilities.”

The Feinstein and North Memorial settlements are obvious wake-up calls.

First, OCR apparently has no problem whatsoever finding that research institutions are covered entities even though such organizations may not squarely fit into the provider, health plan or clearinghouse bucket for all their activities.  See 45 C.F.R. § 160.103.   As set forth by the OCR Director Jocelyn Samuels in the press release, “For individuals to trust in the research process and for patients to trust in those institutions, they must have some assurance that their information is kept private and secure.”

Second, it is much preferable to hire legal counsel and spend several thousand dollars on a good business associate agreement and perhaps $20,000 on a comprehensive risk analysis than it is to pay $1.55 million on an OCR settlement.

And finally, train employees on proper handling of laptops and make sure your laptops are encrypted just in case they are ever lost or stolen.  In both cases, the actual trigger leading to these seven figure settlements was a breach report sent to OCR because of a laptop stolen from a car.

Apple Ordered to Disable Auto-Erase Feature

Pursuant to the All Writs Act, 28 U.S.C. § 1651, Magistrate Judge Sheri Pym ordered on February 16, 2016 that Apple assist in the investigation of the San Bernardino shooting by disabling a feature that would auto-erase one of the shooter’s phone after ten password attempts.  According to the government’s ex parte application, unless the auto-erase feature is disabled, “iOS will instantly, irrecoverably, and without warning erase the encryption keys necessary for accessing stored data.”  Declaration of Christopher Pluhar at 5.  Although the media is reporting that Apple is being forced to unlock the phone, that is not the case.

Even though Judge Pym ordered that Apple provide “reasonable technical assistance to assist law enforcement agents in obtaining access to the data on the SUBJECT DEVICE,” what was actually ordered would only allow investigators to “brute force” determine the password for the phone without the possibility of any data deletion.  The software ordered to be provided by Apple would have “a unique identifier of the phone so that the SIF [software image file] would only load and execute on the SUBJECT DEVICE.”  In other words, the software would purportedly not be used by the authorities for other devices.

The Order concludes with the following:  “To the extent that Apple believes that compliance with this Order would be unreasonably burdensome, it may make an application to this Court for relief within five business days of receipt of the Order.”  Given that the Order also anticipates that Apple would be compensated given it requires that Apple “shall advise the government of the reasonable cost of providing this service” it may be the case the Judge anticipates objections based on the expense or efforts related to the request – and not the significant precedent set by this equitable relief.

In an open letter to customers, Apple’s CEO has publicly challenged the request and set up the stakes as follows:

Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.

According to Mr. Cook, “[o]nce created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.”

Judge Pym gave Apple five days to respond to the Order so a major battle is about to unfold — with various privacy groups already sharpening their amicus pencils.  Given it was prior US government mass surveillance that caused the Court of Justice of the European Union to strike down the EU Safe Harbor framework for data transfer emanating from the EU, this Order obviously holds more significance than the brute force unlocking of a single phone.  Depending on how this case unfolds it will have a direct impact far removed from Judge Pym’s courtroom and may even cause the EU-US Privacy Shield to fail before it really even got off the ground.

Replacement Safe Harbor Reached

First announced today by Bloomberg BNA, the U.S. and the European Union reached agreement on a new data transfer framework to replace the invalidated Safe Harbor program.  The European Commission press release provides details of this agreement and the new EU-US Privacy Shield:

The EU-US Privacy Shield reflects the requirements set out by the European Court of Justice in its ruling on 6 October 2015, which declared the old Safe Harbour framework invalid. The new arrangement will provide stronger obligations on companies in the U.S. to protect the personal data of Europeans and stronger monitoring and enforcement by the U.S. Department of Commerce and Federal Trade Commission (FTC), including through increased cooperation with European Data Protection Authorities. The new arrangement includes commitments by the U.S. that possibilities under U.S. law for public authorities to access personal data transferred under the new arrangement will be subject to clear conditions, limitations and oversight, preventing generalised access. Europeans will have the possibility to raise any enquiry or complaint in this context with a dedicated new Ombudsperson.

By way of background, the U.S./EU Safe Harbor Program allowed U.S. companies to transfer EU citizens’ data to the U.S. if they self-certified to the U.S. Department of Commerce their compliance with privacy principles similar to those contained in the EU Data Protection Directive.  The program was invalidated by the Court of Justice of the European Union based on a claim the U.S. government’s surveillance programs necessarily showed a lack of compliance given the lack of adequate restrictions on this data gathering.  The result was widespread confusion among multi-nationals given the invalidation of Safe Harbor affected thousands of U.S. companies certified in the program as well as many more companies relying on the certification to transfer personal data to those companies.

One key takeaway of this agreement is that it continues to place enforcement power with the FTC regarding the “robust obligations on how personal data is processed and individual rights are guaranteed.”  Moreover, according to the Press Release, the U.S. “has ruled out indiscriminate mass surveillance on the personal data transferred to the US under the new arrangement.”  And, with regards complaints on possible access by national intelligence authorities, the new Ombudsperson will take charge.

It remains to be seen whether the NSA is on board or whether this agreement was a huge temporary fix simply so that all sides could save face.  What is certain, however, is that multi-nationals should have very little comfort that this fix is permanent and will be left unchanged after its inevitable challenge in the court system.

Data Privacy Day 2016 — Time to Get Paid for Your Personal Data?

Despite an active website and Twitter feed, most folks do not realize that January 28th was chosen as a “birthday” celebration for privacy statutory rights given the first statutory privacy scheme came into being on 28 January 1981 when the Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data was passed by the Council of Europe.  As pointed out four years ago, the purpose of this convention was to secure for residents respect for “rights and fundamental freedoms, and in particular his right to privacy, with regard to automatic processing of personal data relating to him.”  It used to be heavily sponsored by Microsoft and Intel without much focus on how personal data is used for online behavioral marketing.  Perhaps spurred on by articles such as a recent one describing how Facebook values its users, the value of personal data is certainly more front and center on Data Privacy Day 2016.

As recognized today by an author writing about Date Privacy Day 2016, “you’re a walking, talking data source.”  The author goes on to discuss a project from the Harvard Data Privacy Lab springing from the fact “the average person has no idea just how much personal data is bought and sold.”

Data Privacy Lab director Latanya Sweeney, who is a former chief technology officer for the Federal Trade Commission, helped launch the project titled, “All the Places Personal Data Goes,” to illustrate the path personal info takes from one place to another.  According to the article, the Lab gathers “information on data buyers and sellers and make it available to journalists and others.  The project will also soon host a data-visualization competition to bring the issue to life.”  It is no surprise that the think tank created by publishing icons John and James Knight, the Knight Foundation, awarded the Lab’s project $440,000 to expand its efforts.

It’s very possible that after consumers read in the press exactly how valuable their personal data is to so many different companies they just might want in on the action.  The first company that helps make that a reality would certainly benefit consumers — as well as data buyers and sellers.

 

Wyndham Settles with FTC

Ending its epic battle with the FTC, Wyndham entered into a settlement agreement with the FTC.  Under the terms of the Stipulated Order that was filed on December 9, 2015 with Judge Salas, Wyndham will establish a “comprehensive information security program designed to protect cardholder data – including payment card numbers, names and expiration dates.”  In addition, the company is required to “conduct annual information security audits and maintain safeguards in connections to its franchisees’ servers.”

These safeguards have a shelf-life of 20 years — common for FTC stipulated agreements involving data breaches.  What is noteworthy and distinct from other settlements, however, is that there is no money changing hands — Wyndham pays no fines, investigative costs or any amount for that matter.   This overall result — especially in light of the Third Circuit ruling, can only be considered a solid victory for Wyndham.

Franchise operators also scored somewhat of a victory given the FTC finally gives some guidance as to what it considers to be a reasonable security program for franchise operators.  First, the FTC alerts future companies that if they conform to the most current Payment Card Industry Data Security Standard (PCI DSS) for certification of a company’s security program, they are in the right direction towards implementing a satisfactory program.  Indeed, the settlement specifically defines its terms as per PCI DSS Version 3.1.  Not surprisingly, the second aspect of a suitable program requires the implementation of a risk-based approach to threat assessment.  As set forth in I.C of the Stipulated Order, Wyndham’s program must include “the design and implementation of reasonable safeguards to control the risks identified through risk assessment (including any risks emanating from the Wyndham-branded Hotels), and regular testing or monitoring of the effectiveness of the safeguards’ key controls, systems, and procedure.”

The agreed-upon requirements also apply to any “entity that that Defendant has any ownership interest in or controls directly or indirectly that may affect compliance obligations arising under this Order.”  And overall compliance features of the Stipulated Order mimic discovery process available under the Federal Rules of Evidence and will certainly be tested over the twenty-year term.  Such future testing — coupled with potential new breaches, may lead to future stipulated Orders.  For the moment, however, Wyndham should be relieved with the results of its FTC skirmish — as well as happy with the work done by its counsel.

California Rakes in $25 Million from Comcast

On September 17, 2015, a California Judge approved a final stipulated judgment between media giant Comcast and the California Public Utilities Commission.  In Paragraph 17 of the Complaint filed the same day, Comcast was not exactly accused of heinous conduct:  “for varying periods of time between July 2010 to December 2012, and for many customers the entire period, approximately 75,000 Comcast residential subscribers in California who had paid Comcast the monthly fee for a non-published or non-listed phone number nevertheless had their subscriber listing information published on Ecolisting, and (in some cases) in phone books, and/or made available by a directory assistance provider.”

In other words, Comcast customers who paid to avoid potentially being listed on sites such a whitepages.com were inadvertently deprived of that purchased service.  Specifically, because “the ‘privacy flag’ was not attached to the listings of approximately 75 ,000 non-published/non-listed subscribers, Neustar provided those listings to Comcast’s vendor, Microsoft FAST, who then published them for Comcast on the Ecolisting website.”  Complaint at ¶ 15.

No financial data was exposed.  No transaction or business data was exposed.  No medical data was exposed.  No emails or passwords were compromised.  Indeed, the only information exposed was the very same information that could be obtained by anyone doing a few sophisticated Google searches – names, addresses, and phone numbers.   For most people, such information exists online independently of any Comcast action or inaction.   In other words, whether or not Comcast properly withheld such information would not likely prevent someone from finding it online.

As part of the settlement, Comcast must pay $25 million in penalties and investigative costs to the California Department of Justice and the California Public Utilities Commission.   The 75,000 customers who were “compromised” ended up with refunds and $100 more in restitution added to their Comcast bills.

And, as part of the stipulated judgment, Comcast also agreed to a permanent injunction that requires the company to strengthen the restrictions it places on its vendors’ use of personal information about customers.  The injunction also requires Comcast to provide a new disclosure form to all customers that explains the ways in which it uses unlisted phone numbers and other personal information.  Such restrictions and added duties have little to do with the actual transgression in question — they represent added gimmes obtained by the California AG’s office given the leverage it had over Comcast.

This case is yet another wake-up call to companies maintaining or processing large amounts of customer data.  Even though the Comcast settlement is somewhat unique given the nature of the information as well as the “unlisting service” provided, other companies also safeguard what may otherwise be publicly available information.  When there are assurances made that such information will be safeguarded, does that automatically elevate the value of the information?

The larger question is how can a transgression with no ostensible harm mushroom into a $25 million payment to a governmental agency?  Until a General Counsel can answer that question with definite certainty, the only course of action is to treat all customer data equally and ensure the requisite reasonable precautions undertaken to safeguard such information matches or exceeds what is considered state-of-the-art for that company’s industry sector.

Third Circuit Affirms Judge Salas in FTC v. Wyndham

In a 47-page ruling, the United States Court of Appeals for the Third Circuit affirmed today an April 7, 2014 ruling of Judge Esther Salas against Wyndham Worldwide.  In affirming the district court ruling, the Third Circuit left intact Judge Salas’s decision that the FTC has power to regulate “unfair trade practices” based on the alleged failed data security of Wyndham.

The Third Circuit recast Wyndham’s argument and ultimately rejected what was potentially viable on appeal as “[t]oo little and too late.”  As recognized by the Court:

Wyndham repeatedly argued there is no FTC interpretation of § 45(a) or (n) to which the federal courts must defer in this case, and, as a result, the courts must interpret the meaning of the statute as it applies to Wyndham’s conduct in the first instance. Thus, Wyndham cannot argue it was entitled to know with ascertainable certainty the cybersecurity standards by which the FTC expected it to conform. Instead, the company can only claim that it lacked fair notice of the meaning of the statute itself – a theory it did not meaningfully raise and that we strongly suspect would be unpersuasive under the facts of this case..

In what was a sua sponte rejection of Wyndham’s “implied” argument that it was not provided with sufficient statutory notice of the century-old Federal Trade Commission Act, the Court of Appeals recognized:

Moreover, Wyndham is entitled to a relatively low level of statutory notice for several reasons. Subsection 45(a) does not implicate any constitutional rights here. [citation omitted] It is a civil rather than criminal statute. [citation omitted] And statutes regulating economic activity receive a “less strict” test because their “subject matter is often more narrow, and because businesses, which face economic demands to plan behavior carefully, can be expected to consult relevant legislation in advance of action.” [citation omitted]

In other words, one of Wyndham’s arguments deemed potentially viable, i.e., that it should not be held to a standard never actually put forth by the FTC in any prior ruling, will likely be rejected on summary judgment.    According to the Court, the relevant standard “considers a number of relevant factors, including the probability and expected size of reasonably unavoidable harms to consumers given a certain level of cybersecurity and the costs to consumers that would arise from investment in stronger cybersecurity.”  It is this applicable standard that the Court found Wyndham should have been on notice of prior to the FTC Complaint being filed against it.

In a section of the opinion that may come back to haunt Wyndham – as well as future victims of a major data incident, the Court was quite blunt in its assessment as to whether this statutory standard was potentially satisfied.  Id. at 41 (“Wyndham’s as-applied challenge is even weaker given it was hacked not one or two, but three, times. At least after the second attack, it should have been painfully clear to Wyndham that a court could find its conduct failed the cost-benefit analysis. That said, we leave for another day whether Wyndham’s alleged cybersecurity practices do in fact fail, an issue the parties did not brief. We merely note that certainly after the second time Wyndham was hacked, it was on notice of the possibility that a court could find that its practices fail the cost-benefit analysis.”).

The import of this decision obviously reaches well beyond the Third Circuit.  As the only appellate court to affirm the FTC’s authority to enforce what it considers applicable cybersecurity standards — “standards” that no other governmental body uses as aggressively as the FTC, the FTC will have even greater leverage in future settlement agreements.  Given the scorched earth tactics taken during this litigation, it is possible the United States Supreme Court will be asked by Wyndham to weigh in.   There is certainly an argument to be made that Wyndham’s time and money would be better spent mending fences with the FTC.

UPDATE:   On the heels of this victory, the FTC announced on August 28, 2015 that it was going to hold a free “PrivacyCon” conference on January 14, 2016 at its Constitution Center offices.  According to the event description, PrivacyCon will “bring together a diverse group of stakeholders, including whitehat researchers, academics, industry representatives, consumer advocates, academics, and a range of government regulators, to discuss the latest research and trends related to consumer privacy and data security.”  Given that there is a call for “presentations seeking original research on new vulnerabilities and how they might be exploited to harm consumers” hopefully the attendee list to this free event does not have too many “John Smiths” listed.

NJDC Affirms FTC Regulatory Power Regarding Data Security Practices

Judge Esther Salas of the United States District Court of New Jersey ruled today that a Section 5 action brought by the FTC was sustainable against Wyndham Worldwide Corporation (“Wyndham Worldwide”) as well as various corporate affiliates primarily involved in the franchise side of its business.  This decision re-affirmed the FTC ‘s power to regulate “unfair trade practices” based on the failed data security of companies.   Judge Salas denied a motion to dismiss a FTC action based on the alleged violation of both the deception and unfairness prongs of Section 5(a) “in connection with Defendants’ failure to maintain reasonable and appropriate data security for consumers’ sensitive personal information.”  Wyndham Worldwide also looked to dismiss the action given the consumer representations made by some corporate affiliates were not intended to be applicable to all corporate affiliates.

In what Wyndham Worldwide considered a matter of first impression, the Court rejected Wyndham Worldwide’s position that the FTC does not have authority to bring an unfairness claim involving lax data security.  Another allegedly unique aspect of this case turns on the fact the corporate affiliate who initially sustained the data incident and also made most of the representations in question (Wyndham Hotels and Resorts, LLC) was able to implicate its corporate parent.

This decision is a rare judicial affirmation of the FTC’s broad power to assert itself in the data protection activities of companies. Typically, the FTC simply obtains consent as a byproduct of a settlement agreement.  Hacked companies routinely acknowledge the FTC’s power in this regard.

Although this decision merely resolves a motion to dismiss — with liability issues left unresolved, privacy practitioners who visit with the FTC should review Judge Salas’ opinion and continue to track this matter.  Given the hard public positions taken by Wyndham and the FTC,  this case may very well end up in the Third Circuit or even the Supreme Court — eventually leading to an appellate court potentially defining the exact contours of the FTC’s authority to regulate hacked companies.

New Jersey State AG Enters into COPPA Consent Order

Capitalizing on its federal grant of authority, the New Jersey state Attorney General’s Office recently resolved claims it brought against Dokogeo, the California-based maker of the Dokobots app, that were based on the Children’s Online Privacy Protection Act (COPPA) and state Consumer Fraud Act.  According to the Consent Order filed on November 13, 2013, the Dokobots app is a geo-location scavenger hunt game that encourages users to visit new locations and gather “photos and notes from the people they meet.” One major attribute of the app is its geo-tracking of users.  A product review of the app describes it as blending “playtime, learning, exploration, and creativity in a curiously enticing way.” The State’s position was that the app was directed to children by virtue of its use of animated characters and “child-themed” storyline.

The Consent Order alleges that the app collects information, “including e-mail address, photographs, and geolocation information” deemed personal information under COPPA yet did not provide any neutral screen registration process to restrict the age of its users to those over the age of 13. Moreover, there was no terms of use agreement and its privacy policy does not disclose that the app is restricted to users over the age of 13. Pursuant to the Consent Order, Dokogeo removed all photographs of children and location information from its website and agreed to more clearly disclose information it collects.  As of November 24, 2013, the Dokobots site merely had a static home page – presumably given it is still in the process of implementing the terms of the Order.

The Consent Order also provides for a suspended fine of $25,000 which will only be enforced if Dokogeo fails to meet the terms of the Order.

This is the second such settlement reached by the New Jersey state AG’s office.  In July 2012, authorities announced a similar settlement against Los Angeles-based 24 x 7 Digital, LLC requiring the destruction of all children’s data that had previously been collected and transmitted to third parties.  That action was commenced by way of Complaint filed in June 2012.

It is not unusual for state AGs to commence COPPA actions against out-of-state companies.  In fact, a state AG action under COPPA was brought years ago by the Texas AG against a Brooklyn-based company for improperly collecting personal information such as names, ages, and home addresses from children.  What is interesting about the Kokogeo case, however, is that the underlying statute requires that the “the attorney general of a State has reason to believe that an interest of the residents of that State has been or is threatened or adversely affected. . . .” 15 USC § 6504 (emphasis added). Other than merely reciting the statute, no actual finding was made or referenced by the New Jersey AG’s office regarding the impact to New Jersey residents.  In fact, Kokogeo defended by arguing the app was intended for adults and there was no discussion by either side regarding New Jersey users.

App developers are well advised to appreciate two basic lessons from Kokogeo. If an app appears to target children, developers should comply with COPPA — especially given FTC guidelines involving the collection of geo-data and use of photographs.   And, if they do not comply, they should be prepared to defend against those state AGs who are not adverse to spending state dollars pursuing an enforcement action

Plaintiffs Bar Hit Hard by Recent CMIA Decision

Insurers providing privacy liability coverage were collectively breathing a sigh of relief last week given a decision from the California Court of Appeals.  Interpreting the California Confidentiality of Medical Information Act (CMIA), the court in Regents of the Univ. of Cal. v. Superior Court of Los Angeles County, No. B249148 (Cal. Ct. App. October 15, 2013) significantly limited the ability of plaintiffs to obtain nominal statutory damages of $1,000 per patient under CMIA.  For the past several years, CMIA was pretty much the best game in town when it came to data breach litigation.  Although enacted in 2008, CMIA was only over the past several years successfully used by plaintiffs’ counsel to obtain settlements previously unattainable post-breach.  The CMIA “statutory damages” bonanza reaped by class counsel was significant – the prospect of such damages allowed counsel to overcome Article III and other “lack of injury” arguments, potentially allowed for class certification even with an otherwise uneven plaintiff pool, and created an early incentive to settle on the part of a defendant – and its insurer – given the potential size of an award.

It is no surprise CMIA was the bane of a good number of network security and privacy insurers – it led to settlements that would not have otherwise occurred.  The Regents decision is noteworthy given it was the first appellate court to decide the availability of CMIA statutory damages and rejected the notion that mere negligence coupled with disclosure could trigger statutory damages.  This is a major departure from how the law was interpreted by the lower courts and instantly dried up a good  part of the statutory damages manna drunk by plaintiffs’ counsel. 

The facts of the case would provide a nice law school hypothetical – a doctor’s home is burglarized and his encrypted external drive is stolen – and, just for good measure, he cannot locate the note card containing the drive’s password.   Was there unauthorized access to the stolen information?  A CMIA private right of action allowing for statutory damages turns on whether “negligence results in unauthorized or wrongful access to the information.”   It is easy to assume when someone may have also stolen the password located near a stolen hard drive that the theft will result in an unauthorized access – especially when the stolen drive is never found.

After reviewing the statute’s legislative history and related laws, the Court of Appeals strictly construed the statute to allow for nominal, or statutory damages of $1,000 – but only when there was actual “unauthorized or wrongful access to the information.”  Given that the class plaintiff was unable to allege her information was improperly viewed or otherwise accessed, the superior court was ordered to have the case dismissed. 

In effect, the Court of Appeals significantly neutered CMIA by requiring actual improper access to a patient’s medical information.  In most likely breach scenarios, ID theft and “actual access” can go hand in hand.  Armed with evidence of potential or actual ID theft, most plaintiffs’ counsel would withstand some level of motion practice – with or without CMIA.  In other words, the benefits derived from CMIA’s availability of nominal damages may have dwindled to some potential commonality assistance during a class certification motion. 

Although it remains to be seen whether insurers will lower healthcare privacy premiums due to this one decision, one thing is certain – claims adjusters will have “a little” extra free time on their hands.

Legal and Business Advocacy