Privacy Risks in Big Data: What are the Legal Aspects?
Concern over privacy was big before Big Data, but now it’s bigger.
The Snowden affair and the loss of personally identifiable data at Target in 2013 have merely served to heat a pot that was already boiling.
According to Ira Rubenstein (2013),
“Finding the right balance between privacy risks and big data rewards may very well be the biggest public policy challenge of our time.”
The Privacy Act of 1974 incorporated some “fair information practices,” but its scope was limited to the government and some of its contractors.
Know Your Piece of the PII
Know this abbreviation: Personal Identified Information (PII). For health PII, that means account numbers, phone numbers, dates of medical events, email and home addresses, biometric identifiers.
There is a vague assumption that somehow there is a U.S. constitutional protection for privacy, but in fact there is none. While courts have determined that there are certain provisions that imply a level of privacy that is protected by the Constitution, informational privacy is only indirectly protected.
That said, constitutions in several states expressly grant individuals some privacy rights, though these mainly entail transactions between individuals and state governments, rather than between individuals and businesses. There are some protections for physician-patient communications, though exceptions are provided for governmental reporting.
On the other hand, personal health information has legal protection, and just because they’re not health providers themselves, those in possession of health data are subject to HIPAA rules. The U.S. Office of Civil Rights “Final Rule” extended HIPAA / HITECH liability to those involved in medical billing, coding, clearinghouses, and health care plans.
Who Cares? 67% Do.
People say they care about keeping health information private. One survey said that 67% of those surveyed said they were concerned about the privacy of their medical records. The percentage rises when respondents are told to consider the possibility of a nationwide system of medical records. (This possibility is, in a practical sense, a proxy for asking people to respond to Big Data health care.) There are additional provisions to protect subjects of medical, psychological and drug experiments.
IT professionals should avoid pronouncements like Scott McNealy’s infamous 1999 utterance: “You already have zero privacy anyway. Get over it.”
HIPAA, HITECH High Wire Acts
HIPAA (Health Insurance Portability and Accountability Act) is the acronym most remember, but HITECH is the 2009 Act that provided most of the privacy provisions.
Fines for HIPAA violations are stepwise. Accidental breaches incur a $100 fine. But a $1000 fine awaits those who neglect reasonable safeguards. The maximum penalty is $25,000 per person for all violations of an identical infraction. For businesses, the penalty is more severe for knowingly disclosing PII health information for commercial advantage: the fine is $250,000, plus the potential for a 10 year prison sentence – or both.
How much health information will make it into Big Data repositories? Plenty, according to Big Data proponents. Whether it belongs there or not, it is often at the top of everyone’s Big Data list.
Big Data analysts, for example, are likely subcontractors to others who provide data; they are not responsible for creating the data. Under HIPAA, a breach at the analysts’ cloud facility – to give a plausible example – is the problem of the data user, not the data provider. In other words, there’s no dodging liability by saying, “It’s their data.”
Stiff Penalties are Levied for Offsite Unencrypted Storage of PII
Some Hypothetical Big Data Breaches
It takes little imagination to concoct a few hypothetical Big Data breach scenarios, presented here with their nearest real life penalties.
- Permissions on a QlikView pivot table display with Accountable Care Organization data was left publicly accessible. (Small Data real world version: A five-physician cardiac surgery practice paid a $100,000 penalty in 2012. This was a case that cried out for enforcement: the practice posted appointment schedules on an unsecured web calendar.)
- Offsite backup media that was unencrypted and still contained personally identifiable information. (Small Data real world version: the APDerm breach in 2013, for which $150,000 was assessed.)
- A Big Data ETL staging area (the flavor that Syncsort’s Big Data sofware is designed to eliminate) that was accidentally left open to unsecured ftp. (Small Data real world version: managed care organization WellPoint was assessed $1.7M for making health care information available over the internet.)
- A multitenant cloud provider experiences a tenancy failure, with data potentially leaking into adjacent tenant storage. (Small Data real world version: AvMed was fined $3.5M after a breach, even though class action members could not prove actual damage.)
- Identity theft of 500,000 names is accomplished by a “Big Data company” employee (There is a usually a two year mandatory minimum penalty for federal violations, and many states have identity theft statues.)
- Further reading at HHS, the enforcer for HIPAA and HITECH, presents some of their settlements, which they call “resolution agreements”. These will further whet the appetite for punishment.
Future Tenants? Cloud Providers after Multitenancy Breaches
A Contrarian Benefit Analysis
Jules Polonetsky is Co-Chair and Director, Future of Privacy Forum and Omer Tene is Senior Fellow, Future of Privacy Forum and an Affiliate Scholar at the Stanford Center for Internet and Society. Polonetsky and Tene authored a piece on privacy law for the Stanford Law Review in which they argue that not enough legal consideration is being given to a cost-benefit tradeoff analysis in legal frameworks for privacy. Specifically, they believe that the FTC and the EU — charged with some aspects of privacy protection — are legally obligated to assess potential benefits to consumers (and other parties).
They point out that giving up some personal data can benefit “a proximate class,” such as others in a geographical region who have the flu, slow internet service, experience patterns of energy consumption or flooding — in fact, many aspects of the Internet of Things (IoT).
“. . . Privacy professionals need to have at their disposal tools to assess, prioritize, and to the extent possible, quantify a project’s rewards.”
The Penalty Phase
So – you’re a startup with a Big Data business plan. What should you worry about? There are the usual suspects: SLA’s, contracts, professional liability insurance. Then start the worry list by searching Google’s latest privacy troubles.
Disclaimer: This list is by no means complete.
- Early in 2014, Google was fined $203K by France and $1.2M by Spain. “Watchdogs” from the EU Article 29 Data Protection Working Party, who are advisors to the European Commission () complained that EU individuals were not notified about how and why PII was used by Google.
- If you host data from European users which includes their IP addresses (which are routinely collected by most web servers), that’s PII.
- If your startup plans to collect money by credit card over the web, look at Nevada’s state law NRS-603A, Minnesota’s Plastic Card Security Act, and Washington state’s H.B. 1149. They have incorporated all or parts of PCI DSS.
- Make clear who owns any social network data your collect. PhoneDog.com sued a former employee for $350,000 who had developed at 17K Twitter following http://bit.ly/1fJcyjv.
- Notification is probably mandatory. Since 2003, 45 states have enacted statutes requiring businesses to give out the bad news of a breach. http://bit.ly/1fJcyjv
- Review the Stored Communications Act 18 USC 2701-11. In the absence of a company policy regarding company internet investigations and expectations of privacy, employer control over employee internet data may be limited. http://bit.ly/1fJcyjv
- For multinational corporations, worry about cross-border data transfers. For example, in November 2009, the European Network and Information Security Agency issued a report on cloud computing warning that companies remain responsible under U.K. law for safeguarding their customers’ information even if those data are stored by a service provider elsewhere.
- Contracts with cloud providers must address requirements for security, retention, response to breaches, and obligation to provide e-discovery in the case of lawsuits or government investigations.
- Failure to comply with eDiscovery could be costly. In late 2013U.S. District Court Judge David Herndon imposed close to $1 million in punitive damages on defendants Boehringer Ingelheim International GMBH (“BII”) and Boehringer Ingelheim Pharmaceuticals, Inc. (“BIPI”) for failing to adequately comply with the court’s discovery orders in In re Pradaxa Products Liability Litigation“
- In all jurisdictions the Computer Fraud and Abuse Act (CFAA), 18 U.S.C. 1030, the federal computer crime statute, applies to former employees who steal data from the company computer, but in two federal circuits it does not apply when the theft occurs during employment.
Ignorance of the law is no excuse. We learned that in Civics 101. If you are a Big Data / Big Data Analytics provider, you now have one less excuse for failing to address compliance, governance and security. The health record you’re protecting may be some ELT’d version of your own.