Patrick Walsh
Originally published at blog.ironcorelabs.com.

Evaluating Newly Proposed Privacy Regulations

New privacy laws are popping up like mushrooms here in the U.S., which we love to see. But it can be hard to tell whether these proposed laws are meaningful or not. There are many things to look for, such as requirements to be transparent about how data is used, which is becoming common. But there are two things in particular that I specifically watch for in evaluating these laws:

  1. A “duty of care”
  2. A “private right of action”

Criteria 1: “Duty of Care”

What is it?

Duty of Care: “The responsibility of a person or organization to take all reasonable measures necessary to prevent activities that could result in harm to other individuals and/or their property.” — Legal Dictionary

Isn’t this a given? If someone leaks our personal data and we’re harmed because of it, isn’t that person liable? In short, probably not.

In 2015, a Pennsylvania court refused to allow people damaged by the theft of their confidential information to sue for negligence. The legal reporting from that time said that they refused to create a “duty of care” to protect confidential personal data. And none was explicitly created.

Some countries are doing a better job of this. For example, the E.U.’s GDPR creates an implicit duty of care for anyone handling the personal information of E.U. citizens — even if the companies reside outside of the E.U.

In the U.S., we do have some protections. There is a patchwork of state and federal laws that cover some information some of the time.

For example, under HIPAA, health care providers have to keep patient data private. If gleaned by your coworkers, that same information can be repeated and shared without HIPAA protections. Other laws protect the data of students, children, financial data, criminal justice data, and much more.

But when a user gives personal information — anything ranging from home address to sexual orientation to psychiatric diagnoses — to a tech company, there are often few, if any, protections.

This is one reason why states like California have passed landmark privacy laws. And why states like Colorado are proposing their own.

Criteria 2: “Private Right of Action”

Unfortunately, it may not be enough to create a duty of care. In many states, the new duty of care for personal data is evaluated by a state attorney general who has the sole discretion to determine whether a company failed to meet the standard. And those decisions are subject to budget constraints, politics, and more.

So if your data was carelessly protected (or not protected at all) by your bank or your software vendor, then you don’t have direct recourse unless you live in a place that gives you a private right of action.

The private right of action allows someone to directly sue for damages when their data is stolen due to a company’s negligence.

California is an interesting case study here. Their initial law, CCPA, was strengthened (arguably) by a bill that amended it, CPRA. One of the changes was to take enforcement away from the Attorney General of California and move it into a new enforcement agency’s hands.

But that enforcement is just for privacy issues where a company uses, shares, or sells data in a way that is different from what a consumer expects. In the security breach case, where data is stolen due to insufficient precautions, consumers have a private right of action and may directly sue breached companies. We’ve previously blogged about how this changes the economics of a breach substantially.

Without a private right of action, it’s unlikely most privacy laws will change behavior in any meaningful way outside of more legalese and pop-up terms and conditions agreements.

Colorado’s Proposed Consumer Protection Act

So when we saw that Colorado has a new privacy law making its way through the state legislature, we were naturally curious to see whether it had any teeth.

👍 Per an analysis by Davis Graham & Stubbs, the Colorado bill does propose a duty of care:

Duty-of-care: controllers must employ reasonable security measures to protect personal data against unauthorized acquisition during both storage and use.

In some ways, this was already the case in Colorado under the Colorado Consumer Protection Act. Still, under that act, a plaintiff had to show that injury was caused by the loss of their data, which is nearly impossible to do if a company offers identity theft insurance following a breach. Now the duty of care exists regardless of provable damages.

So this is a big step forward for Coloradans, particularly since this protection extends to companies that don’t reside in Colorado.

👎 Unfortunately, the Colorado bill does not provide a private right of action, which means that someone whose data is stolen through their service provider’s negligence cannot directly seek redress.

Instead, they have to lobby the Colorado Attorney General to get involved. If they pick it up at all, the Attorney General would assess the data protection measures used and determine if an injunction or a civil fine might be warranted.

Will the Colorado law drive new and better behavior by those holding personal data?

Without a private right of action, that seems unlikely. It’s a step in the right direction, just not a very big step.

Other States

There are currently 15 states with privacy legislation introduced in 2021, according to the IAPP.

IAPP even breaks down the various bills and some of the common provisions in them. It’s worth checkout out.

By our count, five states currently have duty of care and private right of action provisions: Florida, Massachusetts, Minnesota, New York, and Washington.

As more of these laws pass, the privacy patchwork problem will grow even worse. The good news is that most companies will be forced to adhere to the strictest of these laws so the aggressive states can shape privacy for the whole country.

The Bottom Line

We’re not politicians or lawyers. Our expertise is in data protection and data privacy from a technical point of view. We’ve seen up close how most companies handle data and the current state of practice in protecting customer data. The problems are probably the worst in big companies, even though they likely have good security teams. But the more people in a company and the more people who have access to the data, the more complex the systems are surrounding that data. Complexity is very much the enemy of security, and the situation is dire.

The industry could move to much more secure models, but the incentives to make those moves are generally lacking.

So we’re looking to the legislators to build strong economic incentives toward better data protection and better behavior. If done well, before long, all the tools developers use to build software will be enriched with data protection features, which will make for a much more secure ecosystem.

Each new law is another rock in a growing rock slide of data privacy protections. Perhaps one day we’ll even have some semblance of control over our data, who has access to it, and what they can do with it.