text
stringlengths
0
118
breaches, with the aim being to serve as a deterrent for future breaches.
Although this approach certainly accomplishes some degree of deterrence
and incentivizes improved security, it is far from enough. Even
organizations with good security are breached. It is hard to defend against
persistent attacks. Eventually, even very security-savvy organizations will
make mistakes. There will always be weak spots in the security armor or
careless blunders. Organizations can certainly do better, but being perfect is
almost impossible, especially for organizations with a large workforce and
many vendors. Pushing organizations to improve can help but there is a
point of diminishing returns. Even the organizations with the best security
programs will stumble.
In this chapter, we propose that improving data security requires seeing
it quite differently. In what we call “holistic data security,” we contend that
data breaches aren’t a series of isolated incidents as they often are assumed
to be. Data breaches are the product of the data ecosystem, which is
perversely structured in ways that not only to fail to prevent data breaches
but make it easier for them to occur and heighten the damage they cause.
We contend that the law must dramatically widen its scope. It must move
away from its narrow focus on data breaches. It must become more
involved earlier on. It must apply to the full range of actors that contribute
to the problem. In short, the law must address the structural points where
the system is failing.
STOPPING ALL DATA BREACHES ISN’T THE RIGHT GOAL FOR
THE LAW
What do we want data security law to accomplish? Many might reply:
“Stop all data breaches. The ideal is perfect security. Data should never be
compromised.” This is essentially what the law is proclaiming: “DON’T
GET BREACHED!” By ratcheting up the cost and pain for breaches, the
law is declaring: “If you get breached, you will pay more. It’ll hurt more.
So, do everything possible not to get breached.”
The language and imagery of data security reinforces this view. If you do
a search for the keyword “security,” you will be inundated by thousands of
images of locks and safes. Security is often analogized to locking
something in a big vault or padlocking it in a fortified place. People think
that to have data security, you must put the data in an impenetrable location,
in a castle surrounded by a moat high up on a cliff guarded by thousands of
knights.
Although at first blush the goal of perfect security seems desirable, it is
actually the wrong goal, and it is based on a fundamental misunderstanding
of what data security is about.5 When security is properly understood, we
will see that it is more of an art than a science, more about how to deftly
balance tradeoffs and opposing goals. These tradeoffs can’t be denied if we
want good data security policy. We can’t have perfect security, and we
wouldn’t want it either.
Why We Don’t Want Perfect Security
It seems odd to claim that perfect security isn’t good. Nobody wants
intruders or unauthorized access or data loss or theft. So why shouldn’t we
strive for perfect security?
The reason is that security isn’t the only goal for data systems. Just as
important as keeping the bad actors out is allowing the good actors in. Why
bother keeping personal data if nobody could ever see it or use it?
We want to use and transfer data quickly and easily, we want ease and
convenience. But quick, easy, and convenient are a recipe for security
debacles. If you make it more convenient for good guys to access data, then
you often make it more convenient for bad guys to access it too. Every time
data is stored, there’s a security risk. Every time access to data is granted,
there’s a security risk. Every time data is transferred, there’s a security risk.
Anything involving the Internet is risky. Email is risky. Sharing files is
risky. Nearly everything that is efficient, productive, convenient, or useful is
risky.
A security professional who locks all the data away or makes it very
cumbersome to access would prevent an organization from functioning.
Imagine a patient being wheeled into the ER, with the doctors having to go
through time-consuming steps to get the information they need about the
patient. If data is too locked up or if access to data is too slow and
cumbersome, then people can’t do their work—and the security
professional will soon be out of work.
People may say that they want extreme security, but they also want
convenience. People want these conflicting things, and they don’t like being
told that they can’t have their cake and eat it, too.
Law professor Guido Calabresi poses an interesting hypothetical. He
asks you to imagine that you are the leader of a country. An evil deity
comes to you and asks you whether you will accept a special gift for your
country—a magical machine that would make life much easier and more
convenient. “Of course,” you say. “But there’s a catch,” the evil deity says.
The gift would come at a great cost: 40,000 lives lost every year. Would you
accept the gift?
“Absolutely not!” most people would declare. When people say no,
Calabresi asks: What’s the difference between the gift and the automobile?
Cars make life much more convenient, but car crashes kill about 40,000
people per year.6 Society accepts this “gift” despite the costs. Calabresi
calls this a “tragic choice.”
We accept great costs for convenience, but it is very uncomfortable to
admit it. We could make cars a lot safer by designing them so they couldn’t
go faster than 15 mph and with bumpers 10 times bigger. There are many
safeguards that could be implemented, but at the cost of a large sacrifice of
utility.
Security is somewhat like the car. We want convenience and speed, but
these things come at a tragic cost. We can’t ignore these tradeoffs.
Pretending that they don’t exist will result in poorer data security because
the interests on the other side won’t just disappear if ignored. People will
look for workarounds that will often undermine security.
We should thus be honest about the goals of data security law. We don’t
want data security at all costs. We don’t want to do what it takes to stop all
breaches. We must accept a certain amount of risk to access and use data
quickly and conveniently. The key, of course, is just how much risk we
should accept and how much utility we want.
To complicate matters, there is no ideal balance that works for all people
and all organizations. The fact that there is no one-size-fits-all security
balance doesn’t mean that we can’t assess whether a balance is good or bad.