text
stringlengths
0
118
important deterrent to keep companies from engaging in reckless security
practices. But the compromised data holders are just one actor in a much
larger system. Let us step back for a moment and consider the big picture.
There are many actors that contribute to data breaches. They include
software manufacturers that create buggy vulnerable software; makers of
insecure devices that can readily be hacked; ad networks and websites that
host malicious ads; platforms that don’t sufficiently vet apps; consumer
reporting agencies; government officials that exploit vulnerabilities;
government officials that create vulnerabilities; and organizations that
miseducate people, among others.
As Josephine Wolff aptly notes, the law often focuses on “the first or the
most easily understood point of access—the phishing email, the dictionary
attack, the unprotected wireless network.”21 What is overlooked are “all of
the groups and people who play some role in enabling successful
cybersecurity incidents.”22 Wolff is exactly right. The law needs to expand
its scope to hold more actors accountable for data breaches.
WHAT
Almost every hack seems like the result of a technical failure or individual
blunder.23 But usually those failures or blunders were orchestrated by
criminals taking advantage of a system where nobody wants to accept
blame for a security lapse. The lack of accountability within these systems
causes, or contributes to, a lot of breaches (or makes them more harmful).
When we step back and look at the big picture, security is about
structure. It is about the structure of a data ecosystem where many actors
contribute greatly to the problem and aren’t held accountable. It is about the
internal structure of organizations that has fragmented privacy and security.
It is about the structure of products, services, and even particular security
measures that are designed in ways likely to fail. Unfortunately, the law
often responds with a set of reactionary jabs and band aids.
WHEN
The law most often jumps in after the breach. But this is the least effective
time for the law to become involved. The multitude of actors that contribute
to a breach often make their contributions long before data the breach
occurs. The law’s temporal focus must change. By the time a breach occurs,
it’s far too late. The other actors have done their damage at a very different
point in the timeline. This is why focusing on the breach has severely
limited the law’s effectiveness.
A better strategy would be to focus on the optimal time to intervene in
the life cycle of a cybersecurity incident. Sometimes that will be before the
incident occurs, such as regulating design, and sometimes this will be after
a breach occurs but before a risk of harm manifests itself.24
HOW
The current system is deeply flawed because it actually promotes poor
security and worsens the harm caused to victims when a data breach occurs.
Numerous actors play a role in data breaches beyond the organizations that
suffer the breach. The market often fails to create the incentive for good
security, and in many cases, the incentives encourage poor security. The
reason why is because all the parties in the data ecosystem have a very
strong incentive to shift the blame (and resulting liability) of a breach onto
others, because they don’t want to end up holding the bill.25 Data security
law right now is like a game of hot potato where no one wants to be stuck
holding the potato when the timer runs out. Time and energy are wasted
passing around the hot potato instead of having many hands make for a
lighter load.
The law should seek to rectify this failure. The law should create rules
that shape the incentives for actors and give them clear guidance on the type
of activity they should be protecting against or encouraging. The law
shouldn’t rigidly dictate policies for all organizations, as each organization
is best positioned to make decisions about establishing its own optimal
security balance. The problem is that organizations often don’t reach a
balance that is good for society.
The law should step in to restrict organizations from establishing a
balance that is bad for society. The law can also push organizations to take
security measures that are more consistent with how humans behave rather
than ones that are in denial of human behavior.
Additionally, the law can incentivize better design. Design is often
delegated to engineers. Policymakers are sometimes wary of being too
paternalistic because they think that they don’t understand technology. But
engineers often design technology in ways that would be secure if used by
robots but are woefully insecure when used by people. Technologies are
often designed in ways that exacerbate people’s tendencies to be insecure
rather than nudge them to be more secure.
Data security law must look beyond the breach. It must become more
holistic about whom it holds responsible and when it becomes involved.
The law can correct for the current market failure in data security; it can
improve the security balance by more accurately allocating the costs of
breaches to all the responsible actors; and it can be more preventative and
less reactive.
5
Responsibility Across the Whole Data Ecosystem
T  hey made it seem so simple. Just download the app, take a photo of
yourself (the racier the better), send it to your friend or lover, and poof . . .
the image or “Snap” disappears after a few seconds. But it was never that
simple, and it definitely was never that safe.1
The popular photo sharing social media app Snapchat promised
ephemeral communications but malicious actors had other plans. In the fall
of 2014, hackers intercepted hundreds of thousands of pictures and videos
taken by Snapchat users. After a few days of bragging and bluster, the
hackers posted the photos online.
As a company already under a consent order with the FTC to protect
users’ privacy, Snapchat was quick to proclaim that it did nothing wrong,
promptly issuing a statement that read, “We can confirm that Snapchat’s
servers were never breached and were not the source of these leaks.” The
culprit was an insecure and unauthorized third-party software program
designed to let users store “disappearing” snaps. Snapchat blamed its users:
“Snapchatters were allegedly victimized by their use of third-party apps to
send and receive Snaps, a practice that we expressly prohibit in our Terms
of Use precisely because they compromise our users’ security.”
Snapchat was referring to fine print buried in its terms of use that banned