text
stringlengths
0
118
everything on a checklist. Such safe harbors should be rejected as they can
further entrench the faulty checkbox approach.
Ultimately, what matters is not just whether a particular type of security
measure is implemented but how it is implemented. Far too often, checklists
that look good on paper end up being poor in practice. For example, it’s
easy to check the box for security awareness training but it’s far harder to
implement effective training that really has an impact on behavior.
While lawmakers should seek to discourage checklist behavior, the law
should still provide sufficient guidance to organizations about the basics of
good
data
security.
Lawmakers
and
regulators
should
clarify
“reasonableness” standards for data security by providing more concrete
guidance. To be reasonable, the policy choices must amount to sensible risk
management. There can be different degrees to which organizations are
willing to tolerate risk, and the law shouldn’t mandate a one-size-fits-all
approach. Although no one size fits all, there are organizations that engage
in poor risk management, and these organizations should be penalized.
The law should encourage greater integration of privacy and security
Privacy and security go hand-in-hand. Moreover, privacy involves issues
that are often quite muddy, so privacy professionals are typically more
experienced in thinking about muddy issues. A major theme in our
recommendations is for security thinking to become less mechanistic and to
embrace the muddy balancing needed for good risk management. Privacy
can contribute to data security by bringing its experience with this less
formalistic way of thinking.
There are many components of privacy regulation that can strengthen
security, such as data minimization, data mapping, and other requirements.
Good privacy hygiene can reduce the likelihood of breaches as well as their
severity.
The law should require or encourage security by design that accounts
for the human element in systems Lawmakers should take the design of
information technologies more seriously because of the significant role that
design plays in shaping human behavior. Some technological designs
exacerbate people’s carelessness or weaknesses; these designs make it easy
for people to make a mistake. Some designs help nudge people to avoid
mistakes.
Lawmakers are often reluctant to regulate design for fear of being too
paternalistic. But there are strategies for how the law can influence design
without being too rigid, such as requiring certain defaults rather than
banning features or options. Lawmakers should mandate certain designs
that are widely known to create unwarranted security risks, such as
requiring manufacturers of devices to force users to change default
passwords.
Lawmakers should promote security measures that encourage humans to
behave in ways that increase security. Far too often, the law does the
opposite, rewarding security measures that ignore the human element.
The law should promote a more uniform and less demanding set of
security norms so that people have the right expectations and
knowledge Critics often marvel at how foolishly people behave regarding
security. So many people seem to do so many unwise insecure things. In
many cases, it’s not people’s fault. People receive inconsistent and muddled
messages about security. They are taught to avoid clicking links in emails to
reset their passwords, but then legitimate companies ask people to do this
very thing. Lawmakers should help coordinate a more uniform approach to
developing the right understanding about data security in people that
doesn’t place too much of a burden on them.
Lawmakers should ensure that more effective and accurate signals are
sent to people about security. Bad signals can lead to insecure behavior, and
good signals can greatly increase secure behavior.
The law should become more holistic, more strategic, and more cognizant
of the role humans play in the data ecosystem. If data security law is going
to stand any chance in a world of artificial intelligence, smart devices, and
social media, it must move beyond the breach. Doing so will make the law
much more effective and adaptive. With any luck, we will be able to stop
calling every year “The Year of the Data Breach.”
ACKNOWLEDGMENTS
We would like to thank the participants of the 2018 Privacy Law Scholars
Conference workshop of our book proposal and chapter. We also would like
to thank the following people for providing comments, reviewing drafts,
and helping refine the arguments in the book: Annie Anton, Steven
Bellovin, David Choffnes, Dissent Doe, Chris Hart, Chris Hoofnagle,
Edward McNicholas, Paul Schwartz, Peter Swire, Charlotte Tschider,
Christo Wilson, and Josephine Wolff. We are particularly grateful to Kyle
Berner, Jonathan Cleary, Kayvan Farchadi, Katherine Grabar, Johanna
Gunawan, Alissamariah Gutierrez, Ahmed Khalifa, Charlotte Kress, Jay
Mohanka, Alexander Nally, Trevor Schmitt, Alexis Shore, and Julia
Sweeney for their research assistance.
We also want to thank our agent, Susan Schulman, for her great efforts
to further this project and David McBride and Holly Mitchell, our editors,
for their support and flexibility throughout this project.
NOTES
CHAPTER 1
1.
Michael Riley, Benjamin Elgin, Dune Lawrence & Carol Matlack, Missed Alarms and 40
Million Stolen Credit Card Numbers: How Target Blew It, Bloomberg, (Mar. 17, 2014),
available athttps://www.bloomberg.com/news/articles/2014-03-13/target-missed-warnings-in-
epic-hack-of-credit-card-data (hereafter Riley, Missed Alarms).
2.
Megan Clark, Timeline of Target’s Data Breach And Aftermath: How Cybertheft Snowballed
For The Giant Retailer, International Business Times, (May 5, 2014), available
athttps://www.ibtimes.com/timeline-targets-data-breach-aftermath-how-cybertheft-snowballed-
giant-retailer-1580056.
3.