text
stringlengths 0
118
|
---|
around clunky security procedures. When policymakers create rules that |
don’t factor in people’s inevitable foibles and incentives to create |
workarounds, they get unintended consequences. We need to think of ways |
to better account for human behavior in designing security policy. |
RETHINKING DESIGN RULES FOR TECHNOLOGIES |
Although many experts realize that human fallibility is the bane of data |
security, information technologies are often not designed with this fact in |
mind. By designing, we mean creating something according to a plan. |
Design is a key aspect of good data security policy.49 Design is everywhere, |
design is power, and design is political.50 Nearly everything with |
technology results from a design choice, from the interface we see when we |
use software, to the buttons we push on devices, to the initial settings on |
platforms. The way humans interact with technology is at the center of so |
many security vulnerabilities, and design is all about structuring these |
interactions. |
In writing about the pitiful security in most modern technological |
devices and systems, Bruce Schneier suggests that “Security needs to be |
engineered into every system, and every component of every system, from |
the beginning and throughout the development process.”51 We agree. The |
principle that Schneier is recommending is known as “security by design.” |
The law is several steps behind. First, only a few data security laws even |
try to regulate design. Most notably, Article 25 of the GDPR requires “data |
protection by design and by default.”52 Although the GDPR isn’t very |
specific on what these baselines should be, at least it is a start. Beyond the |
GDPR, few other laws say anything about design. The law must change |
course and start regulating design, as doing so is crucial for strong data |
security.53 |
Second, the law not only must regulate design, but also must push for |
design that accounts for the human element. Designs that are oblivious to |
human behavior will fail. Security often focuses on keeping attackers out, |
rather than mitigating the errors from people who are already properly in. |
All design choices have one of two effects (and often both): They can |
signal information to people or make particular behavior easier or harder |
by imposing or reducing transaction costs—the expenditures of time, labor, |
and resources that are necessary to complete a task. Signals and transaction |
costs shape our mental models of how technologies work and form the |
constraints that guide our behavior in particular ways.54 Design decisions |
make certain realities more or less likely. Companies and hackers know that |
the power to design is the power to exert control over people. This is |
because people react to signals and constraints implemented through design |
in predicable ways. People with the right know-how can build or leverage |
the design of consumer technologies to encourage desired user behavior.55 |
A concern with regulating design is that it might be too paternalistic. We |
don’t want policymakers not versed in technology to make decisions far |
beyond their expertise. The regulation of design, however, need not focus |
on the specific details of technologies; policymakers do not need to become |
backseat engineers. Instead, there are some more general design |
requirements that the law can impose that will not be overly meddlesome |
but will be quite effective in designing more securely. Below, we set forth a |
few proposals for the types of security design requirements that follow our |
overarching principle, which is to design with the human element in mind. |
Changing the Default Settings |
Humans are prone to inertia. Most users will not take action, and design |
must reflect this fact. We can beg and plead with people to do things, but |
this nagging will fail to work consistently with many people. Bruce |
Schneier wisely recommends that all “devices need to be secure without |
much intervention by users.”56 The less work people need to do for security, |
the better. |
One way design can have a significant effect is with default settings— |
the preselected options within software programs and devices. Default |
settings have a major impact on human behavior. Many programs, apps, and |
devices have initial settings that are quite insecure. For example, the default |
settings for many social media sites are set to maximize sharing. According |
to a 2015 study from Kaspersky Lab, 28 percent of social media users |
neglect privacy settings and leave all their posts, photos, and videos |
accessible to the public.57 Not only is this setting bad for privacy, but it is |
also bad for security as it can lead to people exposing personal information |
in ways that might compromise their overall data security. Recall how |
Christopher Chaney was able to guess the password recovery questions of |
celebrities: he found their information online. |
Certain apps and sites have unexpected defaults that can be very clunky |
to change. For example, at Venmo, a payment service, user financial |
transactions are public by default.58 The process for a user to limit access to |
all their transactions is not very intuitive. The “Default Audience” setting |
only affects the visibility of the charges and payments initiated by the user. |
The transactions initiated by friends who have not changed the default |
privacy setting will still be shared publicly. To obscure all transactions, a |
user must hunt down and change the inconspicuous “Transactions Involving |
You” setting.59 |
The law could require or encourage default settings limiting the sharing |
of personal information online. If people want to share more, they could |
certainly change the settings, but many people haven’t thought enough |
about the privacy and security consequences of the settings. As we |
discussed in the previous chapter, rules that strengthen privacy often also |
strengthen security, and a rule about the default settings is one that is often |
in the privacy category but can have benefits for security. |
The law could also require or encourage default settings for the use of |
two-factor authentication for certain services or the option to automatically |
update software to more secure versions. If these options are not selected by |
default (or if the software doesn’t nudge users towards using them), then |
users are less likely to take advantage of them. |
Another way that the law can help strengthen security is to address |
default passwords. One of the most commonly exploited vulnerabilities in |
devices is the default password.60 Many devices have a simple password by |
default, and it is sometimes hard-coded in. Many users never change this |
password, making it easy for hackers to break into devices. This is how the |
previously discussed DDoS attack on Krebs was perpetrated. |
One solution would be to require manufacturers of devices to require |
that users change default passwords when they start using the device. This |
rule might seem like a small requirement, but it can make an enormous |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.