text
stringlengths 0
118
|
---|
There are some balances that are clearly off kilter. We can evaluate and |
improve balances, but the key to getting it right is to understand that we are |
balancing competing priorities. |
Data Security Is About Humans, Not Technology |
When many people think about data security, they often think of hackers in |
hoodies furiously typing on computers. But technology is just one part of |
data security. At its core, data security is about humans. People are the |
largest component of the data security risk equation, and people are one of |
the most challenging variables to control. |
Technology is often thought of first when it comes to data security. From |
firewalls to encryption to access controls, there is an array of technologies |
that can help protect against intruders or improper access to data. At |
universities, the main place to study data security is within the computer |
science and engineering programs. Certifications for data security are often |
tech-heavy, including lots of lines of code, cables, and navigating user |
interfaces. |
Data security, however, is not really a war between technologies that |
attack and technologies that protect. Instead, data security is a struggle with |
people using technologies. Most data breaches involve human error. It is |
humans who fail to encrypt or that choose poor encryption. It is humans |
who fail to patch software. It is humans who put data on portable devices |
and lose them or fail to keep them in safe places. It is humans who are |
susceptible to being manipulated, deceived, and defrauded though targeted |
attacks.7 |
Data Security Is About Risk Management |
An employee at the United Kingdom’s National Health Service (NHS) lost |
a USB memory device while delivering it from a clinic to the local |
administrative offices. The device contained the health records of |
approximately 6,360 people. Fortunately, the device was encrypted. |
Unfortunately, the employee had stuck a note on the side of the device with |
the password to decrypt it. |
On paper, NHS was also doing the right things regarding security—it |
was encrypting USB devices. Encryption is a wonderful tool because if a |
device is lost or stolen, the data is unreadable. Encryption, however, doesn’t |
work like magic—it can still be thwarted if people select bad passwords or |
fail to protect their passwords. |
The NHS employee likely knew better than to paste his password to the |
device. Why, then, did such a ridiculous blunder happen? Even when |
people know better, they still do careless things. They recognize a |
suspicious link or attachment, yet they still click on it. They know they are |
not supposed to write down passwords on sticky notes and attach them to |
computers or devices, yet they do so anyway. Why are people so careless? |
People are careless because good security is often cumbersome and |
inconvenient. One of the basic tendencies of human nature is that the more |
inconvenient something is, the less people will do it. The law, as well as |
security officials, often neglect to account for this reality. |
This is why security policies and measures can look fine on paper but |
fail in practice. Suppose a law mandates encryption for personal data on |
portable devices. An organization follows the law. So far, so good. The |
organization has checked the box on a checklist of best security practices. |
The organization might even require a complex password for the device— |
check! In a training video shown to new employees a sentence is uttered |
about not writing down passwords—check! On paper, it all looks quite |
good. All boxes are checked. And yet, it can fail, as we learned from the |
NHS case. |
The organization or the security team typically don’t take the blame. The |
blame goes to the person who unwisely wrote down the password and stuck |
it to the device. Blaming the employee, however, is one reason why data |
security so often fails. The employee may have been foolish and careless, |
but he should have been considered a known variable. His behavior was |
foreseeable. |
The problem is that there are Hobson’s choices with so much of data |
security. For example, with passwords, if you make the password easy, and |
the employee will remember it. But then the password can be more readily |
cracked. Make the password longer and more complex, but then the |
employee can’t remember it. The employee will struggle to figure out what |
to do. If the only advice the employee is given is “don’t write the password |
down,” this isn’t helpful. The employee needs to find a way to remember |
the password, but it’s too difficult to remember. The employee will |
inevitably write it down. Who wouldn’t? Employers often advise employees |
to use unique passwords for different accounts and devices, so the employee |
probably found it necessary to put the password near the device.8 |
How could this problem have been averted? Telling people not to write |
down passwords is unrealistic. People won’t remember them. They must |
write them down somewhere. There must be a better way to make it easy |
for the employee to remember the password for the device. If organizations |
were to help the employee do this rather than demand a more difficult, |
inconvenient, or impossible task, then there is a much better chance the |
employee won’t use a runaround. |
Far too often, security advice is given in training to make the optics look |
good for the organization. Organizations can always claim that it told |
employees the right things. Training becomes a waste dump for intricate |
security advice that only the most assiduous people will follow. On paper, |
an organization can point to a training program that says all the right things. |
It looks good to show to regulators. “We told our employees not to write |
down passwords,” the company can explain to regulators after the breach. |
“Our foolish employee didn’t listen, so it’s not our fault.” Unfortunately, in |
this context, the focus on making everything look good on paper is terrible |
in practice. |
Figure 4.1 |
People often think of data security as a set of clear choices as opposed to |
privacy, which is seen as a set of muddy policy issues. Data security, |
however, is actually quite muddy itself—it involves difficult policy |
decisions about risks and tradeoffs. |
Managing human behavior is immensely challenging. People are hard to |
control. They need to be educated. They need to care. But people forget. |
They have lapses in judgment. They don’t always have enough incentive to |
learn what they are supposed to learn or do what they are supposed to do. |
One choice is to impose more controls on people—make it harder for |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.