text
stringlengths 0
118
|
---|
have created a tool to exploit the vulnerability, which was exposed in a leak |
online and used by hackers to carry out the WannaCry attack.55 |
Brad Smith, President of Microsoft, took the unusual step of castigating |
government officials for exploiting the vulnerability rather than reporting it: |
Finally, this attack provides yet another example of why the stockpiling of vulnerabilities by |
governments is such a problem. This is an emerging pattern in 2017. We have seen |
vulnerabilities stored by the CIA show up on WikiLeaks, and now this vulnerability stolen from |
the NSA has affected customers around the world. Repeatedly, exploits in the hands of |
governments have leaked into the public domain and caused widespread damage.56 |
Miseducators |
Miseducators undermine security when their actions teach people the wrong |
things. Miseducators help hackers by training people to engage in the |
behaviors that hackers can readily exploit. |
TRAINING PEOPLE TO FALL FOR HACKER TRICKS |
Whenever there’s a big data breach caused by a person who clicked on a |
suspicious link in an email, security experts roll their eyes. “People are just |
fools,” they might mutter to themselves. It’s so easy to blame people for |
doing foolish things, but security would be improved if we started blaming |
others, such as the organizations that teach people to do foolish things. In |
security, fools aren’t born—they are made. |
A key security tip is never to click on links in emails asking users to |
login. Many companies, however, send emails asking people to click on a |
link to log in. When companies send emails that are identical to the kind of |
phishing emails that hackers send, people are taught that legitimate |
companies send emails like this. In effect, people are being trained to fall |
for hacker tricks. |
After suffering from a data breach, the firm Evernote alerted its 50 |
million users with an email notifying users that it had reset their passwords. |
The email from Evernote told users some good security wisdom: “Never |
click on ‘reset password’ requests in emails—instead go directly to the |
service.” Ironically, in the very same email, Evernote included a password- |
reset link. The link didn’t even go to Evernote’s website. Instead, it went to |
“links.evernote.mkt5371.com.” The sender’s email address was: |
This email was indistinguishable from a phishing scam. Indeed, it |
practically screamed I am a phishing email!57 |
UNILATERAL AUTHENTICATION |
It is commonplace for authentication to be unilateral. We must authenticate |
ourselves to organizations, but it’s a one-way street. They don’t authenticate |
themselves to us. Their failure to authenticate themselves to us contributes |
to so much fraud. We have been accustomed to readily trust company |
websites, phone calls, emails, texts, and other communications. They have |
trained us to trust them because it is cheaper and more convenient for them |
this way than if they had to authenticate themselves to us. But we really |
shouldn’t trust them without authentication. |
For example, credit card companies often call or email people to inform |
them about potential fraud on their cards. At first blush, this seems good— |
people are being informed about fraud. The problem is that a fraudster |
could readily be making the call or sending the email. The fraudster could |
ask for people’s personal information, passwords, PINs, and other sensitive |
information by pretending to be the card company. |
Organizations often expect us to just trust them whenever they call us or |
email us. People shouldn’t be asked to give their trust so readily without a |
way to verify that the calls or emails are indeed coming from the |
companies. Companies will take steps to ensure that when consumers |
contact them, that consumers are who they say they are. But companies take |
no steps to verify that they themselves are who they say they are. |
When we interact with organizations, authentication should be bilateral |
—companies should be developing means to authenticate themselves to us. |
Then, we would know to expect that a company is properly authenticated, |
and this would teach us how to distinguish between the imposters and |
actual representatives from the organizations. |
Instead, organizations constantly call and email people and expect |
people to trust them. Fraudsters exploit this trust. This is how organizations |
train and prime people so that they will be easily subjected to fraudsters. |
Organizations shouldn’t be contacting people and asking for any |
personal information unless they can convincingly verify their identity so |
that people can distinguish them from imposters. The credit card companies |
that call or email should ask people to reach back out to them on the |
number on their cards or go directly to their websites without clicking on |
links in the email. So should any company that emails people—no company |
should be encouraging people to click on email links. |
The barrage of emails and calls that people receive from organizations |
asking them to click this or that or to provide personal data are teaching |
people how to be sitting ducks for fraudsters. The cumulative effect creates |
a huge public harm—it weakens security for everyone, it undermines efforts |
to teach people good security practices, and it all but ensures that the |
fraudsters will find plenty of people who will fall for their schemes. |
AGAINST DIGITAL TECHNOLOGY EXCEPTIONALISM |
A common theme throughout our discussion of the actors who contribute to |
data breaches is “digital technology exceptionalism”—treating digital |
technology as different from other things. |
The law is so enamored or flummoxed by the Internet, algorithms, and |
artificial intelligence that it often treats them as completely exceptional. The |
world has never seen anything that reduces the barrier to surveillance and |
communication like the Internet.58 People are spied on, lied to, defrauded, |
manipulated, harassed, blackmailed, humiliated, and locked out. Yet the law |
is reluctant to hold organizations responsible. |
Why are platforms not held more responsible for the products and |
services sold on them? Why is software treated so differently from other |
products? One reason is that digital things seem less tangible than physical |
ones. If a company makes a defective ladder that breaks, there’s a |
physicality to the product, the defect, and the injury. The digital world feels |
intangible, less real than the world of flesh and blood and bricks and steel. |
But code can kill. It can harm people in similar ways to physical items.59 |
These days, more things are dependent upon software. At Black Hat and |
Def Con, two popular tech security events, researchers demonstrated how |
they could hack into pacemakers and insulin pumps. The researchers stated |
that they could reprogram a pacemaker to issue a shock or deny a shock.60 |
Andrea Matwyshyn has written about the inherent vulnerabilities to our |
Subsets and Splits
No saved queries yet
Save your SQL queries to embed, download, and access them later. Queries will appear here once saved.