text
stringlengths
0
118
Maximizing Data Minimization
The idea that companies should only be able to collect and retain data that
is adequate, relevant, and necessary is a bulwark against data abuse and the
essence of privacy because it either prevents data from being created in the
first place or compels its destruction. It also demonstrates how privacy and
security must work together to achieve their separate goals.
Security can focus on how to retain data and how to protect its integrity.
It can ensure that only authorized people can see data and that information
doesn’t get improperly accessed or leaked. Privacy focuses on difficult
substantive questions such as how long the data is retained, how it can be
used, and specifically who is authorized to see it and change it. Privacy
focuses on determining when data should be destroyed, which is often
based on regulatory requirements. Security plays a role in ensuring that the
data is properly destroyed.
Lawmakers should embrace data minimization with the same zeal they
embrace data security rules and for the same reasons. Although privacy and
data security have slightly different functions, they work in tandem and
roughly overlap to achieve the same goals.
Data Mapping
Privacy requirements such as data mapping provide awareness about
potential security vulnerabilities. Data mapping shows what data is being
collected and maintained, the purposes for having this data, the
whereabouts of this data, and other key information. Without good data
mapping, personal data is often forgotten. When this occurs, data can fall
outside the security bubble or be improperly accessed, with this access not
being readily detected.
Data mapping is useful for both privacy and security. Keeping track of
data ensures that it remains within the security bubble and has the proper
security controls. There should be data stewards with accountability for
each repository of data. Security can set controls to make sure that those
who should have access do and that those who shouldn’t have access don’t,
but it is often in the realm of privacy where the determination of who
should have access is made.
Recently, privacy laws have been the main driver behind organizations
engaging in data mapping. Laws such as the California Consumer Privacy
Act (CCPA) require that businesses provide people with the specific
personal data collected about them.87 Even more helpful than individuals
knowing the specific data business have about them is the byproduct of
businesses being compelled to respond to individual requests to know. To
be able to respond, businesses are forced to have a better understanding and
inventory of the data they possess. The CCPA doesn’t directly require data
mapping, but the practice becomes necessary to carry out the CPPA’s
obligation to respond to individual demands to know about their data.
More privacy laws should require data mapping, ideally directly rather
than indirectly like the CPPA. Laws should require that organizations
ensure that all personal data is accounted for and have a person assigned to
be accountable for it.
In addition to improving data minimization and data mapping rules,
lawmakers could create improvements for data security by fortifying
existing privacy preservation rules around concepts such as deidentification
and rules against manipulation. Understanding the security benefits from
good privacy practices could generate broader legislative support for
privacy regulation. Companies would also benefit from learning not to
undermine their efforts to promote security by having poor privacy
practices.
8
Designing Security for Humans, the Weakest Link
In the afternoon of July 15, 2020, some of the most famous and powerful
people in the world appeared to suddenly become quite generous. On
Twitter, Joe Biden, Barack Obama, Kanye West, Bill Gates, and Elon Musk
posted messages such as the one below:
I am giving back to the community. All Bitcoin sent to the address below will be sent back
doubled! If you send $1000, I will send back $2000. Only doing this for 30 minutes.1
Bitcoin is a cryptocurrency that facilitates hard-to-trace transactions. This
was a scam, but it worked quite well. The New York Times estimated that
the fraudsters raked in more than $180,000.2
How could this have happened? How could one of the most important
communication platforms in the world have been so publicly compromised?
Investigations of the incident point to a series of failures culminating in a
telephone call between a hacker pretending to be from Twitter’s IT
department and an employee who helped the hacker by providing access to
the company’s customer service portal. Once the hacker tricked the
employee into helping bypass Twitter’s two-factor authentication
protections, it was off to the races. The hackers accessed 130 accounts in a
matter of hours.3
Humans make all kinds of terrible data security decisions. They click on
dubious links. They lose their laptops or leave them unattended. They
publish working login credentials to their clients’ systems on public
repositories like Github. They ignore warnings in browsers that important
security certificates have expired. They re-use passwords. The list goes on
and on.
Most data breaches are facilitated by preventable low-tech blunders.
Many studies show that human error plays a role in an overwhelming
number of breaches.4 Studies and news articles often loosely say that many
data breaches are “caused” by human error, but the more accurate
description is many data breaches “involve” human error. Data breaches
often don’t occur because of just one action; they are caused by a
combination of things. Human error often plays a key role.5 Hackers may
cause a breach, but they break in because humans fall for phishing schemes
or fail to patch software. Even when human error isn’t a major factor in a
hack, humans might fail to encrypt data, thus enabling the hackers to access
data they otherwise wouldn’t have been able to decipher. Human error often
involves the failure to prevent breaches that could readily have been
stopped. For example, the previously discussed Target breach could have
been prevented had humans paid attention to the blinking red lights.
Statistics on the causes of breaches are all over the place, but there is one
thing that can be said with a high degree of certainty: In most data
breaches, human error has played a significant role in enabling or failing to
prevent the breach. Humans are the largest component of the data security