text
stringlengths 0
118
|
---|
protecting their customers’ privacy when, in fact, they have done quite |
little.22 |
Based on his interviews with technologists, Waldman observes that |
many technologists believe privacy merely involves providing users with |
notice about the company’s privacy practices. Others think privacy is |
synonymous with encryption, which in this context is driven more by a |
desire to secure company data than to safeguard against consumer privacy |
risks. As Waldman also notes, “Few engineers remembered meeting with |
lawyers or privacy professionals one-on-one to discuss integrating privacy |
considerations into their work. Many found it difficult to design with user |
needs in mind; therefore, engineer-only design teams not only minimized |
the importance of privacy, but also missed how their designs impacted |
consumers.”23 This kind of organizational schism has led to a mentality |
around privacy and data security that ends up limiting the effectiveness of |
both domains. |
One of the problems with separating data security and privacy is that |
people working in these areas cannot learn from each other. This means |
they often repeat the same mistakes or miss out on different ways of |
thinking about problems. People can get a little myopic, thinking that their |
little patch of responsibilities is the cosmos. This kind of narrow thinking |
also leads to a breakdown in cooperation where privacy interventions could |
help improve data security and vice versa. |
Waldman’s interviews with technologists reveal that the companies they |
work for often do very little to prioritize privacy by design. As Waldman |
observes, “Privacy professionals or other personnel trained in privacy rarely |
met with engineers and programmers, even during weeks of intense design |
work.” Even at companies that had “privacy teams that were supposed to |
‘insinuate’ themselves into design, high turnover, a laissez-faire attitude, |
and corporate silos kept privacy mostly orthogonal to design.”24 |
Further, Waldman’s work reveals that privacy is often deprioritized |
while other values take precedence. The mandate often comes from the top, |
where executives want engineers to prioritize “speed, agility, [and] |
functionality.’ ”25 Waldman noted that “[i]nterviewees used words and |
phrases like ‘hands off,’ ‘absent,’ ‘uninvolved,’ and ‘not really a factor,’ to |
describe their employers’ approach to privacy. Privacy is akin to security’s |
distant cousin, whom everyone forgets to invite to the party. Even when |
privacy is at the party, it is relegated to the small children’s table off to the |
side. |
Beyond a lack of privacy protection, the schism between privacy and |
data security has resulted in organizations viewing data security mainly as |
an IT issue. Certainly, many components of good data security involve IT, |
such as encryption, firewalls, access controls, and more. But many more |
security issues involve a human dimension. Many security decisions |
involve human behavior, such as how to deal with cognitive limitations, |
carelessness, cheating, denial, ignorance, gullibility, and misconduct— |
security’s seven deadly sins. Security decisions also involve policy, such as |
managing the tradeoff between security on the one side, and ease, |
convenience, and ready accessibility on the other. |
We have heard people call the security side “hard” or “left-brained” and |
the privacy side “soft” or “right-brained.” IT technologists are often not |
well-trained in addressing complex issues involving people and values; they |
are more often trained mostly in “hard” technological problems and |
solutions. They know how computer systems and code operate, but often |
they aren’t sufficiently trained about how to respond to human behavior or |
how to think through challenging policy choices. Privacy professionals, in |
contrast, receive a heavier dose of training about so-called soft issues such |
as human behavior, values, law, and policy. We aren’t fond of the terms |
“hard” and “soft” or “left-brained” or “right-brained,” but we agree that |
there is certainly a distinction between the kinds of training IT and privacy |
professionals receive. The key difference is that privacy draws more from |
the humanities and data security is more steeped in engineering. For |
effective data security, however, both types of thinking are essential. |
Privacy is (or at least should be) about much more than just effectuating |
peoples’ personal preferences about who should have their data. Privacy is |
about trust, power, dignity, and the collective autonomy to set the |
preconditions of human flourishing.26 In a broader sense, privacy is about |
all the rules that govern our personal information.27 Data security policy |
similarly cannot escape a web of value-laden decisions, because it, too, |
requires tradeoffs guided by ethics and normative considerations. |
A Schism in the Law |
The schism between security and privacy also exists in the law, especially |
in U.S. law. Broadly speaking, the law began with a more unified view of |
privacy and security, but after the ChoicePoint breach, data security law |
spun off into a more separate domain. |
In the early laws of the 1970s through 2000, data security evolved |
alongside and within privacy laws and frameworks. Data security is one of |
the original Fair Information Practice Principles (FIPPs), which were the |
principles proposed to address concerns with the rise of computer databases |
of personal information.28 The FIPPs arose in a 1973 report by the U.S. |
Department of Health, Education, and Welfare (HEW) called Records, |
Computers and the Rights of Citizens.29 The HEW report was prompted by |
concerns about the computerization of records, and the committee that |
drafted the report was charged with recommending legal and policy |
responses. The primary recommendation of the report was to enact a code |
of fair information practices to regulate all repositories of personal data. |
Data security was one of the main recommendations in the report: “Any |
organization creating, maintaining, using, or disseminating records of |
identifiable personal data must assure the reliability of the data for their |
intended use and must take reasonable precautions to prevent misuse of the |
data.”30 |
The FIPPs have become the backbone of privacy laws around the world. |
In 1980, the OECD Privacy Guidelines included the “Security Safeguards |
Principle,” which stated that “Personal data should be protected by |
reasonable security safeguards against such risks as loss or unauthorized |
access, destruction, use, modification or disclosure of data.”31 The OECD |
Privacy Guidelines have formed the blueprint for the EU’s privacy laws, |
starting with various member nation’s laws, then the EU Data Protection |
Directive, and today’s General Data Protection Regulation (GDPR). Laws |
in the United States and around the world include many of the FIPPs. There |
are now more than 200 countries with data privacy laws, and most of them |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.