text
stringlengths 0
118
|
---|
health and physical safety in “The Internet of Bodies.”61 Digital pills we |
take will communicate through sensors to our smartphones. An artificial |
pancreas hard-wired into the body will use software to calibrate insulin |
levels. The law is unprepared to deal with the individual and societal |
security and privacy risks of these devices. Not only have lawmakers failed |
to sort out the policy and legal problems of the Internet of Things generally, |
but they must also appropriately address what Matywyshn calls the |
“ ‘legacy code’ problem of software liability more generally.”62 We must |
stop treating digital technologies as completely distinct from other products |
and services. |
Holding the technology industry responsible can make a big difference. |
Given how many devices are being hooked up to the Internet and the |
profound dangers that poor security can lead to, it’s time we demand more |
from the technology industry. The software, apps, devices, and platforms |
they are making are no longer just the source of games and entertainment. |
The stakes are life and death. |
When it comes to digital technologies, the law often creates a buyer- |
beware world. We must enter the world of digital technologies risking life |
and limb. |
Imagine a company that opens a lion theme park where people can play |
with lions. Most of the lions are gentle, but occasionally there’s a ferocious |
one that eats people alive. There’s a big sign outside the place that says: |
“SAFETY IS OUR NUMBER ONE PRIORITY.” And below that, in small |
print: “Enter at your own risk. We assume no liability for any injuries.” |
The law’s response would normally be strict liability. The law would |
prevent the theme park from waiving liability and would not allow it to |
expose people to this kind of risk. But if the theme park were a technology |
company, the law would suddenly change its tune. The park now would be |
just a “meeting place” where lions “interact” with people—an exciting |
“social hub” for human–feline friendships. People “socialize” at their own |
risk. The park would just be “bringing people and lions together.” |
“Don’t stifle innovation!” defenders will cry. “If you make the company |
responsible, then the lions will have to go. People love the lions. Only a few |
people get eaten, but most people experience great joy playing with the |
lions. It’s their choice to accept the risk.” |
Outside the digital world, the law would never tolerate forcing people to |
take on so much risk without the theme park being held responsible. But |
often the law treats digital technology completely differently. It shouldn’t. |
Incentives work the same for the physical world as they do for the digital |
world. This results in the law being much better at spurring the market to |
produce safe physical products and so poor at getting the market to produce |
safe digital ones. Legal regimes like negligence law are remarkably flexible |
and adaptable for even complex and opaque technologies like artificial |
intelligence.63 But judges, lawyers, and lawmakers must continually work |
hard to understand the foreseeable risks of these technologies and where |
humans are at fault in designing and using them. |
We are not arguing for a full strict liability approach or for the law to |
make technology companies liable for all security risks.64 There will be |
always security risks with technology, and there are risks that are worth |
taking. The law should address unreasonable security risks, which are ones |
that are not justified by the benefits or that can readily be reduced through |
common industry standards and best practices. |
Far too often, there are technologies on the market that have substandard |
security because there is hardly any incentive to invest in better security. |
Because of the lack of incentives, better security is often not engineered |
into products even when it would have been easy and inexpensive to do so. |
Companies that make these products should be held accountable. Products |
that fail to provide a reasonable amount of security should be penalized and |
restricted. |
IT TAKES A VILLAGE |
Security professionals will tell you that there is no silver bullet on computer |
security, short of taking every hard drive in the world and launching them |
into space.65 It often takes a village to create a data breach. There are many |
actors in the system that lurk in the shadows, contributing to the problem |
yet escaping much notice and not being held responsible. These actors |
create risks, weaken the security of others, engage in risky activities without |
internalizing the harm, and create vulnerable software and devices. The |
law’s failure to hold these actors responsible and to create the right set of |
incentives for them with carrots and sticks overpowers nearly all the good |
things that various data security laws try to do. |
A lot of people are pointing fingers. For example, in her thoughtful |
analysis of several major breaches, Josephine Wolff contends that it is |
unproductive to just blame the breached organization, even when that |
organization is partly at fault.66 Wolff discusses the breach of credit card |
data at TJX Companies, which owns the T.J. Maxx and Marshalls retail |
chains. Fraudsters were able to intercept and decipher wireless |
transmissions of data outside of store locations. She concludes: “A closer |
look at the full timeline of the TXJ breach reveals that the episode in fact |
involved several different technical vulnerabilities and companies, but by |
singling out one encryption protocol and one organization as fully |
responsible for the incident the payment card industry was able to |
effectively shield itself from bearing any of the blame.”67 |
Finger pointing and trying to escape blame are common and predictable |
behaviors.68 This is exactly where the law must become involved. A key |
function of the law is to determine who should be held responsible. If |
responsibility is properly allocated, the responsible parties start to change. |
They start to internalize their costs. They start to reduce the risks. |
It is tempting to see bad happenings as the product of malicious rogue |
actors. Ascribing bad motives in this context can be counterproductive. |
Most actors act according to what the system incentivizes. The best way to |
make them act differently is to incentivize it with carrots and sticks. |
Data breach risks are created by a multitude of actors throughout the |
entire data ecosystem. The breached organizations certainly are responsible |
for some of the risk, but they are one piece of the pie. To be effective, the |
law must aim to reduce risk throughout the whole system. |
There are several ways data security law might better allocate risk to the |
right actors. Lawmakers can create rules holding specific and broad groups |
of actors accountable for data security risks. They can create and enforce |
data security rules for designers, distributors, amplifiers, facilitators, |
miseducators, and exploiters. Lawmakers can develop standards for |
Subsets and Splits
No saved queries yet
Save your SQL queries to embed, download, and access them later. Queries will appear here once saved.