It was hard to believe, but the student insisted it was true. He had discovered that compact discs from a major record company, Sony BMG, were installing dangerous software on people’s computers, without notice. The graduate student, Alex Halderman (now a professor at the University of Michigan), was a wizard in the lab. As experienced computer security researchers, Alex and I knew what we should do: First, go back to the lab and triple-check everything. Second, warn the public.
But by this point, in 2005, the real second step was to call a lawyer. Security research was increasingly becoming a legal minefield, and we wanted to make sure we wouldn’t run afoul of the Digital Millennium Copyright Act. We weren’t afraid that our research results were wrong. What scared us was having to admit in public that we had done the research at all.
Meanwhile, hundreds of thousands of people were inserting tainted music CDs into their computers and receiving spyware. In fact, the CDs went beyond installing unauthorized software on the user’s computer. They also installed a “rootkit”—they modified the Windows operating system to create an invisible area that couldn’t be detected by ordinary measures, and in many cases couldn’t be discovered even by virus checkers. The unwanted CD software installed itself in the invisible area, but the rootkit also provided a safe harbor for any other virus that wanted to exploit it. Needless to say, this was a big security problem for users. Our professional code told us that we had to warn them immediately. But our experience with the law told us to wait.
The law that we feared, the DMCA, was passed in 1998 but has been back in the news lately because it prohibits unlocking cellphones and interferes with access by people with disabilities. But its impact on research has been just as dramatic. Security researchers have long studied consumer technologies, to understand how they work, how they can fail, and how users can protect themselves from malfunctions and security flaws. This research benefits the public by making complex technologies more transparent. At the same time, it teaches the technology community how to design better, safer products in the future. These benefits depend on researchers being free to dissect products and talk about what they find.
We were worried about the part of the DMCA called 17 U.S.C. § 1201(a)(1), which says that “No person shall circumvent a technological measure that effectively controls access to a work protected under [copyright law].” We had to disable the rootkit to detect what it was hiding, and we had to partially disable the software to figure out what it was doing. An angry record company might call either of those steps an act of circumvention, landing us in court. Instead of talking to the public, we talked to our lawyer.
This wasn’t the first time the DMCA had interfered with my security research. Back in 2001, my colleagues and I had had to withdraw a peer-reviewed paper about CD copy protection, because the Recording Industry Association of America and others were threatening legal action, claiming that our paper was a “circumvention technology” in violation of another section of the DMCA. Later we sued for the right to publish these results—and we did publish, four months later. We had won, but we had also learned firsthand about the uncertainty and chaos that legal threats can cause. I was impressed that some of my colleagues had been willing to risk their jobs for our work, but none of us wanted to relive the experience.
Alex had dealt with his own previous DMCA threat, although this one was more comical than frightening. After he revealed that a CD copy protection product from a company called SunnComm could be defeated by holding down the computer’s Shift key while inserting the disc, the company had threatened him with DMCA action. Given the colorful history of the company—it had started corporate life as a booking agency for Elvis impersonators—and the company’s subsequent backtracking from the threat, we weren’t too worried about being sued. Nevertheless, it showed that the DMCA had become a go-to strategy for companies facing embarrassing revelations about their products.
What was Congress thinking when it passed this part of the DMCA? The act was meant to update copyright law for the 21st century, to shore up the shaky technologies that tried to stop people from copying music and movies. But the resulting law was too broad, ensnaring legitimate research activities.
The research community saw this problem coming and repeatedly asked Congress to amend the bill that would become the DMCA, to create an effective safe harbor for research. There was a letter to Congress from 50 security researchers (including me), another from the heads of major scientific societies, and a third from the leading professional society for computer scientists. But with so much at stake in the act for so many major interests, our voice wasn’t heard. As they say in Washington, we didn’t have a seat at the table.
Congress did give us a research exemption, but it was so narrowly defined as to be all but useless. (So perhaps we did have a seat—at the kids’ table.) I’ll spare you the details, but basically, there is a 116-word section of the Act titled “Permissible Acts of Encryption Research,” and it appears to have been written without consulting any researchers. There may be someone, somewhere, who has benefited from this exemption, but it fails to protect almost all of the relevant research. It didn’t protect Alex and me, because we were investigating spyware that didn’t rely on the mathematical operations involved in encryption.
We sat on our Sony BMG CD spyware results for almost a full month. In the meantime, another researcher, Mark Russinovich, went public with a detailed technical report on one of the two CD spyware systems. When nobody sued him, we decided to go public.
In the weeks that followed, things happened quickly. Sony BMG recognized that it had overstepped, it distributed an uninstaller for the spyware, we discovered that the uninstaller opened further security holes in users’ computers, the record company recalled the affected CDs, and we determined that the CDs were reporting users’ listening habits back to the record company. Class action suits were filed. The Federal Trade Commission investigated, and the company eventually settled the FTC charges, agreeing to reimburse affected consumers up to $150 for damage to their computers.
We had managed to publish our results, but we were troubled by the incident. Our decision to withhold the news of the rootkit from the public seemed necessary, even in hindsight, but it was contrary to our mission as researchers. It was the last research Alex and I did on copy-protected CDs. Although I have a higher tolerance for lawyers than many of our research colleagues do, I still prefer the laboratory and the classroom to the courtroom. My peers seem to feel similarly—the volume of peer-reviewed research on copy protection technologies fell off about this time and has not recovered.
The good news is that this problem is easily fixed. Congress could amend the DMCA to create a robust safe harbor for legitimate research—not limited to encryption, not tied down with detailed requirements and limitations. There is a growing groundswell to address the DMCA’s ban on unlocking cellphones and its roadblocks to access for the disabled. Bills have been introduced in Congress to legalize cellphone unlocking. While we’re tinkering with the statute, let’s create a safe harbor for the researchers who can be our early warning system against unpleasant surprises in the next generation of technologies.
These days almost everything we do in life is mediated by technology. Too often the systems we rely on are black boxes that we aren’t allowed to adjust, repair, or—too often—even to understand. A new generation of students wants to open them up, see how they work, and improve them. These students are the key to our future productivity—not to mention the security of our devices today. What we need is for the law to get out of their way.
This article arises from Future Tense, a collaboration among Arizona State University, the New America Foundation, and Slate. Future Tense explores the ways emerging technologies affect society, policy, and culture. To read more, visit the Future Tense blog and the Future Tense home page. You can also follow us on Twitter.