When I was in college, the main campus library had several computers set up on the first floor for public use, and invariably, whenever I used one, a previous user had not logged out of her Gmail account. So when I tried to load my account, I would instead find myself staring at the entire contents of someone else’s inbox. Of course, I would then log that person out and sign myself in—but those brief moments when I had complete access to another person’s email were terrifying nonetheless. How could people be so careless with something as valuable as their email account? And then, inevitably, after my own session, I would make it halfway across campus and suddenly begin worrying that I might have forgotten to log myself out—the same way you might worry you forgot to turn off the stove, or lock the door before leaving your house—and so I would trek back up to the library and check.
I still fear public computers, a terror that was only reinforced by the July 10 advisory that the Secret Service and the National Cybersecurity and Communications Integration Center issued about keyloggers on hotel business center machines. The advisory, first reported by security researcher Brian Krebs, was directed at the hospitality industry and warned of cases in which people who had registered at hotels with stolen credit cards downloaded keylogging software onto the computers in the hotels’ business centers.
The software would then capture every keystroke entered on those public machines—including the usernames and passwords entered by unsuspecting hotel guests, as well as the content of any emails or documents they wrote on those machines. The log of these keystrokes would be emailed to the person who had installed the malicious program, providing the hacker with a wealth of data on the business center users. “The suspects were able to obtain large amounts of information including other guests’ personally identifiable information (PII), log in credentials to bank, retirement and personal webmail accounts, as well as other sensitive data flowing through the business center’s computers,” according to the advisory.
This, of course, is a far more serious—and nefarious—threat than college students who forget to log out of their Gmail accounts and thereby give strangers access to their email, but both risks stem from a common problem in computer security: our tendency to treat public computers like personal ones and, more broadly, to ignore the physical dimension of cybersecurity.
Krebs points out that while there are ways that hotels can try to make it more difficult for people to download keyloggers on their computers—by restricting users’ ability to install programs, for instance—there’s a limited amount that can be done to improve the security of public computers, especially if they’re to provide any valuable services to users. Or, as Krebs puts it, “if a skilled attacker has physical access to a system, it’s more or less game over for the security of that computer.”
Basic safeguards are still worth taking, if only to restrict the set of potential perpetrators to “skilled attackers.” The advisory noted:
The attacks were not sophisticated, requiring little technical skill, and did not involve the exploit of vulnerabilities in browsers, operating systems or other software. The malicious actors were able to utilize a low-cost, high impact strategy to access a physical system, stealing sensitive data from hotels and subsequently their guest’s [sic] information.
It doesn’t take much skill to find keylogging software online and install it on a public machine. You don’t need to know how computers work, you don’t need to be an expert coder, you just need to be dishonest—and have access to a computer that other people use. This is data theft at its easiest—and perhaps also at its easiest to overlook.
In cybersecurity research, we think a lot about the variety of threats that can flow over networks and the silent, nonphysical ways that computers can be accessed and penetrated and entered—via email, Web pages, and other means. These sorts of crimes present a whole host of new security problems that are worth studying and addressing in light of the fact that the principles and assumptions of physical security no longer apply. The very notion of “access,” in fact, changes radically in this context—and the language we use to talk about cybersecurity breaches, in which attackers successfully “penetrate” machines, or get “inside” computers, reinforces how thoroughly physical ideas have been co-opted and given virtual meanings in this space.
But sometimes we risk forgetting that the lessons and language of physical security still matter and still apply. Yes, you can steal information from a computer halfway across the world—but it’s often much easier, especially for criminals with limited technical expertise, to steal from a computer you can walk right up to—a computer in a hotel’s business center or college library. Even privately owned computers that are left unlocked present a prime target for the technically unskilled criminal, and while people routinely use lock screens on their cellphones, they often don’t take the same degree of precaution with their laptops.
The good news about the physical security elements of cybersecurity threats is that, just as they are relatively easy for nontechnical people to exploit, they are also fairly straightforward for other nontechnical people to defend against. Essentially, you want to make it as difficult as possible for anyone who is not you to ever use your private computer, and you should only use public ones under the assumption that anything you do on them may be captured or accessible to others. Just as you might take basic hygiene steps to avoid germs and bacteria in public bathrooms (or on public keyboards), some simple cyber hygiene measures can help you ward against the digital diseases carried by the outside world. This means always—always, always—locking your computer whenever you walk away from it, not letting other people use it, and not checking your primary email account or bank account—or doing anything else potentially sensitive—in a hotel business center or on any other public computer.
This certainly won’t protect against all cybersecurity threats—it won’t even protect against all of the problems posed by hotel networks, which can be used to install malware on personal computers, or even public computers—my sophomore year, those same computers in the main campus library that I occasionally (and foolishly) used to check my email were used to send anonymous death threats via email. But at the very least, these sorts of measures will help weed some of the less technically talented from the field of would-be cybercriminals and allow us to continue studying and learning about the novel nature of these digital threats without losing sight of the ways in which they are not entirely new. Cybersecurity and physical security are closely related—increasingly so, as more physical objects are connected to online infrastructure in various ways—and even as computer networks pose some new security challenges, they can still benefit from applying some of the older lessons of physical security.
This article is part of Future Tense, a collaboration among Arizona State University, the New America Foundation, and Slate. Future Tense explores the ways emerging technologies affect society, policy, and culture. To read more, visit the Future Tense blog and the Future Tense home page. You can also follow us on Twitter.