In 1969, the KGB pulled off one of the cleverest deceptions of the Cold War, slipping forged documents into a leak of otherwise authentic U.S. war plans in an effort to pit America against its NATO allies. More than 50 years later, I was invited to run a training session on disinformation for a U.S. intelligence agency. In my back-and-forth with the official in charge, I proposed a game of spot-the-fake using the 1969 leak. The exercise would test whether the intelligence officers could recognize one of the best forgeries in the history of spycraft. But the official shot down my idea. The leaked material technically may still be classified, he explained, so we weren’t allowed to use it. In fact, although the documents had been sitting in public view since before most of us were born, the officers in the class weren’t even allowed to look at them.
For this I had the Department of Defense to thank. On June 7, 2013, days after the Edward Snowden story broke, the Pentagon issued an immediate security guidance: Employees or contractors “who inadvertently discover potentially classified information in the public domain shall report its existence immediately to their Security Manager,” read the new rule. This reporting requirement was onerous enough on its own to scare federal employees. To make things worse, those who didn’t just stumble across classified documents but looked for them deliberately would be punished: Contractors or employees “who seek out classified information in the public domain, acknowledge its accuracy or existence, or proliferate the information in any way will be subject to sanctions,” stated the notice. (The directive was grounded in a similar 2010 rule issued by the Office of Management and Budget.) Given that the Snowden revelations were being published by media organizations and blasted around the internet, the policy effectively turned mundane activities like watching the news, browsing social media, searching Google, and even reading books into a risky proposition for any federal employee or contractor.
Ten years later, the illogic of this policy has only grown clearer. In the pre-internet age, leaks of classified documents were rare. Today, thanks to the internet, they’re regular events. In the past dozen years, the United States has seen five mega-leaks, each with hundreds, thousands, or even hundreds of thousands of documents spilled into the public domain. The names of the leaks are infamous in national-security circles: Cablegate in 2010, enabled by Chelsea Manning; the Snowden disclosures of 2013; the Shadow Brokers episode of 2016; the release of the Vault 7 files of 2017, stolen by Joshua Schulte; and, most recently, the Discord leaks, courtesy of Jack Teixeira.
These leaks can cause major geopolitical headaches. They can even get people killed. But, like the KGB leak of 1969, they also provide rich troves of material that advance expert understanding of how tradecraft is conducted—by both the U.S. and its adversaries. Snowden exposed how intelligence agencies adjusted their methods to the digital era, how signals-intelligence development evolved, and much more. The Shadow Brokers and Vault 7 exposed how implant platforms are designed and how the NSA does digital counterintelligence. The Discord leaks revealed invaluable knowledge on a range of geopolitical crises. It is simply impossible to understand the story of technical spycraft and computer-network penetrations in the 21st century—a discipline that first emerged in secret—without studying these unauthorized disclosures.
Read: The limits of signals intelligence
Yet the U.S. government has decreed this literature taboo for precisely the people who would most benefit from viewing it. Last month, on the heels of the Discord leaks, the deputy secretary of Defense reaffirmed the 10-year-old rule, warning that “failure to appropriately safeguard classified information”—even information that’s already public—“is a reportable security incident.”
The authors of this policy meant well, and some elements of the rule make sense. If a government official denies a certain element of a leak, for example, they are implicitly affirming the rest of it. More generally, the government wants to avoid the precedent that a bulk leaker could effectively become a rogue declassification authority. But trying to stop members of our own national-security establishment from even looking at leaked documents that are already being viewed by the public—not to mention by rival nations—tips into the absurd.
This became apparent in late 2016, in the wake of the Shadow Brokers disclosures. The hackers behind that leak, who still have not been identified, released not just documents that could be read but computer code that could be used. Suddenly, hostile actors around the world had access to NSA hacking tools that could be deployed against American commercial and government targets. Network defenders for the Department of Defense faced a dilemma. They needed to scan for incoming hacks—but they technically were not allowed to look at the hacking tools that were already being used by some of the most determined adversaries of the United States, including Russian military intelligence and North Korean cyberoperators. To do their job, they would have to violate the department’s official policy.
The issue of defending against forbidden code wasn’t limited to government employees. It extended to U.S. security companies that employ contractors with security clearances. These contractors are bound by the Department of Defense’s rule even when they’re not performing work for the government. I once asked a U.S. cybersecurity executive how his company handled the banned-documents problem in the context of securing the networks of their own clients. His answer: They would assign U.S. leaks to British analysts and leaked U.K. documents to American analysts.
As a professor of strategic studies and cybersecurity, I’m particularly concerned about the rule’s effect on students. Although the Pentagon guidance does not say so, most of my students at Johns Hopkins assume that reading leaked files will reduce their chances of getting a security clearance down the line. It’s what they pick up at happy hours, at receptions, and on social media from peers and alumni who work in the national-security establishment in and around Washington, D.C. The risk-averse culture in this town treats the leaks as forbidden fruit that should not be tasted. Yes, some of my students seriously believe that learning could harm their career prospects. Some therefore avert their eyes not just from primary-source documents but also from social-media feeds that might carry forbidden screenshots. Some go so far as to limit their news consumption and express concern about reading assigned books that use technically still-classified primary sources.
Even fellow scholars of intelligence history and cybersecurity sometimes avoid reading primary-source documents, for they see them as banned knowledge. They’re missing out. I have found that studying leaked files helps me better understand intelligence reports. Just this week, the Five Eyes alliance released an advisory attributing a sophisticated, stealthy implant known as “Snake” to Russia’s Federal Security Service. I trust that report more because I learned from the Snowden leaks how the NSA and its British counterpart built even stealthier implants and refined the means to detect such cyberespionage tools.
Read: Of course this is how the intelligence leak happened
The Pentagon’s policy is even more unfortunate as applied to the Discord leaks, because the pool of people who could learn from them is much wider than with past document dumps. Teixeira disclosed finished intelligence reports, not raw slide decks or technical manuals or hacking tools. These reports, which include military assessments about the war in Ukraine and accounts of behind-the-scenes diplomatic maneuvering, are of interest to a broad range of expert readers, not just technical nerds like me. Some of my colleagues, for example, closely follow China and Taiwan, or North Korea’s missile ambitions, or Iran’s covert actions, or the war in Ukraine. They may still hold an active clearance but have no strict “need to know” and hence no access to current intelligence reporting. In the early days after the leaks, when documents were beginning to get passed around, I noticed one intelligence report about Vladimir Putin’s health. I shared a screenshot with a fellow professor at my school who closely follows the Ukraine war. I expected him to be excited to read it. Instead, his response was, “I’m not supposed to be looking at these things.” I quickly deleted the screenshot from our chat. Reporters face a version of this problem too: When they approach experts for insight or a quote, those experts often decline out of fear that they’ll get in trouble for looking at a document that the press already has its hands on.
The regular spillage of secrets brings a larger paradox into sharp relief: Even democratic governments have secrets that they must work hard to protect. Yet, once those secrets are out, open societies must work hard to understand and learn from the facts that now are no longer secret, whether they are still technically classified or not.
Studying the leaks is in the U.S.’s national interest. Attempting to prevent an educated conversation about details and capabilities that are already public isn’t just quixotic—it’s also wasting an opportunity. Once the horse has bolted, you might as well ride it. A secret that has been publicized is no longer a secret. The task for government therefore must shift from protecting the information to making sure that the right lessons are drawn from it. The most pressing task of the U.S. intelligence establishment in this still-young century has been, and will continue to be, to expose and attribute the espionage, subversion, and sabotage of authoritarian spies and their global proxies. Mega-leaks can frustrate those efforts—but they also give the world a glimpse at an impressive set of tools and capabilities. One of the biggest revelations in the era of the mega-leak, ironically, is that the NSA and CIA are generally quite creative and effective at what they do. That’s one fact the government should not want to cover up.