NSA: Deconstructing a Secrecy Blunder, A Study in Dysfunction

National Security Archive Electronic Briefing Book No. 342 – Published on NSA, by John Prados, April 21, 2011.

… The United States Government’s system for the release of classified material into the public domain continues to be riddled with error, ignorance, arbitrary actions, and simple inaction, while often impaired by parochial agency interests that have nothing to do with the protection of national security secrets, an analysis of a recently declassified document plus associated materials shows. The Obama administration’s efforts to cope with a backlog of more than 400 million pages of documents that have long been queued up for declassification review (see the Archive’s new 2011 FOIA Audit) will remain hamstrung as long as the underlying system rewards inaction amid mounting—and significant—costs both to the American taxpayer and to genuine national security interests … //  

… There are a number of reasons why the system is troubled. First, authorities frequently confuse source with substance. A national security secret consists of information, not documents per se. Confronted with a request to open a particular document, officials too often approach the task de novo, ignoring previous decisions on the same information as contained in other source materials, or indeed the same document. This results in a vicious circle in which the same information has to be reviewed again and again, absorbing effort that should be devoted to opening fresh documents. This weakness is compounded because there is inadequate knowledge across agencies—and even within the same agency—as to what has already been released.

Second, government agencies follow a concept of “equity,” which embodies a notion that information belongs to the agency that first put it into a document.  Thus, its views should be decisive in deciding whether the information can be safely declassified or not. Then that agency claims the primary responsibility for any declassification action concerning that information. In the case of NSAM-29, the CIA asserted an equity in a National Security Council directive because the paper contained instructions to that agency.  While it is understandable that CIA wants to protect what it considered sensitive information relating to secret operations, here the CIA was asserting ownership of information in NSC and Joint Chiefs of Staff documents, information that consisted of higher-level instructions which it had not originated, merely because the CIA was to be the designated action agency. Equity is inconsistent with an efficient declassification system. Parceling out (or “referring”) documents to agencies that assert equities subject the records to laborious, time-consuming transit and coordination procedures, complicating any efforts to expedite declassification. Technological innovation makes the concept of equity even more pernicious. When the time comes to declassify computer-age databases, to which many agencies simultaneously contribute, enforcement of equity could induce serious paralysis within the declassification system.

Third, the notion that information can or should be reclassified after it is released is a significant problem. Not only can this re-ignite the vicious cycle of multiple reviews of the same document, it can have a deterrent effect. No bureaucrat wants to be responsible for releasing information that may later be deemed necessary to reclassify. This leads to an ultra-cautious approach in which only the most innocuous information makes it past the censors. Under the parallel method called Mandatory Declassification Review, an interagency body can overrule agency decisions, but this falls short of true accountability because the censors suffer no consequences for their initial refusal to open information. In general the system provides few incentives for declassification activism, and the addition of this dollop of bureaucratic caution is potentially crippling. Such attitudes are very likely involved in what happened to NSAM-29.

Fourth, classification inflation is a huge problem. Over the years secrecy has been imposed on less and less sensitive levels of information, to the degree that the last (Bush) administration determined to regard “unclassified” information as secret. If “Wikileaks” demonstrates anything, it is that “secret” has been stamped on much innocuous information to avoid putting it on an easily available database.  Our presidential directive NSAM-29 was rated “top secret” because it concerned a secret war in Laos. Today that information would very likely be placed above top secret, at the level of “special compartmented information.” Classification inflation plus bureaucratic conservatism adds up to ossification of openness—and a burgeoning mountain of paper in high-tech vaults that that must be carefully watched—which costs money every day. Secrecy has real costs, in real dollars.

A system based upon these pillars created the morass in which NSAM-29 was caught, one in which repetitive and contradictory classification decisions kept officials running around in circles. Little progress can be made under such circumstances.  To address these problems, the Obama administration, in addition to steps such as issuing a new presidential directive (Executive Order 13256) and advancing uniform standards for secrecy regulations, has created the National Declassification Center, based at the National Archives, to promote the close process of interagency coordination that is necessary to help rationalize the declassification system. Time will tell, however, whether the NDC is enough of a fix and whether it can raise review standards, reduce the impact of equity in slowing declassification, and close the vicious circle of repetitive reviews to the point where what happened with NSAM-29 does not happen again.

Read the Documents 1 – 9 and Notes 1 – 6: … (full long text).

Comments are closed.