PAXsims

Conflict simulation, peacebuilding, and development

Tag Archives: intelligence analysis

Simulating spooks? The CIA, simulations, and analyst recruitment

While many might associate the CIA with dissimulation as much as simulation, the Agency uses serious games and simulations in a number of ways. They are used, for example, in analyst training at CIA University (indeed, one well-known game designer teaches there). They are also sometimes used as an analytical technique, whether directly or through intelligence contractors and outside experts. Some argue they aren’t used enough—one CIA tradecraft primer warns that they are “advanced analytic methods” that “usually require substantial commitments of analyst time and corporate resources.”

A winning paper in the 2007 Director of National Intelligence “Galileo” essay competition (and subsequently published in Studies in Intelligence) suggests that skills in this area are unevenly distributed within the intelligence community, and proposes a “National Security Simulations Center” (somewhat modelled on both the Gaming Department at the Naval War College, and the Centre for Applied Strategic Learning at National Defense University) to act as a sort of IC center of excellent to “strengthen the accuracy and insight of intelligence analysis, improve IC collaboration, and create a testing ground for new analytic tools and methods.”

Be that as it may, I wanted to flag another area where the CIA’s use of simulations has certainly been expanding dramatically in recent years: specifically, the use of crisis simulations as part of its outreach and recruitment efforts at American college and university campuses. Initially, these exercises seem to have formed part of individual campus recruitment visits. Last year, however, they were expanded to become multi-school competitions. The November 2011 competition at Georgetown University, for example, included teams from twelve colleges and universities in the Washington DC/Virginia/Maryland area. According to a press release by the CIA, by the end of 2011 almost  one thousand students across the US had participated in several dozen CIA simulations.

In a typical session:

Each five-person team was presented with the CIA-authored scenario: Printouts containing raw intelligence surrounding a fictitious—but plausible—developing international crisis. They had three hours to sort through the information and prepare a cogent half-page brief outlining the situation and suggesting a course of action for the United States.

Each team was also assigned an Agency mentor, to observe and offer advice

At the end of the simulation, the analysts reviewed the written briefs from all eight teams. The top two teams in each group engaged in a “brief-off” in front of the entire CIA contingent.

Further accounts of these simulations by some of the participating institutions and students can be found at the following links:

h/t Google

simulations and training intelligence analysts

The Committee on Behavioral and Social Science Research to Improve Intelligence Analysis for National Security of the (US) National Research Council recently published a rather interesting useful and interesting two volume set of studies on Intelligence Analysis for Tomorrow: Advances from the Behavioral and Social Sciences and Intelligence Analysis: Behavioral and Social Scientific Foundations. In the latter collection of research papers, Steve W. J. Kozlowski (Michigan State University) argues that:

If you need people to acquire declarative knowledge, reading (rereading and memorizing) a book or manual may be sufficient. But if you need deeper comprehension of decision-making strategies and the capability to adapt those strategies, then you need to engage active, mindful, effortful learning. These higher level competencies may require systematic, guided hands-on experience in the work context or a “synthetic world” simulation (Bell and Kozlowski, 2007; Cannon-Bowers and Bowers, 2009). Indeed, one of the key challenges for improving analytic skills in the IC is that timely feedback and evaluation of the accuracy of a forecast is typically lacking (e.g., the time frame is too long, the forecast influenced events, etc.). Because simulation incorporates “ground truth” or an objective solution, it could be used effectively to provide analysts with wide-ranging synthetic experience, exposure to low-frequency events, and opportunities to calibrate forecasts with the provision of timely, accurate, and constructive feedback and evaluation. For example, the Defense Intelligence Agency has recently begun using analytic simulation to enhance analysis and decision skills (Peck, 2008). These initial efforts could be augmented substantially by incorporating explicit instructional models in simulation design (Bell et al., 2008).

Yep, what he said—which is why I’ve always liked Kris Wheaton’s work in this area. There are others who also use simulation methods and games in training intelligence analysts (some of whom read this blog), and it would be nice if we could get them to write about it some time. Hint, hint.

More broadly, the point applies to everyone working in a professional field where the sometimes rapid analysis of fragmentary social, political, economic, and/or military trends and information is required—a category that applies well to NGOs, aid workers, diplomats, peacekeepers, and others working in fragile and conflict-affected countries.

%d bloggers like this: