I started off the second day of this year’s Military Operations Research Society annual symposium by attending a presentation by Yuna Wong (Marine Corps) on The Search for the Black Herring: MORSS Strategist’s Corner. This wasn’t a gaming presentation per se, but Yuna is certainly well-known in the professional wargame community, and has been particularly active in encouraging the closer integration of social science theory and methods into gaming and strategic analysis. In particular, her talk asked what analytical organizations should do to prepare for future analytical challenges in an era of uncertainty. One of my current research projects looks at the political science of prediction, so I was especially interested to see what she had to say. (For those who might be wondering. a “black herring” occurs when analysts obsessively look for the next “black swans,” only to find “red herrings.”)
Her primary argument was that analytic communities needed to be multidisciplinary, broaden their methodological expertise, and use experts well. Among the challenges to being more multidisciplinary are the existing (US government) human resource system, as well as organizational culture. As an example of the latter she used the professional wargaming community, who tend to have internal measures of legitimacy (for example, many years of being a boardgamer) that have an exclusionary effect on new and different talent. It is also a risk for organizations to depart from existing practices, and the search and start-up costs of becoming more multidisciplinary may be higher than maintaining the status quo.
She also was critical of the tendency of modern operations research to focus on narrow technical problems and answers. Such approaches may be less useful for addresses issues that are better characterized as a “mess” rather than a “problem.” (For the difference between these and the effective use of judgment-based methods, see NATO’s Code of Best Practices for Judgement-Based Operational Analysis.) The real “black swans,” she suggested, were to be found in the swamp of complex and chaotic environments.
Yuna had a number of useful thoughts too on effective use of outside expertise, addressing issues of identification, recruitment, facilitation, as well as practical issues (such as contracting and clearances).
I agreed with pretty much all that she had to say. However (following on from the work of Phil Tetlock and the Good Judgment Project) I asked whether we needed to pay more attention to cognitive styles. It isn’t just a question of finding people with differing areas of expert knowledge, but finding those people who are also not locked into particular paradigms or filter everything through a preexisting worldview.
While not a wargaming panel a great deal of what she had to say was of significant value for analytic gaming. Many wargames, after all, involve need to address messy problems, challenge conventional wisdoms, engage broader expertise in game design and adjudication, and explore uncertain futures.
John Hanley Jr. (formerly ODNI) presented on Gaming and Game Theory: Using Game Theory to Advance Gaming. He argued that understanding game theory helped in both wargame design and analysis. Manual games, he suggested, have their limitations: they are not rigorous analysis; they don’t have fully reproducible results; they are dependent on the quality and characteristics of the gamers; they can be personnel-intensive (and hence expensive), can be error-prone; and they aren’t real (and there is a consequent risk of over-learning from them). Many of these limitations can be reduced however, by continuous gaming and a structure for capturing results. He highlighted this by discussing a series of games over the years at the Naval War College, subsequent analysis of which identified clear clusters of moves, responses, outcomes, and equilibrium strategies for Red and Blue. The data, however, was messy (suggesting that moves and adjudication needs to be more clearly delineated and recorded). Capturing manual games in game theoretic form allows for more sophisticated analysis, helping to poulate the strategy space and identification of dominant strategies and equilibria.
Daniel Stimpson (George Mason University) talked about Using Operational Patterns to Influence Attacker Decisions on a Transportation Network. The challenge he was addressing was how to anticipate an opponent’s IED attacks on transportation and logistics networks. To date, he suggested, much of the academic (and classified) literature did not adequately address attacker dynamics and the interaction between attacker and defender. Boyd’s OODA loop provided some of the conceptual underpinning for his approach.
He offered a useful discussion of the notion of “randomness,” noting that neither “variety” nor “surprise” was synonymous with randomness, nor was it the same as failure to predict (although prediction is only possible in constrained systems, and totally random systems are inherently unpredictable). Surprise, he noted, derives from a failure to predict—the system itself is not “surprised.” In his model, Blue seemed far more constrained in its tactical choices than Red (whose behaviour seemed to be wholly driven by trial-and-error learning). The presenter noted that there were limits in how complex the model could be, given resource and time constraints.
Douglas Samuelson (InfoLogix and Group W), offered a presentation title Anybody Else Wanna Negotiate? Representing Negotiations Realistically in Wargames. He argued that negotiations are generally characterized by non-zero transaction costs and multiple representatives with non-identical interests. Because of this, negotiations often have multiple phases: reaching a deal, then selling a deal to constituents or clients. For a deal to last, it needs to keep producing benefits that outlast the negotiator’s involvement. Negotiations with many parties but clear interests are good candidates for mediation. Problems with shifting interests and unclear identities are more difficult to mediate. (He used the examples of North Korea and Israel-Palestine to illustrate his argument, although I was not convinced of his application of the cases.) He suggested that achieving the best possible deal is not always in a party’s interest, given the importance of promoting trust as a basis for continued interaction. He also addressed coalition-building, as well as conditions under which negotiators may wish to prolong talks, or actors may seek to derail negotiations by attacking the negotiators.
Unfortunately the presentation didn’t link this very well to wargame design or facilitation. For the most part it simply identified aspects of negotiation, and suggested at the end that a wargame ought to include these in some way. Some of the audience probed this point, asking questions about how we might best built the many complex aspects of negotiations into a wargame—the tricks of the trade, as it were, for manipulating players into realistic negotiation behaviours
A key aspect of this, of course, is building an effective narrative that players will internalize, and understanding what player narratives indicate about perceptions and behaviours. Fortunately the next presentation was by Yuna Wong (USMC) and Sara Cobb (George Mason University) on Narrative Analysis in Seminar Gaming. Unfortunately the presentation was classified as For Official Use Only and those of us Canadians with yellow badges had to leave. (This is known in MORS wargaming parlance as being “Brian Train-ed.”)
With this, our cunning plan to use US government thinking about narratives and seminar gaming to assist in the rapid Canadian military seizure of Seattle, Fargo, and Albany as envisaged in Defence Scheme #1 was foiled. Curses!
(I should add the Working Group 30 chairs were very apologetic about this, and as Canadians we left very politely.)
The day ended with an excellent panel discussion on Practices in Wargaming that featured such wargaming luminaries as Peter Perla (CNA), John Hanley Jr., Jeff Appleget (NPS), Hank Brightman (NWC), and Ellie Bartels (Caerus Associates).
Three main topics were addressed. Far too much was said by both the panel and the audience for me to accurately record here, so instead I’ve briefly summarized some of the major points:
Gaming for DoD Analysis
- Ellie Bartels: We need to pay more attention to game design in order to be able to convince clients of the value of gaming methodologies.
- Jeff Appleget: You really need to pay attention to the human element in wargaming. We’ve been too focused on closed-loop combat models.
- Peter Perla: We need to integrate all the tools in the toolkit. Does DoD have any organizational incentive to listen to games and OR analysis that tells unpleasant truths?
- John Hanley Jr.: DoD should be gaming everything that involves the interaction of two or more players. Games are useful for developing concepts, identifying capability and intel needs, etc. However no game results can stand without independent substantiation.
- Hank Brightman: Our greatest challenges as analysts is we work for senior decision-makers from hard science backgrounds that are most comfortable with quantitative data. We need to look at big problems in whole, and use games to provide insights (but not answers).
- Audience questions, comments, and discussion:
- Are decision-makers so saturated with analysis that there is only limited capacity for games-based analysis? Can good analysis drive out bad? Who does wargaming best in DoD?
- Withholding information can help assure that players don’t get lost in the tactical weeds, and instead focus on operational and strategic levels.
- Almost all of the key insights of wargames are qualitative. However, there was push-back on this, suggested that some quantitative data extraction was also important.
Making a Playable Game
- Ellie Bartels: Players are human beings, and need to be treated as such and feel their contributions are valuable. We need to be parsimonious in our game design.
- Jeff Appleget: I have my students actually design a game for a DoD sponsor. They learn the challenges of deriving clarity from the sponsor. Adequate time for playtesting is important, since there is a design/play/revise iteration that is essential.
- Peter Perla: The closer the game is to familiar functions the easier it is to play. However you need a balance between a simple “talk it through” game, and formalisms that give players an opportunity to discuss how the game models the real world.
- John Hanley Jr: You get people responsible and place them in a similar environment and they really engage with the game.
- Hank Brightman: There are two types of game, experiential and analytical, and they have different requirements. We link the designer and the analyst from the beginning.
- Audience questions, comments, and discussion:
- It is important not to confuse one type of game with another.
- We don’t have a common gaming conceptual language.
- There are lots of folks who design bad games.
- There is more to truth to analysis. Games are created universes that can encourage insight but aren’t analysis.
- A game is in the minds of the players, not in the computer or in the table.
The Future of Gaming
- Ellie Bartels: The future of wargaming will depend a lot on who future wargamers and leaders are. Findings tend to be both complex and abstract, and you need analysts and leaders who are comfortable with that. We need to be multidisciplinary–even into the humanities!
- Jeff Appleget: The need for games is higher in an era of higher uncertainty. We need to communicate to senior leaders what games can, and cannot, do.
- Peter Perla: It seems as if gaming is on the rise again—although I’m nervous that it will do its (boom and bust) cycle again. We need to communicate its payoffs and limits. We ought to be able to communicate to future decision-makers with games.
- John Hanley Jr: My expectation is more of the same. My aspiration is that we devote more effort into making sense out of sets of games. We could be using online gaming to explore a larger chunk of the strategy space.
- Hank Brightman: We need to bring in folks from other fields. Future analytical gaming needs to use more analytical triangulation and mixed methods. I think that we’ll have a backlash against the impersonalism of some digital gaming and interfaces.
- Audience questions, comments, and discussion:
- What is the impact of having commanders who went through wargaming?
- Distributed gaming on SIPRInet.
- What is the impact of the current gaming generation?
- What is the accountability mechanism for learning from game failures, or insights that turned out not to be very insightful?
- What are gaming worst practices?