Several weeks ago, well-respected wargamer Barney Rubel posted a short article critiquing SecDef Hagel’s call to improve gaming as a means of enhancing innovation. The piece is a thoughtful review of many of the strengths and pitfalls of gaming and is well worth a read. However, I was struck by how many of Rubel’s arguments are old hat within the gaming community. For example those readers who followed up on Rex’s suggestion to read Crisis Games 27 Years Later (which I also highly recommend) will recognize many of Rubel’s arguments.
While these limitations are important to be aware of, too often gaming is either written about in glowing terms by outsiders to the field (for an example, see Dave Anthony’s poorly-received Atlantic Council event from this fall) or are cautious warnings by experienced gamers like Rubel noting the limits of the field. While professional conferences and publication feature a bit more nuance, they too can fall often fall into the same patterns.
While both points are important to consider, neither is very useful in thinking about what can be done to make gaming better. However, in the last two years or so I’ve seen more and more work that explores new techniques, and attempts to capture how they did (or did not) improve the ability of games to meet their objectives.
My goal is to use some of my time here at PAXsims highlighting these new techniques. My hope is that doing so will give a better sense of some of the areas where the field has made improvements, and help circulate new practices for feedback.
To start out this discussion I want to talk about one of the areas I am most excited by – the use of new, qualitative techniques that compliment quantitative techniques and can be used in tandem to produce richer analytic results.
Wargaming has often been something of a little brother of operations research. As a result, often the instinct of gamers (and DoD analysts more broadly) is to reduce problems down to be as specific, and ideally quantitative, as possible. As a result, many gamers have preferred to look at problems that can be reduced down to combat performance tables and other probabilistic ways of generating results. Such a strategy reduces many of the pitfalls of subjectivity raised by Dean Rubel. However, these methods do so by stripping out much of the complexity and indeterminacy that makes problems challenging in the first place, and as a result can greatly limit gaming’s utility.
An alternative to this approach relies on qualitative analysis that may offer less specific findings, but allow complex problems to be approached directly. Good qualitative analytic tools require a structured research design in advance of the game, and careful data collection that can allow the game to be analyzed after the fact. More and more often, gamers are discovering new techniques from these traditions that can contextualize finding to offer rigorous, useful analysis without artificially simplifying problems.
To my (political-scientist-trained) mind, this distinction mirrors the division between quantitative and qualitative approaches in political science. While sometimes treated as antagonistic approaches, a great deal of excellent recent work has been produce that leverages techniques from both approaches. Using “mixed method” or interdisciplinary approaches can allow one method to buttress the weakness other the other. They can also be used iteratively as a way to drive research. (For those interested in more context about how this plays out in political science, I highly recommend this paper by Kai Thaler that discusses the application of a mixed methods approach to the study of political violence.)
I believe that this same approach can also be used to improve the quality of games by effectively leveraging quantitative and qualitative techniques to build better analysis.
One of the most important precepts of mixed methods is that not all techniques are appropriate to all questions. To my mind, questions about what potential causes are linked to specific outcomes are best handled qualitatively. Questions about the size of the effect are best handled through quantitative means. Questions about categories into which events might be divided (or to put it another way how similar or different events are) may be suitable to either approach, depending on what types of quantitative and qualitative data are available.
When determining which approach is most appropriate, I also tend to look at what type of data can be collected and use that as a guide to selecting a method. If you are interested in the impact of an economic policy that would naturally be measured in dollars, or weapons performance that can be measured in rounds fired per minute, quantitative is the way to go.
If, on the other hand, you are interested in the process of decision making between groups of people where the output is spoken words or policy decision there is often not a natural quantitative proxy. In these cases I think you are often better off using qualitative techniques that can be applied to analyze data like transcripts of dialogue, descriptions of interpersonal behavior, and records of what decisions were made at what points in time.
Multi-methods approaches to gaming are still in their infancy. In part, this is because the field lacks a strong understanding of what qualitative approaches are available and what problems they are appropriate to in a gaming context. As a result, I believe a necessary step to build up to the full potential of mixed method games is improving our qualitative analytic practices. There will be more on this point from me in the next year!