PAXsims

Conflict simulation, peacebuilding, and development

Category Archives: methodology

Room to game (or, the Battle of Winterfell explained)

 

where-is-everyone-during-the-great-battle-of-winterfell.jpeg

Course of action wargaming for the Battle of Winterfell. Might the room be responsible for the defenders’ military missteps?

 

The Battle of Winterfell was the final battle of the Great War against the Night King and Army of the Dead. While ultimately successful, the human defenders adopted a notoriously weak defensive strategy, involving poorly-defended ditches, misplaced archers and artillery, and a suicidal frontal cavalry charge.

Scholars and historians have suggested that weak scriptwriting was responsible for this. However, recent scientific research suggests that the real culprit might be the room selected for pre-battle course of action wargaming.

Everyone who has ever conducted a serious game knows that the room matters. How early can you get access? Are the tables big enough? Can they be moved (and are they all the same height)? Will the audiovisual and IT systems work on the day—and what’s your fallback if they don’t? Are there breakout/team/control rooms nearby? If so, will their location enhance gameplay (by fostering the rights sorts of interaction and immersion), or undermine it? Where will coffee and lunch be served?

There is also, however, considerable evidence that room quality affects player performance in more fundamental ways. A recent study by M. Nakamura in Simulation & Gaming found that the size and layout of the room had significant effects on how players assessed the gaming experience in their debriefings:

Results from the current study demonstrate that the difference in room condition was influential. In HACONORI, participants felt more satisfaction in the small room than in the large room, while in BLOCK WORK, participants felt less usefulness in the small room than in the large room, but only when asked about the degree of usefulness before being asked about their degree of satisfaction. The effect of room condition seems to trend in the opposite direction in the two gaming sessions. This difference is because the amount of space has a different meaning in HACONORI and BLOCK WORK; for example, in HACONORI, group members can successfully work together by providing quick and responsive communication with each other. The small room must have encouraged such speedy communication. Conversely, in BLOCK WORK, participants can successfully work when they have more personal space since the task is more individualized; however, this may be affected by the order of questions. When participants were asked about the degree of usefulness after being asked about their degree of satisfaction, their attitude tone was fixed and the degree of usefulness was not affected by room condition. When asked about the degree of usefulness before being asked about their degree of satisfaction, they recognized the usefulness of the BLOCK WORK session in the large room more than in the small room.

We should take into consideration the movability of the desks as an essential factor in improving room function as this must have affected the results. In HACONORI, participants felt more satisfaction in the small room than in the large room. This is because the movability of the desks was high in the small room but low in the large room. In other words, the small room functioned well because of the movable desks.

Both studies reflect the powerful effect of room condition, which depends on the game attributes. They also demonstrate that the effect of the debriefing form is not as powerful as the effect of room condition, although as noted above, it is advisable to consider the order of the questions.

Perhaps even more striking are the results of a 2016 study by Joseph Allen et al in Environmental Health Perspectives on the impact of room ventilation on cognitive performance. They established three experimental room conditions (“Conventional,” “Green,” and “Green +”) with varying concentrations of volatile organic compounds and C02. The study found that “cognitive scores were 61% higher on the Green building day [and 101% higher on the two Green+ building days than on the Conventional building day].”

In other studies, lighting has also been shown to affect recall, problem solving, and other cognitive tasks (with some gender variation too). Room temperature has demonstrable effects on productivity, with 21-22C the ideal temperature—although this likely also varies with age, gender, and other factors.

Taken together, the existing research on environmental conditions suggests that wargame participants in an appropriately lit, well-ventilated room will perform complex cognitive tasks roughly three times “better” than those in one that is too hot or cold, poorly lit, and poorly ventilated. I suspect that even my PAXsims colleague Stephen Downes-Martin—who could quite rightly quibble quibble how I’ve rather breezily aggregated different measures of task performance here—would agree that the room matters a lot.

Back to Winterfell. Course of action wargaming of the battle took in a cold and dimly-lit chamber of the castle (above). The tallow candles and open braziers used to illuminate the space undoubtedly produced high levels of CO, CO2, and particulate pollution of various toxic sorts. Moreover, few of the participants had bathed in weeks.

dany-and-tyrion-in-thechamber-of-the-painted-table-on-dragonstone.png.jpeg

Was it the dragon or the room? Use of a well-ventilated war room (with natural lighting and healthy sea air) may have been an important factor in planning the very successful Battle of the Goldroad.

 

By contrast, planning for the very successful Battle of the Goldroad took place in the war room at Dragonstone. Unlike the dark and frozen chamber used at Winterfell, the room here is extremely well ventilated, has natural lighting, and is situated in a much more amenable climate. While many commentators suggest that the deployment of a giant fire-breathing dragon was key to the success of Daenerys Targaryen’s forces, we clearly cannot ignore the contribution made by an appropriate wargaming space during the critical planning phase.


Please take a minute to complete our PAXsims reader survey.

McGrady: Getting the story right about wargaming

WotR.jpeg

McGradyWotR.png

At War on the Rocks today, Ed McGrady notes the recent debates about analytical wargaming within the US defence community, and has some thoughts to offer:

There is a debate about wargaming in the Pentagon and it has spilled out into the virtual pages of War on the Rocks. Some say wargaming is broken. Others believe the cycle of research will solve our problems. There is a deeper problem at the root of all of this: There is a widespread misunderstanding of what wargaming is and a reluctance to accept both the power and limitations of wargames.

What we are seeing in the debate about wargaming looks a lot like what wargaming is best at: telling stories. But we have told ourselves several different stories at the same time, and none of these stories really agree with reality….

But failure to understand wargaming — what it is and what it is not — risks screwing up the one tool that enables defense professionals to break out of the stories we have locked ourselves into.

He goes on to question the notion that wargames are analysis:

Wargames do not do this through analysis. Indeed, wargaming is not analysis. “Analytical wargaming” jams the two terms together in a vague way that can mean anything, and often does. To be sure, good wargaming requires analysis: To design a game, one has to understand how things work. But the most important analysis one does for a wargame is about the people and organizations involved, not the systems. For example, defense analysts often find themselves grappling with future force projections and procurement. But the one organization that matters most for future force structure is not included in the assessments: Congress. Wargames can help senior leaders consider things like Congress whereas standard models and analyses cannot.

Wargames can also be the subject of analysis, but tread carefully: Wargames are not experiments unless they have been specifically, and painstakingly, designed as such. They are events: unrepeatable, chaotic, vague, and messy events. Collecting data from them is difficult — they produce “dirty” data, you often miss the best parts, and they cannot be repeated. But if you think that means you can’t learn anything from them, you might as well stop trying to understand real-world conflicts, because everything I have written about wargames in this paragraph is also true for wars.

So, you can analyze wargames, just not the same way you would analyze a set of data from a radar system or a series of ship trials. But in your analysis you have to focus on what wargames can actually tell you, and avoid making conclusions about what they can’t.

He goes on to suggest what we need to do:

First, we need to get our story straight and get it out there. Wargames are the front-end, door-kicking tool of new ideas, dangers, and concepts. In particular, they help you understand how you will get stuff done in the messy, human organizations that we all work in. They are really good at that. We also need to make sure that people understand what wargames are not good at: detailed, technical, complicated analysis that needs to be done to optimize particular aspects of ideas or concepts. They can tell you that the enemy may target your logistics, but they won’t tell you exactly how many short tons you need to offload per day at the port.

Second, we need to push back against the opportunists and charlatans who are colonizing gaming. While these people always show up when areas get hot, they are particularly dangerous in wargaming. Wargames not only provide new ideas and concepts, but also influence the future decision-makers that play in them. About the best we can do is call out bad games when we see them and, as part of our getting the word out about gaming, describe what games to discount when you hear about a bad game.

We can start by saying meetings are not games and speculation is not play.

Third, we need to make sure decision-makers understand that a good game is only the beginning of the journey, not the end. Much more work needs to be done after the game to figure out, through analysis, whether all those fancy concepts and ideas will work. And if we think they just might work, then we need to burn jet fuel and soldier-hours in instrumented and observed exercises to figure out if our forces and equipment can actually execute them. For future systems where we can’t do exercises, this means bringing the actual engineers into the operational picture. One of the best ways to bring the systems developers into the picture is through games.

You can read the full piece here.


Please take a minute to complete our PAXsims reader survey.

The Future of Wargaming working group report

wargaming future.jpg

PAXsims is pleased to provide more of the impressive work done at the Connections 2019 (US) professional wargaming conference. Many thanks to Ed McGrady for passing this on for wider distribution.

Also, if you haven’t yet, please take a minute to complete our PAXsims reader survey.


At Connections 2019 we held a working group (WG2) to explore the future of wargaming.  We approached the problem several different ways.  First, several members of the working group contributed fictional stories describing what gaming might look like in the future.  Second, we had baselining briefs on future technologies, including virtual and alternate reality technologies and artificial intelligence.  Finally, we did a scenario planning exercise with the working group attendees at the conference.  This process resulted in a wide-range of different ways to think about, and predict, the future of gaming.

The working group was co-chaired by Mike Ottenberg and Ed McGrady, with stories contributed by Sebastian J. Bae, Michael Bond, Col. Matt Caffrey (Ret.), Dr. Stephen Downes-Martin, Dr. ED McGrady, and Dr. Jeremy Sepinsky.

Wargaming the Far Future working group report

69590.jpeg

PAXsims is pleased to present the report of the “Wargaming the Far Future” working group, ably assembled by Stephen Downes-Martin. This 276 page (!) document contains the papers written by the working group, their discussions while they wrote and refined those papers from November 2018 to June 2019, and the discussions at the workshop held during the Connections US Wargaming Conference in August 2019.

Our most potent power projection and warfighting capabilities, developed in response to current and near future threats, are technologically advanced, hugely expensive, and have half- century service lives. The first of these characteristics gives us a temporary and possibly short lived warfighting edge. The second grants our political leaders short lived economic and political advantages. The last characteristic locks us into high expenses in maintenance and upgrades for many years in order to justify the initial sunk costs as though they were investments. This combination forces us onto a high-inertia security trajectory that is transparent to our more agile adversaries, providing them with credible information about that trajectory while giving them time to adapt with cheaper counter forces, technologies and strategies.

We must therefore wargame out to service life, the “far future”, to ensure our current and future weapons systems and concepts of operations are well designed for both the near term and the far future. However a 50 year forecasting horizon is beyond the credibility limit for wargaming. The Working Group and the Workshop explored and documented ways that wargaming can deal with this horizon.

Papers and comments are contributed by Stephen Aguilar-Milan, Sebastian J. Bae, Deon Canyon and Jonathan Cham, Thomas Choinski, John Hanley, William Lademan, Graham Longley-Brown and Jeremy Smith, Brian McCue, Ed McGrady, Robert Mosher, Kristan Wheaton, and of course, Stephen himself.


Please take a minute to complete our PAXsims reader survey.

Defense One Radio on wargames

Unknown.png

Defence One Radio has put together a very good 49 minute podcast on contemporary defence wargaming.

This episode we’ll learn why the Pentagon and the U.S. defense establishment are increasingly turning to wargames and simulations; what famous games of the past got right, and wrong; and why we still need experts who strategize almost exclusively in the analog world of plastic chips and toy soldiers and hexagon maps.

Guests include Becca Wasser, Stacie Pettyjohn, Ellie Bartels, Christopher Rice and Mark Herman.

You’ll find it here.


Please take a minute to complete our PAXsims reader survey.

Pournelle: Can the cycle of research save American military strategy?

Pournelle cycle.png

Phil Pournell has added his thoughts to the debate over Pentagon wargaming, with a piece in War on the Rocks:

There’s a debate in the Pentagon about wargaming, and it’s heating up. With a recent War on the Rocks article, John Compton, senior analyst and wargame subject-matter expert in the Office of the Secretary of Defense, has put his hat in the ring. Titled “The Obstacles on the Road to Better Analytical Wargaming,” the essay lays out a powerful case that the Defense Department’s wargaming enterprise is broken.

Compton argues that wargamers have ignored Peter Perla’s call to reform of the art of wargaming. Many practitioners continue to execute wargames which aren’t wargames (e.g. Bunch of Guys and Gals Sitting Around a Table), and have failed to adapt their types and styles of games to what the customers ask for. He then describes, accurately in my view, how many wargaming practitioners lack “analytic ownership,” and fail to properly construct their games using multiple methods.

While I largely agree with Compton’s criticism, I think he paints with too large a brush. Many in the wargaming community are working for the very reforms he calls for. Others work in fields which don’t directly apply, such as training or education. In some areas, however, he doesn’t go far enough. His article fails to highlight the danger of the status quo, and the real risk that poorly-constructed analysis (not just wargaming) can lead to battlefield losses. The future force is in danger of being designed based on the impetus of services’ prerogatives and history rather than on a proper inquiry, exploration, and evaluation worthy of a joint force. The detachment of wargaming and the other elements of analysis from an integrated approach cuts the military adrift from its analytic moorings just when the nation and its allies need it the most.

Practical advice on matrix games

PracticalAdviceOnMatrixGamesV11.jpgI have been running Chris Engle matrix games since 1988. With the increase in popularity and use of matrix games, both recreationally and for more serious matters, I felt that I should be prepared to stick my neck out and try to provide some practical advice on how to run the games in order to get the best results.

I have collated my notes into a small booklet, with short comments on the following topics:

  • What are Matrix Games?
  • Academic Underpinning
  • My Version of How to Play a Matrix Game
  • Argument Assessment
  • Diceless Adjudication
  • Notes about arguments
  • Turn Zero
  • Number of Things you can do in an Argument
  • Use of Dice
  • Reasonable Assumptions and Established Facts
  • Turn Length (in game)
  • Game Length
  • End of Turn “Consequence Management”
  • Inter-Turn Negotiations
  • Elections
  • Secret Arguments
  • Measures of Success
  • Killing Arguments
  • Spendable Bonuses and Permanent Bonuses
  • Levels of Protection and Hidden Things
  • Big Projects or Long-Term Plans
  • Number of Actors
  • Writing the Briefs for the Participants
  • Recording the Effects of Arguments
  • The Components (and Characters) Affect the Game
  • Starting Conditions
  • Cue Cards
  • Large-Scale Combat
  • A House Divided
  • Announcements
  • The Order in which Actors make their Arguments
  • Random Events
  • Dealing with Senior Officers, Dominant People and Contentious Arguments
  • Nit-Picking vs Important Clarification
  • Why I like Matrix Games
  • A few Words of Warning

Please bear in mind that this was chiefly written as “notes” to support demonstrations and course I have run using matrix games, rather than as a guide for someone who has never seen or heard of a matrix game.

The advice also does not cover how such games should be analysed in order to draw out any insights or conclusions. This is an important part of any professional game, but as I primarily use matrix games in an educational context, I haven’t had to that. In the times where I have run games for government departments, they have carried out their own analysis of the games (due to the level of classification), so the booklet doesn’t really cover this area.

More recently I have had the good fortune to be able to experiment with a couple of different game set-ups and mechanics, and I have incorporated them into the guide.

The guide is still a “work in progress”, and probably always will be, but I would like to add more to it in the future, if it is helpful. If anyone has an feedback, please get in touch.

You can download the booklet here.

Matrix games at the Canadian Army Simulation Centre

The following report was prepared for PAXsims by David Banks and Brian Phillips.


Dave Banks Facilitating.jpg

Dave Banks of the Canadian Army Simulation Centre facilitates the use of a matrix wargame during the 2019 Civil-Military Interagency Planning Seminar.

For the first time in its ten year history, a matrix game was employed during the Civilian Military Interagency Planning Seminar (CMIPS) conducted from 18 to 20 June 2019 at Fort Frontenac in Kingston, Ontario. The planning seminar is run annually by the Canadian Army’s Formation Training Group with support from the Canadian Army Simulation Centre (CASC).

 

Background

The intent of CMIPS is to foster understanding among the interagency participants with the intent of building better relationships in advance of any future interaction overseas or domestic settings.  The CMIPS had approximately 50 participants with half coming from the Canadian Armed Forces (CAF) and the remainder drawn from other government departments and international and local non-governmental organizations. The participants were broken into balanced groups of military and civilians who then discussed a common scenario by way of a table top exercise (TTX). While this is a proven approach, the event organizer, Steve Taylor, felt that a matrix game could be an interesting improvement to the Seminar this year.

Dave Banks and Brian Phillips, Calian Activity Leads (ALs) at CASC, with the support of CASC and the help of the other Calian Activity Leads, designed, developed and conducted a Matrix Game for one syndicate of the CMIPS. Dave Banks served as the Controller for the activity and Brian Phillips served as the Scribe.

This matrix game was intended to:

  • foster cooperation and understanding among the players (primary goal);
  • be a proof of concept for CASC in applying matrix games as a training and education tool; and
  • introduce the players to matrix games.

 

Conduct

The matrix game was held over two days followed by a review on the third day. Specifically:

Day 1 consisted of an introduction to matrix games,  a briefing on the specific matrix game set in the Democratic Republic of the Congo (DRC), a short read-in, and concluded with two hrs of play (two turns). During Day 1 the problem faced by the actors was the likely arrival of Ebola to North Kivu province. As much as possible, the participants represented their own, or a similar agency, during the game.

Day 2 consisted of two and a half hours of additional play. During this session a random event card was played that depicted the President of the DRC dying in a plane crash on landing at Goma in North Kivu province. While foul play was not suspected, the death of the president was expected to disrupt the political environment and potentially heighten the risk of violence throughout the DRC and in North Kivu in particular.

 

Differences from Other Matrix Games

While there is no definitive form or format for a matrix game, there were a few features of the CMIPS game that might not be commonly found in other matrix games.

Actor Cards.  The CASC product had fairly detailed Actor cards which included:

  • a brief outline of the nature, purpose and involvement of the Actor in the situation;
  • the Actor’s objectives, both overt and covert (where applicable);
  • the Actor’s limitations (ie: actions it would never take);
  • any specific special capabilities the Actor possessed (such as the ability to provide air or ground transport, deploy medical teams, etc);
  • the number, type and general location of map counters allocated to the Actor; and
  • a recap of the basic game procedures and concepts.

Further differences included having turns divided into three phases:

  1. Negotiation Phase (10 mins). During this phase the Players had 10 minutes to negotiate any support or cooperation they required amongst themselves.
  2. Argument Phase. Each player in sequence made their argument for their Actor’s action for that turn. Actions were adjudicated using a Pro and Con system and two six-sided dice.  Each player had a maximum of five minutes for their action which was strictly enforced by the Controller.
  3. Consequence Management (10 mins). During this phase the Scribe read back the Actions for the turn and some of the consequences were articulated including some consequences that the Players were unlikely to have foreseen.

 

Results

Overall, the matrix game was very well received by the participants. While the matrix game participants did not go into as much fine detail as some of the other syndicates did in their TTXs, the matrix game was immersive. One civilian participant remarked that the experience of uncertainty going into the first negotiation phase was exactly the same sort of experience that he had getting oriented on a previous humanitarian mission.

 

Key Findings

  • As this was the first matrix game run by ALs from CASC the three play testing sessions conducted prior to the event proved to be invaluable. Even with facilitators with significant experience in running TTXs, the specific preparation of the play testing was instrumental in successfully executing the matrix game at the first attempt. The time invested in deliberate play-testing and game development is well spent.
  • The two-person facilitation team of a Controller and a Scribe worked very well. Both the Controller and Scribe exercised firm control at different times to ensure the game stayed within the admittedly fairly wide arcs established for play. We strongly believe that this firm control is vital to the success of a matrix game: without it there is a risk that the game may degenerate, particularly if there are strong personalities around the table.
  • The key advantage of the matrix game noted by the players over a traditional TTX was the fact that the players had to participate. They could not sit at the table and just observe one or two participants dominate a TTX, rather, they had to make decisions and actively contribute.
  • There is ample reference material readily available to build matrix games from The Matrix Game Handbook(Curry et al.) to the Matrix Game Construction Kit offered by PAXsims and several online resources. As such it was fairly easy to find useful graphics for game pieces as well as ideas for rules, event cards, and game conduct through a simple web search. Tom Mouat’s website was invaluable and his Practical Advice on Matrix Games v10 was particularly useful.
  • The formal turn-structure of phased turns including, in particular, a Negotiation Phase, directly contributed to achieving the game objective of fostering co-operation and understanding amongst the players. The inclusion of a Negotiation Phase was one of the outputs of the three play-testing sessions.
  • The Consequence Management (CM) Phase was only partially successful. In future, this phase would benefit from some modification in implementation. At the end of the turn there should be a slight pause while the Controller and Scribe discuss CM and how they want it to proceed as it can function almost like a random event card. Thus CM should be implemented with some care and forethought. Whether that should be done as part of the CM phase or perhaps the CM phase should revert to a Situation Update/Summary phase. In the later case, the CM could be determined by the Controller and Scribe during the Negotiation Phase and briefed at the end of that phase. This will be play tested prior to the next running of the CMIPS matrix game.

 

Conclusion

The feedback from the CMIPS participants indicates that a matrix game proved to be a worthwhile investment of time and resources. These games take longer to prepare than a traditional TTX but the players’ active participation in the game experience made it a valuable learning event.

Matrix games have been added to the toolset offered by CASC and future serials of the CMIPS will likely continue to use this innovative activity.

 


Authors 

Lieutenant-Colonel (Retired) David Banks served 38 years in the Infantry, both Regular and Reserve. He is a graduate of the Canadian Army Command and Staff College 1990 and is a Distinguished Graduate of the United States Marine Corps Command and Staff College Quantico 1997-98. David has completed a number of overseas operational tours including Afghanistan, and participated in several major domestic operations in Canada. He has worked as an Activity Lead for Calian in support of the Canadian Army Simulation Centre and the Canadian Army Formation Training Group since 2011.

Lieutenant-Colonel (Retired) Brian Phillips spent 27 years in the Regular and Reserve force initially as an Infantry Officer and later as an Intelligence Officer. Brian holds an MA in War Studies (1993) and an MA in Defense Studies (2015) both from the Royal Military College of Canada and he is a graduate of the Canadian Army Command and Staff College in Kingston (2005) and the Joint Command and Staff Programme in Toronto (2015). Brian’s operational experience includes the 1997 Manitoba Floods, Bosnia, Kosovo, the Middle-East, Haiti with the DART in 2010 and Afghanistan twice. He has been employed as an Intelligence Specialist and Activity Lead for Calian in support of the Canadian Army Simulation Centre since 2017.

Lacey: Teaching operational maneuver

The following piece has been contributed by Dr. James Lacey, Professor of Strategic Studies at the Marine Corps War College and author of the recently-released The Washington War: FDR’s Inner Circle and the Politics of Power That Won World War II.


Lacey-9.png

Picture credit: War on the Rocks

TEACHING OPERATIONAL MANEUVER

For well over five decades the U.S. Military has ruled the tactical battlefield. While much of this tactical superiority is explained by superior military technology, it mostly reflects the literally thousands of “set and reps” tactical leaders receive in training events, professional military educations system (PME), and actual combat. We have highly capable and rapidly adaptive tactical units because, to a degree unequaled in other militaries, U.S forces really do train as they fight. As such, the battlefield is a familiar place, and given virtually any situation, an American combat leader can instantly reach into his memory to retrieve a similar circumstance from training.

This capacity of “instantaneous pattern recognition” is what keeps leaders from freezing in combat. So, although every training or combat situation has its own unique elements, effective training almost always creates sufficient similarities for experienced leaders to draw upon a stored “mental template” to rapidly build, in their mind’s eye, an accurate picture of the fight, and to immediately start making decisions. It is during home station training, while at training centers, on deployments, and in classrooms that our tactical leaders get the “sets and reps” they require to “see” the battlefield and react rapidly and appropriately, while under the stress of combat.

Unfortunately, none, or precious little, of this level of preparation exists at the operational level and above. While we are fantastic at fighting battalions and regiments/BCTs, the skills necessary to fight a dozen or two dozen BCTs as a coherent whole in a swirling maneuver battlefield have atrophied. If Multi-Domain Battle is going to become a battlefield reality, we must once again teach senior leaders how to fight battles, campaigns, and wars above the BCT level. Further, rising senior leaders need to relearn the art of combining a series of battles into combinations of war-winning campaigns.

Some may argue that PME accomplishes this at the ILE level.  And admittedly, there are some small pockets where the rudiments of what is necessary are still being taught, but, for the most part, ILE (and related) institutions no longer teach operational maneuver.  Instead they teach the “process.” Told to get ‘Force A’ to ‘Objective X’, an ILE graduate can layout courses of action, and present a plan to move along ‘Axis Y or Z’ to arrive at the objective. They can also do much of the detailed staff planning necessary to make such a move possible. What they cannot tell you is whether “Objective X” was the right place to assault in the first place.

I first noted that our senior officers had no idea about how to ‘think about or conduct’ an operational battle while attending various Service wargames. For instance, in one major game the scenario called for US and NATO troops to retake the Baltics, currently occupied by Russian forces. The solution that a room full of field grade officers arrived at was to send the attacking force straight north from Poland. The predictable enemy response was to launch the 1stGuards Tank Army into the attacking force’s unprotected flanks and rear – obliterating the four NATO divisions.

This only confirms something that has disturbed me ever since I began employing wargames in War College classrooms. To help the students master the mechanics of these complex games I bring in local civilians with years of operational and strategic level wargaming experience… but no military experience. In every case, no matter the time-period, or the game level (strategic or operational), the war college students are consistently outclassed by civilian hobbyists – it is not even close. This holds true even after the students have played the game a few times and fully understand the game rules and mechanics.  Time and again, my students are out-thought by civilians with no military experience or education.

This does not mean that civilian wargamers would be effective on a real battlefield.  In truth, few of them could lead a platoon out of a paper bag and most of them would seize-up if confronted by a real combat situation.  Moreover, wargamers lack the experienced-based judgement that is a product of years of training and combat experience.  When one plays a wargame, every unit has a set of assigned numbers, which typically everyone knows at the start of the game. For instance, unit counters will typically have their strength, speed of movement, and other factors printed on them. So, when a friendly unit runs into an enemy unit one can quickly calculate relative strengths and with a glance at the game’s combat results table instantly know the probability of success of any engagement.  In real life things are never that easy.  A unit’s strength is always a judgement call that must be made by an experienced commander. Moreover, this judgement (a mental number) is constantly changing as the battlefield situation evolves.  For instance, a battalion commander might mentally consider his best company a “10” on a scale of 1 to 10.  But, maybe he will assign that same company a “6” after it has been in prolonged combat for 72-hours without a rest… and reduce it further to a “4” or lower if it has lost a few key leaders. If he manages to rotate the company out of the line for 48-hours rest he may, once again, elevate it to a “7”, and then make it an “8” based on getting some quality replacements. In combat commanders are continually assessing their units and judging their relative effectiveness; no one is giving them that number.  Moreover, the best commanders are doing the same thing when they judge the relative combat power of their battlefield opponents.

At the operational level of war, the capacity to make such judgements are the result of years (decades) of accumulated experience. This is why the judgement of wargamers cannot be applied in an actual combat environment. Still, wargames remain the only way to “simulate” war at the operational level and above, short of training maneuvers on a scale no one is willing to pay for. And despite the shortcomings of wargames and civilian wargamers as military leaders, a singular truth remains; at the strategic and operational level, civilian wargamers display a capacity for “instant pattern recognition” that very few field grade officers can match. In most cases, a civilian wargamer requires only a cursory glance at a map and a rudimentary understanding of the game mechanics and objectives to comprehend the entire situation and decide on a course of action. Similarly, I can set up actual operational or strategic situations from World War II (or any past war) on a map and the civilian wargamers will come up with a plan of action in a fraction of the time it takes most professional military officers.

The answer appears simple; our PME systems must wed its students’ undoubted tactical expertise, leadership abilities, and judgement to the “instant” operational and strategic “pattern recognition” that many civilian wargamers possess. Getting there, however, is not going to be easy, as it means undertaking a major curriculum upheaval within almost every PME institution at the ILE level and above.

For over a decade-and-a-half, field grade PME institutions have been focused on teaching leaders how to integrate an all of government approach to fighting COIN conflicts. Given the global situation – almost all of the nation’s landpower engaged in two COIN fights – this was undoubtedly the right thing to do. But, while we were fighting in Iraq and Afghanistan, the world refused to sit still. As we rise our sights above the COIN fight we find ourselves confronting two global military powers, each capable of meeting U.S. forces on the battlefield as peer competitors. It took nearly a decade to get the right people within PME to transform our institutions into COIN academies. Unfortunately, our potential peer-level opponents are unlikely to allow us that much time to realign curriculums back toward operational maneuver.

At this level of warfare civilian wargamers have a tremendous intellectual lead over most military professionals, as they typically have thousands more strategic and operational “reps and sets” than the average field grade officer. Our nation has been served well by company, battalion and brigade level leaders who, because of enduring thousands of “tactical reps”, have repeatedly proven themselves demonstrably superior to their battlefield opponents.  After two decades of training and combat experience we can be reasonably sure that a lieutenant-colonel confronted with almost any tactical situation (real or simulated) will think quickly, move rapidly, and act decisively; all because he has a stored “mental template” to work from. But, unless they are self-taught, military leaders are given few, if any, “reps and sets” at the operational level. Consequently, when confronted with an operational or strategic level problem, their capacity for rapid and decisive action vanishes.

The second great advantage civilian wargamers have over most military professionals is a deep grounding in history, particularly military history. That this advantage exists is somewhat surprising, as military officers are told from the start of their careers that they need to read widely and deeply into all aspects of military history. Unfortunately, disturbingly few bother to do so.

Almost every wargame hobbyist I have met is a walking encyclopedia of historical knowledge. Sit down to play one in a simulation of the Battle of Gettysburg and you will discover that they not only know the big events of the battle; most of them can also tell you what time and from what direction each of Hill’s and Ewell’s brigades arrived on the first day.  But their knowledge usually goes far deeper than such minutiae. Over numerous discussions, I have discovered that they are almost always well-read on the politics, diplomacy, and economics behind any strategic game or simulation. In fact, when it comes to discussing history the average wargamer of my experience can hold his own with any War College faculty member.

Consequently, when a wargame hobbyist examines a new operational or strategic situation he draws upon a huge reservoir of knowledge to contextualize and understand what he is looking at. In short, he has thousands of “mental templates” in his head that help him make sense of even the most complex situations. Moreover, they also have a very good idea of what others have done in similar situations – what worked and what failed.  On the other hand, the typical field grade officer, bereft of the opportunity to develop such “mental templates”, views every situation they are exposed to (and that is way too few) as something totally new… and every approach as novel.

As we begin to reform and realign PME our first question must be: how do we take tactically proficient proven leaders and turn them into – to use an old term – maneuverists? There really is only a single answer; it is the same one that that made them masters of the tactical battlefield. We must increase the number of operational and strategic “reps and sets” they are exposed too. This is the only way to instill in our future senior leaders the “instant pattern recognition” necessary to make them outstanding operational commanders and strategic thinkers.

There are, regrettably, no quick fixes for this problem, as there is no crash course that will give senior leaders the thousands of operational or strategic “reps” they require. Moreover, while most would agree that a leader’s progress toward higher levels of operational and strategic comprehension should start early in their careers, this has always proven a bridge too far. Besides, this is the time when young leaders must focus on the basics of the profession, learning how to lead, and becoming tactical masters.  We can, however, certainly do much better in placing more operational maneuver wargaming and simulations at the ILE level. And then using the war colleges to reinforce these initial “reps and sets”.

I am not advocating turning the “entire” curriculum over to wargaming/simulations and other forms of experiential learning, but they can and should become the “centerpiece” of operational level and strategic education. As Deputy Secretary of Defense Bob Work and Vice Chairman of the Joint Chiefs Paul Selva have written: “Should we instead think about using wargames that explore joint multidimensional combat operations to pursue our JPME objectives? Building school curriculums around wargaming might help spark innovation and inculcate the entire Joint Force with a better appreciation and understanding of trans-regional, cross-domain, multidimensional combat.” Only by placing our future senior commanders within a series of operational and strategic situations can they begin building the “mental templates” and decision-making skills necessary for success on the maneuver battlefields of the 21stcentury. Time spent on often useless electives would be much better used running a series of operational and strategic exercises (or other experiential learning events) that will teach as well as challenge students at the higher levels of warfare.

The second part of the solution is to finally get serious about teaching military history to future strategic leaders. By this, I mean history writ large, in a program where military history is the focus, but also includes the political, economic, and diplomatic contexts in which conflicts are conducted.  It is no longer sufficient to create a booklist and hope officers read it (most do not). A professional reading program must be instituted and enforced (not talked about) at every level. At its best, such a program would eschew lists of required books, in favor of something akin to study guides. For instance, an officer desiring to develop a better understanding of the American Civil War, would be able to access a 2 or 3-page guide that lists a number of books he can choose from, depending on what his current emphasis of study is.

Where would I like us to get to? As a start, I would hope that every field grade officer would have the knowledge to reply to General Bernard Law Montgomery’s request for three courses of action to take Arnhem with: “Sir, should we not first consider taking Antwerp?

If you have no idea what the above analogy references, or you don’t know why “Antwerp” is the right answer then your study of military history is sadly deficient.  Get to work on that.

James Lacey
ME - 2

 

Wargaming and its place in PME

WoTRLeeLewis.png

War on the Rocks has just published a piece by Carrie Lee and Bill Lewis of the US Air War College entitled “Wargaming Has a Place, But is No Panacea for Professional Military Education.”

The school year is about to start, and not just for the kids. Senior-level professional military education is about to begin a new academic year, with new classes of students from across the services preparing to embark upon ten months of education that is meant to elevate their thinking from the operational and tactical to the strategic level. In the two years since the release of the National Defense Strategy (and the now-infamous paragraph that declared professional military education to be “stagnant”), a heated debate has emerged on the pages of this website about the best ways to accomplish the mission of professional military education. Suggestions for improvement have spanned the gamut, from teaching students to be good staffers to introducing diversity — both in the faculty and the curriculum — to improving the ways in which we assess strategic competency. Others have pushed back, pointing out that professional military education already is highly responsive to change and warning about the dangers of the “good idea fairy.” In April, James Lacy of the Marine War College proposed another solution: All professional military education institutions should include board game wargaming as a part of their curriculum.

While this recommendation may hold appeal with those who are explicitly focused on military history and operational art, Lacey’s proposal is both short-sighted and misses the importance of diversity in professional military education — both between service colleges and in the curriculum itself. There is little doubt that experiential learning can be a valuable part of any education, including professional military education. But it also comes in many forms, all of which have benefits and costs. If the mission of professional military education is to educate the next generation of senior leaders about the strategic level of war and expose them to the tools they will need to succeed at that level, then we must use a variety of methods across the service colleges, rather than defaulting to a series of one-size-fits-all solutions.

They conclude:

In order to best educate and prepare our students for this complex and challenging environment, a variety of tools are necessary, and “one size fits all” solutions may do more harm than good. There are many types of immersive programs that can be employed to achieve a broad range of learning objectives. We should strive to view our curriculum not as a checklist of required activities but instead as a wholistic educational experience.

Lee and Lewis are right, of course, that serious gaming is not some magic educational bullet. It takes times. Not all wargames are fit for educational purpose, even if they work well as hobby or analytical games. Academic schedules are crowded, and you can only do so much. There are many teaching techniques available. There is even overwhelming evidence that simulations, when used poorly, can do educational damage.

That being said, I’m not sure they really offer a great deal of guidance in what should be used when and in what ways, how this relates to other teaching techniques, and how we know we measure the effectiveness of all this.

Jim Lacey, who the authors critique as a point of departure, was quick to post a response to Facebook (reproduced here with permission):

Well it is not every day my approach to teaching strategic studies is called “shortsighted” by folks who apparently have no idea what I do. But, I suppose it is always an easy-out to set up a strawman – no matter how it departs from reality – as a foil to base an article upon .

In any event, it may have helped if you had read my earlier article on the topic

But in hopes of increasing your understanding of how we educate MCWAR students, please allow me to offer the following.. During the course of the year MCWAR students participate in a number of experiential events, including:

  • Conducting several staff rides, including Yorktown, the Overland Campaign, Gettysburg, Antietam, and Normandy. – FYI, the students also go on a two week trip to either Europe and Asia to immerse themselves in current issues
  • Engaging in multiple simulations (as you describe them). This includes participating in two multi-day geopolitical simulation at Tufts and Georgetown universities. Moreover, we employ a number of in-house simulations throughout a spectrum of historical, current, and future related topics.
  • I would dare say we also employ a large number of models (as you describe them) throughout the year.
  • When it comes to wargaming MCWAR employs the entire gamut: seminar games, matrix games, board games, computer assisted games, etc.
  • Engage in a number of simulations and wargames based on future scenarios against China, Russia, and Iran, which feed directly into ongoing concept development and Title 10 wargames
  • We also use boardgames, but they remain both a subset of our overall curriculum and a subset of our experiential learning program.

In any event, boardgames are never used in isolation. Let me give one example.

As part of our military history curriculum we examine the Civil War. The structure of that program breaks down as follows:

  1. The students are given a set of readings to finish before they enter the classroom
  2. They are then directed to a website I am developing, where they can listen to lectures from some of the best Civil War historians in the nation.
  3. They are also given CDs so that they can listen to other lectures in their cars
  4. Then, once they have absorbed this material, we conduct our seminar sessions. We only have two seminars at MCWAR…. So I break each of them into two parts and conduct a series of seminars with only 7-8 folks in each (as close to an Oxford tutorial as I can get).
  5. After all of this we conduct a board wargame. I run 3-4 wargames at the same time, so all of the students can fully participate. I have local community volunteers (long-time wargamers) sitting at each game to take care of the game mechanics, so that the students can focus on strategic decisions
  6. Then, when all of that is done, the class goes on their staff rides.

I am always looking for way to improve, and am hopeful that you can suggest ways I can do so.

In any event, I just wanted to clear the air and correct any misperceptions you and your co-author have as to how MCWAR sets-up its curriculum, as well as my approach to teaching and the use of wargames. Of course, a much of this could have been easily cleared-up with a phone call or an e-mail before you went to print. But, moving on… if there is anything I can do to assist your efforts to increase and enhance the use of modeling, simulations, and wargaming – or any other experiential learning methodology – at the Air War College, please do not hesitate to ask.

Thank you for your time and comments. I look forward to learning more about the Air Force conducts experiential learning.

This isn’t the first such debate. I’m not sure is should even be a debate, however. Rather, it points to the value of a common-sense “toolkit” approach to serious gaming. Wargames are tools. Sometimes they may be the best tool for the job. Sometimes there are better tools. Sometimes they are a pretty bad fit. Almost always, they need to be used in conjunction with other techniques.

Setting the (wargame) stage

Slide1.jpeg

I delivered a (virtual) presentation today to the Military Operations Society wargaming community of practice on the importance of “chrome, fluff” and other finer touches in promoting better game outcomes through enhanced narrative engagement. Having forgotten to set a calendar reminder I was a fifteen minutes late for my own talk, which only served to reinforce the stereotype of absent-minded professors. Apologies to everyone who had to wait!

The full set of Powerpoint slides is available here (pdf). Since the content may not be entirely self-evident from the slides, I’ll also offer a quick summary.

Slide4.jpeg

First, I argued—in keeping with Perla and McGrady’s discussion of “Why wargaming works“—that narrative engagement is a key element of good (war)game design and implementation.

Slide6.jpeg

In addition to their experience-based, qualitative argument, I adduced some quantitative, experimental data that shows that role-playing produces superior forecasting outcomes…

Slide8.jpeg

..and that the way we frame and present games has profound effects on the way players actually play them.

Slide9.jpeg

I also noted a substantial literature on the psychology of conflict and conflict resolution that points to the importance of normative and other non-material factors in shaping conflict and negotiating behaviour.

Slide11.jpeg

In other words, if your games don’t have players feeling angry, or aggrieved, or alienated, or attached to normative and symbolic elements, they’re acting unrealistically. Since the selling point of wargaming is that it places humans in the loop, you need those players playing like real humans, not technocratic, minimaxing robots.

Doing that, I suggested, requires nudging participants into the right mindset. One has to be careful one doesn’t overdo it—some participants may recoil at role play fluff that makes it all look like a LARP or game of D&D.

What then followed was a discussion of some considerations and ways that I had done it, but which was also intended to spark a broader conversation. Specifically we looked at:

  • How player backgrounds and player assignment will influence how readily participants internalize appropriate perspectives.
  • Briefing materials should designed to subtly promote desired perspectives and biases (without being too obvious about this). Things like flags, maps, placards, and so forth can all be used to make players more closely identify with their role.
  • In repeated games—for example, some wargames in an educational setting that might be conducted every year)—oral traditions and tales from prior games can make the game setting richer and more authentic (although at the risk of players learning privileged information from previous players). Participants might also contribute background materials, chrome, or fluff that you can use in future games—such as the collection of songs from Brynania that my McGill University students have recorded over the past twenty years.

  • Very explicit objectives and “victory conditions” should often be used sparingly, lest they promote both an unrealistic sense of the rigidity of policy goals and promote excessively “tick-off-the-objective-boxes” game play.
  • Physical space should be used to subtly shape player interaction, whether to foster interaction, limit it, or even create a sense of isolation and alienation.
  • Coffee breaks and lunch breaks should be designed NOT to pull players out of their scenario headspace. The last thing you want is Blue and Red having a friendly hour over lunch talking about non-game matters in a scenario where they are supposed to distrust or even hate each other.
  • Fog and friction should be promoted not only to model imperfect information and imperfect institutions/capabilities, but also to subtly promote atmospheres of uncertainty, fear, crisis, panic, frustration, and similar emotional states, as appropriate to the actors and scenario.
  • The graphic presentation of game materials should encourage narrative engagement and immersion. Avoid inappropriate fonts and formats, make things look “real,” and be aware that game graphics can very much affect how players (and analysts) perceive the game and it’s outcomes.

A variety of other issues came up in the Q&A and discussion. Many thanks to everyone who participated—I hope they found it as useful as I did.

Slide18.jpeg

 

 

CNA: After the wargame

cna-logo

In the third part of their wargaming trilogy, the CNA Talks podcast explores data collection and analysis in professional wargames:

In part three of our occasional series on wargaming, CNA’s chief wargame designer Jeremy Sepinsky returns, accompanied by Robin Mays, research analyst for CNA’s Gaming and Integration program, to discuss how they analyze the results of a CNA Wargame. Jeremy starts by describing the “hotwash” discussion that occurs immediately after a wargame concludes, and what insights participants often take away. Throughout this episode, Jeremy and Robin describe the type of information note takers record during a wargame, and how that data gets used in the final analysis. Using examples from actual wargames about logistics, medical evacuation and disaster relief, they explain how analysis reveals insights not readily apparent to those who played the game.

The link above also contains links to Parts 1 and 2.

Also, for those interested in game analysis, be sure to read the results of our DIRE STRAITS experiment on how analysts can influence (or bias) analysis.

Squeezing the Turnip: The Limits of Wargaming

The following piece has been written for PAXsims by Robert C. Rubel.


 

squeezing_blood_out_of_a_turnip.gif

“Measure it with a micrometer, mark it with chalk and cut it with an axe” is an old adage that cautions us that the precision we can achieve in a project is limited by the least precise tool we employ.  We should remember this wisdom any time we use wargaming for research purposes.  Dr. John Hanley, in his dissertation On Wargaming says that wargaming is a weakly structured tool that is appropriate for examining weakly structured problems; that is, those with high levels of indeterminacy – those aspects of the problem that are unknown, such as the identity of all the variables.  Problems with lesser degrees of indeterminacy are more appropriately handled by various kinds of measurement and mathematical analysis.  However, as the tools for simulation and the analysis of textual data become more sophisticated, the danger is we will attempt to extract precision from wargaming that it is simply not appropriate to seek.

There are three aspects to this issue that we will address here; the inherent ability of wargaming to supply data that can be extrapolated to the real world, the development of “oracular” new gaming systems, and the number of objectives a particular wargame can achieve.

Peter Perla wrote, back in 1990 what has been the standard reference on wargaming, aptly-titled The Art of Wargaming. Of late there has been a lot of discussion online about wargaming as a science, or perhaps more precisely, the application of scientific methodology to wargaming.  There is no doubt that a rigorous, disciplined and structured approach to designing, executing and analyzing wargames is a good and needed thing. Too often in the past this has not been the case, and lots of money, time and effort have been wasted on games that were poorly conceived, designed and executed.  Worse, decisions of consequence have been influenced by the outcome of such games.  But even the most competently mounted game has its limits.  In this writer’s view, games can indicate possibilities but not predict; judgment is required in handling their results.

It is one thing to use a game to reveal relationships that might not otherwise be detected.  A 2003 Unified Course game at the Naval War College explored how the Services’ future concepts were or were not compatible.  It was designed as a kind of intellectual atom smasher, employing a rather too challenging scenario to see where the concepts failed.  The sub-atomic particle that popped out was that nobody was planning to maintain a SEAD (suppression of enemy air defense) capability that would cover the entry of non-stealth aircraft into defended zones. This was a potentially actionable insight that came out of the game, based on actual elements of future concepts. When games are used this way they are revelatory, not predictive.

Where we run into trouble is when we attempt to infer too much meaning from what game players do or say.  Dr. Stephen Downes-Martin has shown that game player behavior is at least partially a function of their relationships to game umpires, and so the linkage to either present or future reality is broken.  Thus there are limits on the situations where player behavior or verbal / written inputs can be regarded as legitimate output of a game.  There is a difference between having some kind of aha moment via observing player inputs and exchanges, and trying to dig out, statistically, presumed embedded meaning from player responses to questionnaires, interviews or even interactions with umpires or other players.

A first cousin to the attempt to extract too much information from a regular game is the attempt to create some new form of gaming that will be more revelatory or predictive than current practice can achieve.  Most of these are some riff on the Delphi Method, whether a variation of the seminar game or some kind of massively multi-player online game.  I know of none that have justified the claims of their designers and in any case they seem to violate the basic logic Downes-Martin lays out; the problematic connection between game players and the real world. When I was chairman of the Wargaming Department at the Naval War College I challenged my faculty to advance the state of the art of wargaming, but always within the bounds of supportable logic. My mantra was “No BS leaves the building!”

Even if a game is conceived and designed with the above epistemic limitations in mind, there could still be danger that the sponsor will try to burden it with too many objectives.  This was a common problem with the Navy’s Global Wargames in the late 1990s.  Tasked to explore network-centric warfare, the games became overly large and complex, piling on objectives from multiple sponsors, creating a voluminous and chaotic (not to mention expensive) output that was susceptible to interpretation in any way a stakeholder wanted.

The poster child of all this was Millennium Challenge 02, a massive “game” involving over 35,000 “entities” embedded in the supporting computer simulation, many game cells as well as thousands of instrumented troops, vehicles, ships and aircraft in the field and at sea.  Not only was the underpinning logic and design flawed – attempting to stack a game on top of field training exercises – but the multiplicity of objectives obfuscated any ability to extract useful information.  As it turned out, the game was sufficiently foggy to spawn suspicion of its intended use in the mind of a key Red player, retired Lieutenant General Paul VanRiper, and his post-game public criticisms destroyed any credibility the game might have had (I observed the game standing behind him as he directed his forces).

Modesty is called for.  While we might approach game design scientifically, and there are certain scientific philosophies upon which game analysis can be founded, gaming itself is not some form of the scientific method, even though rigor and discipline is necessary for their success.  An example of a good game was one run at the Naval War College in the spring of 2014 for VADM Hunt, then director of the Navy Staff.  The game was designed around the question “How would fleet operators use the LCS if it had various defined characteristics?”  Actual fleet staff officers were brought in as players and they worked their way through various scenarios.  What made a difference in the game was the effect that arming the LCS with long range anti-ship missiles had on opposition players.  The insight that VADM Rowden, Commander Surface Force, took away was that distributing offensive power around the fleet complicated an enemy’s planning problem.  One could consider this a blinding flash of the obvious, but in this case it was revelatory in terms of the inherent logic of an operational situation.  Trying to squeeze more detailed insights from the game, such as the combat effectiveness of the LCS, might have fuzzed the game’s focus and prevented the Admiral from gaining the key insight. He translated that insight into the concept of distributed lethality, now codified into the more general doctrine of Distributed Maritime Operations.

In a very real sense, games are blunt instruments, the analogue of the axe in the old saying.  Like the axe though, they can be very useful.  In this writer’s opinion – informed by many years of gaming – the best games in terms of potential for yielding actionable results, are focused on just a couple of objectives.  That said, in my experience, the most valuable insights are sometimes the ones you don’t expect going in.  In fact, some of the most influential games I have seen were essentially fishing expeditions. In 2006 the Naval War College conducted a six-week long strategy game to support the development of what became the 2007 A Cooperative Strategy for 21stCentury Seapower (CS21).  Going in we did not know what we were looking for but in the end a somewhat unexpected insight emerged (It’s the system, stupid) that ended up underpinning the new strategic document.  “Let’s set up this scenario and see what happens” is an axe-like approach that must not then be measured with a micrometer.


Captain (ret) Robert C. (“Barney”) Rubel served 30 years active duty as a light attack/strike fighter aviator.  Most of his shore duty was connected to professional military education (PME) and particularly the use of wargaming to support it.  As a civilian he worked first as an analyst within the Naval War College Wargaming Department, later becoming its chairman.  In that capacity he transformed the department from a mostly military staff organization to an academic research organization.  From 2006 to 2014 he served as Dean of the Center for Naval Warfare Studies, the research arm of the Naval War College. Over the years he has played in, observed, designed, directed, and analyzed numerous wargames of all types and written a number of articles about wargaming.  For the past four years he has served as an advisor to the Chief of Naval Operations on various issues including fleet design and PME.

 

CNA Talks: Playing a Wargame

cna-logo

CNA’s occasional podcast series discusses how to play a wargame.

In part two of our occasional series on wargaming, CNA’s chief wargame designer Jeremy Sepinsky returns, accompanied by Chris Steinitz, director of CNA’s North Korea program, to discuss what it’s like to play a CNA Wargame. Jeremy describes the different players in a wargame, emphasizing the value of people with operational experience who can accurately represent how military leaders would make decisions. Jeremy and Chris lay out the differences between playing Blue team and Red team. They also take us down the “road to war,” describing how the wargaming team lays out the scenario that starts the game.  Finally, Chris and Jeremy take us though the player’s decisions and how the results of a turn are adjudicated.

Lin-Greenberg: Drones, escalation, and experimental wargames

 

WoTRdrones.pngAt War on the Rocks, Erik Lin-Greenberg discusses what a series of experimental wargames reveal about drones and escalation risk. The finding: the loss of unmanned platforms presents less risk of escalation.

I developed an innovative approach to explore these dynamics: the experimental wargame. The method allows observers to compare nearly identical, simultaneous wargames — a set of control games, in which a factor of interest does not appear, and a set of treatment games, in which it does. In my experiment, all participants are exposed to the same aircraft shootdown scenario, but participants in treatment games are told the downed aircraft is a drone while those in control games are told it is manned. This allows policymakers to examine whether drones affect decision-making.

The experimental wargames revealed that the deployment of drones can actually contribute to lowerlevels of escalation and greater crisis stability than the deployment of manned assets. These findings help explain how drones affect stability by shedding light on escalation dynamics after an initial drone deployment, something that few existing studies on drones have addressed.

My findings build upon existing research on the low barrier to drone deployment by suggesting that, once conflict has begun, states may find drones useful for limiting escalation. Indeed, states can take action using or against drones without risking significant escalation. The results should ease concerns of drone pessimists and offer valuable insights to policymakers about drones’ effects on conflict dynamics. More broadly, experimental wargaming offers a novel approach to generating insights about national security decision-making that can be used to inform military planning and policy development.

You will find a longer and more detailed account of the study here.

This is a good example of using multiple wargames as an experimental method. Above and beyond this, it also shows how that wargames can generate questions worthy of further investigation.

More specifically, while the loss of a drone is less escalatory, an actor might be more likely to introduce a drone for this reason—possibly deploying one in a situation where they would not have risked a manned platform. If this is true, however, drones may still prove more escalatory overall. In other words, if the wargame is expanded to include the prior decision to deploy assets in the first place, the actual outcome might have been something like this:

  • Blue scenario 1: Deploy manned platform?
    • No, too risky.
    • No platform deployed.
    • Nothing shot down.
    • Result: No escalation.
  • Blue scenario 2: Deploy drone?
    • Yes, because no pilot at risk.
    • Drone shot down.
    • Result: Minor escalation.

Or, with regard to another situation—perhaps local air defences would have been reluctant to engage a manned aircraft because of the evident risk of escalation, but would happily shoot down a drone. In this case the experimental findings might have been:

  • Red scenario 1: Shoot down aircraft?
    • No, too risky.
    • Nothing shot down.
    • Result: No escalation.
  • Red scenario 2: Shoot down drone?
    • Yes, because no pilot at risk.
    • Drone shot down.
    • Result: Minor escalation.

In fact, if you read the full paper you will see this is exactly what occurred in a scenario involving a  shoot-down decision: participants were much more likely to use force against an unmanned drone.

In other words, while the study suggests that drones might reduce the chance of escalation, it also suggests that we also need to investigate whether the lower perceived risk of drone-related escalation might cause Blue to undertake more provocative overflights, or might lead Red to undertake more potentially escalatory shoot-downs.

Figure 1 below shows the main experiment: aircraft shoot-downs lead to major escalations, drone shoot-downs to minor escalation.

Slide1.jpeg

Figure 1: Experimental results suggest shoot-down of manned aircraft results in greater escalation.

Given the risk of escalation, however, decision-makers might decide against overflight in the first place.

Figure 2 examines a situation where no drones are available. It incorporates the possibility that decision-makers simply refrain from overflight because of the escalation risk, and assigns a (plausible but entirely made-up) probability to this. Moreover, knowing that a shoot-down of a manned aircraft is likely to cause escalation—a tendency noted by Lin-Greenberg’s other experiment—perhaps Red won’t actually open fire. Again, I have assigned a (plausible) probability to this. These numbers are just for the purposes of illustration, but here we note that with manned overflight as the only option there is a 16% chance of escalation.

Slide3.jpeg

Figure 2: Considering other decision points. Should Blue even send an aircraft, given risk of escalation? Should Red engage it, given the risks?

In this fuller model, now let us introduce drones (Figure 3). Given that they are less likely to cause escalation, let us assume that (1) Blue is likely to prefer them over a manned ISR platform, (as per earlier findings) (2) Red is more likely to shoot them down, and that (3) shooting down a drone causes minor rather than major escalation. Once again, I’ve assigned some plausible probabilities for the purposes of illustration.

Slide4.jpeg

Figure: Adding drones to the mix.

When we add drones into the mix, the risk of major escalation drops from 16% to 4%, but, the risk of some form of escalation actually increases to 60%.  Does this mean that drones have actually limited the risk of escalation, or increased it? Moreover, it is possible that tit-for-tat minor escalation over drone shoot-downs could grow over time to major escalation. If that were the case, it is possible that drones—rather than limiting conflict—are a sort of easy-to-use “gateway drug” to more serious problems.

Remember that I’ve essentially invented all of my probabilities to make a methodological point (although I have tried to make them plausible). My point here is not in any way to criticize Lin-Greenberg’s experimental findings—I suspect he is right. It is to say that the two sets of wargame experiments he undertook are useful not only for their immediate findings, but also to the extent that they generate additional questions to be investigated.

 

 

%d bloggers like this: