PAXsims

Conflict simulation, peacebuilding, and development

Tag Archives: ICONS Project

Simulation and gaming miscellany, 2 June 2017

wordle020617.png

PAXsims is pleased to present a number of items on conflict simulation and serious (and not-so-serious) gaming that may be of interest to our readers. James Sterrett contributed to this latest edition.

PAXsims

pic859584_md.jpgAt First Person Scholar, Jeremy Antley discusses “Remodeling the Labyrinth: Player-Led Efforts to Update GMT’s War on Terror Wargame.” Specifically he explores how players proposed and undertook updates of Volko Ruhnke’s wargame Labyrinth: The War on Terror, 2001- (GMT Games, 2010) during and after regional politics in the Middle East were reshaped by the “Arab Spring” of 2011—notably through online discussions at BoardGameGeek and ConsimWorld. He focuses particular attention on the use of event cards as a game mechanism, a representation of history, and focus of player discussions:

While event cards comprise only a portion of the materials found in Labyrinth, their role as abstracted arbiters of reality sustains and reinforces the simulative model to a degree not matched by other elements.  This, combined with their extra-legal nature, allows designers and players alike to utilize event cards for the purpose of injecting their own augmentative or corrective point of view.  Because wargames emphasize the production of knowledge from play, this means that event cards need to distill their subjects using montage, and ensure that the resulting creations act as epistemic reservoirs in service to the operation of Labyrinth’s model.  Debates over player-created event cards such as ‘Curveball’ and ‘Snowden’ reveal the seriousness behind getting this process right.  That players focused their efforts on crafting new event cards to update perceived deficiencies in Labyrinth’s original model speaks volumes to the expectations held by these players in relation to the wargames they play.  Time spent with a wargame’s simulative model is expected to be productive.  Creating new event cards became, in this regard, not only preferable but also essential if Labyrinth players wanted their games to keep pace with current events.

You’ll find a PAXsims review of the game here. GMT Games subsequently published an expansion/update by Trevor Bender and Volko Ruhnke, Labyrinth: The Awakening, 2010– ? (2016). This is still sitting on my bookshelf, awaiting play—and when I do I’ll certainly post a review to PAXsims.

PAXsims

cropped-chessboard12.jpg

At the Active Learning in Political Science blog there is some interesting discussion of emotion, engagement, immersion, and empathy in simulations.

Simon Usherwood first raised the issue in the immediate aftermath of the Manchester terrorist attack. This prompted Chad Raymond to note the problem of students acting too rationally and technocratically in a recent South China Sea simulation:

It is very difficult to get students to understand, in a self perspective-altering rather than an I-remember-what-the-book-said way, the emotional and psychological dimensions of political behavior. Simulations, though they often touted as effective at generating this kind of learning, may not be any better at this than other methods. My own attempts at empirically validating these kinds of outcomes with several different teaching methods have pretty much been failures. How do you get the typical American college student — often an 18- or 19-year old who has never traveled internationally nor has a deep relationship with anyone outside of his or her particular ethnic group or socioeconomic bracket — to temporarily step outside of his or her own feelings and experience what it’s like to be someone else?

Finally, Usherwood returns to the question, and suggests three potential solutions:

First option is to drown the students in detail. Chad’s only given his students a handful of things to think about/work with, so it’s understandable that they focus on these. If you’ve got the time and space, then giving them a whole lot more to handle/juggle makes it much harder for them to act rationally.

Which leads logically to the second option: starving the students. In Chad’s case, that might mean not even giving them what he has done, so that they come to it much more impressionistically and irrationally.

His third option—to “ju-jitsu your way out“—involves building on, modifying, and learning from what others have done:

So if you’re struggling to make your simulation work, why not look around at what others are doing and see if you can get their thing to work for you. If you’re not feeling so sharing-y, then you can also reflect on the other things you do: that’s how I developed the parliament game over its iterations, with its purpose being constructed backwards from what it actually did (which wasn’t what I’d set out to do).

We heartily agree.

On this same subject, this is a great time to recommend—not for the first time—the seminal 2011 Naval War College Review article by Peter Perla and ED McGrady, “Why Wargaming Works.”

We propose the idea that gaming’s transformative power grows out of its particular connections to storytelling; we find in a combination of elements from traditional narrative theory and contemporary neuroscience the germ of our thesis—that gaming, as a story-living experience, engages the human brain, and hence the human being participating in a game, in ways more akin to real-life experience than to reading a novel or watching a video. By creating for its participants a synthetic experience, gaming gives them palpable and powerful insights that help them prepare better for dealing with complex and uncertain situations in the future. We contend that the use of gaming to transform individual participants—in particular, key decision makers—is an important, indeed essential, source of successful organiza- tional and societal adaptation to that uncertain future….

PAXsims

052417_Forage2_800.jpg

In March the Forage Center for Peacebuilding and Humanitarian Education held its annual “Atlantic Promise” field exercise:

This year’s exercise scenario allowed students to become members of a fictitious humanitarian assistance organization and assist a population in conflict after a Category 4 hurricane. The exercise purposely combines students from different schools to build interpersonal relationships, teamwork, and negotiation skills under stressful situations.

Students from Tulane University were among those who participated, and there’s a short account of their experiences on the university website.

The simulation will next be run in November, retitled “Coastal Hope.”

PAXsims

The latest issue of ICONS News contains, among other items, a link to their latest promotional video.

The video contains sultry narrative tones of PAXsims associate editor Devin Ellis, so it’s obviously not to be missed! The ICONS Project can be found here.

PAXsims

d8ecc076-c576-4741-b67f-8c2a7d772dbd.png

On the subject of newsletters, the Spring 2017 update from the World Peace game is also available.

PAXsims

34308500662_46049ca318_z.jpg

Last month the US Naval War College reported on a technology upgrade to its wargaming:

The capstone wargame for international students at U.S. Naval War College (NWC) has undergone a huge improvement this year, replacing oversized game boards and ‘paper ship’ game pieces with a new computer application that uses touch-screen technology and allows multiple player usage.

The 65 students taking the intermediate-level course through the Naval Staff College (NSC) are now using cutting-edge technology in their annual year-end event, which took place May 5.

NSC courses are composed predominately of international students who were divided into Blue and Gold coalition teams that crowded the wargame floor to compete.

The wargame was introduced as the final event of the academic calendar for the class three years ago. The purpose of the game is to allow students to put the theories of operational planning that they have learned at the college into practical use.

“This game is a culmination of the academics the students have learned during the year with concentration on military planning, communication, cooperation and leadership,” said Jeff Landsman, game director and associate professor in NWC’s Wargaming Department. “We bring all of these concepts into a practical exercise, allowing the students to work in an experiential and knowledge-based setting. Additionally, the new technology lets the wargaming faculty execute a more interactive and efficient game.”

Developing the new computer simulation started after last year’s game. The project really starting taking shape in the new year.

“The game was pretty much created from scratch. We customized the [game] grid and a few of the other components,” said Anthony Rocchio, lead program developer for the simulation. “It really came together in the last couple months. The scoring function was started on Tuesday and finished Wednesday, for instance.”

The two coalitions then conducted separate planning sessions and briefed their respective courses of action (COAs) to the Wargaming Department faculty. These COAs were then executed during game play.

“This innovative, current, computer-based simulation more accurately reflects real-world situations that the students could face as they operate in the joint maritime environment,” said Landsman.

The application also allows a simultaneous feed of the results of the game into the two teams’ planning groups so they can plan future moves with better, more updated information.

The new computer-based simulation has many other advantages, according to Landsman.

“The new simulation brings a better understanding of the operational and strategic level complexities, barriers and collaboration when applying national and multinational sea power,” he said. “They get a better look at decision making, leadership, theater complexities, and joint and combined maritime operations. This is a very valuable upgrade for the students.”

PAXsims

Brian Train went to the US Army War College a few weeks ago, and you’ll find his report here.

Fortunately for me, he also showed up at the annual CanGames gaming convention in Ottawa a few days later.

PAXsims

Back in March, Gamasutra featured a discussion of This War of Mine (which James Sterrett reviewed back in 2014 for PAXsims).

The boardgame version of This War of Mine, launched as a Kickstarter project, has just been shipped—and I’m eagerly awaiting mine. I’ll review it as soon as it arrives and I find time to give it a try.

PAXsims

Observant readers will notice this “finding time to play” is becoming a bit of a theme—largely because I have so many game projects on the go. These include the Matrix Game Construction Kit (or MaGCK), together with PAXsims collaborators Tom Fisher and Tom Mouat; the Montreal edition of the July 1 wide-area megagame, Urban Nightmare: State of Chaos (UNSOC); a variety of game-related presentations in a repeat appearance at the UK Defence Science and Technology Laboratory (Dstl) next month; codesigning a South China Sea game with Jim Wallman for Connections UK in September (unlike UNSOC, this one should be zombie-free); and another crisis game for a government client in the fall (again, with Tom Fisher).

So many games, so little time…

ICONS Project seeks researcher/simulation developer

staff_wanted

Dear Readers,

I’ve not been posting much lately. In part due to the fact that I am under an extremely heavy workload. The good news, is that the ICONS Project is adding staff, so I should soon have a lot of time back! I encourage anyone in the PAXsims community who is interested in joining the Project to consider applying here and/or passing this notice on to your communities of interest.

The ICONS Project seeks a Researcher and Simulation Developer to support the ICONS Project’s growing portfolio of simulation-based research, education, and professional training programs. A substantial portion of this position’s time will be devoted to supporting projects looking at U.S. strategic planning and decision-making in the field of international relations and security.

The open position will join the Project’s simulation development team, reporting to the director of the Policy & Research program. Duties will include simulation design and writing, simulation maintenance, project management, research and development and instructional materials and tools, and technical support and customer service.

The position will be directly supervised by the ICONS Project Associate Director. The candidate will report to the ICONS Project’s lead simulation developer on overall creative matters, and to the appropriate principal investigator or program head on specific projects. The ICONS Project has a long history of growth and innovation, and we welcome applicants who are looking for an opportunity to shape and expand a position over time.

Best consideration date is October 27. The sooner we fill this role, the sooner I can turn to another “On Methods” posting…

Devin Ellis

The Dance of the Simulation Designer (updated)

Earlier this month I offered some rather critical reflections on a recent Syria crisis simulation held at Brookings, highlighting some of the the potential problems of wargaming as a tool of policy analysis, as well as addressing some apparent pitfalls in the Brookings simulation design.

In a subsequent blog post, Natasha Gill then added some thoughts of her own on the issue of designing simulations for think-tank clients.

That in turn led Devin Ellis of the ICONS Project to agree with some of what Natasha had said, but to also disagree with some of the rest. Because he raises some very important points about the inevitable compromises between design ambitions and practical realities, as well as the potential for still designing useful exercises within these constraints, I thought they were worth lifting from the comments section and featuring below as a full blog post. Note to Devin: This doesn’t count as the blogpost you promised us—we still hope to collect on that offer!

Update: Natasha offered a response, which I have added below. Any further discussion I’ll leave to the Comments section.

* * *

As someone whose bread is buttered by think-tank simulations, I am am following this evolving series of posts with interest. Some of the observations from Rex and Natasha Gill are spot on and there are things we all wish were better about these high-level exercises. But there are also very real constraints on these, and I find some of the statements above to be pretty sweeping and harsh without a lot of ‘evidence’ to support them. I know there are constraints on what you can put in a blog format, and I am really looking forward to the book when it comes out – but I think there’s some room for debate here.

Rex has brought up Stephen Downes-Martin’s comments and pointed out that the problem may sometimes be that a wargame or simulation might not really be the best tool in the analytic box. It is also true – however – that if a simulation IS a good approach, that is no guarantee the designer and the client won’t face all the same dilemmas incurred by high level participants and a politicized, media-seeking environment. What bothers me a little about Gill’s comments is the sense of expectation that 1) exercises not meeting all her criteria are inherently less valuable than those that do; 2) the major problem with think tank sims is failure to meet the articulated standards of participation; 3) the designer’s wishes will prevail over the reality of the atmosphere.

Stephen gave an excellent talk on living up to our professional integrity this year at Connections, but the truth is it’s also not a one way street. If we walk away from every simulation or game request where the client can’t or won’t meet every aspect of our ideal design and running, we’re limiting our usefulness to the policy world. Truly. If we’ve decided that a simulation or game is a useful investigative approach to the client’s problem, my responsibility as a designer is to help the client make sure the scope and methodology of the simulation are appropriate both to the questions under investigation AND the resources available.

There are many points I would respond to above (favorably as well to be fair), and I’ll write a full post if need be, but I will here just say a word about the issues with high level participants to offer an example of my reservations. I work on high level think tank sims every year, and the truth is, you will never get participants for the lengths of time envisioned in Gill’s work. Her comments on process are wonderful, and I would give my left arm to have participants at the top level in an isolation environment for a week to run a program – the truth is it rarely ever happens.

On the issues of role sheets Gill writes:

Paradoxically, the tendency to obtain false or weak outcomes from a simulation is more likely with participants who know the issues well than with novices. The former can, consciously or not, leap over the instructions provided in their role sheets, bringing their own interpretations to the table rather than learning from the simulation process. As a result, the simulation will confirm the assumptions of the participants rather than provide them with new insights.

I think the first sentence is a reach. Sometime’s that might be the case, but it’s bold to make that statement categorically – though I am willing to be persuaded by evidence. As for the rest, I raise the following contentions:

  1. Yes, think tanks recruit top-level participants in part to give the event or publication more profile – but that’s not the only reason. From a methodological standpoint the value of having those folks is that the ARE indeed experts. If the purpose of your exercise is to explore possible policy reactions to a crisis (I’m going to take it as a given that no one reading this blog believes the purpose of a well designed and run wargame is to predict the ‘real future’ in a complex policy environment) then the choice of participants is a factor in your design. Real top level experts might not be any better than undergrads at coming up with thoughtful, innovative approaches to a problem (they may be worse at is, as Gill implies) but they are undoubtedly better at depicting the probable behavior of their actual peers in a similar situation. A well run policy exercise acknowledges that. You expect the biases in your game design and you account for them in your debriefs and your analysis of the outcomes. Indeed this can sometimes be enlightening to the folks in the ‘thinking’ world about where they fail to understand which issues are viewed as most relevant to the folks in the ‘action’ world.
  2. Gill’s point about self-confirming, and therefore self-fulfilling, observations from participants who “skip over’ their detailed role sheets actually cuts both ways. ‘Garbage in, garbage out’ is not just the garbage the participants bring, but the garbage the scenario writer brings. I am very leery of what seems like an assumption that we, as designers, are always going to have a better take on realistic policy approaches to our hypothetical scenario than the person who has been at the table in real life. By telling my top level participants to obey the objectives or political attitude I have articulated in the role sheet without introducing their own perspectives and experiences, I am turning the simulation into MY self-fulfilling prophecy rather than theirs. I am also – indeed – making the added value of a top level person very limited. I’d be just as well with any reasonably well informed gamer.

In sum, I’ll say it’s a dance: there are sometimes big problems with high level participants – but there are also excellent insights to be gained from them. Gill’s points are well taken, but it is our job to see those issues and work to address them in a way that does not shop the whole prospect of doing focused games with those types of people.

Devin Ellis

* * *

Natasha sent in this response to the points that Devin raised:

* * *

Thanks to Devin Ellis for his thoughtful comments on my piece. I appreciate his feedback, and would like to clarify a few issues and pose a question to him.

The Time Factor

My first point is just a clarification: I’m not sure where Ellis got the idea that I would expect high level participants to spend an entire week doing a simulation. It’s true, my own specialty is creating and running extended and in-depth modules (if I’m teaching graduate students the simulation can last the full semester!). But when I work with diplomats or professionals in conflict, the modules are limited to two days.

I realize that’s more than most professionals can spare, and it’s always a struggle to get them to commit the time. But when they do, they usually offer two comments specifically on the importance of the time element: 1) it made all the difference in terms of grasping the ‘logic’ of the role and understanding the multiplicity of variables that each player had to manage; 2) it was key in helping them learn the most vital lesson, which was less about content and more about the experience of living out a worldview, an experience that helped them gain new insights into the interests, incentives, resistances and obstacles faced by various actors.

Who Fulfils Which Prophesy

I agree that facilitators/game developers can project their own prejudices onto a scenario with as much vigor as a participant. But I was certainly not suggesting that the alternative to participants running the show is the facilitator creating a simulation out of his/her own head. I think the best model is when simulations are developed with a great deal of input from outside specialists, on each aspect of the scenario and roles. The answer to the ‘projection’ question is that there must be more rigor in how modules are constructed, rather than faith in the abilities or knowledge-base of the participants.

I am not implying that high level participant’s don’t have strong abilities or in depth knowledge: I’m suggesting that a good simulation aims to challenge these in ways that benefit the participants and improve the quality of their policy recommendations.

My Way or the Highway

I can see why it sounded as though I believe each simulation should fit the model I’m outlining (you’re lucky you only heard about one part of that model! Don’t order my book when it comes out…). To clarify, I know each simulation cannot and need not fit one model. But simulations are proliferating like rabbits – in universities, think tanks, peace-making and peace-building training programs. And yet many are developed in an ad hoc manner, and facilitators who run them are not always specialists in education/teaching or in simulation development. Further, because almost any simulation generates a great deal of enthusiasm, we as facilitators and professors running them don’t always do enough in the sense of evaluating the weaknesses of the module.

Consequently, I think it’s worth outlining a best practice model of simulation, which is what I’m trying to do in my book. I accept that many great and rigorous modules will follow a different approach and have different goals and methods. I still think it might be useful to assemble and describe the elements that lead to very strong learning outcomes.

Exploring but Not Predicting

My question to Ellis is this: he writes that the purpose of the exercise is to “explore possible policy reactions to a crisis” but in the same sentence admits that “no one…believes the purpose of a well designed and run wargame is to predict the ‘real future’ in a complex policy environment.”

I think this means that although facilitators and participants are well aware that the details of the future can’t be known, the responses of various players to a crisis might be generally predictable in a simulation, in such a way as to be informative or useful for policy makers.

But upon reflection, if this is what Ellis meant, I’m not sure it makes sense to me. If a simulation can’t predict the future, then how much is it really telling you about possible ‘policy reactions’? And how much of what it does tell you about these actually useful (rather than merely interesting) to real policy makers? Policies are guided by the choices of human beings; and the motives guiding those choices are likely to be revealed in a simulation of it delves deeply into the realities lived by those human beings – the pressures they face inside themselves, within their own camp and in confrontation with their adversaries. It’s my view that in a crisis-simulation it is often the case that many of these elements are caricaturized rather than deepened.

Non-specialists versus The Pros

Finally, I’d like to make a point that I realize will sound like my least credible statement. I’m making it nonetheless because I feel I can take cover under the umbrella of those who originally raised it…

I always run a simulation with ‘coaches’ – ‘real’ negotiators, military/security officials, diplomats or analysts who, in addition to helping create the materials, are onsite throughout the simulation to help participants work through the issues. After watching the simulation evolve, the coaches almost always make the same comment: they say they are astonished and disturbed at the non-difference between novices and, well, themselves and other ‘real ‘actors. In contrast to Ellis point that top level experts are ‘undoubtedly better than non high-levels at depicting the behavior of their actual peers” (italics added) – the coaches I cite above are unsettled precisely by the opposite; the fact that, given very realistic roles, detailed materials, elaborate strategies, the non-experts at the table reproduce reality in ways that are striking: in terms of how they discuss and analyze complex issues, how they express the beliefs of various players, and how the dynamics between parties evolve.

I am not suggesting that these participants are able to offer policy recommendations on the same level as specialists. There are of course many differences in the knowledge and wisdom of high level and non-high level participants, and simulations have to work with and around those. But in my experience, the difference in what participants learn from a simulation is not the result of what they bring to the table, but what the table compels them to take from it.

Natasha Gill

%d bloggers like this: