There are times when the right person, who has been at the right places, puts that experience together and says something really important that “speaks truth to power” as they say. Dr. John Hanley’s piece in the new Naval War College Review is a great example. Much angst has been generated in professional wargaming and operations analysis circles about what has been characterized by some as “loss of confidence” in the current analytic support to senior leaders in the Services and DoD. Many in the operations analysis community take issue with this characterization and it may well be a case of a single tool not being sufficient any longer, rather than a particular tool becoming dulled. In any case, Dr. Hanley, who has spent decades as an expert in the uses of campaign analysis and wargaming, provides an objective assessments of the good, bad, and ugly of these two toolboxes in the context of scientific inquiry – are they OBJECTIVE, RIGOROUS, and USEFUL.
After a brief summary of “How we got here” for both wargaming and campaign analysis (using computer simulation) he takes a penetrating look at how each toolset compares on those 3 criteria. Giving that the title is a bit of a spoiler about his conclusions, it is not surprising that he challenges traditional notions that computer simulations score highly on those 3 metrics, and wargames do not. He provides well documented evidence for criticisms of computer simulations – campaign analysis in particular – that many in the wargaming community have felt were stones cast at wargaming by those in a ‘glass house’ on the analysis side. He doesn’t let wargaming off easily, however, suggesting there is much room for improvement through study of complex and chaotic systems, emerging AI techniques, and simpler, more understandable game mechanics and underlying combat models taking better account of human nature.
His prescription is one whose elements have been discussed several places recently – but which he presents together elegantly and with copious references – campaign analysis is no more “broken” than wargaming or any other tool of operations research – the problem is we have forgotten the limitations in the “fine print” for each and are not using them TOGETHER in a systematic, scientific manner.He illustrates several cases of past failures resulting from trying to produce “universal answer machines” that gave “the right answer”. He calls it a fool’s errand to look for any tool that provides incontrovertibly what the decision-maker should do: which course of action should be followed, or what weapon to buy and how many. Even more foolish to try to predict perturbations in future battles arising from “swapping out” one weapon for another leaving all else constant.
His advice is to move from an analytic agenda aimed at drilling deeply into individual platforms and weapons systems to a more integrated mission-focused one.
If DoD is to overcome its accelerating mismatch between limited budgets and growing challenges, it requires a new analysis paradigm and a culture focused more on national security than on protecting parochial service and program priorities by withholding knowledge and data.
He recommends the following (my cutting and pasting from his much more detailed explanations):
- Asking Essential Questions and Selecting Appropriate Methods. DoD should realize that the principal value of good analysis is in eliminating infeasible or unsuitable courses of action, and that no analyses can provide point solutions to complicated problems. The most appropriate action from pseudoexperimentation, whether war gaming or combat/computer simulation, is exploring the validity of the findings using other techniques. Analysis campaigns involve using a variety of techniques to address important issues. Cycles of research emphasize the interaction among these techniques as progress in one investigation informs others and is in turn informed by them.
- War Gaming and Combat/Campaign Simulation. War gaming and combat/ campaign simulation are complementary to each other. Both provide insight to participants on factors governing the contingency under study and issues and data needing further study.
- Fleet/Field Operations Analysis.Games and combat simulation should tie directly to field/fleet exercises experimenting with new concepts, using prototype systems designed to address capability enhancements, and carefully collecting data to inform important areas of ignorance and assumptions used in plans, games, and campaign simulation.
- Intelligence Collection. War games also should be tied to intelligence collection and analysis.
- Campaign Analysis. Rather than using war games or large campaign models that require significant amounts of time to set up, rapid, focused analyses on the eve of war have demonstrated value in anticipating important outcomes.
- Simple versus Large Combat Models. Good analysis derives from understanding those few essential features of the subject under study that govern an outcome. 125 Although using models to understand essential features is valuable, attempting to predict outcomes by adding ever more detail without considering the implications for additional uncertainty is antithetical to analysis. Campaign analyses and manual war games employing simple, focused combat models and rules that are understood and subject to question by all participants can expose the factors that govern success—i.e., those on which commanders and capability developers should focus.
- Complexity Sciences. Advances in complexity sciences raise questions regarding current combat models and present new opportunities for defense analyses.
- History, Cognitive and Social Sciences, and Artificial Intelligence. The cycle of research for war gaming and combat/campaign simulation also extends to studying history and developments in social science, including experimental gaming on human behavior (such as in behavioral economics) and cognitive science studying developments in understanding the brain, etc., to explore human reasoning and dynamics.
- Reviewing Previous Results. A final area of emphasis in a cycle of research is reviewing previous results.
At 40 pages with over 100 footnotes, it starts this new year off with much to chew on!
GO READ IT NOW!!
I look forward to reading the published version. It sounds as if John successfully addressed some of the concerns I had about the earlier draft regarding the marshaling of evidence. If only the DoD machinery were not so rusty and resistant to improvement.
Sent with Good (www.good.com)
I particularly like the bullet:
The cycle of research for war gaming and combat/campaign simulation also extends to studying history and developments in social science, including experimental gaming on human behavior (such as in behavioral economics) and cognitive science studying developments in understanding the brain, etc., to explore human reasoning and dynamics.
A new link for the published article is