The new Armed Forces Journal has this article
which proposes to fix some of the problems with planning operations involving complex interactions using a more culturally astute Red Teaming process. COLs Benson and Rotkoff lay blame for the problem on the OODA loop:
This notion that there are specific knowable causes that are linked to corresponding effects dominates military thinking and manifests in our drive to gather as much information as possible before acting. This concept was captured by Air Force Col. John Boyd’s decision loop: observe, orient, decide and act. In this OODA Loop, an endless cycle in which each action restarts the observe phase, it is implied that collecting information would allow you to decide independent of acting. Also implied is the notion that you can determine measures of effectiveness against which to observe each action’s movement toward achievement of your goal so you can reorient. The result of this type of thinking is to spend a lot of time narrowing the focus of what we choose to observe in order to better orient and decide. This drives one to try to reduce the noise associated with understanding the problem. We do this by establishing priority information requests or other methods of focused questions aimed at better understanding the core problem so we can control it.
OK. Agree with the spirit of this 100%. However I don’t see where Boyd would agree that “it is implied that collecting information would allow you to decide independent of acting” or “Also implied is the notion that you can determine measures of effectiveness against which to observe each action’s movement toward achievement of your goal so you can reorient.”
Boyd argued a very empirical basis for the OODA loop – you can’t decide (effectively) in the absence of information. I’m not sure i understand what “decide independent of acting” is?? Given that Boyd derived the OODA loop from the split second deciding of a fighter pilot, the notion that there is an implication of “MOEs” seems rather a stretch.
I have brought up the issue with “map problem” thinking (I’m here and want to get there, what is the route that most (quickly, safely, efficiently, etc) gets me there. Is an artifact not of Boyd and the OODA loop, but the MDMP. I’m still working that piece.
Its far less politically risky to blame a dead guy, than an entrenched process…but the solution proposed is called “red Teaming” but walks and talks a lot like…a wargame process.
Properly trained, Red Teams provide two critical elements. First, they can look at problems from a decidedly different perspective than the rest of the staff. If everyone else is looking at how to achieve the desired outcome, the Red Team examines how to avoid the worst possible outcome. While the rest of the team looks at the equities of the major players in the operational environment, the Red Team examines the lesser actors, how they would see events and how they might affect things from the periphery. Second, they create space inside the staff for slowing down, questioning common wisdom and giving a charter to an in-house skeptic.
OK, great idea, but what is the process the Red Team uses to “examine how to avoid the worst possible outcome”? Sounds like it may be related to wargaming.
Pre-mortem analysis is among the more useful tools in the Red Team’s box. If the standard military decision-making process is designed to maximize outcome, pre-mortem analysis is designed to minimize risk. As the staff develops a plan to achieve a desired end state, the Red Team envisions the worst-case future. They describe the nightmare scenario in detail and describe the events that led to it, then examine the plan to see how well it heads off the events that would lead to failure. Invariably, this leads the staff to see things it otherwise would not.
Not sure I fully buy the “maximize outcome/minimize risk” dichotomy. I’ve seen a lot of VERY risk averse staffs and think that there is a role for “Red Teaming” to help such a staff identify good and not just bad. If a gaming related process is one of the tools in the arsenal of this new group of deep thinkers, then why just limit them to ‘nightmare scenarios’ which runs the risk of going down extremely low probability rabbit holes (WMD anyone?)
The authors raise several countering arguments, and make some decent responses. One of the most important:
Most importantly, the Red Team has emotional distance from the plan. It is very difficult once invested in developing a plan, based on a mental model of the operational environment, to then abandon this understanding even when faced with overwhelming evidence that it is no longer true. Recall how long the Defense Department prohibited the use of the word “insurgency” to describe actions taking place in Iraq in 2003-2006.
This notion of a group within the staff that is separate from the planners and performs analysis of the plans and reconstructs exercises is the reason the Army and Air Force have Operations Analysis as an MOS. In many high level (3 and 4-star) staffs there are such groups that do many of the things the article discusses, just with an emphasis more on Title X aspects than operational ones, but the ones I’m familiar with would love to do wargaming and analysis associated with planning. unfortunately the tools that most of these groups use are “big iron” simulations that are not conducive to adding human decision-making “in stride”.
So this “Red Teaming” concept seems to have some good goals, despite pointing the finger of blame in the wrong place… What tools do they need to achieve these goals? I would propose that wargaming (in a variety of guises from traditional “manual” CPX to computer assisted, interactive, testing of the plan against “a thinking adversary”.
Decision-making in the 21st century will take place under conditions of ambiguity and hyperspeed in information: in a word, complexity. Red Teams are not the only element of change required to cope with complexity. Across the force, we need to embrace abductive reasoning instead of inductive reasoning, free ourselves from our belief that qualitative metrics are the only valid way to measure progress, and value the fingerspitzengefuehl of leaders at the edge.
Critical thinkers require education and experience. Commanders require critical thinkers who can challenge assumptions and offer alternative perspectives. Red Teams educated to support decision-making can contribute to more nuanced decisions and serve to inculcate critical thinking across the force.
Designs, better visualization of the battlefield, or other improvements to the commander’s and staffs’ understanding of the operational environment are not substitutes for teams of trained and educated contrarians on the staff.
Gee, echoes of one of my favorite wargaming quotes from good LT McCarty-Little:
Now the great secret of its power lies in the existence of the enemy, a live, vigorous enemy in the next room waiting feverishly to take advantage of any of our mistakes, ever ready to puncture any visionary schemes, to haul us back down to earth.
Sounds like a Red Team 😉