Managing Risk in the Fight Against IS25 Oct 2016
As the battle for Mosul rages, the West faces a key turning point in the struggle against the so-called Islamic State (IS). The strategic choices made in the coming months to counter the risk posed by IS are likely to have ramifications that will be felt – good or bad – for years to come. As such it is an apt time to reflect on the lessons learned from past mistakes, in particular from Operation Iraqi Freedom.
Following the publication of the long-awaited Chilcot report this summer, much public debate has taken place on the lessons to learn from the Iraq war. However, much of this critical discussion has focused on the build up to the war and the case for war, or the mistakes made post-invasion that helped prepare the ground for the ensuing insurgency, and ultimately the events that we are confronted with today. Although all of these are extremely important issues, discussion of perhaps the greatest lesson has been largely absent: the boomerang effect. Viewing the conflict through the lens of risk management theory, the boomerang effect makes the consequences of one’s actions harder to anticipate. As Rasmussen argues, the main characteristic of the boomerang effect is that it makes the risk manager the possible victim of the risk management exercise. In effect, the deterred risk causes further risks that, like a boomerang, return to the manager, rather than just any actor involved in the situation. In essence, one becomes the object of their own actions, as the consequences cannot be fully managed and can therefore create further and more severe risks. Thus, unlike the concept of the much talked about blowback effect, the boomerang effect captures how new risks emerge as an unintended consequence to the manager rather than to the situation managed.
From the comments made by former UK Prime Minister Tony Blair before the war, it is clear that the Iraq invasion was framed as a risk management exercise to prevent the possible future risk that Saddam Hussein posed to regional and global security. This was a relatively new approach to security, proactively tackling issues that were perceived as future threats. Contrary to taking action to address “clear and present dangers” presenting an imminent threat, as was previously the case for legitimising the use of force, this new doctrine envisioned taking action now to prevent possible future threats that may happen but had not yet manifested themselves as direct threats.
It is important to remember that this evolution in strategic thought took place in the wake of 9/11, with the American administration desperate to protect the US from any future threats, and to deal with them as risks before they materialised. This risk management approach was famously summed up by Donald Rumsfeld when he explained that “[t]here are things we know that we know. There are known unknowns. That is to say there are things that we now know we don’t know. But there are also unknown unknowns. There are things we don’t know we don’t know”.
What Rumsfeld was articulating is the basic idea behind risk management, going on to argue that “absolute proof cannot be a precondition for action” and that absence of evidence is not evidence of absence. It was argued that in the post 9/11 world, in which the consequence of inaction was perceived as too terrible to contemplate, it was no longer possible to wait for absolute proof of an impending risk, adopting instead a proactive approach to protect against possible future threats now. In short, the risk that Saddam Hussein could in the future ally with Jihadist terrorist groups (even though he was at the time opposed to them) and provide them with WMD was too great to wait for proof of this happening, as the mere possibility of this happening, it was argued, necessitated pre-emptive action. Thus, the decision to use force was to prevent the future possibilities of the worst-case scenario from happening in regards to Saddam’s regime.
As President George W. Bush summed up, “Saddam Hussein is harboring […] the instruments of terror, the instruments of mass death and destruction. And he cannot be trusted. The risk is simply too great that he will use them, or provide them to a terror network […]. We cannot wait for the final proof — the smoking gun — that could come in the form of a mushroom cloud. […] There is no easy or risk-free course of action. Some have argued we should wait — and that’s an option. In my view, it’s the riskiest of all options, because the longer we wait, the stronger and bolder Saddam Hussein will become”.
This smoking gun metaphor was widely used by US leaders and it was used in conjunction with the scare of a new attack, similar to 9/11. However, the probability or the magnitude of the possible risk was unclear. This is a rhetorical way of implying the danger that Hussein was seen to pose to US security, without having to have actual evidence about these instruments or WMD. The conditional grammar in the language further evolved to a call for support for the war, without implying imminent or specific threat.
The examples of risk language in the case of the Iraq war show the clear transformation to a new form of “what if” – decision making. It is evident from the speeches of the time that the decision to go to war did not derive from the perception of an imminent threat posed by Saddam Hussein, but what was perceived as a future possibility. This focus on the singular objective got in the way of paying attention to the boomerang effect and the potential consequences that removing Iraq’s authoritarian leadership would produce for the United States-led coalition.
The problem with evaluating risk management success is that it needs to be measured in non-events. In Iraq, this means evaluating between the non-event in the case of inaction and action. The “precautionary principle”, which holds that uncertainty is not an excuse for inaction when it comes to serious risk, is problematic in cases like these since there is no objective way of measuring qualitative risks. When inspecting the war through this criterion, the operation was successful in the sense that Saddam never acquired a full WMD capability. In this view, the policy should not be evaluated based on the best-case solution, but rather how well the said risk was managed.
However, although the Iraq war achieved its risk management objectives, it is widely acknowledged as a failure due to the death and destruction that it has caused in Iraq and the spill-over effect into the region as a whole during the thirteen years of ensuing conflict, not least because rather than countering the Islamist threat it served as a major catalyst for amplifying the jihadist cause. These events, as the unintended consequences of the Iraq war (although not necessarily unforeseen consequences), are its real lessons.
In the case of the Iraq war, the cure was worse than the cause. In their rush to tackle all perceived risks by Saddam (imminent or not) in the wake of 9/11, the American security establishment failed to consider the impact of their actions during the removal of the risk. As seen in Iraq, the risk management exercise might actually create an even bigger, more imminent threat, that was not considered a risk in the first place. Hence, through its actions to mitigate the risk it faced, the US government’s actions in fact only increased the risks it now faces. It became the object of its own actions – the boomerang effect.
Fifteen years into the war on terror, the boomerang effect is a crucial lesson that the West must learn and factor into its strategic thought in more depth. The West is continually dealing with the fallout from its previous preventive actions. Both the invasions of Iraq and Afghanistan may have dealt with the objectives they were conceived for, but in the process they have had unintended consequences that have driven the global jihadist cause and increased the security threat to the West. As the West contemplates its next steps in the ever escalating Syria conflict, this should be at the forefront of strategists’ minds. It is paramount that the boomerang effect is factored into strategic thinking to avoid the mistakes of the past, and to make sure that actions to reduce the risks posed by IS do not end up making us the object of those actions and facing increasing risks.