The term ‘Red Team’ has made the transition from its origins within military war-gaming to the lexicon of both government and business. What is less well understood is what a red team is, what it does and does not do, and how and why red-teaming can be helpful to an organisation. In Red Team: How to Succeed by Thinking Like the Enemy, Micah Zenko, a Senior Fellow at the Council on Foreign Relations, takes a well-known yet poorly understood concept and answers these questions in terms that can be easily grasped by the reader based on a central theme that, ‘you cannot grade your own homework.’
Red Team: How to Succeed by Thinking Like the Enemy [Image Credit: Google Books]
The book is broken into six chapters, each dealing with a facet of Zenko’s research into the three core red-team techniques: simulation, vulnerability probes, and alternative analyses. In the first, Zenko distils from his research six best practices in red-teaming: (1) the boss must buy in – if leadership doesn’t support the red-team, the exercise is futile; (2) the team must be from outside and objective, but have familiarity with and empathy for the organisation they are supporting; (3) fearless sceptics with finesse – while red-teamers need to think differently, their output needs to be able to be presented effectively; (4) “have a big bag of tricks” – the red team cannot become predictable; (5) the organisation must be willing to hear bad news and act upon it; (6) red-team just enough and no more – it is stressful and overdoing it can become disruptive.
Realistically, the reader could choose to put the book down at this point having consumed the key takeaways; this would be a mistake. Although Zenko’s bottom-line-up-front approach cuts to the chase, these derived best practices lack critical context without the reference points and nuance provided in the later chapters. These are critical for the reader to fully understand how, when and why to use red-teaming effectively.
The second chapter covers the origins of and military use of red teams and emphasised the benefits and limitations of red-teaming to address issues such as groupthink created within rigid hierarchies. It begins with an examination of how red-team techniques evolved Army war-gaming in the Cold War to become the Red Team University at Fort Leavenworth as a means to address what General Peter Schoomaker, former US Army Chief of Staff, saw as ‘the regimentation and institutionalisation of mediocrity’ in the US Army. Zenko then looks at efforts to integrate red-teaming effectively into the US military; with an emphasis on command commitment and willingness to accept the red team’s results. These include the 2012 red team review of the US Capstone Concept for Joint Operations, the limited success of the US Marine Corps Commandant in incorporating red teams into USMC command staffs since 2010, and the intentional crippling of the red-team during the United States’ 2002 Millennium Challenge. Of these, the organisational denial that followed the Millenium Challenge Red Team’s ruthless defeat of the Allied forces on the first day of exercise provides is particularly informative.
In the third chapter, Zenko examines the US intelligence community to reveal subtle differences in approach and application while again reinforcing the six best practices. In these organisations, he discovered three barriers to optimised performance: (1) a tendency to overestimate the likelihood of high consequence events to minimise backlash should they occur; (2) socialisation between analysts in a team or section that prevents individuals from developing distinctly different conclusions to other team members; and (3) the “tyranny of expertise” which anchors them in deeply held views on their field of expertise. Zenko again uses examples to explain the limits and utility of red teams, beginning with the corrupted 1976 CIA ‘B Team’ assessment of the Soviet Union’s nuclear arsenal and the absence of an independent review prior to the 1998 strike on a Sudanese pharmaceutical factory neither owned by Osama bin Laden nor producing VX gas, before detailing the effective red team estimates that informed the decision to attack the compound in Abbottabad in which bin Laden was killed.
Other government agency approaches to vulnerability probes and simulations for critical infrastructure are presented next. Aspects of the best practices are again reinforced through further examples, the most disturbing of which relates to the United States’ Federal Aviation Administration (FAA) red team established in the 1990s following the Lockerbie bombing. This specific example highlights that despite the resourcing and motivation of the red team itself, their efforts become effectively irrelevant in the absence of leadership commitment to their findings. In this case, the FAA leadership’s choice to consistently ignore the red team’s recommendations regarding security at Logan airport is causally linked by Zenko to the success of the 9/11 attacks on the World Trade Center.
In the fifth chapter Zenko looks at red team use in the private sector. The key distinction between this and the previous military/government usage appears to be motivational. When executives in private firms use red teams, they appear to do so as either a means to improve strategies and gain a competitive edge in the market to achieve individual advancement in the company, or as a form of insurance against a failed strategy in which they can highlight that they applied due diligence by engaging a red team. While this self-interest is not surprising given the more direct accountability in the commercial world compared to the government sector, the potential for the red team to be token in nature is increased.
Zenko closes by describing the limitations of red-teaming and the problems that can result if it is planned and executed poorly. In contrast to his best practices he proposes the five worst: (1) the ad hoc devil’s advocate where someone is appointed to artificially provide a dissenting view; (2) adopting the red team’s findings as policy, rather than to inform policy; (3) freelance red teams that set their own scope or are not endorsed by the organisation; (4) shooting the messenger when the red team’s findings do not support a predetermined course of action; and (5) allowing red teams to make, rather than inform decisions.
Red Team is inherently interesting, easy to read and contains several concepts that make it a worthy book for consideration as part of a Professional Military Education and Training regime. It clearly demonstrates through multiple examples how an informed and empowered red team can help an organisation overcome the cognitive and organisational biases that constrain not only their decision-making but also their ability to generate courses of action in the first place. Importantly, it also highlights the need for independent and critical thought regarding an organisation’s operations, culture, processes and the core assumptions on which they construct strategic planning.
For Air Force, red teaming will be central to the success or failure of Plan JERICHO – without it we risk the institutionalised mediocrity Schoomaker feared. As the Deputy Chief of the Royal Australian Air Force (RAAF) reflected on during his speech to the 2016 RAAF Air Power Conference, Air Force need to harness and develop our iconoclasts – those individuals who have a natural tendency to innovate, to challenge orthodoxy in our processes and decision-making, and who then cannot progress because of our organisational bias against contrarian thinkers. This description closely reflects the core requirement for red-teamers, ‘people who are outsiders and think differently, but haven’t given up on the institution yet.’
WGCDR Jason Begley is an Air Combat Officer in the Royal Australian Air Force. He is also a Sir Richard Williams Foundation Scholar and PhD candidate at the University of New South Wales. The opinions expressed are his alone and do not reflect those of the Royal Australian Air Force, the Australian Defence Force, or the Australian Government.