Home Career When Intelligence Assessments Conflict

When Intelligence Assessments Conflict

4

By Erik Kleinsmith, American Military University

In a recent Daily Beast article, “Exclusive: 50 Spies Say ISIS Intelligence Was Cooked,” Shane Harris* and Nancy Yousef describe a situation that is all too common in the intelligence community: Different assessments conflict or disagree with one another.

While their story deals specifically with a problem occurring in U.S. Central Command (USCENTCOM), the problem of conflicting intelligence assessments happens to every analyst several times throughout a career. The causes may seem unavoidable, but there are remedies to reduce their chances and ways to deal with them when they occur.

Conflicting intelligence estimates show up both internally and externally.

Internal Conflicting Assessments
Conflicting intel assessments can occur within the intelligence apparatus itself. One assessment made at a lower level in an organization can disagree with an assessment made at a higher level. This was the case at USCENTCOM this summer, when analysts claimed their assessments not only disagreed with senior leadership, but were also changed to fit the higher narrative.

[Related Article: The 4 Core Abilities Needed for a Career in Intelligence]

Even different intel organizations working the same problem set can put out assessments that disagree with each other. For example, analysts at the CIA can disagree with analysts at the Defense Intelligence Agency (DIA) or the NSA. Because of the parochial nature of the intelligence community, these types of disagreements happen all the time.

External Conflicting Assessments
Intel_Conflicting ReportsExternally, conflicting assessments can come up between an intelligence analyst and a customer who has made up its mind about a given situation. While the primary job of intelligence is to provide information to the customer as a decision maker, this relationship is often hampered by rank and seniority as well as a lack of communication and trust between the two.

Intelligence analysts will routinely find themselves providing information to someone much senior to them such as a combat commander, police chief, senior investigator, or the president of a company. These decision makers combine intelligence assessments with their own sources of information and experience. If their own opinions conflict with what the more junior intelligence analysts find, these leaders must either disregard that intelligence or force the analysts to argue or articulate their position more effectively.

[Related Article: Intelligence Work Expands Beyond the Core Intelligence Community]

Additionally, decision makers themselves may have been given information or guidance from an even higher authority that forces them toward a particular line of thinking. Unfortunately, this guidance could be politically driven and therefore motivated by a different set of factors than what analysts take into account.

While there are many specific situations in which intel assessments can conflict with each other, the key to minimizing this problem is to understand the common sources of disagreement and to attack these sources directly.

Good Assessments Rely on Trust
Like all forms of effective communication, the relationship between intelligence analysts and their customers are based on an established level of trust. The customer—be they the decision maker or another analyst—must trust that the analyst making a particular assessment is competent, articulate, and honest. They must trust that intel analysts understand the situation and their job. They must also trust that analysts have an innate ability to communicate verbally and in writing, and have taken steps to eliminate their own bias or agendas from an assessment.

If any of these types of trust are lacking or absent, the integrity of assessments can be compromised. When decision makers tend to rely on their own insights and experience or other sources of intel over the assessment, that indicates that there is a problem in the analyst–customer relationship.

As intel professionals, we must continually overcome this hurdle of trust with every new customer. I once had a brigade commander who, upon our first meeting, proclaimed that he himself was the best intelligence officer in the brigade. He may as well have declared that he was never going to trust the assessments of his own intel officers. Fortunately, I was the incoming intel officer (or S2) for a battalion commander in his brigade at the time and not on his staff. As every decision maker weighs the intel assessment provided, he or she will also evaluate the analyst for competence, articulation, and honesty.

For conflicting assessments that occur internally, trust is still the basic issue. If your assessment disagrees with another one up the chain, it can be commonly perceived as a challenge to the competency of the author of the other assessment. Personality and social factors can contribute to the situation here as well. While it’s important to work through disagreements, simply following the rule that the higher-level assessment is correct will almost guarantee failure.

Tools and Processes to Combat Conflicting Assessments
Structured analytical techniques include red team analysis, devil’s advocacy, team A/team B analysis, and high impact-low probability analysis.

Each of these techniques is used to challenge the current line of thinking or produce alternate analyses with proper backing of the results. Each of these also requires the commitment of resources, time, and personnel that may or may not be available depending on the organization and situation. After learning the academic version of these techniques (taught in courses INTL401: Critical Analysis and INTL402: Intelligence Analysis), your challenge is to adapt these techniques to your situation and incorporate them into your operations given your level of resources and time available.

Wargaming is another way to reduce conflict in analysis primarily because it involves both the decision maker and the analyst, with the analyst taking the position of the adversary. Want to gain the confidence of your commander? Be a challenge for them in a wargaming session. As one great commander once told me, you’re never allowed to win, but you better make it a challenge for them to beat you.

A final way to reduce conflicting assessments is through training and education of intel professionals. Even the most seasoned intel analysts can stand for a brush-up in analytical techniques, critical thinking, and briefing and presentation skills. Keeping yourself and the analysts you work with trained and always seeking educational opportunities will allow them to more easily gain and maintain the trust of the decision maker whom they are supporting.

*Full Disclosure: I’ve worked with Shane Harris extensively on his first book, The Watchers, and consider him a friend of mine.

Erik KleinsmithAbout the Author: Erik Kleinsmith is the Associate Vice President for Strategic Relationships in Intelligence, National & Homeland Security, and Cyber for American Military University. He is a former Army Intelligence Officer and the former portfolio manager for Intelligence & Security Training at Lockheed Martin. Erik is one of the subjects of a book entitled The Watchers by Shane Harris, which covered his work on a program called Able Danger tracking Al Qaeda prior to 9/11. He currently resides in Virginia with his wife and two children.

Comments

Comment(4)

  1. Erik,

    Your article calls conflicting intelligence assessments a “problem”. I would counter that not only are they not a problem, but that they are desirable. The bigger problem is group think, or in other words, not developing enough possible scenarios.

    U.S. Army doctrine says to “Identify the full set of courses of action available to the threat/adversary”. The commander must “understands all COAs a threat/adversary commander can use to accomplish his objectives” so that the commander can “plan for all possible contingencies”

    The norm, however, is to develop no more than two COAs. What are the odds that one of the two is what the enemy will ultimately do? From my experience, and what I always told young intelligence officers, is that you will be wrong more than you are right, but the more possible scenarios you develop the greater the odds that you will be able to provide the commander with the intelligence he needs to plan and execute a successful operation.

    Through the information collection plan the intelligence officer is able to eliminate COAs that are not being adopted, and confirm those that are, but if we fail to consider all of the possible scenarios initially it is very difficult to employ effective intelligence collection assets.

    Scott

    1. Scott,

      Your comments read to me as if you are conflating the assessment with analysis.

      To refer back to the doctrine, 2-01.3 lays out the steps of IPB, of which the 4th pertains to COA development. It’s only after the 4 steps have been completed that doctrine has the staff presenting their assessment to the CDR during the MDMP (step 6).

      My use and understanding of the term ‘intelligence assessment’ has always been that it follows along after the analysis and contains the collected results of said analysis.

      Dean

  2. Scott,

    Thanks for your comment. I appreciate your insights as they accurately portray another side of conflicting intel assessments.

    I used the term problem because conflicting assessment can be both an issue to be worked through and/or a result of dysfunction – it all depends on how well an organization can handle them when they occur. If your organization has a well-tuned command climate and working environment, working through differing assessments can be beneficial. If not, well we’ve seen too many examples of how it can break the planning process.

    Executing a collection plan is another great method to confirm or deny a particular Course of Action for an enemy. Thanks for including a description of it. As the norm for the tactical environment, running a collection plan has many steps similar to conducting techniques such as Analysis of Competing Hypothesis (ACH) et al. As a Battalion and Brigade Intel Officer, I was always required to present three: Most Likely, Most Dangerous, and Possible.

    Like Structured Analytical Techniques, running a collection plan requires time and additional resources. Because the military is so well geared for collection planning – it’s a doctrinal requirement as a method. For the intel analyst supporting a police investigator, corporate security group, or a similar non-military organization, your comments highlight the importance of thinking tactically no matter where the intel analyst works.

  3. Thanks Mark, I’ve been frustrated by seeing Intelligence get the blame for a surprise when in some cases, it was the decision-maker who disregarded the intel supporting them. However, as one commander told me, I was never allowed to say “I told you so”, as that’s not the job of intel. I am an advocate of Structured Analytic Techniques listed in the article used to elicit doubt in the prevailing read often taking these steps and truncating them to best fit the time/resources available to the tactical environment. More often than not – my “Most Dangerous” assessment was used in place of my “Most Likely” read as a result.

LEAVE YOUR COMMENT

Your email address will not be published. Required fields are marked *