[ad_1]
EXPERT PERSPECTIVE — When I was just starting out at CIA, there was an analyst in my group who worked in a particularly methodical way. As she read all the various intelligence reports, she would type on a sheet of paper (and it was a typewriter then) the excerpts that she considered meaningful. She would then cut the paper into strips, so that each strip contained just one excerpt, and filed them in notebooks. When it came time to write an article about a particular issue, she would pull out the relevant strips of paper, organize them into paragraphs, write connecting and transition language and an occasional topic sentence, and, voila! She had an analytic product.
I am not making this up. On occasion, I would walk by this analyst’s cubicle just when she had laid the strips of paper in the optimum order, and I would be sorely tempted to blow on her desk to scatter the strips hither and yon. I never did that, but I did – even as a junior analyst – ask my bosses whether they approved of this approach to analysis. I certainly didn’t. Even early in my career, I appreciated that reality was not a cut-and-paste operation. I remember them shrugging their shoulders and remarking that they couldn’t argue with the productivity. Our analyst was the most prolific member of the team, churning out analytic content at twice the rate of any of the others. But her intelligence reports, accurate in the details, were uninspiring in their insight.
This memory came to mind when I read about the Director of National Intelligence’s ongoing review of how the IC assesses the fighting power of foreign militaries, particularly their “will to fight.” The effective stubbornness of Ukraine’s military surprised US policymakers who had been told by the IC that Russian forces would make short work of its defenses. What were those assessments based on? My hunch is they were based on the available reporting, which probably could account for concrete, objective things such as the quantity and quality of military equipment, and even anticipated tactics. But there were clearly some aspects of the situation that traditional intelligence reports could not account for no matter how meticulously they were assembled.
What is lacking in established intelligence tradecraft and how can we fill in the gaps? This is the question I’ve been examining throughout my 40+ years as an analyst.
The methods of the Intelligence Community—and indeed of most knowledge organizations—skew in favor of rational cognitive practices. But there are any number of major human and societal issues that escape—to one degree or another—rational cognitive examinations.
Emotions—how each individual reacts to a particular event—are an obvious example. But beyond individual emotions, we can also speak of national moods. As was recently argued by Stephen Coleman in the International Journal of Politics, Culture, and Society, “citizens’ attunement to political mood comprises an element of political orientation that exceeds cognitive explanation.”
What we are witnessing in Ukraine, is the power of a resolute national mood and will to survive. Moods tend to be about everything and nothing, Coleman argues, and can coalesce quickly. At some point, will we witness the development of a new mood among Russians? Will the Intelligence Community see it coming?
The answer: Not if it only pursues rational, cognitive approaches toward making sense of the world.
Making sense of the world requires more than the collation of intelligence reports—regardless of how detailed they are. It’s time for the IC to expand its thinking repertoire by seriously exploring nonlinear and more impressionistic mental practices.
Daniel Kahneman’s publication in 2011 of his landmark book, Thinking Fast and Slow focused new attention on the problems of cognitive biases and the advantages of good thinking. (Kahneman has since admitted that some of the book’s conclusions were based on experiments that are now known to suffer from the replication problems afflicting social science research.) Even if you haven’t read the book, you’re probably familiar with the concept of System 1/System 2 thinking that Kahneman popularized. System 1 thinking could almost be described as non-thinking. It’s automatic and fast and directs much of what we do as humans. System 2 thinking is logical, well-ordered, and slow. It is what we call rational thinking, and we like to tell ourselves that it’s protected from emotional contamination.
The reaction to the book from many organizations, including the Intelligence Community, was to conclude that knowledge workers need to do more System 2 thinking and less System 1 non-thinking. Quick, intuitive reactions to an issue or event (System 1) are riddled with irrational cognitive biases; we’re better off improving our rational, logical thinking practices. This will result in better analysis and support to policymakers…or will it?
My sense is that this overcorrection toward System 2 thinking—and the labeling of more intuitive, less-structured practices as not helpful, error-prone, and perhaps even nonprofessional, has – in my view – been a mistake. Our intuitive System 1 minds are in many ways more powerful than our System 2 processing.
System 1 can connect dots and identify patterns that will escape even the most careful reading of the usual sources. Reading reports and categorizing their contents—the default tradecraft of the intelligence community—may be fine when we’re tracking widgets, but these methods can’t keep up with the complexity of modern times. We already know that artificial intelligence and deep machine learning hold considerable promise in making sense of wildly separate and yet subtly interconnected events—they are essentially an imitation of our System 1 processes. But each of us also comes with an amazing piece of standard equipment—the human brain—that can detect patterns and relationships without our conscious involvement. Yes, we can use artificial intelligence to process reams of data—but it’s becoming clear to me that we won’t know what to do with that data unless the Intelligence Community prioritizes the improvement of our intuition.
The idea that intuition is the much lesser partner of analytic thinking is based on misunderstandings of human thinking processes. When a thought or gut feeling enters our mind unbidden, it is likely the product of behind-the-scenes brain work. Many neurologists now think the brain can best be described as a predicting machine that constantly compares its current perceptions against all its memories. The brain can detect emerging patterns or changes that deserve attention, long before the analytic brain comprehends evidence of a new trend.
As a recent study on creative thinking found, the best results should occur when System 1 and System 2 collaborate on decision making and insight-production. After all, behavioral evidence tells us that the number of confounding factors we confront exceeds all of our System 2 sensemaking abilities; there are more than enough mysteries to go around.
Despite the disregard many have for intuition, some knowledge workers explicitly acknowledge the role of intuition in their work. Historian of Science Jimena Canales has written that, “stories of scientific discovery often turn on moments of imagination, dreams, and the unreal.” Among the scientists who have pointed to dreams as sources for their discoveries are Dimitri Mendeleev, Alfred Russell Wallace, and James Watson. Workers in fields such as marketing and design for whom creativity is essential often employ practices—such as meditation—to better access their intuition. Intuition is particularly essential for dealing with more difficult, wicked problems that cannot be solved through linear thinking.
What can the Intelligence Community do to improve its use of intuitive talents?
Subscriber+Members have a higher level of access to Cipher Brief Expert Perspectives on Global Issues. Upgrading to Subscriber+ Status now.
First, stop discouraging the application of intuition. I remember a time when intelligence agencies insisted that every analytic judgment be backed by a specific intelligence report. Of course, it’s good practice to base our judgments on sound intelligence but it’s folly to insist upon this as an absolute rule. We are not absolutely certain that intelligence reports and other information we receive accurately represents 100% of reality; in fact, we are certain this is not the case, and we can’t correctly estimate what percentage of reality we fail to capture.
To drive home this point, I would often ask groups of analysts to tell me—if the entire room represented all that could be known about Al Qaeda—what part of the room represented what the Intelligence Community actually knew? On more than one occasion, an analyst held up a coffee cup.
We can’t very well tell policymakers that a group of analysts has a hunch that X or Y could happen, at least not without preparatory groundwork. But we can encourage analysts to engage in quiet individual or group reflections on a regular basis to allow new or different ideas to penetrate their consciousness. As Asta Raami, a researcher on “intentional intuition” notes, any behavior that encourages quieting of thoughts can be helpful in gaining new insights.
Over the years, the IC has experimented with non-traditional analytic methods that had the potential to incorporate intuition. One technique was asking analysts who were covering political instability, to use numerical scoring to keep track of how things were progressing…or not. An individual’s score could reflect not just what she knew analytically but what her intuition might be telling her.
The activity became tedious over time and deteriorated into a box-checking exercise. Prediction markets and crowd-sourcing techniques are other methods that can harness the power of intuition. IARPA (the Intelligence Advanced Research Projects Activity) has sponsored prediction markets but to my knowledge, their results haven’t often been conveyed to policymakers. Even when the IC experiments with non-traditional analytic methods, it has been reluctant (embarrassed?) to use them to support policymakers directly.
One approach that hasn’t been tried, is to explicitly incorporate our intuitive faculties into analytic tradecraft. The Intelligence Community could emulate the best practices developed by other organizations to deepen and harvest System 1 thinking. Analysts would be coached about the limitations of intelligence reporting—how it is inherently incomplete and particularly bad on complex issues and questions of human will and determination.
They would also learn how to apply our thinking abilities—System 1 and System 2—for the situations they are best suited. And in much the same way that brainstorming has been incorporated into analytic work, meditative practices would become a standard analytic technique. Intuition coaches would emerge as a new role in analytic units—individuals adept at helping analysts discern among their moments of intuition and hunches, identifying those worthy of further examination.
The explicit incorporation of intuition into analytic tradecraft is not about intuition replacing analytic reasoning; it is instead about combining both to achieve better results overall. Individuals with deep domain expertise—such as Nobel Prize winners—often have the most compelling intuitive insights. They receive rather than produce these insights because their minds are trained to recognize the value of “out of the blue” ideas.
While intuition can deliver potential leads that traditional analytic tradecraft can pursue or collect against, the IC will need to fight its inclination to standardize intuitive practices. Intuition is a personal experience; some practices that work for one individual will be pointless for others. I often find new ideas entering my mind during a long walk or just after I wake up. Intuition coaches can help analysts identify what works best for them.
The Cipher Brief hosts expert-level briefings on national security issues for Subscriber+Members that help provide context around today’s national security issues and what they mean for business. Upgrade your status to Subscriber+ today.
Intuition training leads to precognition.
All of us have the experience of thoughts entering our minds unbidden. But we rarely ask ourselves where those thoughts came from. In recent years, researchers—influenced in part by discoveries in quantum physics about the uncertain nature of time—have explored whether there is in fact, a way for the human brain to receive signals from the future. Lately, I have wondered whether information about the future can leak into the present, and whether humans can detect it.
For more than two decades prior to 1995, the IC studied the idea that precognition is possible. Specifically, both DIA and CIA pursued programs in remote viewing where individuals were asked to put their minds in a state in which they felt they could describe distant physical locations, facilities, and even people—not only as they existed at that moment, but how they would look at some future point.
When CIA inherited the program in the early 1990s, the agency asked the American Institute for Research (AIR) to evaluate its efficacy, and AIR asked psychologist Ray Hyman and statistician Jessica Utts to comb through several years of data. Both reviewers assessed that the remote program’s precognition results were statistically significant. Nevertheless, the CIA decided to kill the program because, according to the official report, it was not clear how to incorporate remote viewing results into standard intelligence reporting.
In the almost thirty years since, the study of precognition has advanced independent of the IC’s level of interest. Precognition has emerged as a statistically significant experimental effect, both when it’s assessed with skilled practitioners and among the general population. Scientists now are working to identify what factors influence precognitive performance; it appears that meditation experience, belief in the phenomenon itself, and positive feelings may all have an impact.
The financial and investment industries, always looking for an edge, have also experimented with precognition, including methods for identifying and training skilled “precogs.”
As you can guess by now, I think that precognition is among the forms of intuition that the IC needs to consider. But unlike the efforts from the past century, the work, as much as possible, should be pursued openly and in collaboration with scientific researchers.
A new emphasis on System 1 thinking will inevitably present us with moments of possible precognition. We no doubt will discover skilled precogs among our analysts, who have probably been using these skills all along, perhaps not knowingly. But we will need to use our System 2 analytic minds to distinguish noise from true signals and to develop protocols to explore these signals with rigor.
Many will scoff at these ideas and indeed, the concept of precognition remains controversial within the scientific community. There is a strong bias in the intelligence and scientific communities that all reality is materially-based and that speculation about non-material, non-rational phenomena is delusional and, even worse, a type of con-job. But I’ve yet to see any scientific proof that all reality is materially-based—only assertions, conjecture, and wishful thinking. And the more I’ve read about quantum physics, the nature of time, consciousness, and the mysteries of the mind, the more I’ve come to appreciate the awesome potential of human cognition.
Our people have always been our greatest resource; the time has come to make use of all our minds have to offer.
Read more expert-driven national security insights, perspective and analysis in The Cipher Brief because National Security is Everyone’s Business.
[ad_2]
Source link