There’s a weird bit on BBC radio 4 program ‘All In The Mind’ (About 13:20) which deals with our inability to remove discredited information from our analytical process.
It turns out that, because of the way that our brains have evolved, it’s difficult for us to disregard information which we initially believe to be true, but is subsequently proven to be incorrect. This isn’t a case of dealing with ambiguity, where something may or may not be true – in experiments subjects were told a story, and subsequently told that specific parts of the narrative were untrue. When asked to deduce further information about the narrative, the students still relied on the erroneous information, even though they ‘knew’ that it was wrong.
To make matters worse, it appears that the more times erroneous information is repeated, the more embedded it becomes in our minds. So the more we repeat the initial untruth in an effort to correct it, the more likely we are to simply re-enforce it.
Does this matter? I think it does, in at least two ways.
- With regard to information operations (going back to Roger’s post on ‘Rumint’), clearly if you can make your rumour initially plausible, it probably doesn’t matter too much if it is subsequently discredited. We kind of knew this already – “Mud sticks” and Hitler’s “Big Lie”.
- In terms of the grading of intelligence reporting. The reliability of information is very rarely certain, and we frequently find ourselves reviewing the reliability grade that we attribute to both sources and specific pieces of information. We understand that “the value of this information can go down as well as up”, but it transpires we’re not very good at processing the effect of those changes on our analysis.
I suspect that some people reading this will be shaking their heads in disbelief, knowing that that they would never fall into this trap. And they’re probably right – I personally believe that some people have strong internal information handling processes, which are underpinned by robust logic – and others don’t. I also believe that almost all people can learn good analytical processes and internalise them – it is after all a fairly fundamental life skill to make deductions based on a combination of information and assumption.
So how do we deal with the problem? Something that I find useful when I’m really struggling to make sense of an issue is to write down the key facts, assumptions and inferences which underpin my analysis. Not only does this force you to into following a logical process, but it also highlights where gaps in knowledge exist, and where potentially faulty assumptions are being relied upon. Even better, once it’s written down, it becomes available for critique by other analysts.
I’m not suggesting that everybody does this all the time as I don’t think it’s practical. It may be a useful exercise however, both in terms of training analysts, and ‘operationally’ when the intelligence picture is becoming particularly clouded and confused.