It’s not the information, it’s not the overload, it’s not even trying to be rational

Published by Tony Quinlan on

Interesting article from the New York Times surfaced yesterday – “In New Military, Data Overload Can Be Deadly”. It paints a picture of a difficult situation in Afghanistan where 23 Afghan civilians died as a result of a mistaken attack by US helicopters. In some ways, it reminds me of the description of the USS Vincennes incident Gary Klein refers to in Chapter 6 of the excellent Sources of Power – another situation where information was misinterpreted in the heat of a conflict situation.

What intrigues me is the interpretation of what was at root – and what would improve the situation:

“Information overload — an accurate description,” said one senior military officer, who was briefed on the inquiry and spoke on the condition of anonymity because the case might yet result in a court martial. The deaths would have been prevented, he said, “if we had just slowed things down and thought deliberately.”

Later in the article, there’s a recognition that it’s not just overload – it’s being able to distinguish between signal (useful, meaningful information) and noise (the stream of information). And often – as in this case – it’s easier to spot the signals in retrospect.

Research shows that the kind of intense multitasking required in such situations can make it hard to tell good information from bad. The military faces a balancing act: how to help soldiers exploit masses of data without succumbing to overload.

The idea of “thinking deliberately” also seems to me to be optimistic – in fast-moving environments we are thrown even more onto pattern-matching than information-processing, as Gary Klein points out. His analysis of the USS Vincennes incident is that it was more about using mental simulation to evaluate and rule out possible explanations – this looks to be a similar situation.

One of the difficulties seems to be in presenting the information itself, rather than a combination of patterns within the information and anomalies. One of the approaches that were discussed back at the November event was for security and IED detection – capture quantities of data in advance of any event, then re-signify after an event to determine patterns that can be used later to trigger “anticipatory awareness”. By dealing with patterns in data, rather than the raw data itself, should go some way to getting past cognitive biases. (They can’t – and indeed often shouldn’t – be dispensed with altogether.)

Across the military, the data flow has surged; since the attacks of 9/11, the amount of intelligence gathered by remotely piloted drones and other surveillance technologies has risen 1,600 percent. On the ground, troops increasingly use hand-held devices to communicate, get directions and set bombing coordinates. And the screens in jets can be so packed with data that some pilots call them “drool buckets” because, they say, they can get lost staring into them.

Which seems to call for a change in the way the information is presented to them – a la Klein’s project on information presentation described in the book, or the sort of approaches we’re talking about with SenseMaker™ these days.

As the technology allows soldiers to pull in more information, it strains their brains. [emphasis mine] And military researchers say the stress of combat makes matters worse.

But that assumes that you’re working in a world of simply presenting dry information. From what we know about the brain, giving them “fragments” – bits of others’ experiences, past mistakes and more – allows them to put the pieces together to make them relevant to their current experience. Part of the problem seems to be presenting them with information that is just that – dry information.

There is a further element that seems to me to be present here – the assumption that the information holds its own meaning, that it can be analysed to produce the answers in itself. It’s the same reasoning that encouraged the development of software I saw last year that took the outputs from focus groups and presented back a representation of what concepts participants were addressing. It’s too easy to miss the important (weak) signals in a piece of text or a photograph, just as it is to over-interpret meaning in trivialities. Better by far, to allow lots of people to interpret what they regard as significant, then allow patterns to emerge from the meta-data. If the emergent pattern shows up specific information, then is the moment to look at the raw data.

(It’s worth bearing in mind here that the New York Times event is being re-interpreted by the journalists writing the article and any sub-editors/editors intervening in the process, so the actual military investigators’ report may be rather different.)