Through a glass darkly
A timeline over at Pajamas Media details how the elation over the supposed discovery of 12 survivors in a West Virginia coal mine turned by stages into anger and then retrospection. One snippet.
Anklebiting Pundits: In what is nothing short of miraculous, the 12 trapped West Virginia miners have been found alive. -- 04 January 2006 00:22:49 EST
Tommyville: I, Like many americans watched with great elation as the news channels reported that despite the odds, 12 miners were found alive from the trapped mine in West Virginia..and now, several hours later it appears those reports are untrue. Only one has survived. Here are some questions to watch in the following days: If the officials present knew the rumor was false only 20 minutes or so later, why did they wait hours to notify the families? -- 04 January 2006 03:36:00 EST
JCL's Super Fun Happy Blog: Well, at first the mass media was saying 12 of the miners in the West Virgina mine explosions were found alive. Ooops. Turns out 12 of the 13 miners are dead and there was an erroneous report. See, this is why live news channels like CNN and the like is a joke. It's not about the stories, the events or the people involved but all about who is the first to break the story for ratings and advertising. It's not about reporting or journalism anymore. The so called media is a joke in general terms. I know some of you will be thinking "you're not a reporter, so what do you know?" -- 04 January 2006 04:01: 00 EST
I'll reproduce a comment I made over at Tigerhawk's forum on journalistic ethics, which was asking 'how can we make the news better', because it eerily touches on this very matter.
In some ways these proposals remind me of merge replication, where you are trying insert new records in a table at the same time others may have modified it, having regard to a data sources reliability and integrity. The example of flagging an editorial when it does a policy 180 degree turn is a case in point. What is attempted here is to keep the knowledge base coherent, to ensure that it is not corrupted or infest with unrecognized internal contradictions. If these must be endured, then the idea is, flag it. One of the key problems of course, is the structure of the knowledge itself. It's freeform text and as such, it's hard to tell, even from the point of view of a newspaper itself, just what it's saying in a compressed way. Guarding such an amorphous store from contradiction is hard.
Maybe there's a better chance of maintaining data integrity, at least as to source, by making reporters go through the sensible checklist that Tigerhawk provides. My only suggestion is that a way be found to ensure collateral confirmation. Many stories are actually single point sourced at the start of their lives. Example: Italian hostage kidnapped in Gaza. The hope is that with the passage of hours and the descent of more journalists the initial story can be confirmed. But when a story doesn't last long enough to be additionally confirmed, it simply flashes past and enters the record, sits there forever in the archives, an event resting on the slenderest of supports. Sometimes, in taking a backfix, we belatedly realize that an old story probably wasn't true since it has been contradicted or at least made implausible by subsequent developments. One example that comes to mind were some of the New Orleans disaster stories. But there's no way to go back and revise the archival story, and it would be Orwellian to try. But if we were to think of a state of knowledge as consisting of the present estimate of the true narrative of an event, it would not be wrong to discount a past error. Historians do this all the time. They don't revise the source but they discount some of it, like the story, widely believed then, that the IJN battleship Kongo was sunk in 1942. If you were writing history today, you would not claim this.
There's no reason to suppose that the media consciously spread an incorrect story about the survival of the miners. They made a decision to publish under conditions of imperfect information and they made an honest mistake.
A monograph about decision making under uncertainty describes how there is a state of true knowledge about the world and belief model that we form about this world. "A common distinction between knowledge and belief is that knowledge is simply that which is true in the world, while beliefs are an agent’s understanding about the world. Therefore, an agent can believe something that is false, but it cannot know that something is, for example, true, when in fact that thing is not true." The paper goes on to argue that the best way to build a belief system out of the available information is to go through a process of detection, measurement and interpretation. But in the end you can never know that you know. Worse, in many situations the requirement for action is front-loaded: the most important actions occur in the initial moments, at a time when our information is most imperfect.
One safeguard is to conduct sanity checks, which helps but not as much as one would hope. The Principle of Limited Reduction ... "states that relying on analytical behavior to reduce complexity introduces new sources of uncertainty and this requires experimental countermeasures. Correspondingly, relying on experimental behavior to reduce uncertainty introduces new sources of complexity requiring analytical countermeasures." In other words, sanity checks may themselves introduce error because in confirming measurements against our belief systems or our belief systems against measurements we are still left with the possibility of error.
Tough, but there's no escape from it. Imagine a hypothetical situation in which an Air Defense officer receives a report on September 11, 2001 that a hijacked airliner is heading for New York City. The information, in it's initial state, is single-sourced, but the aircraft in question will be over Manhattan in fifteen minutes. What should he do?