Tuesday, June 24, 2008

How Dan Rather could have kept his job

It's been four years since the flawed "60 Minutes" report on George Bush's National Guard service and Dan Rather's subsequent ignominious separation from CBS. But the incident still holds important lessons for newspeople today. Namely: Rather and his team tripped up not because they weren't doing good journalism, but because they were using the "old world" rules in a "new world" game. And it bit them in the proverbial patootie.

In the old world, investigative reporting went something like this: Get a hint of something juicy, spend months tracking it down, all the while holding your cards close to your chest to prevent competitors from finding out what you're up to and, dang it all, scooping you on your own story. Next, get your ducks in a row, double-check your piece, and then release it to the world in one fell swoop. Make a spectacular splash, and don't forget to take a victory lap so your journalism peers can genuflect in the face of your total awesomeness.

Because let's admit it, part of what motivates journalists to do kick-butt investigative reporting is to reap kudos from their peers. Don't get me wrong. Many reporters absolutely do care about the subjects they investigate. And for some, exposing wrongdoing is, in fact, their only motivation. But for many others, part of what keeps them slogging through the long, lonely, and distinctly unglamorous days of investigative reporting are visions of the recognition they will receive down the road. And not just reporters. What editor doesn't, after all, take into account "Pulitzer potential" when deciding whether to greenlight a story that will take one (or two) of their warm bodies out of the daily grind for the indeterminate future?

But here's the problem with playing the old game in the new world: Get one thing wrong, and you'll get skewered. Maybe even lose your job.

This didn't used to be a problem. Not because investigative reporters didn't make mistakes. They almost certainly did. But the difference is that reporters enjoyed what some today might call a special advantage: They owned the printing presses. Mechanisms for bringing their errors to light in any meaningful way simply didn't exist. No public commons existed for readers to get together and start bringing collective weight to things that seemed off. Like, "Hey, that thing you said about that boat was wrong. It doesn't use the XYZ motor. It uses an ABC motor." And, "Yeah, you're right. My dad had one of those when I was a kid. I remember him complaining that he wanted the XYZ motor, but it couldn't fit." And, "Indeed, I worked in a boat shop, and our customers used to ask for the XYZ motor, and we had to tell them they couldn't get it."

With the Web, all that changes. With the Web, everyone enjoys the special advantage. Errors no longer escape scrutiny. Just the opposite. Mistakes get analyzed and magnified six ways to Sunday. In the new world, reporters simply can't get away with the things they got away with before.

To be clear, I'm not suggesting that the "60 Minutes" team knowingly had doubts about the documents which cast shadows on Bush's National Guard service. They purportedly consulted with independent experts to verify the documents' authenticity. All well and good. That's standard journalistic practice. And in the old world, that was probably good enough. In the new world, however, it's not even sufficient.

The difference is in the confidence rate. Let's say the experts' assessments give you an 80% confidence rate that what you have is kosher. The experts' credentials are impeccable, and they say they are reasonably sure that the documents are legit. But they're not 100% sure. With the amount of information at their disposal, they say they can't be 100% sure, only 80%. In the old world, 80% might have been as good as you could get. There wasn't an efficient mechanism to reach 100% confidence. So 80% was good enough, and you legitimately went out the door with that.

In the new world, however, you often can get to 100%. How? By putting the material out to the world and letting the crowd respond.

At the 2006 Nieman Conference on Narrative Journalism, new media thought leader Dan Gillmor suggested that "60 Minutes" could have saved itself buckets of headaches if, sometime during its investigation, it had simply posted portions of the documents online and asked for help verifying their authenticity. Given the amount of totally random, yet holding-a-piece-of-the-proverbial-elephant regular folk who weighed in after the fact (typewriter enthusiasts, a guy who was a Navy clerk in the early '70s, a former Air Force personnel specialist), it seems reasonable to believe that CBS could have determined the documents were, in fact, suspect--long before their report went on air.

So this is the point: In the old world, an 80% confidence rate was perhaps as high as you could get. You could get away with going out the door with an 80% confidence rate. In the new world, however, you often can get to 100%. We didn't have the tools before. We do now. So a journalist today has no excuse for not using the tools at their disposal to achieve that 100% confidence rate.

"But what about our scoops?!" some will protest. "If we post to the world what we're working on, we'll lose our scoops."

Hmm. I see what you're saying. Here are some thoughts:

-- It's possible that Rather and crew could have gotten the help they needed without tipping their hand. Simply post a portion of the documents and see what people say.

-- It's more likely, however, that they would had to have been more forthcoming. Online audiences only get involved when they feel invested in the person/site asking for their help. It's hard to feel curious, much less invested in something if a reporter/website only offers a blind shred of something. It's much easier to feel invested if there's a big hullaboo going on.

So yes, in leveraging the new tools to try to get to 100% confidence rate about the veracity of their stories, reporters are more likely than not going to have to tip their hands about what they're working on. This is a bad thing only if your goal is to get scoops rather than to bring the truth to light. And given that prioritizing a scoop over getting to the truth can bite you in the butt, as it did "60 Minutes," it's increasingly going to be in every reporter's interest to go for accuracy, scoops be damned.

The good news is that scoops aren't going to mean as much in the future as they did in the past. Newspaper scoops were important in two-newspaper towns. When as a reader, you're only getting your news once a day, you want to subscribe to the newspaper that's not a day behind. Online, however, all sites get everything more or less immediately. Maybe there's a lag of a few hours between when one site breaks something and others confirm it. But a few hours doesn't mean much to readers. (Most are not spending their waking hours waiting for the next big news item. They do have lives....) Scoops are not going to drive reader preferences. Much more important are going to be things like accuracy, how well the site helps the reader understand the significance of the things it covers, how engaging the writing is, and how easy the site is to use. Scoops? Pshaw.

Photo courtesy of scriptingnews. Creative Commons license

No comments: