By Eric Frederick, NC Local newsletter editor
Deborah Dwyer is a former reporter and communications professional in her home state, Tennessee, who’s now a Ph.D candidate at UNC-Chapel Hill. Her newlywed husband is in Durham, but she’s temporarily living in Columbia, MO, as a 2020-2021 Reynolds Journalism Institute fellow, studying the “ethics and practicalities of unpublishing” — the focus of her research at UNC.
Her mission and passion are to create tools and collect best practices to make unpublishing — removing old factual content, by request, to restore the subject’s reputation — more manageable and fair.
In the digital age, crime reporting means that some people who make minor mistakes, or have their charges dropped or reduced, or who redeem themselves, or who might be found innocent, can still be forever “guilty by Google,” as Dwyer puts it.
But there are many arguments against unpublishing: Factual reporting is an accurate reflection of history. Information that was true when it was reported should not be removed or altered. Doing so can erode trust with the audiience, and arbitrarily alter our only record of past events. And there are alternatives, such as addendums that update and/or clarify; writing and linking to a follow-up story that updates the reporting; or removing the story from a search engine’s cache but preserving it.
I talked with Dwyer to find out more about the challenges and possible solutions. Her key points are distilled in my NC Local newsletter of Feb. 10, 2021; a fuller account of our conversation, lightly edited for clarity and length, is below.
It seems almost no one actually has a process for considering a request to unpublish factual content — or any consistent standards.
Three of the questions that Dwyer says are seldom answered:
- “How does the public even know to submit these?”
- “What are the protocols for decisions to be made?”
- “How are these things communicated to the audience?”
In addition, there’s no record to build on.
“Not a soul, hardly, is tracking these things,” Dwyer said. “Out of 109 editors that I surveyed back in 2017, four of them had any kind of official tracking system.
“That is a major problem for two reasons. One is, how in the world are we ever going to identify best practices, and some basic standards, if we don’t even know what we’re dealing with? Two — major ethical issue, right? Because we don’t even have anything that holds us accountable for the previous decisions we’ve made. We have nothing to go back to determine: Are decisions being made equitably?”
The first big industry report on unpublishing, she said, was by Kathy English, then the public editor of the Toronto Star, who was commissioned by APME in 2009 to do a survey — and reported back with recommended guidelines, questions to consider, even a suggested script for replying to requests. Dwyer said she used English’s work as a base when she began her research a few years ago. She found that there still were no consistent standards — that nearly all decisions to unpublish were made “off the cuff.”
Newsrooms don’t have time to “think about the long-term implications of these things, or the nuances,” she said. “They were dealing with the fire at the moment — the one person screaming at them on the phone. And so I just felt like somebody had to go do this work.“
The most promising solution: an ounce of prevention.
Coverage decisions are the best prophylactic, Dwyer said.
“I think part of this … points to a historical shift as to what we call news,” she said. “What I’ve learned over the four years of studying this is, as with everything, unpublishing is a symptom of a larger problem. And part of that problem is what we choose to cover… We’ve got a direct feed from law enforcement, so we’re just feeding that stuff online, and, you know, it’s easy content. … There are (cases) that we set ourselves up for when we publish the trivial.”
And it’s not just crime. Dwyer recalled discussing with an education reporter a story she wrote long ago about a third-grader living in poverty, who requested years later, as an adult, that the story be removed.
“I innocently asked her: Why did you use his name in the first place? … She stopped in her tracks, and she was like: Because we always do. … That’s a perfect example of where we’ve sometimes pulled these traditional news practices into the digital age without really questioning them a whole lot, and now they have much bigger consequences.”
If a story is about an issue (such as poverty), and not specifically about the people involved, does it degrade the reporting to not identify the people? The key question: Could you justify your decision later?
“You hope the goal with unpublishing is, you can justify the coverage when somebody calls, and therefore you never have to unpublish, right?”
“Maybe you update the article, with an editor’s note. Maybe, in the case of an expungement, it’s removed. But that would be a very last resort — and a lot of times you would not need to, because we pay more careful consideration on the front end.”
Along with her research, Dwyer is working with the Missourian, a community newsroom managed by professional editors and staffed by University of Missouri students, and television station KOMU on prepublication policies.
“They already had a pretty decent one that said we don’t use mug shots unless there is a danger to the public and we’re putting it out because there’s an APB on this guy, you know — the police have given it to us, and we’re searching for him, he’s armed and dangerous, whatever.
“And so I asked them: What happens with that content, then, when that person is no longer a danger, or if they find out that he’s not even the guy they need? And they had never thought about, oh, well part of making that promise to the public that that’s our policy also means that, yeah, it probably ought to be removed, right? And so a lot of times it’s just thinking through the natural conclusion of some of this stuff.
“But again, I 100% empathize with newsrooms because they don’t have time to do it. And so they’re all doing the best they can. But what is going to happen if we have these very different approaches? Some people will say, we’re not going to cover minor crime, period, anymore. Well, some people would argue there’s a watchdog function on law enforcement and the criminal justice system that you might lose by doing that. Some people say we won’t cover damage to property, only human-on-human crime. Other people say we’re not going to cover anything now that we’re not going to commit to seeing it through to its conclusion in the court.”
I asked what she would say to a reporter or editor who insisted that not identifying all of the players in a story raises issues of transparency and accountability.
“What I would say is, we already have journalistic principles that we can stand on to justify that,” she said. “We have our ethical guidelines that talk about balancing the risk of harm to the public’s right to know. …
“We often will justify (anonymous sourcing) by saying something like, due to the sensitive nature of the content, we’re not sharing his name, but that’s a perfect example of where you could remind your audience that we have a process in place where, you know, three editors have to know who the person is.”
Another issue is whether sources actually realize that their words and accounts will “live forever,” and whether reporters are trained to make people aware of that.
“Are we teaching people how to obtain informed consent, in the digital age? That’s another prepublication practice that I’m looking at. How do you ensure, without scaring somebody off to not talk to you, that people … are fully informed as to what’s going to happen with that?” — Dwyer
“The truth of the matter is, we can’t unpublish this stuff totally,” partly because of all the different ways it’s published, including multiple apps. But “in an age where we cannot afford to lose one iota of credibility with the public, words matter — and we have to make sure that we’re not overpromising.”
I asked her to talk about how unpublishing is an equity issue.
“The equity issues start before unpublishing even begins,” she said, “because there’s been empirical evidence to show that we have inequalities in our criminal justice system, and therefore it’s going to be reflected in media coverage. A disproportionate amount of people of color are arrested … so that’s the systemic part.
“Let’s look at the unpublishing end of it … Who has the knowledge and the resources to even request that something be unpublished? I always go to the trope of, the white attorney who knows the publisher is likely not going to have a lot of difficulty getting his DUI taken down, right? Whereas a black kid from the other side of town … and maybe he’s on the wrong side of the digital divide. Maybe he … doesn’t even know that that’s up there, you know?
“How do we ensure that, especially for the same crimes, that we’re being equitable? For newsrooms who choose to unpublish, I really think they’re going to have to start thinking about how to make it more equitable than just offering the opportunity to people who raise their hand. And that means searching for technological solutions.”
She mentioned that Google News Initiative is funding the unpublishing project at Cleveland.com, in a quest for tools that will help. She also mentioned the possibility of using tags — for instance, tagging stories about marijuana arrests so their coverage could be addressed in bulk if the jurisdiction legalized marijuana. Or figuring out how to get a “technological line” to expungements of convictions that would automatically delete the originial coverage, and even to get newsrooms to support efforts to educate people about their right to seek expungement — “there are a ton of people who don’t know how,” she said.
So what can we all agree on?
There’s not a single model that will work with the laws in every jurisdiction, of course, but there are what Dwyer calls “fundamental values.” She’s working to create an annotated PDF of the Australian Broadcasting Corporation guidelines — “one of the most thoughtful policies that I’ve ever seen” — to help newsrooms with the “questions you ought to consider.”
She’ll share policies developed by newsrooms she is working with, and others that she’ll solicit from newsrooms and advocacy groups, plus ongoing research. She’s thinking of working with state press associations “to start trying to build some basic standards.”
I asked: If an editor wants to start working on this issue, what do you tell them?
“Go read Kathy English’s report from 2009, because there are really good questions to consider in there. … My first question is, do you archive every bit of your online news coverage?” (RJI is finding that no one is doing that, she said.) “So my question is, how are you going to argue that you’re ‘defending the first draft of history’ by refusing to unpublish?”
The second point is to consider transparency — letting the public know your policy. Dwyer says she has found this to be a sticking point when talking with newsroom leaders. “One page is all I was asking for,” she said, “with about five bullet points on it, that newsrooms would keep up to date on their website to tell people … we unpublish, and here’s the types of things we unpublish. … I haven’t had one taker. The big argument, of course, is, oh, the minute we start publicizing it … we’re gonna get more requests.”
And there are other challenges with no clear answer.
“I’ve had editors who have told me that their digital ad directors have knocked on their door and said, hey, we’ve got somebody who’s getting ready to sign a $10,000-a-month contract with us, but they said, hey, y’all wrote two bad articles about our restaurants a couple of years ago so take those down and then we’ll … you know. What are we gonna do about the new path to editorial influence and power? Everybody’s talking about the individuals right now. What happens when it’s a corporation? What happens when it’s the mayor?
“I don’t believe that things for public figures should be removed. But that begs the question: What about people who get things removed and then go into public office? We don’t know the future value of information, and that is one of the biggest hesitancies I have. … I use an example of a woman who’s got a new baby and she’s looking for a nanny online, and they’ve had a domestic issue with children scrubbed. My question is, did the editor who made that decision to let that person clean up their digital reputation … did we serve the public in that case?”
I asked her for the bedrock points to consider.
“Start with prepublication. Look at your reporting practices that can easily lead to unnecessary unpublishing requests, where are we identifying people, or using content that is of minimum value, potentially, to the public, yet is gonna be highly susceptible to unpublishing later on. And that includes things like informed consent, use of children’s names, all of that.
“Track, track, track… I would say keep a record from the get-go. So you can actually go back and see what kind of requests you’re getting for accountability purposes, and to make sure that your guidelines are as consistent as possible.
“Form a committee. Nobody ought to be playing God; not one editor in an editor’s office ought to be doing this on their own. You’ve got to unpublish by consensus.
“Sit down and determine your philosophy and then stick by it, and put the protocols in place to do it in a fair way. And that’s likely going to mean exploring tech; that’s likely going to mean investing some resources, and being a little more vulnerable with your audiences.”
Be mindful of “issues of race and power and inequity.”
Be transparent. “And that includes things like editor’s notes when you change something. We’ve had long questions about leaving up the links that just say ‘this content has been removed under our unpublishing policy’ or something like that, so it’s not just a broken link.”
How can news people help in this?
“Being able to talk about this more openly is really, really important. Helping brainstorm standards. Use this website (unpublishingthenews.com, which will be live soon) … connect with me with questions. … I want this research to be evaluated, to actually be instituting some change in newsrooms, to give newsrooms the tools they need.”