“For the great bulk of our readers, and—yes—to respect the wellbeing of our staff too, we need to take a more proactive stance on what kind of material appears on the Guardian,” an editorial reads.
On Friday, the publication’s executive editor for audience Mary Hamilton outlined exactly how it currently monitors its comment section, which receives over 50,000 posts every day, noting that its editors were committed to encouraging readers to participate in discussions. She clarified, however, that various articles demand various kinds of moderation—for example, a crossword puzzle doesn’t need the same type of vigilance that an article about rape does.
Hamilton continued:
We are going to be implementing policies and procedures to protect our staff from the impact of abuse and harassment online, as well as from the impact of repeatedly being exposed to traumatic images. We are also changing the process for new commenters, so that they see our community guidelines and are welcomed to the Guardian’s commenting community. On that point, we are reviewing those community standards to make them much clearer and simpler to understand, and to place a greater emphasis on respect.
The Guardian is also examining its current moderation protocol and are testing various ways for writers to be involved in conversations with readers. It will reportedly begin publishing results from an analysis of its own comment section some time this week, but for now has put out a call for readers to chime in about what they look for in comments sections and what’s off-putting.
“We are not like the 4chan message boards, where anyone can say almost anything without consequences,” Hamilton continued. “Just as Facebook, Twitter, Metafilter and many others provide spaces for different kinds of communities to gather, we want to creat spaces on the Guardian for particular conversations and particular groups to speak—with each other and with us.”
Image via The Guardian.