Less Hate, More Speech: An Experiment In Comment Moderation

Photo: Lorenza Sargenti

As part of the International Journalism Festival 2016 (IJF16), three Romanian journalists unveiled their findings on a year-long comment moderation experiment at Palazzo Sorbello in Perugia on 6 April, 2016.

The moderation experiment, called Less Hate, More Speech was carried out by Roxana Bodea, Executive Director of the Median Research Centre, Marina Popescu of the Central European University and journalist/author Catalin Tolontan – who was absent at the unveiling due to work-related obligations – between April 2015 and March 2016. During that period, the three focused their attention on four Romanian websites, with tolo.ro and gsp.ro being the main focus of their investigation.

Teaming up with the four websites to see how comments online are worded – not only by the average online reader, but by journalists themselves – the three worked with two types of professionals:

i. Journalists, who were interested in finding out how to maintain good relationships with the masses and to positively use this engagement, even when presented with online aggression;

ii. Researchers, who came in to find convenient conditions for this to happen. Particularly, they tried to find out how journalists could interact online with their followers without losing them;

“[Journalists] didn’t want to give that up [the relationship with the public] like many other publications did around the world, you know; [they] just closed down the comments section,” said Bodea.

According to Popescu, Romania suffers from a huge political polarization issue and  it was thus hard to find an unbiased platform to collaborate with. Research has shown that Romania has a tendency to adopt more extreme ethnic hues compared to the rest of Europe, and that discriminatory statementes voiced within the country are more likely to go unpunished. Romanian standards of acceptability when it comes to racism or bias are quite low, and many do not see anything wrong with making biased comments against others. The three journalists at the head of the investigation frequently found themselves surprised at how often online users came up with different ways of being racist and/or abusive.

Journalists’ Opinions

Working in close contact with some of the participating journalists made Bodea, Popescu and Tolontan record the following opinions:

i. About 72% of them believed in having a comment section online, as long as it was moderated;

ii. 67% of the journalists also believed that moderation could increase the number of readers and online engagement, but did not believe that it would be possible to devise a moderation procedure which was capable of separating hate speech and prejudice from other aggressive comments;

iii. 84% of the  journalists believed that comments were often very vile;

However, the experiment came to prove that only 20% of the comments considered as vile by the journalists needed moderation.

SIDECO

Sponsored by the EEA Research Programme and Norway Grants, the three journalists came up with a moderation system known as SIDECO. With the use of a dictionary of bad words – which is constantly updated – the system has been able to moderate what users post on the focus websites, with  moderators intervening almost immediately to edit or approve posts when online. The most important thing however is to ensure that the moderators do not ruin conversation trails, therefore they were given some rules:

i. Mere disapproval/resentment should not warrant moderation;

ii. Moderators were warned against “reading the minds” of the online users, “because that’s when you tend to overmoderate – said Bodea – and moderators should basically look at what it is written there, and not think more about what the commenter tried to say,”;

iii. Commenters were to be allowed to have debates;

iv. Moderators were not to take out the passion from comments: this was considered really important in the focus sites which dealt with hot topics such as sports;

Hate speech, incitement to violence and discrimination were the main focus of moderation and the team used online masks such as asterisks and small fonts alongside preview warning messages and tags to show that a certain comment had been moderated.

Challenges Faced

In the year-long research period, the three journalists faced several challenges including:

• Convincing people that racial caricature/jokes could be offensive even when the author genuinely meant no harm;

• Working with newsroom writers and journalists whose articles could potentially cause difficult ripple effects to deal with – i.g. out of context image placement;

To conclude, in one year the team managed  to significantly influence the pattern of vile language’s use on the websites they worked with and, while the number of commenters was reduced, the overall number of users on those websites increased. They also came to the realization that when journalists/authors of particular articles intervened in the comments section, they became more civil. They are now working on getting more journalists to actively engage in interactions with online users, especially when the discussion in the comment section gets particularly heated up.

The comment moderation project is expected to keep running on, and  Popescu mentioned that the team hopes to achieve even more positive results in another year’s time.