In times of social messaging which enables hate speech to spread outside of the public limelight, journalists are needed to tell society what is happening.
This was the message of UNESCO’s Guy Berger, speaking at the Press Freedom Conference organized by the European Publishers Association in Gdansk, Poland, this week.
Berger is director for strategy and policy in the Communication and Information Sector at UNESCO. He was attending the conference “It all starts with words”, held in the historic building of the Polish trade union movement Solidarnosci.
The UNESCO director noted that encrypted and private social messaging among small groups is increasingly catching up with public social media as a major vector of hate speech, thereby posing new challenges to society and the news media.
Berger referenced the UN Secretary General’s plan of action on hate speech, and highlighted how journalists were often victims of hate-mongers who, as a UNESCO study shows, are increasingly also among those attacking reporters.
Technology can help to draw attention to suspected hate speech that especially appears in the public side of social media, and also where this appears on the comments sections of news sites, Berger acknowledged.
But he warned about automated moderation of this content.
“While technology actors can be made to act on potentially problematic content especially in the public sphere, this can also pose a big risk to press freedom and to legitimate expression and debate,” he observed.
The director cautioned especially against using algorithms to computationally down-play or delete speech and cancel certain accounts. Even the use of Artificial Intelligence is not able to identify key nuances in assessing what is hate speech, he stated.
“For example, the technology is not able to use the guidance of the Rabat Plan of Action of the UN Office of High Commission on Human Rights in terms of assessing possible restrictions in terms of major contextual issues like the significance of the speaker, reach of the message and its likely impact,” said Berger.
Other problems with automated action against suspected content, especially in the absence of effective remedies for mistakes, he said, were: collateral harm to legitimate news reporting on hatred; the delegation of censorship to private companies with their own interests; and the precedent of prior censorship.
News media should help to highlight these risks, while at the same time also avoiding the trap of its own reports serving to normalize or even fan social hatred.
Addressing the journalists in attendance, Berger said: “Your direct influence on hateful attitudes is less powerful than your role in political agenda-setting and catalyzing political action”.
This role could help spur governments to speak out against hatred, deal practically with issues of immigration and integration, and promote Media and Information Literacy to ensure public resilience to incitement to violence, hostility and discrimination.
And, if technology solutions can try and identify hate speech in public space, “it is the role of journalism to investigate orchestrated hate in private social messaging,” said Berger.
To protect that role, journalists should observe World Press Freedom Day (3 May) and the International Day to End Impunity for Crimes Against Journalists (2 November), he encouraged.