GENEVA - The heads of many of the world’s biggest social media platforms were urged on Friday to change their business models and become more accountable in the battle against rising hate speech online.
In a detailed statement, more than two dozen UN-appointed independent human rights experts - including representatives from three different working groups and multiple Special Rapporteurs - called out chief executives by name, saying that the companies they lead “must urgently address posts and activities that advocate hatred, and constitute incitement to discrimination, in line with international standards for freedom of expression.”

Time to change

They said the new tech billionaire owner of Twitter, Elon Musk, Meta’s Mark Zuckerberg, Sundar Pichai, who heads Google’s parent company Alphabet, Apple’s Tim Cook, “and CEOs of other social media platforms”, should “centre human rights, racial justice, accountability, transparency, corporate social responsibility and ethics, in their business model.”

They reminded that being accountable as businesses for racial justice and human rights, “is a core social responsibility, advising that “respecting human rights is in the long-term interest of these companies, and their shareholders.”

They underlined that the International Convention on the Elimination of Racial Discrimination, the International Covenant on Civil and Political Rights, and the UN’s Guiding Principles on Business and Human Rights provide a clear path forward on how this can be done.

Step up against hate

“We urge all CEOs and leaders of social media to fully assume their responsibility to respect human rights and address racial hatred.”

As evidence of the corporate failure to get a grip on hate speech, the Human Rights Council-appointed independent experts pointed to a “sharp increase in the use of the racist ‘N’ word on Twitter”, following its recent acquisition by Tesla boss Elon Musk.

This showed the urgent need for social media companies to be more accountable “over the expression of hatred towards people of African descent, they argued.

Soon after Mr. Musk took over, the Network Contagion Research Institute of Rutgers University in the US, highlighted that the use of the N-word on the platform increased by almost 500 per cent within a 12-hour period, compared to the previous average, the experts said.

Uphold human rights

“Although Twitter advised this was based on a trolling campaign and that there is no place for hatred, the expression of hatred against people of African descent is deeply concerning and merits an urgent response centred on human rights.”

They added that hate speech, “advocacy of national, racial and religious hatred that constitutes incitement to discrimination and violence, as well as racism on social media, are not just a concern for Twitter but also for other social media giants such as Meta”, the company formerly known as Facebook.

The experts said although some companies claimed not to allow hate speech, there was a clear gap between stated policies, and enforcement.

Rampant disinformation

“This is particularly salient in the approval of inflammatory ads, electoral disinformation on Facebook, and content that talks of conspiracy theories. Research from Global Witness and SumOfUs recently revealed how Meta is unable to block certain advertisements”, the experts state.

Meta “took a significant step with the establishment of an oversight board in 2020”, in response to complaints, they said, noting that the “group of experts from diverse areas of expertise is in place to ‘promote free expression by making principled, independent decisions regarding content on Facebook and Instagram and by issuing recommendations on the relevant Facebook Company Content policy’”.

Long-term oversight

The experts acknowledged that the board had been well funded, received around two million appeals regarding content, and made a number of recommendations and decisions.

“However, the effectiveness of the Oversight Board can only be seen over a long-time horizon and will require continued commitment at the highest levels” to reviewing and modifying tools to combat racial hatred online, the experts said.

“There is a risk of arbitrariness and profit interests getting in the way of how social media platforms monitor and regulate themselves”, they added.

Free speech, not a ‘free pass’

They pointed out that High Commissioner Volker Türk who heads up OHCHR, had recently penned an open letter to Twitter CEO Elon Musk, emphasizing that free speech did not mean “a free pass to spread harmful disinformation that results in real world harms.

“As he underlined, human rights law is clear – freedom of expression stops at hatred that incites discrimination, hostility or violence. We see too often that the spread of hatred and hate speech against people of African descent, and other groups, not only undermines their rights but creates major fissures in societies. These are increasingly difficult to overcome and a source of various forms of destabilisation within countries.”

‘Race-based traumatic stress’

The independent experts said that allowing and tolerating incitement to hatred and expression, or advocacy of hatred against people of African descent and other marginalized groups, “not only encourages the perpetrators, but also constitutes a continuous source of chronic race-based traumatic stress and trauma.”

The presence of racial hatred further undermines confidence on the part of those impacted, in using social media and seeking justice.

“It is especially alarming” considering that so many youngsters “live a significant part of their lives” online, they added.

Social media at a crossroads

“Content moderation can only address a part of what happens in cyber space but does not take into account the intended and unintended effects in society. There are deeper issues about advocacy of racial hatred, lack of accountability for abuses, and an absence of efforts to promote tolerance.

“If addressed, these can be strong determining factors in building a positive future both online and offline.”

Acknowledging the power for good that social media represents if put to positive use, the experts said that it has “a major role to prevent further rifts, so that racial justice and human rights can be upheld, to build less racist, less devisive, more tolerant, just and equitable societies.”

Special Rapporteurs and independent experts are appointed by the Geneva-based UN Human Rights Council, and form part of it's so-called Special Procedures to examine and report back on a specific human rights theme or a country situation. The positions are honorary and the experts are not paid for their work.