World Federation of Advertisers calls for platforms to act on ‘dangerous and hateful content’

The World Federation of Advertisers (WFA) has issued an unambiguous call for digital media platforms to do more to stop the spread of harmful content in the wake of the recent mass shooting in New Zealand, which was first broadcast on, and later shared through, social channels.

The statement positions the call for action as a moral issue for advertisers, not one of brand safety.

“All these platforms are funded by advertisers, and as such, those that make them profitable have a moral responsibility to consider more than just the effectiveness and efficiency they provide for brand messages,” said the statement, issued in the middle of Global Marketer Week in Lisbon, Portugal. The WFA represents 90% of global marketing communications spend, about US$900 billion per year.

The statement specifically references “dangerous and hateful content,” and does not mention privacy or other concerns about the possible negative effects of social media. A request by The Message for more information about how WFA is defining “dangerous and hateful” content was not immediately returned.

The WFA statement does not recommend a course of action, but does call for its members and all brands worldwide, “in their capacity as the funders of the online advertising system—to put pressure on platforms to do more to prevent their services and algorithms from being hijacked by those with malicious intent.”

Raja Rajamannar

Newly elected WFA president Raja Rajamannar, chief marketing and communications officer at Mastercard, said the social platforms are becoming more powerful in terms of shaping culture and mobilizing online communities. “This means brands and platforms must assume a higher level of responsibility to ensure these online environments are forces for good, not conflict or violence,” he said. “That begins with acknowledging flaws and quickly investing in lasting solutions. To drive change we need less debate and more action.”

The WFA statement comes one week after the Association of New Zealand Advertisers (ANZA) and the Commercial Communications Council (Comms Council) issued a joint statement in the wake of the March 15 shooting, calling on the social media companies to do more to stop the spread of hateful content.

“The events in Christchurch raise the question, if the site owners can target consumers with advertising in microseconds, why can’t the same technology be applied to prevent this kind of content being streamed live?” said their statement.

The WFA statement also specifically mentions the recently spotlighted problem of child predators posting comments in YouTube videos of children, as well as stories about “the glorification of self-harm and suicide content on Instagram.” “This is not an issue of brand safety, this is a moral question to hold social media platforms to account—in the same way we do for traditional media,” said ANZA chief executive Lindsay Mouat in the WFA statement.

Just hours ahead of the WFA issuing its statement, Facebook said it was cracking down on white nationalism on its site. “Today we’re announcing a ban on praise, support and representation of white nationalism and separatism on Facebook and Instagram, which we’ll start enforcing next week,” reads a post on the Facebook Newsroom site.

After the shooting in Christchurch, a number of Facebook critics pointed out that Facebook’s efforts to block ISIS content from its platform were stronger than those for white supremacist content.

Law professor Hannah Bloch-Wehba told Wired she doesn’t see any reason the tech platforms should be better at removing or blocking ISIS content than right wing extremist content. “We just haven’t seen comparable pressure for platforms to go after white violence,” she said.

WFA makes clear that real change cannot come from New Zealand alone, but needs a global response. It cited member research in which 47% indicated that “improving the online advertising ecosystem” is the single biggest issue of 2019.

“Marketers must reflect on the extent and terms on which they fund these platforms,” said Stephan Loerke, CEO of the WFA. “WFA is committed to working with the platforms in a constructive manner in order to find solutions to these grave problems.”

Ron Lund, president and CEO of the Association of Canadian Advertisers (ACA) and a WFA board member, was supposed to be in Lisbon this week but an illness prevented him from traveling. He told The Message Wednesday night that the ACA supports the statement, although it will seek clarification about the specifics.

“But the principals around [social media platforms] have to do better, that for sure we are behind,” he said. “We haven’t gone to the board for official endorsement, but I don’t think there will be a problem.

“We just have to do better. And I think [platforms] recognize that,” he said.

David Brown