Google, Facebook and Twitter are 'Grooming' New Terrorists Says Yvette Cooper

Social media algorithms that promote posts to users based on their interests are helping to drive the cycle of radicalization, an influential MP has claimed.

"All three of your organizations use your algorithms to encourage people who are interested in one particular thing to then follow something else," Yvette Cooper, the chair of the Home Affairs Select Committee and a former shadow home secretary told representatives from Google, Facebook and Twitter in a parliamentary hearing on Tuesday (December 19.)

"Isn't the real truth that your algorithms... are doing that grooming and that radicalisation?"

All three companies stressed that they were actively addressing all the issues Cooper raised, but accepted there was more work to be done.

Social media Twitter Facebook
Social media companies are doing more to tackle extremism. Creative Commons

"Clearly this becomes a problem, when you have content where you don't want people to end up in a bubble of hate," said Nicklas Berild Lundblad, Google's Vice President of Public Policy for Europe, Middle East and Africa.

Lundblad said the company was looking at how to use machine learning technology to identify potentially harmful videos, and limit their features so they did not appear in lists of recommendations to users.

The hearing, held as part of the committee's ongoing inquiry into hate crime and its consequences, also saw Cooper question whether tech bosses were paying enough attention to far-right extremism as well as the threat from Islamists.

She pointed to one "propaganda" video from the proscribed British far-right group National Action.

The video, Cooper said, was now removed entirely from YouTube but had been repeatedly reinstated from different accounts over a period of eight months after first being removed. Cooper raised the video with senior YouTube and Google staff members during that time.

Lundblad admitted this specific case was "disappointing" but said the company was improving its response times in removing videos.

Cooper added that she had also found the National Action video on Twitter and Facebook.

"Do you guys not share?" Cooper asked, pointing out that the tech companies had referred to the Global Internet Forum, in which they pool information relating to extremist content. "Why do you not apply this co-operation to far right extremism?"

But Simon Milner, Facebook's Director of Public Policy, said that this information sharing initiative currently only covered content relating to the Islamic State militant group (ISIS) and Al Qaeda.

"They are the most extreme purveyors of this kind of viral approach to distributing their propaganda," Milner said. "That's why we've addressed them first and foremost. It doesn't mean we're going to stop there."

Elsewhere in the hearing, Cooper chastised the tech bosses for failing to remove offensive posts, long after she, members of the committee, or her staff had notified the companies of their presence.

Examples Cooper highlighted examples, including:

  • Two tweets containing extreme anti-Semitic abuse, including one referring to a "filthy jew" getting "bitch slapped," both of which remained live on Twitter at the time of the hearing.
  • "Violent threats" directed at Theresa May, at a former Prime Minister, and "very racist abuse" aimed at the black Labour MP Diane Abbott, all of which were reported to Twitter anonymously by Cooper's staff but remained on the platform.

"I'm kind of wondering what it is we have to do," Cooper said of the anti-Semitic abuse, "If even when we raise it in a forum like this nothing happens, it is very hard for us to believe that enough is being done."

Sinead McSweeney, Twitter's Vice President of Public Policy and Communications for Europe, Middle East and Africa, said that she had seen "more action" on the issue of harmful and offensive content in her last year at Twitter than in her previous four years with the company.

She said she could identify 25 separate changes made to address these issues, and added that 95 percent of terrorist content is now being removed on Twitter's own initiative.

But, McSweeney said, historically many "Bystander reports," where people who are not the victim report abuse, had been given a lower priority. That was changing, she said.

She added that Twitter had a dedicated team working with parliamentarians to tackle abuse. But, she said, "If you're cleaning a street you can clean a street every morning, you can't guarantee it is still going to be clean at 10am."