icon bookmark-bicon bookmarkicon cameraicon checkicon chevron downicon chevron lefticon chevron righticon chevron upicon closeicon v-compressicon downloadicon editicon v-expandicon fbicon fileicon filtericon flag ruicon full chevron downicon full chevron lefticon full chevron righticon full chevron upicon gpicon insicon mailicon moveicon-musicicon mutedicon nomutedicon okicon v-pauseicon v-playicon searchicon shareicon sign inicon sign upicon stepbackicon stepforicon swipe downicon tagicon tagsicon tgicon trashicon twicon vkicon yticon wticon fm
19 Dec, 2017 15:45

Facebook considers blocking far-right Britain First after Twitter suspends leaders’ accounts

Facebook considers blocking far-right Britain First after Twitter suspends leaders’ accounts

Facebook is considering whether to ban far-right group Britain First from its platform. It comes just days after Twitter blocked the accounts of its leaders Paul Golding and Jayda Fransen in a clampdown on hate speech.

Neither can post messages and all of their previous messages have gone – with their group’s official page meeting the same fate. Fransen must be devastated, as her following had doubled in recent weeks after three retweets by US President Donald Trump.

Facebook has now claimed it will carry out a review of the Britain First profile page as the Home Affairs Committee grilled the social network, Google and Twitter on what they were doing to combat hate speech.

Facebook’s Director of Public Policy Simon Milner said the social-media network is “very cautious” about removing political speech and that Britain First was until recently still registered as a political party. “Clearly there are issues with the pages but we are very cautious about political speech,” he said.

However, MPs said not doing enough was being done. Labour MP Yvette Cooper, chairwoman of the committee, insisted that steps taken by Facebook and Twitter were still not adequate.

Cooper said the social platforms were among the “richest companies in the world” and were responsible enough to do more. She accused YouTube, in particular, of big failures.
She claimed she had reported a video on the sharing and streaming site repeatedly, but that the clip deemed racist by Google remained on the platform for more than half a year.

“It took eight months of the chair of the select committee raising it with the most senior people in your organization to get this down,” Cooper said. “Even when we raise it and nothing happens, it is hard to believe that enough is being done.”

As a result, Google said it would produce a transparency report but Facebook and Twitter did not commit to doing the same.

Google’s Vice President of Public Policy Dr Nicklas Lundblad said the web giant is improving its machine learning, by which material likely to be banned will be spotted by machines and not humans. But Google was not out of the firing line.

Cooper said when she was searching for a video posted by banned terrorist group National Action, and calling for it to be taken down, she was recommended others by the same group.

“Is it not simply that you are actively recommending racist material into people’s timelines? Your algorithms are doing the job of grooming and radicalising,” the Labour MP said.

Facebook’s Simon Milner said: “Our focus has been on global terrorist organizations. One of the issues with this is that content from videos like this can be used by news organizations to highlight their activities. With this material, context really matters,” he said.

“There is a chance that we are taking down important journalism.”

Podcasts
0:00
26:13
0:00
24:57