icon bookmark-bicon bookmarkicon cameraicon checkicon chevron downicon chevron lefticon chevron righticon chevron upicon closeicon v-compressicon downloadicon editicon v-expandicon fbicon fileicon filtericon flag ruicon full chevron downicon full chevron lefticon full chevron righticon full chevron upicon gpicon insicon mailicon moveicon-musicicon mutedicon nomutedicon okicon v-pauseicon v-playicon searchicon shareicon sign inicon sign upicon stepbackicon stepforicon swipe downicon tagicon tagsicon tgicon trashicon twicon vkicon yticon wticon fm
15 Nov, 2016 02:52

Facebook to crack down on spread of misinformation

Facebook to crack down on spread of misinformation

Facebook founder and CEO Mark Zuckerberg has come under fire for defending his company’s role in the spread of fake news in the 2016 election. Now he plans to use an algorithmic machine to prevent further distribution of erroneous reporting.

The 2016 presidential election opened a Pandora’s box of ethical quandaries, but in the days following the election of Donald Trump, the focus has shifted to the media’s role – particularly that of social media. Both Twitter and Facebook were instrumental in the election, as 44 percent of US adults get their news from social media, according to the Pew Research Center.

Facebook’s Trending Topics has been home to incorrect news stories that are generated by legitimate looking articles from such fictional entities as the Denver Guardian, a newspaper that has never existed, and End the Fed News.

Facebook even apologized for unintentionally promoting a false story that Megyn Kelly had been fired from Fox News. Other high-profile stories that Facebook promoted included claiming that an FBI agent involved in the Clinton investigation had been murdered and that the pope had endorsed Trump. None of that is true, yet it cycled its way through the trending news and was spread to the masses as real.

Zuckerberg downplayed Facebook’s role in the election at a news conference on Thursday. He told reporters it was “a pretty crazy idea” that Facebook could influence the result. He followed up with a post Saturday, saying, “on Facebook, more than 99 percent of what people see is authentic” and that it was “extremely unlikely hoaxes changed the outcome of this election.

We have already launched work enabling our community to flag hoaxes and fake news, and there is more we can do here,” he said. “We have made progress, and we will continue to work on this to improve further.

It is too early to see how this will affect Facebook users, particularly conservatives, many of whom felt wronged when it was revealed that Facebook’s Trending News section was not the result of an algorithm but was actually selected by human “curators” who were accused of using their editorial decisions to suppress conservative trends.

While Facebook denied any political bias in its Trending News section, it did fire its entire curatorial team following the fiasco.

But the question of why Facebook dragged its heels in avoiding the spread of misinformation is a troubling one, particularly after Gizmodo published an article claiming that two sources with direct knowledge of the company’s decision-making knew Facebook had “the tools to shut down fake news” but did not out of “fear about upsetting conservatives after Trending Topics.

This puts Facebook in the uneasy position of being the arbiter of truth. It is a privately-owned company and its ability to violate the First Amendment is limited, as users could just go elsewhere to share their views. The real question comes from the very nature of news.

While some conspiracy theories are the results of delusions, others may be found to be true after further investigation. The National Enquirer is a tabloid that is often held in little regard to the substance behind its claims, but in 2007, it broke the news that former presidential candidate John Edwards was having an affair with a staffer. Therefore, blocking specific sources with bad reputations could theoretically block actual news.

Zuckerberg acknowledged this conundrum by addressing the reverse issue: reliable news sources getting things wrong.

While some hoaxes can be completely debunked, a greater amount of content, including from mainstream sources, often gets the basic idea right but some details wrong or omitted,” he wrote.

In addition, Facebook’s current system of having employees look into stories flagged by users poses problems as well. There is nothing to stop users from flagging news they disagree with, which means that manpower would go to investigating the veracity of a factual story that perhaps was shared with the wrong group of people.

Instead, Facebook is looking to rely on a machine-learning algorithm like it does with its clickbait detector. The social media giant’s next moves are going to be tricky, but Zuckerberg has faith that they’ll figure it out.

Podcasts
0:00
28:37
0:00
26:42