FACEBOOK FAKE NEWS & THE US ELECTIONS

FAKE POSTS: Social network Facebook has been accused of condoning fake posts to favour the victory of Republican candidate Donald Trump in the US elections.

By John Naughton

Well, the election is over; now we’re knee-deep in post mortems. Every mainstream publication and every corner of the blogosphere is full of autopsies. Many of these investigations have an anguished “How could this have happened?” tone. American students in a university department adjacent to mine have decorated the trees outside with hundreds of distraught but determinedly forward-looking messages. “Love WILL Conquer!” says one. “Knowledge not Ignorance,” says another.

I don’t propose to add to this genre. If you want an informed, dispassionate analysis of the campaign that has given Trump the keys to the kingdom, look no further than an essay by Professor Charlie Beckett of the London School of Economics on that institution’s Polis blog. It’s worth reading in full, but for those who are pressed for time, the gist is: “Trump had the better politics. Tactically, strategically, personally, policy-wise. He won partly because the Democrats and Hillary Clinton got most of that wrong, but mainly because he did best what you are supposed to do in an election: convince people to vote for you. They (and he) knew what they were doing.”

Unpalatable? Yes. But I think accurate, provided you accept that by “politics” Beckett means running for political office, not anything with a moral overtone. But such a detached analysis hasn’t stopped people looking for scapegoats and simpler explanations. And their baleful glare has fallen upon the internet generally and social media in particular. “For election day influence, Twitter ruled social media,” fumed the New York Times. “Donald Trump won Twitter, and that was a giveaway that he might win the presidency,” claimed Business Insider. And “Donald Trump won because of Facebook,” wrote Max Read in New York magazine.

Twitter was castigated mainly because it was Trump’s favoured channel and his tweeting provided a masterclass in how to exploit it. Facebook was in the dock, though, for a different reason: it was claimed that fake news stories that had spread virally on the service had inflicted real damage on the Clinton campaign. Among these were stories that the pope had endorsed Trump, that Hillary Clinton had bought illegal arms worth US$137m and that the Clintons had purchased a $200m house in the Maldives. (There was probably worse stuff, but I didn’t have the stomach to do the necessary trawl.)

In the end, the finger-pointing got to Facebook’s boss, Mark Zuckerberg, who was moved to offer a pained response. “After the election,” he wrote, “many people are asking whether fake news contributed to the result and what our responsibility is to prevent fake news from spreading. These are very important questions and I care deeply about getting them right.” He went on to point out that “of all the content on Facebook, more than 99 per cent of what people see is authentic. Only a very small amount is fake news and hoaxes. The hoaxes that do exist are not limited to one partisan view or even to politics. Overall, this makes it extremely unlikely hoaxes changed the outcome of this election in one direction or the other.”

Nevertheless, Zuckerberg says that he doesn’t want fake news on Facebook, but it turns out that getting rid of it is very difficult because “identifying the ‘truth’ is complicated”. Philosophers worldwide will agree with that proposition. But you don’t need to have a Nobel prize to check whether the pope did indeed endorse Trump or whether Clinton conducted the supposed purchases of arms or a Maldives house.

Zuckerberg’s problem is that he doesn’t want to engage in that kind of fact-checking, because that would be a tacit acknowledgement that Facebook is a publisher rather than just a technology company and therefore has some editorial responsibilities. And what he omits to mention is that Facebook has a conflict of interest in these matters. It makes its vast living, remember, from monitoring and making money from the data trails of its users. The more something is “shared” on the internet, the more lucrative it is for Facebook.

Just to put some numbers behind that assertion, research by BuzzFeed journalists discovered that “top fake election news stories generated more total engagement on Facebook than top election stories from 19 major news outlets combined”. The study found that over the last three months of the election campaign, 20 top-performing false election stories from hoax sites and hyper-partisan blogs generated 8,711,000 shares, reactions, and comments on Facebook, whereas the 20 best-performing election stories from 19 major news websites generated a total of 7,367,000 shares, reactions and comments. In other words, if you run a social networking site, fake news is good for business, even if it’s bad for democracy.

Courtesy: The Guardian

Leave a Reply

Your email address will not be published. Required fields are marked *

2 + 1 =