NEW YORK - Facebook CEO Mark Zuckerberg says the idea that fake news spread on Facebook influenced the outcome of the U.S. election is "crazy."
Still, the majority of Americans (six in 10) say they get at least some news from social media , mostly Facebook, according to the Pew Research Center. While a lot of this news comes from established outlets — whether CNN or BuzzFeed News, misinformation spreads on Facebook just as information does, shared by users, recommended by software and amplified by both.
Sources of spurious information has ranged from news articles produced by "content farms" for the sole purpose of getting clicks, to "hyperpartisan" sites from both sides of the political spectrum, churning out stories that are misleading at best.
Case in point: "FBI AGENT SUSPECTED IN HILLARY EMAIL LEAKS FOUND DEAD IN APPARENT MURDER-SUICIDE" — a fabricated headline from a fake news site called the Denver Guardian, was shared thousands of times in the days leading up to the election.
Is it possible that voters were swayed for or against a candidate, much like those same people might buy a product after seeing an ad on Facebook?
Zuckerberg says voters deserve more credit.
During an interview Thursday with "The Facebook Effect" author David Kirkpatrick, Zuckerberg said idea that people voted the way they did because of bogus information on Facebook shows a "profound lack of empathy" for supporters of Donald Trump.
"Voters make decisions based on their lived experience," he said.
Given the acerbic political contest from which the country just emerged, when countless longtime friends, even family, were unfriended, many are left to wonder if there would be an alternative American history being written today if it were not for Facebook, Twitter and the like.
This, after all, was the first truly social media election, playing out on Twitter and Facebook as much or more than it did on major networks, in living rooms and around watercoolers.
But isn't social media just a reflection of our world as it exists? Has Facebook become an easy scapegoat when the answer is far more complex?
While Pew found that many believe political discussions on social media to be "uniquely angry and disrespectful," a comparable number have the same impression of face-to-face conversations when it comes to Democrats, the GOP, or another party.
When it comes to Facebook users, Zuckerberg said almost everyone has friends on the "other side." Even if 90 per cent of your friends are Democrats, for example, 10 per cent will be Republican. Still, that's not a very big number, and the idea of a "filter bubble" — that social media allows people to surround themselves only with the people and ideas with whom they agree, has been a hot topic this election cycle.
"By far the biggest filter in the system is not that the content isn't there, that you don't have friends who support the other candidate or that are of another religion," Zuckerberg said. "But it's that you just don't click on it. You actually tune it out when you see it. I don't know what to do about that."
A DIFFICULT LINE
Facebook has long denied that it's a publisher or a media company, or that it acts remotely like either. Its cheery slogan — to make the world more "open and connected" — seemingly invites a broad range of viewpoints, diverse, lively discussion and the free flow of information, rather than censorship.
But it could also make clamping down on fake news difficult. At a time when everyone seems entitled, not just to their own opinions, but to their own facts, one person's misleading headline might be another person's heartfelt truth.
"We take misinformation on Facebook very seriously," Adam Mosseri, the executive in charge of Facebook's news feed, said in a statement to the tech blog TechCrunch this week. "We value authentic communication, and hear consistently from those who use Facebook that they prefer not to see misinformation."
Facebook acknowledges that it has more work to do, and it seems to be putting a lot of faith in the power of data, artificial intelligence and algorithms as the solution.
Over the summer, Facebook fired the small group of journalists in charge of its "trending" items and replaced them with an algorithm. The catalyst appeared to be a report in a tech blog, based on an anonymous source, that the editors routinely suppressed conservative viewpoints.
Subsequently, fake stories ahead of the election began to trend.