Did Facebook's Dark Side of Fake News Push Trump into the White House?

Posted in Other | 21-Nov-16 | Source: Globalo News

How much and how do social networks influence people´s decision to vote?

Did Facebook’s dark side of fake news push Trump into the White House?

Facebook CEO Mark Zuckerberg asked at the Asia-Pacific Summit for a more open and connected world.

But what are the costs involved in our global connectivity?

Social networks have been impacting many aspects of people´s lives. When taking political decisions and actions, this is no different.

Ever since Obama´s successful use of internet in engaging young voters in 2008, internet and social media has been used in different ways to accomplish political goals.

The outbreak of the Arab Spring in 2010 showed how the civil society can connect through social media to face social injustices and demand for political rights. With the motto “We are the 99%”, the Occupy Wall Street movement crossed borders during the financial crisis to call people to the streets against austerity measures. While we can see this influence of social networks as leading to more democratic and open societies, many have been warning about the dark side of this story.

Not even one week went by after Trump was elected before Facebook´s CEO Mark Zuckerberg came to public to express his views on the role of his company in the elections. His Facebook post November 19th, 2016, came after a series of criticisms over the lack of action in controlling the quality of the content shared in social networks.

Here the main arguments by Mark Zuckerberg:

“A lot of you have asked what we’re doing about misinformation, so I wanted to give an update.

The bottom line is: we take misinformation seriously. Our goal is to connect people with the stories they find most meaningful, and we know people want accurate information. We’ve been working on this problem for a long time and we take this responsibility seriously. We’ve made significant progress, but there is more work to be done.
Historically, we have relied on our community to help us understand what is fake and what is not. Anyone on Facebook can report any link as false, and we use signals from those reports along with a number of others — like people sharing links to myth-busting sites such as Snopes — to understand which stories we can confidently classify as misinformation. Similar to clickbait, spam and scams, we penalize this content in News Feed so it’s much less likely to spread.
The problems here are complex, both technically and philosophically. We believe in giving people a voice, which means erring on the side of letting people share what they want whenever possible. We need to be careful not to discourage sharing of opinions or to mistakenly restrict accurate content. We do not want to be arbiters of truth ourselves, but instead rely on our community and trusted third parties.
While the percentage of misinformation is relatively small, we have much more work ahead on our roadmap. Normally we wouldn’t share specifics about our work in progress, but given the importance of these issues and the amount of interest in this topic, I want to outline some of the projects we already have underway.”

According to Zuckerberg “more than 99% of what people see is authentic”, and, of what people are exposed to, “only a very small amount is fake news and hoaxes”.

While he is right in stating that “identifying the truth is complicated” when it comes to content shared, he seems to be underestimating the amount and impact that fake news cause in user´s behavior.

Everyone benefits from fake news. But some more than others.

Being used by both Democrats and Republicans, “hoaxes and fake news were extremely unlikely to have changed the outcome of this election in one direction or the other”, according to Mr. Zuckerberg. However, such strategy benefitted more Mr. Trump´s campaign, for the clear lack of truth in many of his statements online and during television debates.

According to a “lies-track” developed by David Dale, from Toronto Star, Trump not only lied in many statements, but actually centered the entire campaign on false claims. “I think his dishonesty is a central fact about his campaign. It’s not secondary,” Dale said. “It is critically important that people understand that this is not normal and people understand that much of what he’s saying isn’t true.”


If indeed we are entering an age where truth is not the most important characteristic of politicians during campaigns, many others may follow Mr. Trump´s strategy both offline and online.

That is not just Facebook´s fault

If you think that internet is a more democratic space than the political systems we have, you better think twice.

According to a research from Incapsula, 61% of the internet traffic in 2013 came from bots, showing an increase in comparison with previous years.

In Twitter, a bot is a program used to produce automated posts or to automatically follow Twitter users.

Twitterbots can post @replies or retweets in response to tweets that include a certain word or phrase, boosting a topic on the microblog or flooding the network with a specific message.


The use of bots for political purposes is common during election cycles and it is also made by some governments to control the type of information people are exposed to. Boosting the name of a candidate during election cycles to appear as a trending topic may drag positive or negative attention to him, depending on the interest behind the creators of the bots. Also, flooding the internet with positive messages may take the attention away from political scandals – this is specially the case in autocracies, to prevent people from getting information. In that sense, social networks need to acknowledge that they will become more important for democracies and autocracies as the world gets more connected. Therefore, they should be more accountable for its consequences.

“Facebook” is in a relationship with “Elections”

Facebook has been experimenting on voter’s behavior in recent years. When its users logged into their accounts on the US Election Day in 2010, they encountered a message that reminded them to go to vote. The experiment consisted in dividing users in three groups.

The first group received an ‘informational message’ encouraging them to vote, with a link to information on local polling places and a clickable ‘I voted’ button. The second group received a ‘social message’ with the same elements, but also showed the profile pictures of randomly selected Facebook friends who had clicked the button. The third group of users was assigned to a control group that received no message.


The results showed that those who got the informational message voted at the same rate as those who saw no message at all. However, those who saw the social message were 2% more likely to click the ‘I voted’ button and 0.3% more likely to seek information about a polling place and 0.4% more likely to head to the polls than either other group. Although not supporting any party and testing only the impact on turnout rates, this experiment is important to illustrate the fact the, although still marginally, social media influences real life decision of voters.

Let´s connect everyone! At what cost?

Mr. Zuckerberg asked at the Asia-Pacific Summit last week for a more open and connected world. He is correct in advocating for “digital rights” to everyone, but acknowledging the costs that connectivity brings to our social and political systems offline should be one of Facebook´s central concerns in the future. In the quest of providing reliable information online, social networks should keep in mind that transparency is a central issue. People need to understand why they are exposed to the content that appear in their timelines. Algorithms that rule social networks are still a matter of discussion, and none of these companies seems to be willing to address it.

Although not getting much in detail, Mr. Zuckerberg pointed out his seven point plan to tackle misinformation online:

  1. Implementing new technology to detect false information
  2. Users can easily report fake news
  3. Recognized fact-checking by third party organizations
  4. Warnings to alert readers about articles that have been flagged as potentially fake
  5. Setting a higher bar for related links that appear below articles
  6. Disrupting fake news economics: including better “ad farm detection”
  7. Working with journalists for input and fact-checking systems

What impacts should we expect in the future?

We must keep in mind that the reasons that drive our decisions are not drastically changed by the use of Facebook and Twitter. We are highly influenced by our ideologies, our life experiences, the political system we live in and rules that govern us.

However, democracy evolves with us, and with social networks and digital tools increasingly being a part of our lives and important sources of information, we need to hold them accountable for possible negative impacts they may bring to our democratic values.

Democracy is about exchanging points of view and, the better informed we are, the better the outcomes of our policies will be. If social media cuts dialogue by putting users into ideological bubbles and do not take any action against hoaxes, democracy is weakened.