Web-Puppeteering with the Use of Fake News: How Does It Spread and Where Will It Get Us

Jonathan Albright from Elon University; Charlie Beckett, Director of Polis; and Renee Kaplan, Head of Audience Engagement at Financial Times, talked in depth about how fake news gets around on the Internet.

Fabricated news have become a major issue over the past year due to their tendency to threaten users with tracking risks, triggering within them a plethora of emotions and sometimes manipulating individuals into taking extreme or excessive actions.

Beckett started the conversation about fake news stressing out that it is quite important not only to look at how such content is being created, but also to observe how it spreads, who the people sharing it are and what are the mechanisms that make this type of news viable.

Such viral content is represented by ‘often very partisan’ political messages, which seek to cause particular reactions and manipulate people.

Introducing Albright’s research Beckett eloquently pointed out that ‘some amazing mapping of the spread of fake news’ has been done and suggested its effect was similar to the one of the Hubble telescope, ‘where you can peer into these distant galaxies and see the vastness in the way the planets move.”

Albright then mentioned that there were a lot of talks about the roles of Facebook and Twitter during the election campaign. His research though ‘kicked off’ when he tried looking into the process resulting in Brexit following the EU referendum in the UK.

His aim while researching this matter was to try seeing the components that are making web-systems behind fake news work, and see the connection between these systems and various social media platforms.

“What I did in my research [is that] I started looking around at how data was being used to target people. This is something that is not new.” One of the main questions Albright worked on wasn’t about the effects of the campaign messages, but rather about the sources, so he looked at the types of channels providing information to the public. When he looked at some of the traffic sources he noticed that a lot of the traffic came from Facebook and Twitter.

He also mentioned that a lot of the audience nowadays is ‘more focused on trends, so the specific kinds of individuals and consumers can be targeted through [those] trends.’ The recognition of the public’s preference tendencies is also much easier to achieve.

Albright also stated that fake news ‘falls into the broader category of programmatic advertising.’ One of the main triggers used in the distribution of such content, and the one often resulting into the effectiveness of it, is human emotions, feelings. Fake news is tracking for more precise targeting, and all of the sites that are tracking people ‘heavily rely on Facebook’s likes.’

“When you look at the sentiment around some of these real times trends [one of them was the election debates in the United States], all the candidates are viewed as negative. And there was a lot of outrage about of all of the primary frontrunners in the US elections. Donald Trump had far more negative sentiment than any of the other candidates.”

So the strategy regarding emotions is to ‘get’ people to a certain point at which ‘they are susceptible to the certain messages, and emotional enough to spread it.’

Regarding the terms which are used in the context of fake news, and the context of big data used in targeting during elections, Albright noted: “I think the better term is actionable data. It’s data that is being used specifically to reach specific people.”

In Albright’s opinion, capturing the real time sentiment is very important in fake news spreading, because it gives whomever is spreading them the opportunity to manipulate people tapping into such emotions as anger, fear and happiness. There is targeting that is based entirely on emotion:  many researchers proved that anger is the ‘softest spot’ for successful triggering.

He also said that a completely different picture is shown when mapping the processes happening on the Web: “It involves relationships, connections that will take you to the next layer”. He added that the circulation is very important, and that online relationships suggest a lot.

Companies that have programmatic access to Facebook’s API are powerful distributors of such content. They can easily spread ‘hundreds, if not thousands, customized messages at the same time and see the results.”

Later in his speech, Albright mentioned so-called ‘cyborgs,’  ‘automated accounts or semi-automated accounts,’ that these days seem to be mentioning Trump a lot: “He is the center of the universe for at least mentions.”

As for Kaplan, she expressed a general concern regarding the mechanisms used to spread fake news and mentioned the professional values which are being violated in such a situation.

“A lot of information being distributed through this architecture is tendentious or biased or untrue,” stated Kaplan adding that the effectiveness of it is impressive: “Marvelous but disturbing.”

“It’s breaking all the rules. It’s breaking the rules of fair play. It’s theoretically breaking the quite official rules allegedly of some of these platforms. It’s essentially breaking the rules of any kind of code of journalism.”

Because there is no governing body for the Internet, it is quite difficult to envisage possible measures to combat fake news and other phenomena like that. Individuals can and should hold the platforms distributing viral content accountable ‘for inflated metrics,” she said.

The so-called traditional media are, as Kaplan admitted, constrained by the ethical rules and ‘duties of accuracy’ when it comes to filling the Internet with information. One of such rules she named is the one YouTube is often guilty of breaking: it’s about ‘stuffing’ too many keywords, which ‘probably have nothing to do with a piece of content.’ That’s a strategy to gain optimization.

That is a behavior sometimes adopted even by ‘legitimate’ media professionals, stated Kaplan: “We have kind of a tabloid economy of media which are unbelievably powerful SEO machines, that essentially are commissioning around all of the fairly basic rules in order to rank in Google.”

Summing up the discussion, she reached a rather pessimistic conclusion: since media organisations like the Financial Times ‘play by the very traditional rules of accuracy and truth,’ their future  is not bright and cloudless.

‘We are bound to lose,’ concluded Kaplan, but made it clear that the professional values will still be treasured and maintained in spite of the demands of the web.