⇓ More from ICTworks

The Fake News Information Disorder – Your Weekend Long Reads

By Steve Vosloo on December 16, 2017

information disorder

We live in an era, according to the Economist, that is post-truth. Especially in politics, this time sees “a reliance on assertions that ‘feel true’ but have no basis in fact.” In 2016, post-truth was the Oxford Dictionaries’ Word of the Year.

Untruths have always been with us, but the internet is the medium that changed everything. The scale with which “alternative facts“, untruths and blatant lies can be created and spread – by people and algorithms – can, for the first time ever, threaten democracy and social cohesion at a global scale.

For those of us who have, for a long time, believed in the power of the internet to break down barriers between people and cultures, foster dialogue, have a sharpening effect on truth through increased transparency and access to information, post-truth’s most dangerous weapon, “fake news“, is a bitter pill to swallow.

While fake news has been around since the late 19th century, it is now a headline phenomenon, the Collins’ Word of the Year for 2017. What happened to the grand internet dream of the democratisation of knowledge?

All of us have a duty to engage with these complex issues, to understand them, take a position, and reclaim the dream. Most importantly, we need to constantly question whether the digital tools we built, and continue to build, are part of the problem.

The Birth of a Word

It is useful to go back only a year and a half to remind ourselves how fake news became a household word. WIRED’s article traces the the birth – and, it claims – the death of it. How did it die? It quickly became so diluted in meaning, so claimed by those shouting the loudest, that it has become meaningless in many ways.

Fake News, or Information Disorder?

In an attempt to bring structure to the discussions, the Council of Europe produced a report on what it calls information disorder. The authors refrain from using the term fake news, for two reasons. First, they believe it is “woefully inadequate” to describe a very complex issue, and, secondly, it has been appropriated by politicians to slam any news or organisation they find disagreeable, thus becoming a mechanism for repression – what the New York Times calls “a cudgel for strongmen”.

The authors introduce a new conceptual framework for examining information disorder, identifying three different types:

  • Mis-information is when false information is shared, but no harm is meant. (According to Open University research, misinformation is rife among refugee populations.)
  • Dis-information is when false information is knowingly shared to cause harm.
  • Mal-information is when genuine information is shared to cause harm, often by moving information designed to stay private into the public sphere.

The report concludes with excellent recommendations for technology companies, as well as a range of other stakeholders. If the report is too long for you, be sure just to read the recommendations.

Fight It With Software

Tom Wheeler at the Brookings Institute offers a history of information sharing, control and news curation. He laments that today the “algorithms that decide our news feed are programmed to prioritize user attention over truth to optimize for engagement, which means optimizing for outrage, anger and awe.” But, he proposes: “it was software algorithms that put us in this situation, and it is software algorithms that can get us out of it.”

The idea is “public interest algorithms” that interface with social network platforms to, at an aggregate level, track information sources, spread and influence. Such software could help public interest groups monitor social media in the same way they do for broadcast media.

Fight It With Education

While I believe in the idea of software as the solution, the Wheeler article seems to miss a key point: information spread is a dance between algorithms and people. Every like, share and comment by you and me feeds the beast. Without us, the algorithm starves.

We need to change the way we behave online; media and information literacy are crucial to this. There are many excellent resources for teens, adults and teachers to help us all be more circumspect online. I like the Five Key Questions That Can Change the World (from 2005!)

Want To Understand It Better? Fake Some

Finally, long before fake news become popular, in 2008, Professor T. Mills Kelly got his students at George Mason University to create fake Wikipedia pages to teach them the fallibility of the internet. At Google’s Newsgeist unconference last month, a similar exercise involved the strategising of a fake news campaign aimed at discrediting a certain US politician.

Both instances force us to get into the minds of fakesters and how to use the internet to spread the badness. While creating fake Wikipedia pages doesn’t help the internet information pollution problem, the heart of the exercises are useful — perhaps they should be part of media literacy curricula?

Thanks to Guy Berger for suggesting some of these articles.

Filed Under: Thought Leadership
More About: , , , , , ,

Written by
Steve Vosloo is passionate about using technology in education. He's worked at UNESCO, Pearson South Africa, Stanford University, and the Shuttleworth Foundation on the use of mobile phones for literacy development, how technology can better serve low-skilled users, and the role of digital media for youth. All opinions expressed in this post are his own.
Stay Current with ICTworksGet Regular Updates via Email

One Comment to “The Fake News Information Disorder – Your Weekend Long Reads”

  1. Ed Gaible says:

    Great post, Steve, Fake News (and fake-fake news) is the issue that defines the moment, at least in terms of information.(Other moment-defining issues are perhaps more directly related to tribalism). 1 point and 1 example follow:
    – Point: To the extent that we are seeing education systems respond to their irrelevance by promoting “functionalist” curricula such as STEM and TVET and even STEAM, we are witnessing those systems willfully re-structuring to not equip our youth to address the ethical, philosophical and evidentiary issues that these people, currently students, will confront as they become adults.
    – Example: How do we assesst the trade-offs of information and violence, esp in relation to “fake news”? Doxxing “fascists” is vastly preferable to and I suspect more effective than committing violence against them (a la Antifa) . But the risks of false accusation and/or publishing fake news is high. The risks of falsely applied violence are also high. Choosing the best approach likely requires evidentiary skill and ethical contemplationl