The rapid spread of misinformation and disinformation (mis/ disinformation) online has emerged as a pressing public issue of the 21st century that affects all those accessing online networks, as well as those offline.
As active digital users, mis/disinformation is very much a part of children’s lives. In addition, mis/disinformation among parents, caregivers and educators can have a negative effect on children, even if the children themselves are not directly exposed to it. Children can be targets and objects of mis/disinformation, spreaders or creators of it, and opponents of mis/disinformation in actively seeking to counter falsehoods.
The new rapid analysis report “Digital Misinformation / Disinformation and Children” by the UNICEF Office of Global Insight and Policy outlines what we know about children and mis/disinformation, and the challenges for policymakers, technology companies, educators, and parents and carers in countering mis/ disinformation.
10 Things to Know: Children and Mis/disinformation
Misinformation is false or misleading information that is unwittingly shared, while disinformation is deliberately created and distributed with an intent to deceive or harm. Together they range from satire and parody, to dangerous conspiracy theories. Here are 10 things you need to know about how they affect children.
1. Mis/disinformation online is a pressing public issue.
The rapid spread of mis/disinformation online affects everyone online and offline. As active digital users, mis/disinformation is very much a part of children’s lives. Mis/disinformation on social media spreads farther, faster, and deeper than truthful information. Hot-button and divisive issues, such as immigration, gender politics and equality, and vaccination are common subjects.
2. There can be real-world consequences.
Mis/disinformation has been used to incite violence and crime targeted at ethnic minorities – which has resulted in deaths and displacement of children, led to lower child COVID vaccination rates, undermined trust in journalism and science, and drowned out marginalized voices.
3. Algorithms are a key to mis/disinformation.
Algorithms drive personalized news feeds and curate search results, content, and friend recommendations by tracking user behaviour. Algorithms sometimes promote misleading, sensationalist and conspiratorial content over factual information and can be key vectors in amplifying the spread of mis/disinformation.
4. Children are vulnerable to the risks.
Because of their evolving capacities, children cannot always distinguish between reliable and unreliable information. As a result, not only can they be harmed by mis/disinformation, but may also spread it among their peers. Even very young children or those without access to social media networks may be exposed to mis/disinformation through their interactions with peers, parents, caregivers and educators.
5. Children can challenge mis/disinformation.
Children can be targets and objects of mis/disinformation, but they can also actively counter its flow. They can contribute to online fact-checking and myth-busting initiatives, such as against COVID misinformation in Nepal. UNICEF Montenegro’s Let’s Choose What We Watch programme has given young people opportunities to practice their media literacy and journalism skills and so improve the quality of reporting child rights.
6. Education is important.
Equipping children with the critical reading and thinking skills can help them determine the veracity of information. Considering how mis/disinformation moves easily between online and offline contexts, it is important to develop critical thinking skills amongst children even in non-digital contexts.
7. Collective action is required to protect children.
Policymakers, civil society organizations, technology companies, and caregivers including parents and educators must work together to protect children from the harms of mis/disinformation. Efforts to slow the spread of mis/disinformation are not coordinated, and there is little reliable data on the scale of the problem.
8. We need child rights-based regulations.
UNICEF recommends that policymakers devise regulation to protect children from harmful mis/disinformation, while enabling children to safely access diverse content. Regulation should focus on requiring procedures for classifying content and ensuring transparency and accountability. Finding the balance between rights-based online protection and freedom of expression is a very significant policy challenge.
9. Technology companies can help.
Technology companies are key actors in combating mis/disinformation. UNICEF recommends that they actually implement their self-declared policies and invest more in human and technical approaches. Technology companies should be transparent about mis/disinformation on their platforms and how they are combatting it, and prioritize meaningful connections and plurality of ideas for children in the designs of digital systems.
10. Civil society should provide policy guidance.
Civil society, including academia and international organizations, should conduct research on the impact of mis/disinformation on children and the efficacy of counter-measures, so that their findings can inform advocacy and policy responses.
A lightly edited synopsis of the Digital Misinformation / Disinformation and Children report and 10 things you need to know by Philip N. Howard, Lisa-Maria Neudert and Nayana Prakash, Oxford Internet Institute, Oxford University, and Steve Vosloo, UNICEF.
Sorry, the comment form is closed at this time.