Skip to main content

Building Resilience to Disinformation

Key learning from training sessions

Between January and April 2026, the WLGA and Wales Safer Communities Network delivered a series of learning sessions for local authority officers, elected members and partners on disinformation, misinformation and wider information threats.

Funded by Welsh Government and the WLGA, the sessions were designed to help public sector professionals better understand how harmful information spreads, why it resonates, and how proportionate, values‑led responses can reduce community harm and protect trust in public institutions.

The training combined research‑led insight with practical frameworks, recognising that disinformation is not only a digital challenge, but a community, governance and cohesion issue.

Understanding mis‑, dis‑, and mal-information

There are three distinct terms that are often all lumped together as “mis-information.”

  • Misinformation is false or misleading information shared without intent to cause harm.
  • Disinformation is false information that is deliberately created or amplified to deceive or manipulate.
  • Malinformation is genuine information that is shared out of context, manipulated or used in a way intended to cause harm. For example, selectively cropped images or the misuse of private information.

While intention varies, all three can contribute to information vacuums, undermine confidence, fuel division and fracture relationships between communities and public bodies if left unaddressed, particularly during high‑profile or emotionally charged events.

Why information threats spread, and why facts alone are not enough

The sessions explored how modern information environments create fertile conditions for harmful narratives. Social media platforms, encrypted messaging apps and online fringe spaces can enable rapid amplification, particularly during high‑profile incidents or periods of uncertainty, when information gaps are quickly filled by speculation.

Participants also examined how psychology and neurology influence online behaviour. Concepts such as instant gratification, emotional arousal, confirmation bias, identity fusion and group belonging were used to explain why certain narratives gain traction, and why simply presenting facts is often not enough to change minds. Corrective information can sometimes feel ineffective, or even provoking, if it is delivered without empathy, context or trust. Persuasion is rarely about winning arguments. Creating space for doubt, reflection and curiosity is often more effective than challenging beliefs head‑on.

Why this matters

Disinformation is not an abstract or purely online issue. The sessions emphasised its real‑world consequences, and highlighted how information threats can contribute to:

  • Increased community tensions and hate incidents
  • Erosion of trust in public institutions and local leadership
  • Pressure on elected members and officers, including harassment or intimidation
  • Misunderstandings that place additional strain on partnership working
  • The normalisation of hateful or discriminatory narratives

A proportionate framework for responding: Ignore, Inform, Refute, Escalate

A key practical tool shared across the sessions was the I.R.E. framework, which supports proportionate and timely decision‑making when responding to harmful information:

  • Ignore: Low‑visibility or fringe content may be monitored without engagement to avoid amplification.
  • Inform: Where misunderstandings are growing, accurate information can be shared proactively without repeating false claims.
  • Refute: Clearly false or harmful narratives may require public correction using evidence and clear context.
  • Escalate: Coordinated, malicious or safeguarding‑related activity may require referral to platform teams, legal routes or external partners.

Participants were encouraged to recognise that multiple responses may be appropriate, and that having clear internal escalation routes and shared frameworks helps avoid delay and confusion.

Communication, trust and professional curiosity

The sessions strongly reinforced the importance of trust‑building communication. Effective responses to disinformation were shown to rely on:

  • Filling information gaps proactively and transparently
  • Being honest about uncertainty and explaining why some information is not yet available
  • Using accessible language, FAQs and consistent messaging
  • Showing empathy and curiosity rather than confrontation

Approaches such as professional curiosity, active listening and Socratic questioning were explored as ways to keep people engaged, reduce defensiveness and create space for reflection, particularly where beliefs are emotionally or identity‑driven.

Building resilience, not just reacting

Rather than focusing solely on reactive responses, the training highlighted the value of prevention and resilience‑building. This includes strengthening critical thinking skills, supporting media literacy, encouraging reflection on online behaviours, and helping staff and elected members remain mindful of their digital footprint.

Disinformation and information threats will continue to evolve. Building resilience—individually, organisationally and within communities—remains a shared responsibility.