Charlie shares the negative impacts on disinformation and gives an insight into the DEPICT framework which categorises the different types of disinformation.
“Fere libenter homines id quod volunt credunt.” – Men willingly believe what they wish.
Julius Caesar – Commentaries on the Gallic War
I came across the DEPICT framework in one of Philippe Borremans’ ‘Wag the Dog’ newsletter and, being a great fan of mnemonics and frameworks, I thought I should learn more about the framework and whether we could find it useful as a tool for dealing with disinformation.
Disinformation as a Risk
As business continuity practitioners and members of society, I am sure we are all aware of disinformation and its impact. In the World Economic Forum’s Risk Perception Survey 2025–2026, misinformation and disinformation is the number 2 risk in the short term (2 years) and number 4 over the long term (10 years). Social media penetration, the end of deference, distrust in politicians and other leaders, and societal polarisation, have led us to live in a society where disinformation can flourish.
Disinformation Impact – Southport Attack
Disinformation, as we know, can have real-world consequences and can lead to injuries and even attempts to murder. In the 2024 Southport stabbings, disinformation that the attacker was a recent asylum seeker very quickly led to disorder. In Southport, a crowd gathered outside a mosque; the situation quickly turned violent. Bricks and bottles were thrown, a police vehicle was set on fire, and officers were injured trying to control the crowd. Further unrest occurred when the Holiday Inn Express in Tamworth, which was housing asylum seekers, was attacked and set on fire. The police were slow to provide the information that the attacker, Axel Rudakubana, was not an asylum seeker and had been born in Wales; by the time this information was made available, the unrest had already taken place.
Impact on Organisation
Disinformation can have a devastating impact on organisations, as misinformation about their products or services can severely affect sales, lead to boycotts, or cause consumers to lose faith in the product. The response by organisations is difficult, as disinformation spreads very quickly on social media: it can travel rapidly, anyone can shape the narrative, and, with the decline in institutional structure, scientists and authority figures find it difficult to debunk it.
The DEPICT Framework
The DEPICT framework is a way of looking at the different ways disinformation is used. If we can understand the techniques used in disinformation, then we can better identify it and then disregard or react to its contents.
| Pillar | Explanation | Real Life Example | How it Appears in a Crisis or Cyber Incident |
| Discrediting | This is about attacking the source rather than the message. Instead of debating facts, it tries to make people distrust experts, organisations, or media. Over time, it erodes confidence so people stop believing anything. It creates confusion where people feel no source can be trusted. | During the COVID-19 pandemic, some groups labelled scientists and organisations like the World Health Organization as corrupt or lying. | Attackers or commentators may claim the organisation is ‘lying’ about the true impact of an incident, hiding the truth or failing to be honest about its impact on consumers: ‘putting profit before safety’. This undermines official updates and makes stakeholder communication much harder to manage. |
| Emotional Manipulation | This uses strong emotions like fear, anger, or outrage to get people to react quickly. When emotions are high, people are less likely to stop and think. It is designed to trigger sharing before checking. The stronger the emotion, the faster the content spreads. It taps into deeply personal concerns where people feel they must act quickly. | Ongoing debates in the USA linking childhood vaccines to autism often rely on highly emotional stories and messaging. Despite extensive scientific evidence showing no link, emotionally charged narratives continue to circulate and influence behaviour. No parent wants to put their child at risk, so this is a very persuasive message. | During an incident, similar tactics can exaggerate harm, for example claims that ‘everyone’s personal data is exposed’ or ‘systems are completely unsafe’. This creates fear and panic, driving rapid sharing before facts are confirmed. |
| Polarisation | This creates an ‘us versus them’ divide. It pushes people into groups and makes disagreement feel like conflict. The more divided people are, the easier it is to influence and mobilise them. It turns complex issues into identity battles. | Divisive messaging during the 2016 United States Presidential Election framed issues as ‘patriots vs traitors’. | Incidents can become politicised or framed as ‘company vs customers’ or ‘management vs staff’. This damages trust internally and externally and can escalate reputational harm. |
| Impersonation | This involves pretending to be a trusted person or organisation. It could be fake accounts, copied branding, or even deepfakes. The aim is to exploit trust so that false information looks credible. It becomes harder for people to tell what is real. | Fake accounts posing as BBC News have spread false breaking news. | Attackers may send phishing emails pretending to be the organisation during the incident, or create fake statements from executives. This can worsen the attack and confuse response efforts. |
| Conspiracy | This builds a hidden story about secret groups controlling events. These narratives are hard to disprove because they explain away any opposing evidence. They often grow stronger in times of uncertainty. They create a closed belief system that resists facts. | The QAnon narrative claimed a secret global network controlled events. | Rumours may emerge that the incident is a cover-up, an insider job, or a deliberate act. This can distract from response activity and damage long-term reputation. |
| Trolling | This is about provoking reactions and creating noise. It uses insults, baiting, and disruption to derail conversations. The goal is not truth, but attention and emotional response. It often overwhelms genuine discussion and pushes people away. | When Elon Musk posted provocative comments about the UK government on X, including suggesting there was a “civil war” in Britain, replies quickly filled with insults, sarcasm, and extreme views. | Social media channels can be flooded with hostile or sarcastic comments during an incident. This overwhelms communications teams and makes it harder to engage with genuine stakeholders. |
Inoculating People Against Disinformation
Academics Stephan Lewandowsky and Sander van der Linden in their paper ‘Countering Misinformation and Fake News Through Inoculation and Prebunking’ [1], have put forward the idea that if people understand how disinformation is constructed and then used, then we can more easily recognise it and disbelieve it. This information, like vaccination, makes people immune to disinformation, so they will not be affected by it. I am slightly of the opinion that young people who have been brought up in the world of social media will become, or are already, good at identifying fake news, disinformation and deepfakes. Those who are most impacted by it are those who were not brought up in the age of social media and are perhaps more inclined to believe disinformation, as they were brought up in a world where you were more trusting of authority, what was portrayed on the news, and the validity of what they saw on video.
Conclusion
Understanding disinformation and its construction can go a long way to identifying it and then not reacting. Disinformation works best when it plays into existing prejudices and attitudes and engenders an emotional response. As practitioners, we should share information on recognising disinformation, but I also think we should exercise our response to it. Run an exercise with the scenario of your most important product or service being the subject of a fast-moving disinformation campaign and how you would react to it. As it is one of the major risks worldwide, it would be remiss of us not to do something to mitigate this risk for our organisations.
References
[1] Lewandowsky, S. and van der Linden, S. (2021) ‘Countering Misinformation and Fake News Through Inoculation and Prebunking’, European Review of Social Psychology, 32(2), pp. 348–384.



