<div class="brand-block">
<img src="https://eudem.alphabetformation.org/wp-content/uploads/2024/11/cropped-EU-DEM-LOGO-e1731073848663-1.png"
alt="EU DEM project logo"
class="brand-logo">
<img src="https://eudem.alphabetformation.org/wp-content/uploads/2024/11/EN_Co-fundedbytheEU_RGB_NEG-1024x228.png"
alt="Co-funded by the European Union emblem"
class="eu-logo">
<p class="brand-disclaimer">
This interactive online course is part of the <strong>EU DEM – EU Fact Checker Network</strong> project,
co-funded by the European Union.
</p>
<h4>Module 1 · The Decision Lab</h4>
<p>
Welcome to <strong>Truth vs. Lies: The Decision Lab</strong>.
</p>
<p>
Every day, we navigate an information ecosystem shaped by speed, emotion, and constant distraction.
Today, you will explore how misinformation spreads — and how your decisions influence that process.
</p>
<p>
A headline appears on your feed:
</p>
<p>
<strong>“Major Public Figure Exposed in International Scandal — Experts ‘Shocked’!”</strong>
</p>
<p>
Without context, without confirmation, without sources.
</p>
<p>
Before acting, ask yourself:<br>
<em>What shapes your first reaction? Emotion? Curiosity? Outrage? Habit?</em>
</p>
<p>How do you respond?</p>
[[Engage immediately with the post]]
[[Pause and analyse its credibility]]
[[Save it for later without interacting]]
You click, react, or share — a split-second decision.
This choice is common: the comic (page 9) illustrates how millions unintentionally amplify false content simply because reacting is easier than reflecting.
Your engagement boosts the visibility of the post. The platform interprets this as “relevance” and pushes it to more users.
Within minutes, the story mutates into dozens of comments, interpretations, and emotional reactions.
You feel a slight uncertainty. Something doesn’t seem entirely credible.
What triggered your engagement?
[[It felt urgent or alarming]]
[[Everyone else seemed to be reacting]]
[[I assumed a headline that confident must be true]]
[[Return to Start]]
You resist the impulse to react — an increasingly rare act in fast-paced digital environments.
You begin by dissecting the post:
• Who published it?
• Is the language neutral or exaggerated?
• Does the image match the claim?
• Are there reliable sources?
This careful pause aligns with the comic’s final teaching (pages 10–11):
*Misinformation weakens when we slow down and verify.*
What’s your first verification step?
[[Investigate the source’s identity]]
[[Examine the image via reverse search]]
[[Check whether independent outlets report the same story]]
You choose not to act immediately — a neutral but disengaged stance.
Hours later, the post resurfaces in your mind. But the context has shifted:
comments have intensified, interpretations multiplied, and emotional bias has deepened.
This delay reveals a key vulnerability discussed in the comic (pages 7–8):
*Our memory fills gaps with assumptions, making misinformation feel more plausible even without evidence.*
Faced with this uncertainty, what do you do?
[[Investigate it now with a clear mind]]
[[Decide it’s not worth checking — information overload]]
[[Return to Start]]
Sensationalism is crafted to create urgency — a psychological shortcut that bypasses critical thinking.
The comic (page 7) highlights how emotion-driven content manipulates instinctive reactions.
Ask yourself:
Why did the urgency override your caution?
[[Because the post suggested risk or danger]]
[[Because dramatic headlines feel credible]]
[[Return to Start]]
[[Continue to Module 2: The Anatomy of Manipulation]][[No – crowd behaviour influenced me]]
[[Maybe – it still looked plausible]]
[[Return to Start]]
[[Continue to Module 2: The Anatomy of Manipulation]]You trusted the post because the headline sounded confident and authoritative.
This is a common cognitive shortcut:
when something is written with certainty, our brains assume it must be true — even without evidence.
Misleading headlines intentionally use:
• decisive language
• absolute claims
• certainty framing (“shocking truth”, “experts agree”)
• emotional tone
These features bypass careful reading.
What now?
[[Because the post suggested risk or danger]]
[[Because dramatic headlines feel credible]]
[[Continue to Module 2: The Anatomy of Manipulation]]
[[Return to Start]][[Start]]
You examine the account behind the headline.
No clear author. No verifiable background.
Just a logo, generic name, and a series of posts built around outrage and emotional cues.
This mirrors a key distinction from the comic (page 5):
*Not everything sensational or incorrect is “fake news”, but anonymity + emotional framing is a warning sign.*
You now understand the post is suspicious — but you need more evidence.
Your next step?
[[Analyse the accompanying image]]
[[Search for independent reporting]]
[[Return to Start]]
You run a reverse search.
Result:
The image used in the post originally comes from a **2012 unrelated event**, repurposed to provoke shock.
This technique — using real images in misleading contexts — is reflected in comic pages 7–8, showing how emotional engineering works.
You’ve uncovered a major inconsistency.
How do you proceed?
[[Look for corroborating news sources]]
[[Reflect on why the image felt persuasive]]
[[Return to Start]]You look for independent confirmation.
Major, credible sources show **no mention** of the event.
Some reliable fact-checkers have even flagged similar claims as fabricated.
You now see the bigger picture:
The post relies on sensationalism, emotional triggers, and mimetic design to appear trustworthy (page 8).
What insight do you take from this?
[[I understand how easily narratives can be engineered]]
[[I want to explore why people fall for such posts]]
[[Return to Start]]
[[Continue to Module 2: The Anatomy of Manipulation]]You return to the post after some time — this time with a clearer mind and less emotional noise.
With distance, the headline feels slightly exaggerated.
The phrasing seems dramatic, the punctuation too intense, and the author anonymous.
Delayed reflection is powerful: the comic highlights (pp. 7–8) that stepping away helps break emotional influence and cognitive shortcuts.
What do you do now?
[[Analyse the accompanying image]]
[[Search for independent reporting]]
[[Check whether independent outlets report the same story]]
[[Continue to Module 2: The Anatomy of Manipulation]]
[[Return to Start]]You decide not to investigate the post.
You’re tired, overwhelmed, and already exposed to too much information today.
This reaction is extremely common — information overload makes critical thinking harder.
The comic highlights this (pages 2–3): even well-intentioned people fall for false content when mentally exhausted.
Yet choosing not to check leaves a gap your mind will fill later with guesses, assumptions, or emotions.
Without verification, misinformation becomes harder to resist.
What do you want to do now?
[[Reflect on why information overload affects judgment]]
[[Reconsider and check the post anyway]]
[[Continue to Module 2: The Anatomy of Manipulation]]
[[Return to Start]]You take a closer look at the image attached to the post.
At first glance it seems convincing — dramatic lighting, a serious expression, a setting that looks official enough.
But when you slow down, details begin to feel “off.”
Questions start to surface:
• Does the image actually relate to the headline?
• Could it be from a different event or year?
• Is it edited, cropped, or used out of context?
• Does it appear on fact-checking sites?
The comic (pages 7–8) shows how powerful visuals can be when paired with misleading claims — they create instant emotional credibility.
What’s your next move?
[[Run a reverse image search]]
[[Check whether independent outlets report the same story]]
[[Look for corroborating news sources]]
[[Reflect on why the image felt persuasive]]
[[Continue to Module 2: The Anatomy of Manipulation]]
[[Return to Start]]
You decide to look for reporting from independent, credible sources.
As you scan through reputable outlets, you notice something important:
• No major news organisations mention the event.
• Fact-checkers have not confirmed anything similar.
• Related stories appear only on low-credibility blogs or anonymous accounts.
Absence of independent reporting is a strong warning sign.
As the comic shows (pages 6–8), misinformation relies on *isolated*, *emotionally charged*, and *unverified* claims.
This step helps break the illusion of truth created by repetition or dramatic framing.
What do you want to do next?
[[Check whether independent outlets report the same story]]
[[Look for corroborating news sources]]
[[Reflect on why repetition creates false credibility]]
[[Continue to Module 2: The Anatomy of Manipulation]]
[[Return to Start]]
You search for other news outlets reporting the same story.
What you find is revealing:
• Higher-quality outlets do **not** mention the event.
• Smaller blogs repeat the claim using similar phrasing — hinting at a single original source.
• Some posts cite each other rather than real evidence.
• No official statements confirm the situation.
Corroboration is one of the strongest tools against misinformation.
When multiple independent sources cannot verify a story, it is likely unreliable or deliberately fabricated.
This step aligns with the comic’s message (pages 6–7):
**misinformation spreads easily when people fail to compare sources.**
What do you want to do next?
[[Check whether independent outlets report the same story]]
[[Reflect on why the image felt persuasive]]
[[Reflect on why repetition creates false credibility]]
[[Continue to Module 2: The Anatomy of Manipulation]]
[[Return to Start]]
Images feel true — even when the story behind them is false.
What made *this* image feel persuasive?
• It matched the emotion of the headline
• It looked “official” or serious
• It triggered empathy, fear, or curiosity
• Your brain filled in missing details automatically
• You assumed the photo and the claim belonged together
This is a well-known effect:
**pictures create instant credibility**, especially when the viewer is rushed, tired, or overwhelmed.
As the comic (pages 7–8) shows, misinformation often uses *real images* paired with *false claims* to create emotional impact.
Now that you recognise this mechanism, how do you want to continue?
[[Look for corroborating news sources]]
[[Check whether independent outlets report the same story]]
[[Reflect on why repetition creates false credibility]]
[[Continue to Module 2: The Anatomy of Manipulation]]
[[Return to Start]]
[[Continue to Module 2: The Anatomy of Manipulation]]
[[Continue to Module 2: The Anatomy of Manipulation]]
You reacted because the post implied that something dangerous or urgent was happening.
Fear is one of the strongest psychological triggers.
When your brain senses risk, it prioritizes speed over accuracy.
This is why alarming headlines spread so quickly — they are designed to make you react before verifying.
Fake news creators rely heavily on:
• urgent language
• warnings about safety
• threats to health or security
• “act now before it’s too late” framing
These techniques bypass your critical thinking by activating instinctive responses.
What would you like to explore next?
[[Because dramatic headlines feel credible]]
[[Reflect on why urgency affects judgment]]
[[Continue to Module 2: The Anatomy of Manipulation]]
[[Return to Start]]
Headlines written with confidence feel trustworthy — even when the content is weak.
Dramatic phrasing creates the illusion of authority:
• definitive statements
• absolute claims
• emotionally loaded phrasing
• confident tone (“shocking truth”, “experts confirm”)
Your brain mistakes confidence for credibility — a cognitive shortcut frequently exploited in misinformation.
What now?
[[Reflect on why urgency affects judgment]]
[[Continue to Module 2: The Anatomy of Manipulation]]
[[Return to Start]]
Welcome to **Module 2: The Anatomy of Manipulation**.
In Module 1, you explored how your own reactions shape the spread of misinformation.
Module 2 takes you deeper — into the strategies that make misinformation *powerful*, *emotional*, and *highly shareable*.
Here you will uncover:
• How fake news targets your emotions
• How images and visuals create false credibility
• How narratives are engineered to feel “true”
• Why your brain believes things before verifying them
Where would you like to go next?
[[Explore emotional triggers]]
[[Discover how visuals manipulate us]]
[[Understand narrative engineering]]
[[Return to Start]]You realise that your reaction was shaped by **crowd behaviour**, not by the credibility of the post itself.
When many people appear to react, share, or comment, your brain interprets the content as:
• relevant
• important
• trustworthy
• socially validated
This is the psychology of *social proof*.
Fake news creators exploit this by using:
• coordinated bot activity
• purchased engagement
• manipulated comment sections
• fake accounts
The appearance of popularity is often engineered.
What would you like to explore next?
[[Maybe – it still looked plausible]]
[[Reflect on how social proof shapes judgment]]
[[Continue to Module 2: The Anatomy of Manipulation]]
[[Return to Start]]
You acknowledge that the post *felt* plausible — even without evidence.
This happens because your brain fills missing information with:
• assumptions
• expectations
• prior beliefs
• emotional impressions
When content aligns with what you already suspect or fear, it feels easier to believe — even when it lacks proof.
Where do you want to continue?
[[Reflect on how social proof shapes judgment]]
[[Continue to Module 2: The Anatomy of Manipulation]]
[[Return to Start]]
Fake news is engineered to hit your emotions first — not your logic.
Fear.
Anger.
Shock.
Curiosity.
Moral outrage.
Each of these reactions creates *fast thinking*, which is exactly what misinformation needs to spread.
Which emotional tactic do you want to explore?
[[Fear-based manipulation]]
[[Anger-based manipulation]]
[[Shock and curiosity bait]]
[[Return to Continue to Module 2: The Anatomy of Manipulation]]Images convince faster than text.
Fake news exploits this by using:
• out-of-context photos
• AI-generated faces
• staged scenes
• misleading charts
Which aspect do you want to explore?
[[Out-of-context photos]]
[[AI-generated visuals]]
[[Deepfakes]]
[[Return to Continue to Module 2: The Anatomy of Manipulation]]
Fake news isn’t random — it is structured.
Narratives use:
• heroes vs villains
• simplified explanations
• emotionally satisfying arcs
• moral certainty
Where would you like to continue?
[[Why fake narratives feel believable]]
[[How repetition builds “truth”]]
[[Return to Continue to Module 2: The Anatomy of Manipulation]]Fear bypasses rational thinking.
Messages like:
**“Urgent warning!”**
**“Your safety is at risk!”**
…are designed to provoke instinctive compliance, not critical analysis.
What next?
[[Example of fear tactics]]
[[How to resist emotional hijacking]]
[[Return to Explore emotional triggers]]
Anger spreads faster than any other emotion.
Misinformation uses outrage to:
• divide groups
• mobilise users
• create echo chambers
• weaken trust in institutions
What next?
[[How anger increases virality]]
[[Strategies to slow down during anger]]
[[Return to Explore emotional triggers]]
“YOU WON’T BELIEVE WHAT THEY FOUND!”
Shock + curiosity = instant engagement.
Creators rely on:
• exaggerated punctuation
• emotional language
• partial information
• mystery framing (“hidden truth”)
Next steps:
[[Discover how visuals manipulate us]]
[[Return to Explore emotional triggers]][[Continue to Module 2: The Anatomy of Manipulation]]
Fear-based misinformation often uses:
• doctored statistics
• unverified medical claims
• warnings of imminent danger
• dramatic colours and symbols
• authoritative-sounding language
Where to continue?
[[How to resist emotional hijacking]]
[[Return to Explore emotional triggers]]The key is *interrupting* the emotional reaction.
Try this:
• Write down the emotion you felt
• Take a 5-second pause
• Ask: “Who benefits if I share this?”
• Check at least 2 independent sources
When emotions slow down, logic speeds up.
[[Return to Explore emotional triggers]]
[[Return to Continue to Module 2: The Anatomy of Manipulation]][[Explore emotional triggers]]
Studies show that angry posts are shared more often — even when users know the claim might be false.
Why?
Because anger creates:
• urgency
• identity reinforcement
• “in-group vs. out-group” dynamics
Where to go?
[[Strategies to slow down during anger]]
[[Return to Explore emotional triggers]]
@When angry:
• Don’t share — save instead
• Re-read later
• Ask: “Is this designed to manipulate me?”
• Look for emotional exaggeration
Anger is predictable — and so is its exploitation.
[[Return to Explore emotional triggers]]
[[Return to Continue to Module 2: The Anatomy of Manipulation]]A real image + a false caption = extremely persuasive misinformation.
People assume “seeing is believing”, even when the interpretation is fabricated.
Continue with:
[[How to verify images]]
[[AI-generated visuals]]
[[Return to Discover how visuals manipulate us]]AI can produce realistic images of:
• people who don’t exist
• events that never happened
• documents that were never written
These visuals bypass intuition because they “feel real”.
Continue with:
[[How to spot AI images]]
[[Deepfakes]]
[[Return to Discover how visuals manipulate us]]Deepfakes combine:
• face mapping
• voice cloning
• video reenactment
They create synthetic realities capable of massive deception.
Next:
[[How to protect yourself from deepfakes]]
[[Return to Discover how visuals manipulate us]]To verify a photo:
1. Reverse-search it
2. Check when it first appeared
3. Compare multiple versions
4. Look for metadata
5. Identify manipulated elements
Knowing how to investigate visuals is a powerful antidote.
[[Return to Discover how visuals manipulate us]]
[[Return to Continue to Module 2: The Anatomy of Manipulation]][[Discover how visuals manipulate us]]
Look for:
• strange hands or teeth
• unnatural reflections
• inconsistent shadows
• overly smooth skin
• mismatched earrings or hair
AI leaves fingerprints — once you know them.
[[Return to Discover how visuals manipulate us]]
[[Return to Continue to Module 2: The Anatomy of Manipulation]]Tips:
• Look for unusual blinking patterns
• Compare mouth movement to speech
• Check lighting consistency
• Verify source reliability
• Consult fact-checkers
Awareness reduces vulnerability.
[[Return to Discover how visuals manipulate us]]
[[Return to Continue to Module 2: The Anatomy of Manipulation]]Narratives stick when they:
• confirm our worldview
• reinforce identity
• offer someone to blame
• feel emotionally coherent
This is engineered manipulation.
[[How repetition builds “truth”]]
[[Return to Understand narrative engineering]]The Illusory Truth Effect:
“The more you hear something, the more true it feels.”
Misinformation spreads in waves to manufacture belief.
Next:
[[Module 2 Conclusion]]
[[Return to Understand narrative engineering]][[Understand narrative engineering]]
You’ve explored the deeper mechanisms behind misinformation:
• emotional triggers
• visual manipulation
• narrative engineering
These strategies shape opinions, reinforce bias, and distort public understanding.
You are now ready for Module 3: **Building Digital Immunity**.
[[Return to Start]]
[[Proceed to Module 3: Building Digital Immunity]]When your brain is overloaded, it seeks shortcuts.
This means:
• trusting confident-sounding headlines
• relying on memory rather than verification
• avoiding effortful thinking
• acting emotionally rather than analytically
Misinformation thrives in moments of fatigue and distraction.
What next?
[[Reconsider and check the post anyway]]
[[Continue to Module 2: The Anatomy of Manipulation]]
[[Return to Start]]
You give yourself a moment to pause — and choose to verify the post after all.
Distance improves clarity.
With fewer distractions, the headline looks exaggerated and the claim feels less plausible.
Where do you want to begin your late-stage verification?
[[Analyse the accompanying image]]
[[Search for independent reporting]]
[[Check whether independent outlets report the same story]]
[[Continue to Module 2: The Anatomy of Manipulation]]
[[Return to Start]]
Urgency pushes you into “fast thinking mode.”
In this mode:
• emotions override logic
• caution is reduced
• verification feels unnecessary
• instinct dominates over analysis
Misinformation thrives when people feel rushed or pressured.
Understanding how urgency shapes your reactions is a key step toward resisting manipulation.
Where next?
[[Continue to Module 2: The Anatomy of Manipulation]]
[[Return to Start]]
Social proof is powerful because it acts as a mental shortcut.
Instead of evaluating a post yourself, your brain asks:
“Do others believe this? Should I?”
This shortcut is efficient — but dangerous when the “crowd” is artificially manufactured.
Understanding this mechanism helps you resist emotionally contagious misinformation.
What now?
[[Continue to Module 2: The Anatomy of Manipulation]]
[[Return to Start]]
You decide to verify the image before believing the headline.
A reverse image search reveals something important:
• The photo appears in earlier articles from **2015**, unrelated to the event described
• It has been reused on several blogs with different captions
• None of the original posts mention the claim in the headline
• The image has been cropped in a way that removes key context
This is a classic technique:
**real images + false claims = powerful misinformation.**
Reverse searching exposes the mismatch and weakens the story’s emotional impact — exactly as the comic illustrates (pages 7–8).
What would you like to do next?
[[Check whether independent outlets report the same story]]
[[Look for corroborating news sources]]
[[Reflect on why the image felt persuasive]]
[[Continue to Module 2: The Anatomy of Manipulation]]
[[Return to Start]]
Repeated exposure makes information feel familiar — and familiarity feels like truth.
This is called the **Illusory Truth Effect**.
Even if a claim is false, seeing it many times across different platforms or accounts can make it seem convincing.
Disinformation campaigns exploit this by:
• posting the same claim from many fake accounts
• repeating dramatic wording
• recycling images from unrelated events
Recognising this pattern weakens its power.
Where would you like to go next?
[[Continue to Module 2: The Anatomy of Manipulation]]
[[Return to Start]]
Welcome to **Module 3: Building Digital Immunity**.
After analysing your reactions (Module 1)
and uncovering manipulation strategies (Module 2),
you now learn how to **defend yourself** from misinformation.
Digital immunity is built through:
• verification tools
• emotional self-awareness
• critical habits
• cross-checking methods
• slow thinking in fast environments
Where would you like to begin?
[[Learn essential verification tools]]
[[Master the “slow thinking” technique]]
[[Recognise manipulation patterns]]
[[Help others resist misinformation]]
[[Return to Start]]
Verification doesn’t require advanced expertise — just the right habits and a few tools.
Useful tools include:
• reverse image search platforms
• fact-checking websites
• credibility-rating extensions
• source-tracking functions
• cross-platform keyword searches
Verification works because it creates **friction** in a fast environment.
What would you like to explore?
[[How to verify images effectively]]
[[How to verify claims and quotes]]
[[How to verify accounts and sources]]
[[Return to Module 3 – Building Digital Immunity]]
Misinformation thrives on speed.
Slow thinking disrupts that cycle.
Slow thinking means:
• pausing before reacting
• questioning emotional triggers
• checking for source credibility
• postponing judgment until evidence appears
This simple habit dramatically reduces the spread of false content.
Explore:
[[How emotions influence thinking]]
[[How to pause effectively online]]
[[Return to Module 3 – Building Digital Immunity]]
Misinformation relies on predictable patterns:
• emotional exaggeration
• false urgency
• scapegoating
• fabricated authority
• dramatic punctuation
• altered images
• repetition across unreliable accounts
Seeing the pattern breaks its effect.
Continue with:
[[Emotional manipulation tactics]]
[[Narrative manipulation tactics]]
[[Visual manipulation tactics]]
[[Return to Module 3 – Building Digital Immunity]]
One informed person can interrupt the spread of misinformation.
You can help others by:
• asking gentle, non-judgmental questions
• providing verified sources
• sharing fact-checking tools
• explaining how emotional manipulation works
• modelling good verification habits
Change spreads through empathy, not confrontation.
Next:
[[How to talk to someone who shared misinformation]]
[[How to promote verification in your community]]
[[Return to Module 3 – Building Digital Immunity]]
To verify images, you can:
1. Run a reverse image search
2. Compare versions across platforms
3. Check metadata where available
4. Inspect lighting, shadows, reflections
5. Identify cropping or staging
Photos often feel true because they evoke emotion — not because they provide evidence.
Continue with:
[[How to verify claims and quotes]]
[[How to verify accounts and sources]]
[[Return to Learn essential verification tools]]
[[Return to Module 3 – Building Digital Immunity]]
To verify text claims:
• Check if the statement appears in reputable outlets
• Look for official statements or data
• Track the original source
• Check if the same wording appears on conspiracy or spam sites
• Compare multiple versions of the claim
Misinformation often spreads by stripping context or inventing attribution.
Next:
[[How to verify accounts and sources]]
[[Return to Learn essential verification tools]]
[[Return to Module 3 – Building Digital Immunity]]
Unreliable accounts often share traits:
• no clear identity
• recent creation date
• repetitive emotional language
• no citations or hyperlinks
• disproportionate posting activity
• anonymous or AI-generated profile photos
Reliable sources show:
• transparency
• consistency
• verifiability
• accountability
Where next?
[[Learn essential verification tools]]
[[Return to Module 3 – Building Digital Immunity]]
[[Proceed to Module 3: Building Digital Immunity]][[Learn essential verification tools]] Your brain prioritises speed when emotions run high.
Fear → urgency
Anger → certainty
Sadness → lowered skepticism
Curiosity → impulsive clicking
Understanding this helps you resist engineered manipulation.
Next:
[[How to pause effectively online]]
[[Return to Master the “slow thinking” technique]]
[[Return to Module 3 – Building Digital Immunity]]
Pausing is a skill.
Try:
• saving the post before reacting
• checking the date, source, and image context
• asking yourself: “What emotion is being triggered?”
• comparing with at least one independent outlet
Even a 5-second pause is enough to break manipulation.
Where next?
[[How emotions influence thinking]]
[[Return to Master the “slow thinking” technique]]
[[Return to Module 3 – Building Digital Immunity]]
[[Master the “slow thinking” technique]]Emotional manipulation includes:
• outrage traps
• moral shock headlines
• sympathy exploitation
• fear amplification
• “protect your family now!” framing
These tactics override rational thought.
Next:
[[Narrative manipulation tactics]]
[[Visual manipulation tactics]]
[[Return to Recognise manipulation patterns]]
Narratives often include:
• heroes and villains
• simple explanations of complex problems
• confident, absolute statements
• secret or forbidden knowledge
• repeated phrases to create familiarity
These create “truth illusions.”
Continue:
[[Visual manipulation tactics]]
[[Return to Recognise manipulation patterns]]
Visual tactics include:
• AI-generated images
• deepfakes
• out-of-context photos
• misleading charts
• staged or cropped visuals
• emotionally charged imagery
Images bypass reasoning and go straight to emotion.
Where next?
[[Emotional manipulation tactics]]
[[Narrative manipulation tactics]]
[[Return to Recognise manipulation patterns]]
[[Return to Module 3 – Building Digital Immunity]]
[[Module 3 Conclusion]] [[Recognise manipulation patterns]] The goal is not to “win” but to build awareness.
Helpful approaches:
• Ask: “Where did this come from?”
• Show how you verified it
• Emphasise shared values (“we both care about truth”)
• Avoid shaming — it triggers defensiveness
Dialogue builds resilience far better than confrontation.
Where next?
[[How to promote verification in your community]]
[[Return to Help others resist misinformation]]
[[Return to Module 3 – Building Digital Immunity]]
Communities become stronger when verification becomes a habit.
Try:
• sharing simple checklist graphics
• teaching youth how to verify images
• organising small awareness sessions
• modelling critical thinking in conversations
• using humour and storytelling like the comic
A culture of verification reduces the influence of misinformation.
Next:
[[Help others resist misinformation]]
[[Return to Module 3 – Building Digital Immunity]]
[[Help others resist misinformation]]You have completed **Module 3: Building Digital Immunity**.
You now understand:
• how to verify images, claims, and sources
• how to slow down emotional reactions
• how to recognise manipulation techniques
• how to help others resist misinformation
You are now equipped with a full toolkit for navigating digital spaces responsibly — completing the learning arc that began in Module 1 with the comic.
Would you like to continue exploring?
[[Return to Start]]
[[Revisit Module 2]]
[[Revisit Module 1]]
[[Start]] [[Continue to Module 2: The Anatomy of Manipulation]]