ResearchALC TheoryPedagogy

You Need Both a Vaccine and a Relationship

Inoculation theory teaches resistance to manipulation. Domestication theory explains how people negotiate with technology. Neither talks to the other. ALC bridges them into a two-phase pedagogy that actually works.

·10 min read·Based on Roozenbeek & van der Linden 2021, Hynes & Richardson 2009

Two Brilliant Ideas That Don't Know Each Other Exist

In psychology, there's a powerful idea: you can vaccinate people against manipulation. Show them weakened versions of persuasive tactics before they encounter the real thing, and they develop resistance. It's called inoculation theory, and it works. Across demographics. Across political lines. Effects last three months or more with booster shots.

In sociology, there's an equally powerful idea: people don't justadopt technology — they domesticate it. They negotiate with it. They find a place for it in their lives, their routines, their values. It's called domestication theory, and it explains why the same technology gets used completely differently by different households.

Both ideas are brilliant. Both have decades of evidence. And neither one talks to the other.

This is a problem, because teaching people to navigate algorithmic systems requiresboth. A vaccine without a relationship gives you initial defense but no ongoing negotiation. A relationship without a vaccine gives you familiarity but no protection. You need both — and the bridge between them is communication theory.

The Vaccine: Inoculation Theory

William McGuire proposed inoculation theory in 1961, borrowing directly from immunology. The logic: if you expose someone to a weakened form of a persuasive attack before they encounter the real thing, they build cognitive antibodies.

For decades, this remained a niche theory. Then Roozenbeek and van der Linden at Cambridge made the scalability breakthrough: instead of inoculating against specificclaims (“climate change is a hoax”), you inoculate againsttechniques (emotional manipulation, false dichotomies, scapegoating, conspiracy reasoning). Teach someone to recognize the method, and they can spot it regardless of the topic.

Five Manipulation Techniques (Roozenbeek & van der Linden)

01
Emotionally manipulative language — fear, outrage, moral panic
02
Incoherence / conspiracy reasoning — connecting unrelated dots
03
False dichotomies — forcing binary choices on complex issues
04
Scapegoating — blaming a group for systemic problems
05
Ad hominem attacks — targeting the person, not the argument

The most powerful finding? Role reversal is the most effective delivery mechanism. Games like Bad News andHarmony Square put players in the role of producing disinformation. You don't learn to spot manipulation by studying it — you learn by doing it. Once you've built a fake news empire, you can see the techniques everywhere.

This is enormously powerful. It's also incomplete.

What the Vaccine Misses

Inoculation theory treats algorithmic influence like a message you can fact-check. But algorithms aren't messages — they're environments. You can't inoculate someone against their entire information ecosystem being curated. The filter bubble isn't a single lie you can prebunk. It's the air you breathe.

Inoculation gives you a one-time defense. But algorithmic systems change. TikTok's recommendation engine today isn't what it was six months ago. Instagram's ranking signals shift quarterly. Your vaccine expires — and there's no booster program for algorithmic fluency.

What you need after the vaccine is an ongoing relationship with the technology. Not just resistance, but negotiation. Not just defense, but dialogue.

The Relationship: Domestication Theory

Roger Silverstone developed domestication theory in the 1990s to explain something the Technology Acceptance Model (TAM) couldn't: why the same technology gets incorporated into life so differently by different people.

TAM says adoption is about perceived usefulness and ease of use. Domestication theory says adoption is about cultural negotiation — fitting technology into your values, routines, identity, and household dynamics. It identifies four phases:

1. Appropriation

Acquisition. The technology crosses from market commodity to personal possession. You download the app. You create the account.

2. Objectification

Placement. Where does this technology fit in your life? Your home screen. Your morning routine. Your sense of who you are.

3. Incorporation

Active use. The technology becomes part of your temporal routines, your tasks, your social practices. You check it without thinking.

4. Conversion

Public display. Your private use becomes part of your public identity. You reference it in conversation. You recommend it. You define yourself through it.

Crucially, domestication theory says these phases are never complete. Technology must be continuously negotiated. The relationship is always being renegotiated, always requiring effort — especially when the technology itself is changing.

But here's what domestication theory misses in the algorithmic age: algorithms resist domestication. Simpson et al. (2022) showed this empirically: LGBTQ+ TikTok users could never fully tame their For You Page. They'd train it, enjoy a brief window of alignment, then watch it drift back toward majority-patterned content. Domestication assumes the technology is relatively stable once you negotiate with it. Algorithms are constantly re-negotiating back.

The Missing Fifth Phase: Reflexive Maintenance

Silverstone's four phases assume a technology that, once domesticated, stays roughly where you put it. Your TV doesn't rearrange your living room. Your telephone doesn't change who it connects you to.

Algorithms do. They update, they retrain, they shift their optimization targets. The technology you domesticated last month isn't quite the same technology this month. This means algorithmic domestication requires something beyond Silverstone's framework: a fifth phase of reflexive maintenance.

Reflexive maintenance is the ongoing, conscious process of re-domesticating algorithmic systems as they change. It's not a one-time negotiation — it's a standing dialogue. It requires recognizing when the algorithm has shifted, understanding how it shifted, and communicating your preferences anew. This is a communicative competence, not just a technical skill.

And this is where the two theories need each other.

The Bridge: Communication Theory

Neither inoculation theory nor domestication theory uses communication theory. Inoculation comes from social psychology — it treats persuasion as stimulus→response. Domestication comes from media sociology — it treats technology adoption as cultural negotiation. Both are right. Neither is complete.

Application Layer Communication bridges them by recognizing that both inoculation and domestication are fundamentallycommunicative processes. Resisting manipulation is a communicative skill (recognizing persuasive techniques in the “language” algorithms speak). Domesticating technology is a communicative practice (negotiating meaning and use through ongoing interaction). The shared foundation is communicative competence at the application layer.

Two-Phase ALC Pedagogy

đź’‰

Phase 1: Communicative Inoculation

Initial defense through role-reversal learning. Don't study algorithms —be the algorithm.

  • • Technique-based, not platform-specific
  • • Active role-playing exercises
  • • Builds pattern recognition
  • • Time-limited but transferable
🤝

Phase 2: Communicative Domestication

Ongoing negotiation through reflexive practice. Not “taming” the algorithm — maintaining dialogue with it.

  • • Context-specific, ongoing
  • • Reflexive maintenance habits
  • • Builds communicative fluency
  • • Adaptive to system changes

What This Looks Like in Practice

The role-reversal principle from inoculation research suggests a specific pedagogical design for Phase 1: be the algorithm designer.

“You're TikTok's algorithm. You have 50 videos in the queue. Your user just watched three cat videos and one political rant. What do you show next? Why? What are you optimizing for?”

Exercise: Recommendation Design

“You're Spotify's recommendation engine. This user listens to sad music at 11pm every night. Do you keep recommending sad music? Intervene with upbeat tracks? Who decides — you or the user? What are the ethical stakes?”

Exercise: Ethical Algorithm Design

“You're LinkedIn's feed algorithm. Two posts were published at the same time: one from a CEO with 50K followers, one from a recent graduate with 200. Both are equally insightful. Who gets amplified? Why? What does this mean for the graduate?”

Exercise: Stratification by Design

Once someone has designed recommendation systems — even simple ones — they develop what I call algorithmic communicative fluency. Not just knowing that algorithms exist, but being able to think in the language of algorithmic decision-making. Once you can think like the algorithm, you can negotiate with it, resist it, or route around it.

Phase 2 then builds on this foundation with ongoing practice: noticing when your feed shifts, testing what signals the algorithm responds to, developing and revising your folk theories, sharing observations with communities. This is reflexive maintenance — the communicative habit of staying in dialogue with the systems that shape your information environment.

Why Existing Approaches Fall Short

Most AI literacy programs today do one of two things:

âś•

Knowledge transfer

“Here's how recommendation algorithms work.” Passive. Decays quickly. Doesn't build fluency.

âś•

Tool training

“Here's how to use ChatGPT effectively.” Platform-specific. Breaks when the tool changes. Teaches button-pressing, not thinking.

Neither builds communicative competence. Neither gives people the ability to navigate systems they haven't seen yet, or to adapt when familiar systems change. The inoculation-domestication bridge offers something different:

âś“

Technique-based inoculation

Learn the methods of algorithmic influence, not the specifics of any one platform. Transfers across systems. Builds pattern recognition.

âś“

Communicative domestication

Develop ongoing negotiation practices, not one-time knowledge. Builds adaptive fluency. Includes reflexive maintenance for changing systems.

The Organizational Angle

This isn't just an education framework — it's a workforce development model. Organizations deploying AI tools face exactly this problem: employees get a one-time training session (the vaccine, poorly administered) and then are expected to navigate evolving AI systems indefinitely (the relationship, with no support).

The result is what we've documented as the shadow literacy gap: 92% adoption, 36% training. Employees are using tools they were never equipped to negotiate with. Some develop fluency through trial and error. Most don't. The gap stratifies the workforce.

What Two-Phase ALC Training Looks Like

Phase 1 (Inoculation): Role-reversal workshops where employees design AI recommendation systems, write prompts that manipulate outputs, and experience algorithmic decision-making from the inside. Half-day session. Builds immediate pattern recognition.

Phase 2 (Domestication): Ongoing coaching structures — regular check-ins where teams share folk theories about how their AI tools work, test those theories together, and adapt their practices as tools evolve. Monthly cadence. Builds lasting fluency.

The Missing Piece Neither Theory Provides

Despite their power, both inoculation and domestication theory share a blind spot:neither uses communication theory.

Inoculation treats persuasion as stimulus→response (social psychology). Domestication treats adoption as cultural negotiation (media sociology). Both are describingcommunicative processes without the theoretical language to say so.

This matters because without communication theory, you can't explain whysome people develop algorithmic fluency and others don't. It's not just about exposure (inoculation) or cultural context (domestication) — it's about communicative competence. The ability to read algorithmic signals, compose effective inputs, negotiate meaning through interaction, and adapt your communicative strategy as the system evolves.

That's what ALC provides. Not as a replacement for inoculation or domestication, but as the communicative foundation that makes both legible as what they are: two phases of developing fluency in the most consequential communication medium of our time.

Sources: Roozenbeek, J. & van der Linden, S. (2021). “Inoculation Theory and Misinformation.” NATO StratCom Centre of Excellence. Hynes, D. & Richardson, H. (2009). “What Use is Domestication Theory to Information Systems Research?” IGI Global. Silverstone, R. & Hirsch, E. (1992). Consuming Technologies. Simpson, E., Hamann, A. & Semaan, B. (2022). “How to Tame 'Your' Algorithm.” PACM HCI (GROUP 2022). McGuire, W.J. (1961). The effectiveness of supportive and refutational defenses. Sociometry, 24(2).

Ready for Two-Phase ALC Training?

We design inoculation workshops and domestication coaching programs tailored to your team's AI tools and workflows.

Get the free ALC Framework Guide

The same framework we use in our audits — yours free. Learn how to identify application layer literacy gaps in your organization.

No spam. Unsubscribe anytime.