Lesson 3: They’ll Use AI to Engineer Division. We Must Use It to Strengthen Solidarity.
AI vs. Autocracy: Seven Lessons in the New Battle for Democracy
In fear-based systems, division is the fuel.
Autocrats don’t need to be loved.
They need people to distrust each other enough to give up on cooperation.
Now, with AI — especially large language models (LLMs) trained to mimic human tone, logic, and cadence — they can manufacture division at scale, in real time, and with increasing surgical precision.
This isn’t ideological.
It’s psychological.
And it’s already happening.
What the Autocrats Will Do
AI tools allow regimes and bad actors to:
Generate disinformation campaigns aimed at turning allies against one another
Inject content that appears to come from trusted voices, but is fake
Flood community threads with identity-based attacks to provoke fracture
Deepfake or impersonate opposition figures to break coalitions from within
And we’re not speculating.
In the past few days it has come out that a fake Signal account impersonating Secretary of State and Acting National Security Adviser Marco Rubio contacted at least two members of Congress and a foreign leader.
The messages were convincing. They were written in Rubio’s voice.
The objective? Not propaganda — division.
To undermine diplomacy. To breed confusion. To test how easily trust between allies could be fractured with AI.
That wasn’t a prank. That was a trial run.
And if the Secretary of State can be spoofed that easily, how do you know the person in your coalition group chat is really who they say they are?
What Democracy Must Do
We don’t fix this by ignoring it.
We fight division with intentional solidarity — built with trust, tools, and vigilance.
That means:
Using AI to detect and map coordinated disinformation campaigns
Building alert systems to warn movement leaders when impersonation or identity-wedge tactics are being deployed
Training grassroots and national coalitions to expect this kind of infiltration — and remain anchored anyway
The next wedge won’t come from the outside.
It will realistically appear as if it originated from within your team.
Three Things You Can Do Today
1. Watch for patterns, not just posts.
If a message triggers infighting, ask: Who benefits from this fracture? Then check the source.
2. Build real relationships offline.
Digital trust can be gamed. Personal trust takes longer and holds stronger.
3. Establish comms norms before crisis hits.
Agree in advance: how do we confirm messages? Who speaks for the group? What’s the backup channel?
Bottom Line
Authoritarians will use AI to simulate solidarity and sabotage it.
We must utilize both AI tools and human strategy to reinforce what is real.
Solidarity isn’t soft.
It’s a security system.
And in 2025, division isn’t just a risk — it’s a weapon.
Next Up:
Lesson 4 — They’ll Use AI to Erode Reality. We Must Use It to Anchor Shared Truth.
Subscribe: Searching for Hope
Follow: @trygveolson.bsky.social