![]() |
| Disinformation From Russian Trolls Hijacks American Politics and Public Opinions |
Russian
disinformation used to feel like a Cold War relic, but it never went away. It
just got slicker, louder, and far better funded. And remember when the DOJ was
digging into Rudy Giuliani’s Russia-linked dealings and the FBI had to warn
members of Congress and certain media outlets that they were being manipulated
by Kremlin operators? That moment wasn’t an outlier. It was a snapshot of how
deeply Moscow understands the American mind and how boldly it tries to bend it.
What makes this whole thing surreal is
that the warnings are not subtle. U.S. intelligence agencies have had to tell
powerful people inside our own government that they are being manipulated by
Russian information ops. It is like watching someone walk into an obvious scam,
and you cannot stop them.
I started tracking all of this years ago
while researching my Corey
Pearson – CIA Spymaster Series. The series grew out of one
question: what if we looked at Russian disinformation not as background noise
but as the central threat it really is? The more I dug, the more disturbing it
got.
Take the notorious troll farm in St.
Petersburg that never sleeps. It has turned disinformation into a full-scale
industry. Between early 2016 and mid-2018, the Internet Research Agency burned
through over $35 million crafting fake personas, fake arguments, and fake news
designed to push Americans into real anger. During one six-month stretch in
2018, the operation spent almost as much as it had the previous year. That kind
of budget is not for casual mischief. It is for influence.
And the reach keeps growing. A decade ago
it was basic memes and clumsy sock-puppet accounts. Today it is high-end
deepfake videos, AI-generated audio, and content factories that can crank out
fake news faster than most real newsrooms. We now live in a world where a
convincing video of someone saying something they never said can be made in
minutes. Russia knows exactly how powerful that is, which is why U.S. Cyber
Command is actively identifying individual operatives and letting them know
they are being watched.
This all ties directly into a real-world
example from one of my spy thrillers, Mission of Vengeance,
where a former KGB officer defects and tells Corey Pearson about a Russian
troll operation hiding behind a DVD bootlegging business in the Dominican
Republic. That storyline was fiction, but only barely.
In the novel, Bocharov pushes his phone
across the table and reveals what he knows:
“Putin has rejuvenated the KGB mindset.
The old ‘Active Measures’ strategy is back. They are flooding the Caribbean
with disinformation about U.S. exploitation. Spetsnaz teams are in place. And
the same bloggers and hackers who interfered in your 2016 election are working
out of my estate in the Dominican Republic.”
That scene was pulled straight out of real
Russian playbooks. The old Soviet KGB planted the idea that the CIA killed
Martin Luther King Jr. In the eighties, they spread the rumor that AIDS was a
U.S. bioweapon created at Fort Detrick. Today, instead of whisper campaigns and
forged letters, they use Facebook, YouTube, Instagram, Reddit, and thousands of
coordinated bots.
Bocharov’s estimate in the novel that
“roughly 146 million Americans and Caribbean citizens” are exposed to
Kremlin-backed lies may sound dramatic, but it is not far from reality.
Facebook admitted that as many as 126 million American users saw Russian-generated
content by 2017. Twitter found tens of thousands of Russian bots pushing
political tweets during the 2016 election cycle. And that was before AI
supercharged the strategy.
Why does it matter? Russian
disinformation is not just an election problem. It is a mindset problem. It is
designed to make Americans doubt everything, including each other. That
confusion is the real win for Moscow. When trust collapses, steering public
opinion becomes easy.
You can see the impact most clearly
outside of politics. Public health is a prime example. Years before COVID hit,
Russian troll farms were already pumping out contradictory vaccine content
aimed at Americans. Some posts pushed extreme pro-vaccine arguments, others
pushed anti-vaccine conspiracies, and many played both sides at once. The goal
wasn’t to promote a position. It was to trigger outrage, make people fight, and
weaken public trust in science and medical institutions.
Researchers at George Washington
University and Johns Hopkins discovered that Russian-controlled accounts from
the Internet Research Agency quietly ran thousands of these contradictory
vaccine posts from 2014 to 2017. They weren’t trying to sway anyone toward a
single belief. They were trying to make vaccines themselves feel suspicious and
divisive. By the time COVID arrived, a lot of that groundwork had already taken
hold. Americans weren’t just disagreeing. They no longer trusted information
sources that had once been reliable.
That is the power of Russian
disinformation. It isn’t about pushing one lie. It is about eroding the shared
reality that keeps a society grounded. Once people start doubting institutions,
experts, and even basic facts, they become easier to manipulate, easier to
provoke, and easier to fracture. That is the true battlefield, and it goes far
beyond elections.
That is exactly why Corey Pearson keeps
running into these operations in the Corey Pearson- CIA Spymaster Series.
The threat is not a relic. It is a pressure point that keeps shaping American
politics and public opinion, and it is only getting more advanced.
Robert Morton is a member of the Association of Former Intelligence Officers (AFIO) and writes about the U.S. Intelligence Community (IC). He also writes the Corey Pearson- CIA Spymaster Series, which blends his knowledge of real-life intelligence operations with gripping fictional storytelling. His thrillers reveal the shadowy world of covert missions and betrayal with striking realism.

No comments:
Post a Comment