The decree came with little warning.
National Stability Ordinance 02.
Effective immediately, all public institutions will implement Emotion Regulation Surveillance (ERS) systems.
Coded phrases followed:
“To prevent emotional extremism.”
“To protect collective psychological safety.”
“To identify instability before it manifests.”
But the truth was simpler:
They were building a machine to detect feelings.
And punish them.
The first implementation hit schools.
Facial recognition cameras in classrooms.
Emotion-scoring algorithms trained to spot "instability":
Crying? Suspicious.
Too quiet? Watchlisted.
Too angry? Counseling order.
At first, it was framed as mental health support.
But reports leaked.
Students pulled from class for "emotional volatility."
Workers flagged during interviews.
Writers shadowbanned for "language risk markers."
Moneytory stared at the headlines, rage crawling up his throat.
The tale has been illicitly lifted; should you spot it on Amazon, report the violation.
“They weaponized empathy tech.”
Haejin nodded, face cold.
“They took your blueprint…
and inverted it.”
He got the invitation in a silver envelope.
No stamp.
Just a sigil—stylized eye over a heartbeat line.
The Ministry of Emotional Security.
They didn’t ask him to speak.
They asked him to advise.
At the glass tower headquarters, they smiled.
“You were right. Emotions matter.”
“But left unchecked… they become chaos.”
“We’re simply… containing the fire you lit.”
Moneytory leaned in.
“You’re not containing it.
You’re extinguishing it.”
The lead advisor didn’t flinch.
“Containment is peace.”
They showed him the CORE AI:
a system built to assign “Emotional Trust Scores” to citizens.
It monitored posts. Eye movements. Word choice.
Even silence.
“We’re only targeting extremes,” they insisted.
“So did every tyrant who thought they were the hero.”
Back in the ruins of his old lab,
Moneytory met with the last of his students.
A programmer.
A poet.
A former therapist turned data analyst.
“They’re building a cage shaped like safety,” he told them.
“We’re going to short-circuit it.”
Haejin returned from her own mission—
with a drive full of leaked training data.
AI bias.
Suppressed audits.
Emotion suppression thresholds built without clinical input.
They had proof.
But proof wasn’t enough.
They needed interruption.
They hijacked the government’s national empathy forum.
A place once used for “emotional unity exercises.”
He appeared on every screen.
Not as a hero.
Not as a ghost.
As a man.
“I built tools to help us understand each other.
Not to filter each other into silence.”
“Empathy isn’t a net to catch dissidents.
It’s the bridge between difference.”
“You can’t criminalize grief.
Or mandate serenity.
You can only earn trust—
and you’ve failed.”
The system crashed for four minutes.
Just long enough.
The Ministry doubled down.
Sweeps increased.
Surveillance sharpened.
But something else grew in the cracks:
Protest poems encoded into digital art.
Songs passed through encrypted streams.
People writing again—not just with logic, but with love.
And on a wall in Daegu:
A new mural.
Moneytory’s silhouette.
Below it, one phrase:
“You cannot fix what you cannot feel.”
To be continued…