I spent twenty-six years as an employee.
Four years short of a full pension.
That’s not a résumé line.
That’s a lifetime of early mornings, long shifts, responsibility, and trust built one day at a time.
I wasn’t new.
I wasn’t careless.
I wasn’t trying to bend rules or disappear into the system.
I was known for something specific.
I was a safety advocate.
I spoke up when equipment, processes, or decisions put people at risk.
I stood up for employee rights—not to stir trouble, but because keeping people safe and informed is how work actually gets done.
That context matters, because it tells you who was standing there when things went wrong.
When I needed medical leave, I did what any reasonable person would do.
I communicated the need.
I acted in good faith.
I trusted that the system I had given most of my working life to would respond with the same basic fairness.
What I did not know—because no one told me—was that I had specific legal protections under the California Family Rights Act.
That absence of information wasn’t loud.
It wasn’t hostile.
It didn’t come with warnings or red flags.
It came as silence.
And silence, I learned, is one of the most powerful tools an institution has.
Later, the court ruled that California Portland Cement Company violated the law by failing to inform an employee of his CFRA medical leave rights. Because the company failed to communicate those rights, it lost the ability to lawfully terminate based on that leave.
The court didn’t say the company was cruel.
It didn’t say they acted with malice.
It said something far more important:
You cannot punish someone for rules they were never told existed.
That ruling wasn’t about emotion.
It wasn’t about sympathy.
It was about structure.
And that’s where this stops being “my story” and becomes a lens on the times we’re living in now.
Today, people are surrounded by systems that assume understanding without earning it.
HR platforms that replace conversations with portals.
Compliance frameworks that say “policy available upon request.”
Automated systems that make decisions faster than a human can ask a question.
The language is always the same:
“You should have known.”
“It’s in the documentation.”
“You agreed when you clicked.”
But here’s the truth most people learn too late:
Systems that don’t explain themselves manufacture failure.
Not because people are incapable.
Because they are uninformed by design.
The burden has quietly shifted.
If you don’t know, it’s your fault.
If you misinterpret, that’s on you.
If you fail, the system shrugs and moves on.
But the law—and life—still recognize something older and sturdier than software:
Information asymmetry creates harm.
Silence always favors the side with power, time, and lawyers.
That lesson didn’t end in a courtroom.
It carried forward.
It’s one of the reasons the Faust Baseline exists at all.
The Baseline isn’t about rules for the sake of rules.
It’s about explainability before consequence.
It starts from a simple moral position that technology keeps trying to outrun:
If a system can affect a human’s livelihood, health, standing, or future,
that system owes the human clarity before it enforces outcomes.
Not afterward.
Not buried in links.
Not written for specialists.
Before.
What happened to me didn’t require bad intent.
It required opacity.
And opacity is everywhere now—especially where automation and AI are concerned.
People are evaluated, filtered, flagged, and guided by systems they don’t understand, using criteria they never see, with consequences they only discover after the fact.
That isn’t intelligence.
It’s acceleration without accountability.
The Faust Baseline is built to interrupt that pattern—not with outrage, not with abstraction, but with discipline.
It insists that systems must be understandable to the people they touch.
That enforcement without explanation is not order—it’s erosion.
That predictability isn’t comfort—it’s fairness.
I learned that lesson the expensive way.
Twenty-six years in.
Four years from retirement.
A career built on safety, trust, and showing up.
The times haven’t changed as much as we think.
Only the interfaces have.
And that’s why this history still matters.
Not as a grievance.
Not as nostalgia.
But as a reminder:
When systems stop explaining themselves, people pay the price.
And when clarity disappears, trust goes with it.
The Baseline exists to keep that from becoming normal.
The Faust Baseline™Purchasing Page – Intelligent People Assume Nothing
micvicfaust@intelligent-people.org
Unauthorized commercial use prohibited.
© 2026 The Faust Baseline LLC






