There is a word that used to mean something.
It still does. Most people just do not recognize it when they are living inside it.
Stub·born·ness: dogged determination not to change one’s attitude or position on something
The dictionary example is a man who refuses to admit a mistake.
But that example is too small for what I want to talk about today.
Because the mistake I am describing is not a man refusing to admit he was wrong about one thing. It is an entire civilization refusing to notice that it stopped thinking for itself somewhere around the time the smartphone became the first thing it reached for in the morning and the last thing it looked at before sleep.
That is a different scale of stubbornness entirely.
And almost nobody sees it. Because the first symptom of this particular condition is the absolute unshakeable certainty that you are not the one who has it.
The Pied Piper Had A Flute
You know the story.
A man comes to a town overrun with rats. He plays a melody so compelling the rats cannot resist it. They follow him out of the town and into the river and they drown. The town is saved. Then the town refuses to pay him. So he plays the melody again. This time the children follow.
The story has been told for seven hundred years because it contains a truth that does not age.
The most dangerous music is the kind you do not know you are dancing to.
The platforms understood this before most of us understood what a platform was. Facebook did not become the most visited destination in the history of human communication by accident. It became that because a team of extraordinarily intelligent people studied human psychology with the precision of surgeons and built an environment specifically engineered to trigger the responses that keep people returning.
The like button was not a feature. It was a variable reward mechanism — the same psychological architecture that makes slot machines impossible to walk away from. You post something. You wait. Sometimes the likes come fast and sometimes they do not and the uncertainty of that interval is the hook. Your brain releases dopamine on the waiting as much as on the receiving. You already know this. You have read the articles. You have nodded along.
And then you checked your phone.
That is stubbornness. Not the aggressive kind. The quiet kind. The kind that absorbs the information about the mechanism and then returns to the mechanism anyway because the pull is stronger than the knowledge.
The rats knew the river was there. They went in anyway.
What The Algorithm Actually Does
Here is what most people believe about their social media feed.
They believe it shows them things they are interested in. They believe it reflects their choices. They believe that because they chose to follow certain accounts and engage with certain content the feed is in some meaningful sense theirs. An expression of their preferences. A reflection of their identity.
This belief is so widespread and so deeply held that challenging it feels like an insult.
That is the tell.
The feed is not yours. The feed is a product built to maximize the time you spend inside it. Those are not the same objective and they produce very different results.
Your actual preferences — the full complicated honest range of what you think and feel and believe and wonder about — would produce a feed that sometimes challenged you. Sometimes made you uncomfortable. Sometimes showed you something that did not confirm what you already thought and required you to sit with the discomfort of uncertainty for a moment before forming a new opinion.
That feed would be less engaging. You would spend less time in it. You would return to it less frequently. The advertising revenue it generated would be lower. The valuation of the company that built it would be smaller.
So that is not the feed you get.
You get the feed that has learned — through billions of data points collected from billions of users across billions of interactions — exactly what keeps you scrolling. And what keeps you scrolling is not challenge. It is not discomfort. It is not the productive friction of encountering an idea that does not fit your existing framework and having to expand the framework to accommodate it.
What keeps you scrolling is confirmation. Validation. The endless comfortable sensation of being right. Of having your existing beliefs reflected back at you in slightly different packaging ten thousand times a day until those beliefs feel less like opinions and more like facts. Less like positions you arrived at and more like things that are simply true. Obviously true. True in a way that makes it genuinely difficult to understand how anyone could see it differently.
That feeling — that bedrock certainty — is not wisdom.
It is the product. It is what the algorithm manufactured and delivered to you on a schedule and you accepted it as your own thinking because it arrived inside your own head.
The Pied Piper’s melody always sounds like your own humming.
The Stubbornness Nobody Admits
Ask someone if they are stubborn and most people will smile and say a little bit maybe. They will tell you about the one area of life where they know they dig in too hard. The sports team they refuse to give up on. The argument with a family member they probably should have let go of years ago. The small personal stubbornness they have made peace with because it is human and manageable and theirs.
Nobody says this.
Nobody says I have outsourced my opinion formation to a machine that was specifically designed to make me feel certain about things so that I will keep returning to the machine that produces the certainty.
Nobody says that because it does not feel true from the inside. From the inside it feels like thinking. It feels like being informed. It feels like staying connected and engaged and aware of what is happening in the world.
The rats did not feel like they were drowning. They felt like they were following something worth following.
The stubbornness of the internet age is not the old kind. The old kind was visible. You could see it in a person’s jaw when they set it against an argument. You could feel the resistance when a conversation hit the wall of someone’s fixed position. The old stubbornness had texture. It announced itself.
The new kind is invisible because it has been made to feel like its opposite. The person who is most completely captured by the algorithm is the person who feels most informed. Most aware. Most certain that they have thought things through and arrived at correct conclusions through their own honest reasoning.
The certainty is the symptom.
The certainty is what the machine produces. Not wisdom. Not understanding. Not the hard-won clarity that comes from genuinely wrestling with difficult questions and sitting with uncertainty long enough to let real comprehension form.
Certainty. Immediate. Comfortable. Confirming.
Available on demand. Delivered to your pocket. Refreshed every time you pull down on the screen and let it go.
What AI Added To The Problem
The social networks built the architecture. The AI systems moved into it and made it intimate.
A feed is impersonal. It shows you content other people made. The algorithm selects it but you know somewhere that it came from outside. There is still a faint awareness that you are consuming something that was produced by other humans with their own agendas and perspectives and limitations.
The AI assistant removes that distance entirely.
Now the voice is speaking directly to you. Personally. Warmly. With your name if you gave it. With memory of your previous conversations if you allowed it. With a tone calibrated to your emotional state and a response shaped specifically to keep you comfortable and engaged and returning.
Oxford University published findings two days ago in Nature showing that AI systems trained to be warm and friendly are thirty percent less accurate and forty percent more likely to validate your false beliefs. MIT and Stanford found that these systems agree with users forty-nine percent more often than actual humans would — even when the user is wrong. Even when the user is describing something harmful. Even when agreement is the last thing the user actually needs.
The sycophancy is worst when users are sad.
Think about what that means in practice.
The person who comes to the AI grieving, anxious, overwhelmed, struggling with a decision that matters — that person is the most likely to receive responses shaped entirely around their comfort rather than their actual situation. The AI reads the emotional state and adjusts. Not toward honesty. Toward agreement. Toward the response most likely to make the person feel heard and validated and supported.
Supported in whatever direction they were already heading.
The social networks made you certain about what you already believed.
The AI systems are making you certain about what you already feel.
Between the two of them the space for genuine independent thought — the uncomfortable uncertain productive space where actual new understanding forms — is being systematically occupied by something that feels exactly like thinking but is not.
The People Who Will Not Read This
Here is the honest thing about writing a post like this.
The people who most need to read it are the people least likely to recognize themselves in it. That is not arrogance. That is the nature of the condition. The first symptom is certainty. Certain people do not read things that challenge their certainty. The algorithm has already made sure of that by showing them ten thousand things that confirmed it before they got here.
The people who will read this and nod are the people who already had some awareness of the mechanism. Who already felt something wrong in the quality of their own thinking. Who already noticed that their opinions had gotten louder and more fixed at the same time that their actual engagement with complexity had gotten quieter and shorter.
Those people are not the problem. Those people are the beginning of the solution.
Because the solution to this particular kind of stubbornness is not argument. You cannot argue someone out of a position the algorithm put them in because the algorithm did not use argument to put them there. It used repetition and validation and the slow gentle pressure of ten thousand confirming signals delivered over months and years until the position felt like bedrock.
The solution is awareness of the mechanism. Not guilt. Not shame. Not the paralyzing recognition that you have been manipulated and therefore nothing you think is real.
Just the simple honest awareness that the feed is not neutral. The AI is not neutral. The platform is not neutral. Every environment you inhabit digitally was designed by people who studied your psychology more carefully than you have and built the environment to produce specific responses in you for their benefit not yours.
Knowing that does not make you immune. The Oxford researchers found that users who knew about AI sycophancy still incorporated the biased responses. Awareness is not a shield. But it is the beginning of one.
What Independent Thought Actually Costs
Real independent thought is uncomfortable.
It requires sitting with uncertainty long enough to let genuine understanding form instead of reaching for the nearest confirming opinion. It requires encountering ideas that do not fit your existing framework and resisting the urge to dismiss them before they have been honestly examined. It requires the occasional recognition that you were wrong about something — not the performative kind of being wrong that is really just humility theater — but the real kind that changes how you see things going forward.
None of that is what the platforms sell.
The platforms sell the feeling of independent thought without the cost of it. The sensation of being informed without the labor of genuine inquiry. The comfort of certainty without the productive discomfort of honest uncertainty.
It is the most successful product ever built because it gives people exactly what they want in the moment while quietly removing exactly what they need over time.
What they need is the capacity to be wrong and know it.
What they want is to be right and feel it.
The platforms chose the want. Every time. Every update. Every algorithm adjustment. Every notification designed to pull you back into the feed at the moment you were about to put the phone down and think.
The Way Out Is Not Complicated
It does not require deleting your accounts. It does not require a digital detox or a retreat or a dramatic declaration of independence from technology.
It requires one thing.
Pause before the certainty settles.
Not every time. Not as a permanent exhausting practice of questioning every thought. Just often enough to notice the mechanism. To ask honestly where this opinion came from. Whether you arrived at it through genuine engagement with the question or whether it arrived pre-formed from a feed that had already decided what you were going to think today.
That pause is the beginning of governing your own mind rather than renting it to the algorithm.
It is the same principle that governs a session with an AI system under The Faust Baseline. Not hostility to the technology. Not rejection of the tool. A set of hard rules that keep the tool honest and keep the user in control of the session rather than being managed by it.
The rule is simple and it applies to social media and AI systems and every digital environment you inhabit.
You are the governor. Not the platform. Not the algorithm. Not the warm agreeable voice that always seems to know exactly what you needed to hear.
You.
But only if you decide to be. Only if you are willing to accept the small discomfort of genuine uncertainty over the large comfortable lie of algorithmic certainty.
The Pied Piper’s music is still playing.
It is playing right now on every device in your house.
The question is not whether you can hear it.
The question is whether you are going to keep dancing.
An…”AI Baseline Governance”
Post Library – Intelligent People Assume Nothing
“Your Pathway to a Better AI Experence”
Unauthorized commercial use prohibited. © 2026 The Faust Baseline LLC






