What gets expensive in 2026.
when AI makes everything cheap.
Happy New Year!
2026 has a great energy to it. Kinda zesty. I learned it’s a “1 Year” in numerology which is supposed to correlate with new beginnings, leadership, bold action, and innovation.
Let’s get it.
Over the holidays, I was reflecting on how much content I read in 2025 that was well-written but ultimately unmemorable. Like a piece on ambition that consumed 5 minutes of my time but I couldn’t tell you who wrote it. Could’ve been any of a dozen smart creators who were picked up by the algorithm.
Or no one at all.
Have you noticed this too?
I realized it’s a signal of something structural shifting on the internet.
AI has been disruptive, obviously. Part of that disruption is the removal of friction from content production so that the cost of publishing above-average output has collapsed to near zero.
Like if we borrow a college paper grading rubric for a moment, I’d say there’s an abundance of B+ quality content online.
In 2023, the narrative was “AI will replace writers.”
In 2024, the narrative was “AI won’t replace writers. Writers using AI will replace writers not using AI.”
In 2025, we got “AI slop” which is a somewhat subjective assessment of content quality but also such an emotionally charged topic that it became Merriam Webster’s Word of the Year.
When we lay it out this way, I see the slop as a logical response to 2024 discourse. People were motivated (by fear or tech enthusiasm or both) to use AI to produce content.
Which gave us slop but also some really interesting content that couldn’t reasonably exist without AI because it would take 3-5x longer to produce, making it a bad investment of time.
So now it’s 2026.
We’ve got an abundance of B+ quality content online.
And yet… most of us know (consciously or subconsciously) that there hasn’t been a magical leap in human competence to where the average person behind the screen is capable of writing or speaking intelligently enough to produce B+ content all on their own, without AI.
Which means a default posture of suspicion is rising.
Suspicion of other people and reluctance to coordinate because you can’t easily trust the person behind the output understands what they’re saying.
A question I’m contemplating as we head into 2026 is what happens to culture when the cost of appearing competent drops to zero.
Trust premium
AI fluency is table stakes now.
It’s reasonable to expect the average person to know how to prompt effectively, fact-check output, and produce polished work with AI assistance.
What’s more difficult to do is cultivate trust.
Trust is a firm belief in the reliability, truth, ability, or strength of someone.
So where does it come from in an environment where output is cheap and someone’s competence could be smoke and mirrors?
Trust relocates to signals that remain expensive.
Here’s a few I’ve come up with:
Consistency. Showing up the same way over months vs. weeks. Building a track record people can verify. If you’re able to say “I’ve been saying this since 2023” you gain trust points.
Coherence across contexts. You sound like the same person in your writing, in meetings, on podcasts, off-screen. There’s no gap between your LinkedIn post and how you explain the same idea when someone asks you about it in person. The person matches the output.
Network density. Who vouches for you when you’re not in the room. The people you’ve worked with and what they say about you matters more than anything you could say about yourself.
Taking your time. Not racing to be the first person to post the hot take. Thoughtfulness and selectivity become more valuable than reactivity.
Presence. The people who trust you most are usually the ones who’ve met you outside a digital feed. Could be through events or dinners. Or through private channels like Zoom conversations or DM’s. Trust accrues through physical closeness.
Depth. Going deep on one thing instead of surface-level commentary on many. Expertise that others can interrogate. The confidence to say “ask me anything about X…” that comes from years of focus and experience.
Process transparency. Showing your work. Explaining how you arrived at a conclusion instead of stating it. Letting people see your reasoning so they trust your judgment even when they disagree with your conclusion.
Error correction. Publicly updating your views as new data becomes available.
Durability. Work that’s still relevant 6-12 months from now. Ideas with staying power. A body of work people can link back to, reference in conversations, build on top of.
2026 is a sorting year.
The people who understand where trust is migrating will build leverage and command a premium in the market.
Those who don’t will plateau at competent.
We wrap with a few news signals on my radar about where the future of money and culture are headed...
OpenAI is building an audio-first device for 2026. Smartphones are so 2016. I almost added a Drake lyric here about cracked iPhone screens but…
Brookfield is starting its own cloud business to lower the cost of AI by leasing data center chips directly to AI developers.
Meta buys Manus for $2-3B. It was valued at $500M six months ago.
X is ready to tell YouTubers: “come get this bag.”
Follower counts died in 2025. LTK CEO: “the algorithm completely took over, so followings stopped mattering entirely.”
97% of CMOs plan to increase creator marketing budgets in 2026, in response to AI slop.






The trust premium framework here is sharp. Network density as a signal makes total sense, since its one of the few things that can't be faked or accelerated with tools. I've noticed this shift in my own media diet where I'm gravitating toward creators with older archives I can spelunk through, just to verify consistency over time. The "taking your time" point feels counterintuitive in an attention economy but it tracks when you think about it as anti-slo p positioning.