Do We Always Have to Pay for Our Knowledge Gap?
AI removed the technical tax, but exposed the one we can't avoid
There’s a tax we’ve all paid: want to edit a video? Hire a professional or spend weeks learning software. Need a logo designed? Pay a designer or master graphic design. Want to understand your business metrics? Hire an analyst or learn SQL and statistics.
The logic was simple: expertise is scarce, therefore expensive. If you want something done, you either pay in years of learning or dollars to someone who already did. AI is changing this equation in ways worth examining.
Mastery has always required suffering
Malcolm Gladwell popularized the 10,000 hour rule. Robert Greene’s Mastery goes deeper: expertise emerges through years of apprenticeship, failure, and refinement. Mozart, Darwin, Einstein—all paid their dues through extended periods of struggle that looked nothing like genius from the outside.
This wasn’t just time passing. It was painful iteration, failed experiments, dead ends—the slow accumulation of pattern recognition that only comes from repeated exposure to problems.
We built industries around this reality. Bootcamps charge $15k to accelerate the timeline. Agencies charge $200/hour because they’ve already paid the 10,000 hours. More recently: content creators selling courses for $499, promising to compress years of learning into weeks, to give you pattern recognition without the pain.
But can you really pay to skip the struggle?
Removing technical barriers exposes who actually has taste
When I’m building with AI, I’m not acquiring expertise in the underlying systems. But I can articulate what I want, iterate when it doesn’t work, refine my approach. The implementation happens without me paying the traditional 10,000 hour tax. This feels like cheating because it is cheating—at least according to the old rules.
But here’s what’s become stark: when everyone can build technically, the difference between builders becomes painfully obvious. Technical complexity used to obscure this. You could blame a bad product on implementation challenges or limited resources. Those excuses are disappearing.
Now when someone builds something mediocre, it’s not because they couldn’t execute—it’s because they lacked the judgment to know what to build. The technical barrier was hiding the real differentiator all along: taste.
And taste is brutally binary in a way technical skill never was. You either have developed it through years of exposure and failure, or you haven’t. AI makes this gap more visible, not less.
Some curves have no shortcuts
Greene emphasizes that mastery isn’t just technical skill. It’s intuition, the ability to see patterns others miss, the judgment to know when rules should be broken. The 10,000 hours were about internalizing failure patterns, developing taste, building mental models that let you navigate ambiguity.
AI can compress the technical learning curve. It cannot compress the taste development curve.
Consider job searching: you can buy resume templates, pay for interview coaching, purchase LinkedIn optimization courses. But none of this eliminates the actual work—applying, getting rejected, refining your approach, learning what employers actually want through repeated exposure. The curve of rejection and refinement is the education.
Building products is the same. You can use AI to remove technical barriers, but you still have to go through the curve of building things that don’t work, recognizing why they failed, developing intuition about what people actually need. That curve cannot be purchased or shortcut.
The knowledge gap tax is decreasing for technical execution, but we still pay in the ways that actually matter—in building things that don’t work, in developing judgment through mistakes, in spending years getting better at recognizing what’s worth building.
The course economy is selling the illusion that you can skip the curves. But the curves are where the actual learning happens. You have to go through it yourself. That’s not a limitation—that’s how mastery works.

