VCs say the damnedest things, and I hate to be the guy who agrees with a VC. But sometimes—not often, but sometimes—they say something useful. Today I came across this tweet by Anish Acharya of a16z:

the timeline is full of people using Claude Code to generate ads, recover corrupted video, make landing pages, manage business texts…

the narrow view of coding agents is that they primarily decrease the cost of software development

the ambitious view is that almost any problem/solution can be expressed in software and this capability is upstream of all knowledge work

I think that’s a really sensible view, and having experimented heavily with AI coding assistants, I agree with it. It got me thinking about the future of knowledge work. The first thing that popped into my head was Dickens:

It was the best of times, it was the worst of times, it was the age of wisdom, it was the age of foolishness, it was the epoch of belief, it was the epoch of incredulity, it was the season of Light, it was the season of Darkness, it was the spring of hope, it was the winter of despair, we had everything before us, we had nothing before us, we were all going direct to Heaven, we were all going direct the other way—in short, the period was so far like the present period, that some of its noisiest authorities insisted on its being received, for good or for evil, in the superlative degree of comparison only.

Right now, it feels exactly like that. We Are All Cognitive Peasants

Much of what passes for knowledge work today is not really knowledge work. It’s cognitive peasantry.
I don’t use the term pejoratively. Our food is grown and produced by peasants. What I mean is that we do menial cognitive labor, the equivalent of using picks and shovels. There’s not a whole lot of actual knowledge work being done. Some work, yes. But is it really knowledge? Not really.

Much of the history of cognitive peasantry can be told through two phrases: “we need to find synergies” and “I will circle back.” There are truly dreadful and soul-sucking parts to this work. A large chunk of being a cognitive peasant means doing very boring, dull, depressing, annoying jobs. That’s just the reality.

The Worst of Times

This naturally leads to the question of AI job automation. My firm view is that whole swaths of menial, basic, entry-level knowledge work will be automated away. The direction feels like a certainty; the speed is anyone’s guess.
We’re already seeing signs. Life has gotten harder for entry-level graduates. There are articles about Ivy League grads in the US struggling to find jobs. Anecdotal reports of layoffs are piling up, though the data remains unclear.

Having used these tools, having seen implementations, having spoken to people at companies rolling them out, my thesis is this: anything that lives between the walls of a basic deterministic process is gone. Pressing a button, getting an output. Going through a sequence of steps. Those jobs are gone. Whether they’ll be replaced by something else is a question on the mind of everyone trying to make sense of AI. I don’t have an answer.

We have to be careful about the language here, though. In the last few months, I’ve spoken to senior developers who told me that AI is helping them do more than ever. They’re able to create a replica of themselves and generate more code than ever with AI assistance. But this isn’t the same as “AI is coding.” There’s a difference between a senior developer who’s been coding all his life telling an LLM what to do like an assistant, and someone with no context telling an LLM what they want and having it spit out code. The senior developer knows what’s actually happening. He can read the code, figure out if there are issues, and implement it properly. The other person cannot.
And if senior developers can now transcend the limits of being human, if they can have digital replicas of themselves doing more, that has to mean they’ll probably need to hire less. The math is simple.

The Best of Times

But there’s another side to this. Until you’re using coding agents, describing your problems and watching them get solved, your view of AI is incomplete. The web versions of ChatGPT and Gemini are great, but they don’t show you what’s actually possible.
The label “AI coding tools” is a misnomer. Normies like me haven’t adopted them because we assume you need technical competence. Nothing could be further from the truth. Anyone can use these tools as long as they put in enough effort. It’s like fitness or building any habit. The barrier isn’t technical skill, it’s willingness to show up and experiment.

These tools aren’t about coding anymore. They’re general-purpose, malleable enough to fit your style of thinking, your way of working, your specific problems. You can wrangle them any which way you want.
Think about cognitive work for the last hundred years. A lot of drudgery. A lot of soul-sucking repetition. Now, AI tools can do the most boring parts better than you can, with higher certainty. They free you up. They’re a cognitive exoskeleton. Your brain does the thinking. The suit handles the lifting.

In many cases, these tools do things better than us. It’s like having two or three coworkers by your side. For the first time, we can create replicas of ourselves that are reasonably competent at a broad variety of things. They are force multipliers. You can now do things at the speed of thought.

A concrete example. I’d wanted to build a website that makes public domain literary works more readable for years. Conceptualized it, thought about it, never had the skills. With AI coding tools, I built akshara.ink. What would have taken years, or more likely stayed an idea forever, took me a month.

There’s a lot of hype about these tools, a lot of blind boosterism. But as things stand today, they’re most valuable for people with actual domain knowledge and skills. A senior developer using AI as an assistant gets far more out of it than someone who can’t tell good code from bad. That’s just true.

But here’s the other thing that’s also true: these tools have removed the ceiling for amateurs to tinker, to fuck around and find out. I couldn’t code before. Now I’ve shipped a website. Both things are happening at the same time. The experts are becoming more powerful, and the amateurs are finally allowed in the room.

Room to Do More

A good mental model: these AI tools enable you to do more. More creative things. More ambitious things. They make problems feel more tractable than they ever were. They let you create custom solutions tailored to your personality, your style of work.

At no point in the history of knowledge work did most people have an assistant where you could say, “Go do this thing,” and the thing would be done. Now we have this butler, this thinking partner, this creative sparring partner. You can offload things to it, and it goes and does them and comes back without complaining.

For the first time in the history of cognitive peasantry, you can offload the dreadful parts to a willing agent who will probably do it better than you ever could.

You no longer have to circle back. You can just go somewhere. You no longer have to find synergies. You can actually create them. You no longer have to write performative LinkedIn posts pandering to recruiters, doing your little employment pole dance in the hopes someone notices. You can build things. Share them. Stand out in a crowd of sameness and bland incompetence.