🦔 🦔 🦔
What if this is as good as software is ever going to be? What if AI stops getting better and what if people stop caring?
Imagine if this is as good as AI gets. If this is where it stops, you’d still have models that can almost code a web browser, almost code a compiler—and can even present a pretty cool demo if allowed to take a few shortcuts. You’d still get models that can kinda-sorta simulate worlds and write kinda-sorta engaging stories. You’d still get self-driving cars that almost work, except when they don’t. You get AI that can make you like 90% of a thing!
90% is a lot. Will you care about the last 10%?
I’m terrified that you won’t.
I’m terrified of the good enough to ship—and I’m terrified of nobody else caring. I’m less afraid of AI agents writing apps that they will never experience than I am of the AI herders who won’t care enough to actually learn what they ship. And I sure as hell am afraid of the people who will experience the slop and will be fine with it.
As a woodworking enthusiast I am slowly making my peace with standing in the middle of an IKEA. But at the rate things are going in this dropshipping hell, IKEA would be the dream. Software temufication stings much more than software commoditization.
I think Claude and friends can help with crafting good software and with learning new technologies and programming languages—though I sure as hell move slower when I stop to learn and understand than the guy playing Dwarf Fortress with 17 agents. But at the same time AI models seem to constantly nudge towards that same median Next-React-Tailwind, good enough app. These things just don’t handle going off the beaten path well.

Mind you, it’s not like slop is anything new. A lot of human decisions had to happen before your backside ended up in an extremely uncomfortable chair, your search results got polluted by poorly-written SEO-optimized articles, and your brain had to deal with a ticket booking website with a user interface so poorly designed that it made you cry. So it’s a people problem. Incentives just don’t seem to align to make good software. Move fast and break things, etc, etc. You’ll make a little artisan app, and if it’s any good, Google will come along with a free clone, kill you, then kill its clone—and the world will be left with net zero new good software. And now, with AI agents, it gets even worse as agent herders can do the same thing much faster.
Developers aside, there’s also the users. AI models can’t be imaginative, and the developers can’t afford to, but surely with AI tools, the gap between users and developers will be bridged, ChatGPT will become the new HyperCard and people will turn their ideas into reality with just a few sentences? There’s so many people out there who are coding without knowing it, from Carol in Accounting making insane Excel spreadsheets to all the kids on TikTok automating their phones with Apple Shortcuts and hacking up cool Notion notebooks.
But what if those people are an aberration? What if this state of tech learned helplessness cannot be fixed? What if people really do just want a glorified little TV in their pocket? What if most people truly just don’t care about tech problems, about privacy, about Liquid Glass, about Microsoft’s upsells, about constantly dealing with apps and features which just don’t work? What if there will be nobody left to carry the torch? What if the future of computing belongs not to artisan developers or Carol from Accounting, but to whoever can churn out the most software out the fastest? What if good enough really is good enough for most people?
I’m terrified that our craft will die, and nobody will even care to mourn it.
🦔 🦔 🦔
