TL;DR – The cost of servicing technical debt is plummeting because of LLMs; assuming coding models keep improving. You’re better off taking more technical debt in your projects and bet on the fact that LLMs will clean up the debt in the future.
Through most of my software engineering journey, I understood the concept of technical debt as taking shortcuts while development that bite you at some later point. Here are some other explanations and takes if you’re unfamiliar with the term.
A step change in my understanding of the term came when I read Avery’s post on extending the tech debt metaphor.
A few ideas from the post that help me make my point:
Not all (Tech) Debt is bad
A family that takes on high-interest credit card debt for a visit to Disneyland is wasting money.
But if you want to buy a $500k machine that will earn your factory an additional $1M/year in revenue, it would be foolish not to buy it now, even with 20% interest ($100k/year). That’s a profit of $900k in just the first year! (excluding depreciation)
The idea being that debt in service of more [insert a metric you care about], is probably prudent to take. This is well understood in software development as Knuth’s famous maxim – premature optimization is the root of all evil.
This comes with the usual caveat of when and how much debt you should take. It’s easier to take shortcuts at the start of the project/company/codebase (smaller principal). The shortcuts shouldn’t be so egregious that the time spent on fixing the shortcut (higher rate of interest) is not worth the upside.
Debt to income ratios
In the tech world, the interest-to-income equivalent is how much time you spend dealing with overhead compared to building new revenue-generating features. Again, getting to zero overhead is probably not worth it.
If you can grow your revenue/DAUs/MAUs fast enough, you could probably throw more resources down the line to make up for the shortcuts you took.
As far as Avery went in extending the analogy, I would like to go further.
The Risk Free Interest Rate
The risk free rate of return is a theoretical concept on how much return could you get on an asset assuming no underlying risk. It is most commonly proxied by the US Fed Rate (although that could be changing slowly).
Most structured debt in the world is directly downstream from this number. Your mortgage is some percentage points plus this rate. You should be buying more risky assets when Fed rates are low and fleeing to low risk assets when the Fed rates are high.
This risk free rate decides how much liquidity/money there is in the system and is the single most important number in finance. It is revisited up to 8 times a year and there’s a central committee who sense the vibes of the economy and decide this number. If it seems a bit arbitrary, it is (by design).
When extending this analogy to technology, the risk free rate of return is akin to how much we can expect underlying technologies to improve without any intervention from us. Because the long arc of technology is up and to the right – tech’s risk free rate has monotonically fallen over the years.
An example of this was Moore’s Law in the ’90s and early oughts, CPUs were reliably getting faster every generation and developers were incentivized to work on features over speed optimizations because the hardware would catch up. Another example is software tooling getting better over time meant software development as a vocation was accessible to more people.
Over the years, we have internalized a rate of improvement in underlying technologies. The advent of coding with LLMs, however, challenge our intuitive rate.
LLMs Lower the Risk Free Rate of Interest (dramatically)
The cost of writing a useful line of code is falling every month. The mess we (or our LLMs) create by taking shortcuts in code today is a problem to be tackled by a future LLM.
It seems intuitive but horrid to say – the amount of care we put into writing code should be trending down. I say this for people writing code in service of something. Not for people who take joy in writing code for its own sake (me!). Artisanal Coding has its place like handmade goods do, but that will be a shrinking size of the code we write.
You Need to Be a (Slightly) Worse Coder
We have all grown up under a regime of what counts as good coding practices. For instance Harold Abelson from Structure and Interpretation of Computer Programs (SICP):
Programs must be written for people to read, and only incidentally for machines to execute.
Or, Robert C. Martin:
It doesn’t take a huge amount of knowledge and skill to get a program working. Getting it right is another matter entirely.
Leaving the code cleaner than you found it” (The Boy Scout Rule) — treat every commit as an opportunity to improve the codebase.
Or Brian Kernighan:
Debugging is twice as hard as writing the code in the first place.
Or Jeff Attwood:
Code tells you how; comments tell you why.
They are all still true but …less important than they were before. Handcrafting routines, writing helpful comments, having the right level of abstraction are still good principles but less consequential than they were before. Most code going forward is going to be read by machines.
As software engineers trained in the BCE (Before Claude Era), we have ourselves calibrated to taking a certain amount of tech debt allowed by our circumstances. If we’re writing mission critical software going on a rocket, we would (understandably, and hopefully) take fewer shortcuts but when writing some front-end help page that 3 other people in the world are going to see, it is okay to take some shortcuts.
We might be due for a recalibration. Not taking shortcuts does come at some cost, and the bar to not take a shortcut just went down.
Redeeming the Vibe Coder (partially)
The Vibe Coder understandably gets a bad rap. However they might be on to something. As seasoned software engineers have underestimated the fall in the risk free rate, most vibe coders have the opposite problem. They overestimate what the underlying technology can deliver. However, over time (if model improvements, continue), they’ll be more right than wrong.
The Consequences of Long Term Lowering Interest Rates (software’s ZIRP era)
When interest rate fall in the economy, capital flies to a lot of stupid things. Remember those million dollar monkey JPEG crypto scams? They were all downstream from low interest rate – the term coined for these ventures was ZIRP – Zero Interest Rate Phenomenon. When interest rates go up, these projects clear out, hopefully leaving behind durable, value accretive businesses.
However, in this case, I don’t think the interest rate is going to rise, the amount of crummy software just keeps rising, the enshittification/ensloppification of software has begun, we will need new infrastructure/heuristics to figure out good software from bad.
Notes: