I’ve always had this loosely held opinion that the actual code you write is the least important part of coding.
The important parts are deciding what code to write, deciding how that code fits into the overall software design, deciding why you should work on this feature vs. that feature.
Given that perspective, I have no fear of AI taking my job as a programmer.
AI is fantastic at writing contained, low-context code. If you give it a box, and tell it to fill the box, it will fill that box like an army of hungry junior engineers.
Things start to fall apart if it needs to understand anything outside of the box. The more constraints you put on filling the box, the worse it gets.
And not just worse, it forgets, hallucinates, does all the classic LOL AI things.
But having an army of junior engineers is honestly great. You just have to manage them.
A couple points about this:
- AI is useful enough as a programmer that I will continue to use it
- The things the AI are bad at, are the things that’s always been more impactful for my focus
- AI is a net productivity gain, therefore humans should put real effort into maximizing AI management skillset
Saying Nice Things About TDD To Make A Point
Test Driven Development is not great.
It’s an interesting idea, but if you follow the TDD-brick-road far enough, you end up getting shanked in the alley by an entity with a mocking smile.
The good thing about TDD is that it forces you to write code that is testable.
Easily testable code forces good patterns like single responsibility functions, etc.
My theory is that “vibe coding”, AI Driven Development, is the TDD of the 2020’s.
An interesting idea, that will absolutely shank you if you go too deep into the alley.
But just like TDD, it will force programmers into focusing on good programming patterns.
Specifically, it will enforce humans to focus a lot more on the overall software architecture of their program(s).
If the AI is not great at wide-context problems, humans will construct programs that isolate context- a web of boxes that the AI can fill, with tons of care and attention on the input/output that connect the boxes together.
One could even make the argument that the next generation of software devs will be better at software design because the AI is so punishingly bad at it.
brew install AI-bad-pattern-linter-thing
AI-bad-pattern-linter-thing isn’t a real thing, and shouldn’t be.
But it’s an example of where one could see the industry going.
Human’s have been bad at programming for a lot longer than AI has, and humans have developed entire ecosystems to try to unsuck the code.
Linters are a great example, which is a program that will complain at you if you’re not following the same conventions as the rest of the codebase.
My prediction is that a wave of unsuck-AI-tools will start flooding the ecosystem.
Like, a common AI coding problem is that it will repeat itself dozens of times to do some action, violating the Don’t Repeat Yourself principle to an egregious degree.
Someone is going to write some janky tool that will spot this kind of pattern and complain at you that you should consolidate the common calls.
Honestly behavioral linters sound like an awesome idea in general, but the common pitfalls of AI makes them a necessity.
AI Coding is Kinda Great
The way I see it, AI coding is like sugar.
Great ingredient to be able to cook with, but too much and you ruin the meal.
Sugar isn’t going to put the chef out of a job.
And chefs aren’t going to stop using sugar.
And as a chef, you better get good at using sugar in your meals, else the chefs who can use sugar effectively are going to do better than you.