Making things is often not just about making the thing right in front of you, but about building the skills to make bigger and better things. When you consider the long view, the struggle that makes it harder to make the thing at hand is well worth it. We have long considered taking shortcuts that don’t build skills to be detrimental in the long term. This pretty much only stops being the case when the thing you are short cutting becomes totally irrelevant. We have yet to see how the atrophying of programming skills will affect our collective ability to make reliable and novel software.
In my experience, I have not seen much new software that I’m happy about that is the fruit of LLMs. I have experienced web apps that I’ve been using for years getting buggier.
I feel that too much reliance on LLMs will leave engineers with at best a linear increase in skill over time, compared to the exponential returns of accumulated knowledge. For some I fear they will actually get negative returns when using AI.
In my experience, I have not seen much new software that I’m happy about that is the fruit of LLMs. I have experienced web apps that I’ve been using for years getting buggier.