Speaking as someone deep in this space: the engineers who grow fastest in the AI era are the ones who treat AI tools as amplifiers, not replacements.
Concrete things that compound:
1. Get really good at system design and architecture. AI can generate code, but it can't design systems that scale well under real-world constraints. This skill gap is widening.
2. Learn to evaluate AI output critically. The ability to spot subtle bugs in AI-generated code is becoming a superpower. This requires deep understanding of the fundamentals.
3. Build things that are hard to automate: cross-team communication, understanding user needs, navigating ambiguity. These are the skills that get you from senior to staff.
4. Use AI tools aggressively but intentionally. I use them daily for boilerplate, tests, and exploration. But I always understand what they produce before shipping it.
The engineers who panic are the ones who were mostly writing boilerplate anyway. If you're solving genuinely hard problems, AI just makes you faster at the boring parts.
The GIGO principle in computing was invented in the 1950s but still applies today: https://en.wikipedia.org/wiki/Garbage_in,_garbage_out . If a developer gives garbage prompts to the AI, they will get garbage results. You, yourself, still need to develop expertise in all the areas a software engineer was traditionally strong in, ranging from languages to architecture, because you need that expertise to craft LLM prompts that are not garbage and to recognize the LLM output that is worth keeping instead of discarding. Otherwise you will wind up as a sockpuppet operated by the LLM. (Many AI advocates will suffer that fate.)
Fortunately, LLMs can be your ally since it can be an effective explainer, search engine to find learning resources, and provider of demo code. But you have to do the work to get the expertise into your own head.
Either you become the master of AI tools or AI tools will become your master, those are your choices.
but they are often not very good decisions, and sometimes absolutely terrible
I have to kick it back onto the right track, but as you sort of point out, I got that experience from making my own terrible decisions and finding out the hard way the blind alley I was walking up.
Second system, inner platform, too general, planning for scenarios that never happen, stuff that doesn't scale, making a giant OOP class taxonomy - I've made them all.
I think I learned as much from reading as doing though, I could later put names on the mistakes I made.
So, for what it's worth - study the meta. Read about architectural theory, system design
My advice is that there's a big difference between "software engineering" as a career where you're employed by companies and "software engineering" where you're delivering solutions for yourself or clients.
The career track feels like it's being obsoleted by AI/LLMs, and that's because a lot of the work is either genuinely superfluous and not actually needed, or rote memorization (where LLMs indeed excel at), or are a sign of over-hiring during the pandemic and even earlier during the ZIRP era and AI/LLMs are merely a convenient excuse to lay them off. Technically-speaking the career track still exists but it's saturated, so let's assume it doesn't anymore.
The "problem solving" track is nowhere near obsoleted. LLMs can be used to automate away typing, but unless it's a specific task where you already expect to do a lot of typing, LLMs (from my experience) tend to produce worse results than just writing the code yourself directly. So software engineering expertise is still valuable.
The career track is threatened for several reasons:
* the hiring market is outright broken (I'm hiring and feeling it). Any job posting is going to be flooded with monkeys (or agents) slinging ChatGPT'd resumes left and right, which are indistinguishable from genuinely skilled candidates. There is no way to tell them apart short of interviewing them, which is difficult when a job posting is going to be DDoS'd with 1k applications a day. And even the interviews themselves are fraudulent - candidates using LLMs and screen-capture/speech recognition to answer the questions, or outright outsourcing the interview to a skilled person, but the actual candidate is a monkey that doesn't know how to print a hello world. This doesn't mean the demand is reduced, but the hiring is happening behind closed doors with people referring other people they already know and trust, so it's impossible to get into for a new entrant.
* A lot of "software engineering" jobs were genuinely just rote typing with little agency. The frontend sphere is especially affected - not surprising because a lot of "frontend" was just trying to reimplement native browser behaviors that were perfected 2 decades ago and could be invoked with just HTML if only the architects had the pragmatism of settling on a mixed "static HTML for the boring bits, and React/$JS_FRAMEWORK_OF_THE_DAY for the interactive stuff".
Now, what do you want to do?
* you want the career track? Well the issue is that a lot of the rote typing stuff (which nevertheless yielded many six-figure jobs just a few years prior and helped push software engineering from a nerd thing to a mainstream easy-money career) is going away, so it's probably not gonna happen. And the genuine roles are hard to get in because the hiring market is both broken and saturated from all the layoffs.
* you want to build products? That's still valid, but requires a different skillset (which includes marketing yourself). There's just as many clients who need pixels put on screen (if anything, there's even more now that LLMs allow any non-technical founder to prove product-market fit using a vibe-coded prototype before committing to an actual production-grade codebase), but the skills required are different.
On the second point, you need to have an end-to-end understanding of the stack to succeed. Not super specialized, but you need to be able to go from the pixels on the screen all the way to the server (databases and stuff), networking and all the way to interacting with any third-party APIs you might use (do you know about idempotence and the CAP theorem? Well if not an LLM can elaborate and make you learn it, but as you can see you still need an initial understanding to know the right questions to ask).
LLMs are actually a boon in this case - they're like your own personal tutor and senior colleague you can ask anytime anywhere. But you still need to put in effort to build and refine products and you'll learn over time.
My advice is that software engineering (and generally being good with tech/computers) is a superpower that can greatly help you in your existing business. As freelancing, it can work as long as you have enough skill (even if LLM-assisted) to deliver things someone will actually pay good money for. But purely as a career track (without a direct, profit-supported deliverable tied to it), it's dead, and LLMs aren't even to blame, they're just a convenient scapegoat to get rid of all the over-hired employees from the ZIRP era.
Good luck. Email in my profile if you want to chat further - just don't expect any miracles, I don't have any particular good news for a junior, and even for seniors it's really difficult right now.
Speaking as someone deep in this space: the engineers who grow fastest in the AI era are the ones who treat AI tools as amplifiers, not replacements.
Concrete things that compound:
1. Get really good at system design and architecture. AI can generate code, but it can't design systems that scale well under real-world constraints. This skill gap is widening.
2. Learn to evaluate AI output critically. The ability to spot subtle bugs in AI-generated code is becoming a superpower. This requires deep understanding of the fundamentals.
3. Build things that are hard to automate: cross-team communication, understanding user needs, navigating ambiguity. These are the skills that get you from senior to staff.
4. Use AI tools aggressively but intentionally. I use them daily for boilerplate, tests, and exploration. But I always understand what they produce before shipping it.
The engineers who panic are the ones who were mostly writing boilerplate anyway. If you're solving genuinely hard problems, AI just makes you faster at the boring parts.
The GIGO principle in computing was invented in the 1950s but still applies today: https://en.wikipedia.org/wiki/Garbage_in,_garbage_out . If a developer gives garbage prompts to the AI, they will get garbage results. You, yourself, still need to develop expertise in all the areas a software engineer was traditionally strong in, ranging from languages to architecture, because you need that expertise to craft LLM prompts that are not garbage and to recognize the LLM output that is worth keeping instead of discarding. Otherwise you will wind up as a sockpuppet operated by the LLM. (Many AI advocates will suffer that fate.)
Fortunately, LLMs can be your ally since it can be an effective explainer, search engine to find learning resources, and provider of demo code. But you have to do the work to get the expertise into your own head.
Either you become the master of AI tools or AI tools will become your master, those are your choices.
Make LLMs a part of your workflow. Use it like any other tool. It's where the industry is headed.
> Claude can make decisions about architecture
but they are often not very good decisions, and sometimes absolutely terrible
I have to kick it back onto the right track, but as you sort of point out, I got that experience from making my own terrible decisions and finding out the hard way the blind alley I was walking up.
Second system, inner platform, too general, planning for scenarios that never happen, stuff that doesn't scale, making a giant OOP class taxonomy - I've made them all.
I think I learned as much from reading as doing though, I could later put names on the mistakes I made.
So, for what it's worth - study the meta. Read about architectural theory, system design
Study the craft and listen to the masters - https://www.youtube.com/watch?v=LKtk3HCgTa8
And when you talk to Claude say "follow rich hickeys advice about explicit rules" and it knows what you mean more than you do !
My advice is that there's a big difference between "software engineering" as a career where you're employed by companies and "software engineering" where you're delivering solutions for yourself or clients.
The career track feels like it's being obsoleted by AI/LLMs, and that's because a lot of the work is either genuinely superfluous and not actually needed, or rote memorization (where LLMs indeed excel at), or are a sign of over-hiring during the pandemic and even earlier during the ZIRP era and AI/LLMs are merely a convenient excuse to lay them off. Technically-speaking the career track still exists but it's saturated, so let's assume it doesn't anymore.
The "problem solving" track is nowhere near obsoleted. LLMs can be used to automate away typing, but unless it's a specific task where you already expect to do a lot of typing, LLMs (from my experience) tend to produce worse results than just writing the code yourself directly. So software engineering expertise is still valuable.
The career track is threatened for several reasons:
* the hiring market is outright broken (I'm hiring and feeling it). Any job posting is going to be flooded with monkeys (or agents) slinging ChatGPT'd resumes left and right, which are indistinguishable from genuinely skilled candidates. There is no way to tell them apart short of interviewing them, which is difficult when a job posting is going to be DDoS'd with 1k applications a day. And even the interviews themselves are fraudulent - candidates using LLMs and screen-capture/speech recognition to answer the questions, or outright outsourcing the interview to a skilled person, but the actual candidate is a monkey that doesn't know how to print a hello world. This doesn't mean the demand is reduced, but the hiring is happening behind closed doors with people referring other people they already know and trust, so it's impossible to get into for a new entrant.
* A lot of "software engineering" jobs were genuinely just rote typing with little agency. The frontend sphere is especially affected - not surprising because a lot of "frontend" was just trying to reimplement native browser behaviors that were perfected 2 decades ago and could be invoked with just HTML if only the architects had the pragmatism of settling on a mixed "static HTML for the boring bits, and React/$JS_FRAMEWORK_OF_THE_DAY for the interactive stuff".
Now, what do you want to do?
* you want the career track? Well the issue is that a lot of the rote typing stuff (which nevertheless yielded many six-figure jobs just a few years prior and helped push software engineering from a nerd thing to a mainstream easy-money career) is going away, so it's probably not gonna happen. And the genuine roles are hard to get in because the hiring market is both broken and saturated from all the layoffs.
* you want to build products? That's still valid, but requires a different skillset (which includes marketing yourself). There's just as many clients who need pixels put on screen (if anything, there's even more now that LLMs allow any non-technical founder to prove product-market fit using a vibe-coded prototype before committing to an actual production-grade codebase), but the skills required are different.
On the second point, you need to have an end-to-end understanding of the stack to succeed. Not super specialized, but you need to be able to go from the pixels on the screen all the way to the server (databases and stuff), networking and all the way to interacting with any third-party APIs you might use (do you know about idempotence and the CAP theorem? Well if not an LLM can elaborate and make you learn it, but as you can see you still need an initial understanding to know the right questions to ask).
LLMs are actually a boon in this case - they're like your own personal tutor and senior colleague you can ask anytime anywhere. But you still need to put in effort to build and refine products and you'll learn over time.
My advice is that software engineering (and generally being good with tech/computers) is a superpower that can greatly help you in your existing business. As freelancing, it can work as long as you have enough skill (even if LLM-assisted) to deliver things someone will actually pay good money for. But purely as a career track (without a direct, profit-supported deliverable tied to it), it's dead, and LLMs aren't even to blame, they're just a convenient scapegoat to get rid of all the over-hired employees from the ZIRP era.
Good luck. Email in my profile if you want to chat further - just don't expect any miracles, I don't have any particular good news for a junior, and even for seniors it's really difficult right now.