Ive been a super early adopter of AI. I basically had something like an MCP server up in the early days of Chat GPT (basically a bunch of prompts that guided the model to spit out specific data for the wrapper to catch and execute).
I can safely say that the way things are heading, AI is not going to take my job as a software engineer. Nobody is building AI. Everyone is building effectively explicitly coded agents that all still generally can't reason.
The only thing that AI has improved is the time from knowing what you want to getting it into working code. Which is significant. But for any time improvements that this gives, it also means that things can be written wrong faster, which means more time spent later on undoing it.
How would cyber/AI security survive in a world where AI can do software development? I'm not a LLM bull but I am a security engineer and it's already very good, I don't see how this field could possibly continue if it advances enough to destroy dev jobs.
It’s been well over a decade and 5 jobs ago that I had to sell myself on my coding ability even though coding has always been part of my job.
I’m going to keep getting a job the way I have for the past decade - my ability to use my now 30+ years of experience between development, 10+ years of leading architecture projects and being able to deal with the business and disambiguate and solve XYProblems. AI has changed nothing about the value o bring to the company. Now I just do a lot of the implementation drudgery myself faster or that I would have delegated to other people.
So, in the 80s it took around a year and a half for a team of ~10 people to develop Pac-Man (Namco). Nowadays it probably would take any engineer between a day and a weekend to develop it solo. Does that mean engineers are out of jobs? No. Things get easier to be developed, but humans always want more.
So in X years when developing a Figma may take one or two guys and a week of development effort, the society will expect "better" (more complex) software. And there I will be ready to develop such software (using AI probably like yet another tool). The AI available in X years will probably do wonders but won't be a silver bullet to develop the software required in such days. Our needs are always ahead of what the technology can provide, that's why technology keeps evolving. It's silly to thing that we'll reach a ceiling and that a given tech will do everything from that point onwards.
> So, in the 80s it took around a year and a half for a team of ~10 people to develop Pac-Man (Namco). Nowadays it probably would take any engineer between a day and a weekend to develop it solo.
Including hardware? For that matter, developing the software on the hardware it had would not be a weekend job.
If you mean "develop it using current tools on current hardware", then yes, it might be doable in a weekend, maybe even a day.
But I think I agree with your overall point. Better tools let us level up. We get to do harder things. We've got cancer to cure, and Alzheimer's, and space to conquer. We're not going to get there by writing assembly for Z80s, so to speak. We need to move up.
We have no idea what the world's going to look like in 5 years. Maximize your ability to adapt, grow, learn and get things done. Any plan you make today is going to be worse than a plan made 3 years from now with more information.
I saw the writing on the wall when ChatGPT was first released. I'm a good 2 years into a pivot to become an online creator who makes weird web experiences and content.
I am knee deep in a challange to build 25 projects in 25 weeks. I'm working on project #20 today.
I hope to build an audience through Patreon, sponsorships, and monetizing some of the projects. Maybe a micro SaaS or two? Maybe a newsletter or two?
There's a lot of clarity to be gained by decomposing the question of "a living" into its parts.
How do you plan to continue the processes of gaining sustenance and economic power as needed? Do you own the land or capital needed for your sustenance or do you need to trade for them? If you need to trade, what assets or capabilities do you have to trade?
How well are these guarded against theft, cyberattack, romance scams against you, quasi-legal expropriation and changes in the legal system that eliminate your status of having property rights?
We'd better hope that robotic replacements for human hands aren't soon in coming.
My plan is to learn how to use AI and be proficient with it, so they will be more likely to keep me, as opposed to those who refuse to use it. Some people on my team still refuse to use AI, thinking you can't trust anything since it's all hallucinations.
From my data of ~1.8M jobs, "Software Engineer" as a job title in postings are down ~25% and "Senior Software Engineer" jobs are down ~10% comparing the first 2 months of 2026 with 2025
I run a job search site (UnlistedJobs) which monitors about 200k company website career pages. So my data comes from that and I've normalized it to look at the same group of companies (for analysis sake). Ah yea I've seen that Indeed chart circulating.
I'm in the process of putting together a market analysis tool with my data since it isn't biased to one platform (not saying that Indeed data is wrong though). Mine will only go back to the beginning of 2025 tho
The premise is flawed, I don't see AI taking jobs because companies want to grow in productivity generally speaking and would rather just make their existing employees do more; all productivity saving technologies have always been net positive in the amount of jobs created.
> all productivity saving technologies have always been net positive in the amount of jobs created
To what extent has the net increase in jobs been because there have just been more people who needed to work in order for society to not collapse?
Population growth is slowing (expected to peak in 2080-ish). To some, AI feels like a different sort of "productivity enhancer" than we've seen in this past.
I don't think the person's premise is flawed. It's more that you just disagree with it.
The counter argument to that is what happened with horses. Since domestication every advance of human civilization lead to having more horses. Until cars were invented and improved - which eliminated overnight 90% of the number of horses used.
So the fact that in the past new technologies have created new jobs is not a guarantee that AI will create new jobs.
On top of that have a look what happened say during the industrial revolution in Britain. You'd have a village with 2000 workers producing clothes or materials to make clothes. A rich guy opens a factory in that village that employees 200 people and produces more than the whole village before that. 90% are unemployed, the 10% that work in factory have far worse working conditions. Studying graves from that time shows the height of people went down during the industrial revolution - as the conditions in the factories were far worse than what they had before. Hence the luddite movement - but as the reach people owned the mass media, the luddites were portrayed as crazy. Eventually, many years later, new better jobs did appear.
I'm try to save money and could take an early retirement in 2-3 years if necessary. If I'm fired before that, and if I don't find a job as an SWE or manager, I think I could teach.
But who knows if your job will be gone. There could be more jobs in order to fix the tech debt and AI slop. I'm using Claude daily to write all my code, I wouldn't bet my life on the fact I'm more productive individually, and I don't think my team/company is. But we'll see.
IMHO no one know what the heck is going to happen. Probably will have to adapt to the situation as needed... In other words, it's too hard to predict to come up with a plan ahead of time?
At a time like this, don't put all your eggs in one basket. So maybe come up with a plan to get more baskets.
What would that look like? Well, I've seen the statement that you can become a licensed phlebotomist (one who draws blood) for $500. That gives you an option that is not "write code until I get laid off". (Of course, in a true AI takeover, we're going to have blood-drawing robots eventually, so it's not a permanent fix.)
More generally, even if software stays around forever, you may not be doing the same kind of software for your whole career. (I have done both internet security software and embedded systems.) You almost certainly won't be at the same company for your whole career. Keep learning new things, and keep your eyes open for new opportunities. Right now is more scary than it often is, but we have always needed to keep our eyes open for what we'll do next.
I remember saying ~2 years ago that people should probably assume the role they're in would be their last programming job. I feel like that was an unreasonably good prediction given almost everyone on HN at the time was arguing that LLMs were just stochastic parrots.
I suspect people will still be needed to create software for some time but increasingly it won't be software engineers who have spent decades learning syntax. It will be a relatively poorly paid role compared to today. If you're happy to be paid a fraction of what you're paid today, perhaps you can transition into a vibe coder role.
There will probably be some more senior people maintaining projects, but the number of people like this you'd realistically need at any org is probably in the single digits. The main hiring criteria will be that they're very personable, not the typical anti-social engineer type. The autistic 10x engineer's value is basically zero in this new AI economy.
Longer term (~10-20 years) we're all dead anyway so your priority probably shouldn't be to optimise for income or prestige.
Ive been a super early adopter of AI. I basically had something like an MCP server up in the early days of Chat GPT (basically a bunch of prompts that guided the model to spit out specific data for the wrapper to catch and execute).
I can safely say that the way things are heading, AI is not going to take my job as a software engineer. Nobody is building AI. Everyone is building effectively explicitly coded agents that all still generally can't reason.
The only thing that AI has improved is the time from knowing what you want to getting it into working code. Which is significant. But for any time improvements that this gives, it also means that things can be written wrong faster, which means more time spent later on undoing it.
I plan to do the same thing I did the last several times new automation took my old job: carry right along making software.
Jevons' paradox tells us that AI will create more demand for software than ever. I see no threat to my career here.
> Jevons' paradox tells us that AI will create more demand for software than ever. I see no threat to my career here.
AI will also create essentially infinite supply with, if you believe the predictions, little if any need for human contributions.
How would cyber/AI security survive in a world where AI can do software development? I'm not a LLM bull but I am a security engineer and it's already very good, I don't see how this field could possibly continue if it advances enough to destroy dev jobs.
It’s been well over a decade and 5 jobs ago that I had to sell myself on my coding ability even though coding has always been part of my job.
I’m going to keep getting a job the way I have for the past decade - my ability to use my now 30+ years of experience between development, 10+ years of leading architecture projects and being able to deal with the business and disambiguate and solve XYProblems. AI has changed nothing about the value o bring to the company. Now I just do a lot of the implementation drudgery myself faster or that I would have delegated to other people.
So, in the 80s it took around a year and a half for a team of ~10 people to develop Pac-Man (Namco). Nowadays it probably would take any engineer between a day and a weekend to develop it solo. Does that mean engineers are out of jobs? No. Things get easier to be developed, but humans always want more. So in X years when developing a Figma may take one or two guys and a week of development effort, the society will expect "better" (more complex) software. And there I will be ready to develop such software (using AI probably like yet another tool). The AI available in X years will probably do wonders but won't be a silver bullet to develop the software required in such days. Our needs are always ahead of what the technology can provide, that's why technology keeps evolving. It's silly to thing that we'll reach a ceiling and that a given tech will do everything from that point onwards.
> So, in the 80s it took around a year and a half for a team of ~10 people to develop Pac-Man (Namco). Nowadays it probably would take any engineer between a day and a weekend to develop it solo.
Including hardware? For that matter, developing the software on the hardware it had would not be a weekend job.
If you mean "develop it using current tools on current hardware", then yes, it might be doable in a weekend, maybe even a day.
But I think I agree with your overall point. Better tools let us level up. We get to do harder things. We've got cancer to cure, and Alzheimer's, and space to conquer. We're not going to get there by writing assembly for Z80s, so to speak. We need to move up.
We have no idea what the world's going to look like in 5 years. Maximize your ability to adapt, grow, learn and get things done. Any plan you make today is going to be worse than a plan made 3 years from now with more information.
I saw the writing on the wall when ChatGPT was first released. I'm a good 2 years into a pivot to become an online creator who makes weird web experiences and content.
I am knee deep in a challange to build 25 projects in 25 weeks. I'm working on project #20 today.
I hope to build an audience through Patreon, sponsorships, and monetizing some of the projects. Maybe a micro SaaS or two? Maybe a newsletter or two?
> Maybe a micro SaaS or two? Maybe a newsletter or two?
No offence, but I'm not sure a pivot into micro SaaS software development or newsletter writing is much of a plan for AI job destruction.
Probably right, but I'd rather be 1-2 years into figuring this out than starting now. I have a lot of traction and I've built up an audience.
There's a lot of clarity to be gained by decomposing the question of "a living" into its parts.
How do you plan to continue the processes of gaining sustenance and economic power as needed? Do you own the land or capital needed for your sustenance or do you need to trade for them? If you need to trade, what assets or capabilities do you have to trade?
How well are these guarded against theft, cyberattack, romance scams against you, quasi-legal expropriation and changes in the legal system that eliminate your status of having property rights?
We'd better hope that robotic replacements for human hands aren't soon in coming.
My plan is to learn how to use AI and be proficient with it, so they will be more likely to keep me, as opposed to those who refuse to use it. Some people on my team still refuse to use AI, thinking you can't trust anything since it's all hallucinations.
From my data of ~1.8M jobs, "Software Engineer" as a job title in postings are down ~25% and "Senior Software Engineer" jobs are down ~10% comparing the first 2 months of 2026 with 2025
Wondering what the data would say if you compare it to the numbers in 2015. It definitely feels there are more jobs now (at least from my POV)
Yea totally agree, would like a 10+ year chart. I only had good data from the beginning of 2025
Where is your data from? I saw comparisons of job postings from Jan 2025 to Jan 2026 and SWE was trending upwards: https://fred.stlouisfed.org/series/IHLIDXUSTPSOFTDEVE
I run a job search site (UnlistedJobs) which monitors about 200k company website career pages. So my data comes from that and I've normalized it to look at the same group of companies (for analysis sake). Ah yea I've seen that Indeed chart circulating.
I'm in the process of putting together a market analysis tool with my data since it isn't biased to one platform (not saying that Indeed data is wrong though). Mine will only go back to the beginning of 2025 tho
The premise is flawed, I don't see AI taking jobs because companies want to grow in productivity generally speaking and would rather just make their existing employees do more; all productivity saving technologies have always been net positive in the amount of jobs created.
> all productivity saving technologies have always been net positive in the amount of jobs created
To what extent has the net increase in jobs been because there have just been more people who needed to work in order for society to not collapse?
Population growth is slowing (expected to peak in 2080-ish). To some, AI feels like a different sort of "productivity enhancer" than we've seen in this past.
I don't think the person's premise is flawed. It's more that you just disagree with it.
The counter argument to that is what happened with horses. Since domestication every advance of human civilization lead to having more horses. Until cars were invented and improved - which eliminated overnight 90% of the number of horses used.
So the fact that in the past new technologies have created new jobs is not a guarantee that AI will create new jobs.
On top of that have a look what happened say during the industrial revolution in Britain. You'd have a village with 2000 workers producing clothes or materials to make clothes. A rich guy opens a factory in that village that employees 200 people and produces more than the whole village before that. 90% are unemployed, the 10% that work in factory have far worse working conditions. Studying graves from that time shows the height of people went down during the industrial revolution - as the conditions in the factories were far worse than what they had before. Hence the luddite movement - but as the reach people owned the mass media, the luddites were portrayed as crazy. Eventually, many years later, new better jobs did appear.
There's nothing you can do against civilizational movement. Most people exist because of other people. When those other people lose jobs, so will you.
I'm try to save money and could take an early retirement in 2-3 years if necessary. If I'm fired before that, and if I don't find a job as an SWE or manager, I think I could teach.
But who knows if your job will be gone. There could be more jobs in order to fix the tech debt and AI slop. I'm using Claude daily to write all my code, I wouldn't bet my life on the fact I'm more productive individually, and I don't think my team/company is. But we'll see.
What happens to the economy if AI is a super success and 90% of the financial benefits flow to the already existing top .01%?
Does the same thing happen if AI is a failure and all AI related stocks and investments crash?
IMHO no one know what the heck is going to happen. Probably will have to adapt to the situation as needed... In other words, it's too hard to predict to come up with a plan ahead of time?
At a time like this, don't put all your eggs in one basket. So maybe come up with a plan to get more baskets.
What would that look like? Well, I've seen the statement that you can become a licensed phlebotomist (one who draws blood) for $500. That gives you an option that is not "write code until I get laid off". (Of course, in a true AI takeover, we're going to have blood-drawing robots eventually, so it's not a permanent fix.)
More generally, even if software stays around forever, you may not be doing the same kind of software for your whole career. (I have done both internet security software and embedded systems.) You almost certainly won't be at the same company for your whole career. Keep learning new things, and keep your eyes open for new opportunities. Right now is more scary than it often is, but we have always needed to keep our eyes open for what we'll do next.
UBI
I remember saying ~2 years ago that people should probably assume the role they're in would be their last programming job. I feel like that was an unreasonably good prediction given almost everyone on HN at the time was arguing that LLMs were just stochastic parrots.
I suspect people will still be needed to create software for some time but increasingly it won't be software engineers who have spent decades learning syntax. It will be a relatively poorly paid role compared to today. If you're happy to be paid a fraction of what you're paid today, perhaps you can transition into a vibe coder role.
There will probably be some more senior people maintaining projects, but the number of people like this you'd realistically need at any org is probably in the single digits. The main hiring criteria will be that they're very personable, not the typical anti-social engineer type. The autistic 10x engineer's value is basically zero in this new AI economy.
Longer term (~10-20 years) we're all dead anyway so your priority probably shouldn't be to optimise for income or prestige.
fingers crossed!
If ai takes enough jobs from people.
we will all get together, start a revolution and overthrow our respective governments.
desperate people will do desperate things to survive
There are always more of us than them.
I truly hope that it happens in my lifetime. I really am sick of this shitty world.
Its not what I was promised 69 years ago when I was born.
The governments may have by that time armies of drones and robots, controlled by a few loyal people or AI.
[flagged]
If my heat stops working then my heat starts working.
Those people may have armed robot guards.
Well then it sounds like I’ll be free of my problems soon enough