NHacker Next
  • new
  • past
  • show
  • ask
  • show
  • jobs
  • submit
The ladder is missing rungs – Engineering Progression When AI Ate the Middle (negroniventurestudios.com)
ngburke 24 hours ago [-]
Spot on. All those years of slinging code and debugging gave me and others the judgement and eye to check on all the AI generated code. I now wonder often about what hiring looks like in this new era. As a small startup, we just don't need junior engineers to do the day to day implementation.

Do we instead hire a small number of people as apprentices to train on the high level patterns, spot trouble areas, develop good 'taste' for clean software? Teach them what well organized, modular software looks like on the surface? How to spot redundancy? When to push the AI to examine an area for design issues, testability, security gaps? Not sure how to train people in this new era, would love to hear other perspectives.

kace91 19 hours ago [-]
Here's a depressing take.

Most places I worked at, seniors were expected to do the junior work, only faster. All the actual senior stuff (architecture, refactoring,code quality, you name it) is usually done either against management or as a concession to humor the devs.

Now that our ability to go fast has been supercharged, I suspect we're just going to see a massive lowering of quality across everything. We seem to be already seeing it in windows, osx, iOS, azure...

Either the market stops accepting that lowering and we see a counterpush, or people become content with 97% availability. Considering how normalized it is nowadays to have data leaks, I think the frog's already half boiled.

hapticmonkey 17 hours ago [-]
I don’t think this is limited to coding. We’ll eventually see it across all products and services. It’s just a matter of how much customers are willing to accept.
tdeck 8 hours ago [-]
I unfortunately agree with this. Look how many senior engineers are already abdicating using their "judgement and eye to check on all the AI generated code" in favor of leaving an agent running all night and maybe skimming through 30k lines in the morning.
xboxnolifes 18 hours ago [-]
> As a small startup, we just don't need junior engineers to do the day to day implementation.

> Do we instead hire a small number of people as apprentices...

Are you not just re-describing what a junior engineer is? Someone with potential you hire to have them learn and grow on your team?

I don't understand what is different. Pre-LLM, companies seemingly expected juniors to know a ton about coding. Post-LLM they'll expect them to know a ton about LLMs.

alephnerd 17 hours ago [-]
> Are you not just re-describing what a junior engineer is?

Basically, the salary expectations are out of whack for expected output.

The 25th and 50th percentile TC for an entry level SWE in the US are $100K and $140K respectively [0].

Meanwhile, the 90th percentile TC for an entry level SWE in Canada is US$115K [1] and US$120K [2] in the United Kingdom.

Why should I hire an entry level CS major from Sacramento State or UMass Dartmouth when I can hire a UWaterloo/UBC or Oxbridge/Imperial CS grad who is guaranteed to have done multiple internships?

CoL is the same in most of US Canada, and UK if not more expensive in Canada+UK. And the excuse of "healthcare" doesn't hold either - both Canadian and American employees pay the same in healthcare fees and benefits, even including COBRA during a layoff or an ACA plan. And increasingly in the UK, our PortCos have started offering private healthcare plans becuase of NHS issues.

What is happening is globally, new grad hiring will be significantly reduced with hiring occuring at target programs where curricula and student quality is already well understood.

On the other side of the bell curve, the bottom half of apprentices globally will be trained by Accenture/Deloitte/PWC or WITCH or FPT type companies who tend to pay bottom barrel new grads around $6K TC starting salaries (which is roughly the same one could earn farming or as an automation engineer on a factory line in India, the Philippines, or Vietnam) but force them to study on the job at their education and university programs and will merge their output with GenAI platforms.

The kind of organization that viewed software as a loss leader before AI still doesn't have an incentive to hire internally even with AI. Meanwhile, companies who view software as critical to their operations will continue to expand GCCs and pick-and-choose the top tier of talent to incubate internally.

IF you are a new grad in North America, this means you need to move to a Tier 1 tech hub like SF or NYC ASAP - these are the only hubs with the right density of talent and self sustaining software hiring markets that can ensure you will find your next job if you get laid off or need to find an entry level role.

IF you are a new grad and already have a role - UPSKILL ASAP. A decently regarded online MSCS like GT or UT Austin doesn't cost more than $10K total, and other programs like UIUC's MCS or Dartmouth's MEng cost in the $20k-40k range in their entirety which is worth it. Additionally you will have to self-skill in your free time as well.

[0] - https://www.levels.fyi/t/software-engineer/levels/entry-leve...

[1] - https://www.levels.fyi/t/software-engineer/levels/entry-leve...

[2] - https://www.levels.fyi/t/software-engineer/levels/entry-leve...

lovich 6 hours ago [-]
I remember in college when we got taught that early economists thought capitalism and increasing productivity from innovation would lead to less work and effort needed from people and not more.

It must have been nice to be that optimistic and not having to see how it’s actually playing out.

jplusequalt 5 hours ago [-]
https://en.wikipedia.org/wiki/Hedonic_treadmill

Economists typically point to this phenomenon when people talk about the relatively stable working hours over the last 50-60 years. I've seen some of them argue it's an issue of supply/demand, and that if people truly wanted to work less we'd see more of demand for such careers. I think this ignores that retirement/medical benefits are almost exclusively tied to jobs expecting you to work 40 hours a week.

alephnerd 5 hours ago [-]
> we got taught that early economists thought capitalism and increasing productivity from innovation would lead to less work and effort needed from people and not more

Keynes wasn't wrong. The issue is macro-level productivity is orthogonal to personal effort and productivity. And what Keynes was talking about was macroeconomics, not individuals.

For example, it takes 10% the workforce it took in early 20th century to produce the same amount of agricultural output in the US in the 21st century.

Similarly, end-to-end automotive manufacturing via industrial robots has reduced the need for a line worker who's job was to screw in a door on an assembly line.

The economy is much more productive and efficent today than it was a century ago, but automation leads to a subset of workers specializing and a larger set of workers deskilled or unemployed because they didn't upskill when they had the chance.

It's interesting to watch the same class of people who told coal miners "they should learn to code" back in the early 2010s now getting the same comeuppance.

Frankly, American SWEs got lazy and lost their competitive edge especially during the early 2020s.

lovich 54 minutes ago [-]
> It's interesting to watch the same class of people who told coal miners "they should learn to code" back in the early 2010s now getting the same comeuppance.

When I told people to learn to code in that situation it was with pity and I would talk to them about how I felt forced to do so after I graduated with a useless degree during the Great Recession.

It was more of a “here’s one of the few growth areas left that are feasible to self teach”, rather than contempt for people not being on the same class as me.

> Frankly, American SWEs got lazy and lost their competitive edge especially during the early 2020s.

If “competitive” edge at this point means needing to get a masters on top of needing to train unpaid on your free time I think it’s more that corporations in America have gotten to the point of wanting increasingly rare or expensive to acquire skills in their labor force, while simultaneously deciding that they will be paying approximately $0 in any and all training costs.

AI is only accelerating that as every manager and exec is drooling at the mouth at the idea of never hiring juniors again. It’ll be some other assholes problem like their future self who has to deal with what happens after the lack of people in training finally catches up to the industry.

alephnerd 36 minutes ago [-]
> It was more of a “here’s one of the few growth areas left that are feasible to self teach”, rather than contempt for people not being on the same class as me

> If “competitive” edge at this point means needing to get a masters on top of needing to train unpaid on your free time...

Doesn't it suck when being asked to completely retool and reskill in the middle of your career.

lovich 34 minutes ago [-]
I never said it didn’t. I also wanted tech workers to unionize while we had the power because I expected this the second it was feasible, but alas we have no more leverage.
alephnerd 31 minutes ago [-]
> I also wanted tech workers to unionize while we had the power

You could, but that does nothing to prevent job losses, as can be seen with Hollywood completely offshoring to the United Kingdom [0][1] despite SAG-AFTRA and WGA dominating the entertainment industry.

Or even the loss of the entirely unionized coal industry.

The economics of IP-driven industries require an entirely different approach from manufacturing industries.

You can't ignore economics. This is what globalization looks like.

[0] - https://www.nytimes.com/2025/04/19/movies/hollywood-filming-...

[1] - https://www.bloomberg.com/news/articles/2026-03-13/hollywood...

jplusequalt 5 hours ago [-]
>It's interesting to watch the same class of people who told coal miners "they should learn to code" back in the early 2010s now getting the same comeuppance.

There are millions of software engineers in the US alone. Don't put all of them into a single bucket.

waterTanuki 19 hours ago [-]
When your senior developers retire, and if the LLMs haven't caught up to their level by that time, where do you think new senior developers will come from?
orangecoffee 19 hours ago [-]
The response I get offline is that there's millions of seniors right now that can last a couple decades.
FromTheFirstIn 15 hours ago [-]
I don’t know about the rest of them, but my rates keep going up. And the more I read takes like this the more I ask for
arcanemachiner 19 hours ago [-]
Buy Now, Pay Later
raw_anon_1111 19 hours ago [-]
Thats a tragedy of the commons problem. But not my problem. I’m judged as a senior+ by meeting my quarterly and at most yearly goals and strategy.

No current manager (I’m not one) is incentivized to care.

6510 17 hours ago [-]
MBA! lol

We've actually been here before with higher languages. Assembly is actually a higher language, performance is much worse than machine code. It cant really self modify or do code generation. To squeeze all of the wine from the rock you do need 100 times more effort. C is luxurious compared to assembly. Python is even more productive. We don't use html/css/javascript because it is so fast, it's gawd aweful slow. I can however get something up and available to the world in less than a minute.

Then we pretend to be optimizing our websites for performance but we have no idea what code is triggered by our instructions. If the button responds in 0.2 seconds we are good. You know, the time it takes for the cpu to do 1-2 trillion instructions?

We already are MBA's!

orangecoffee 19 hours ago [-]
How much are you willing to pay? Is there any expectation of payoff?
cyanydeez 18 hours ago [-]
Give juniors local models and plan for a workflow that soesnt require subsidized compute with lockib.
newaiera 23 hours ago [-]
[dead]
raw_anon_1111 19 hours ago [-]
As much as I get push back by saying since AI, I never look at the code and I can still be sure it meets the functional and non functional requirements, no one has been able to dispute my methodology.

For functional requirements I review both the unit and more often the integration tests and make sure they align with the requirements.

For security, I validate the API endpoints can’t be accessed without authentication and these days I use Amazon Cognito.

The host environment - lambda, EC2, Docker runtime (ECS/EKS) have IAM roles attached with least privilege.

Then someone asked about multi tenancy and RBAC. Since mostly I deal with B2B clients with large business customers, each customer gets their own AWS account.

For RBAC, DynamoDB and Postgres at least on AWS both have Row level security that you can tie to a user or group - again authorized by Cognito or IAM. Even if the code does miss something - it’s still protected.

The database itself doesn’t have access to the outside world and if I can, I don’t even assign a user to the database and use the AWS Data API when possible that uses the AWS Control plane and IAM. If I do end up using a database use - it again has leash privilege.

Of course UX and UI testing has to be done manually.

I do carefully review the “works on my machine” and “works with small dataset” footguns - like concurrency implementations and I also have scalability tests.

dasil003 18 hours ago [-]
I agree that the way that today's generation of seasoned programmers learned their craft is going away, and that we don't know how the next generation will learn. I disagree very much with the conclusion:

> They didn’t trade speed for learning. They traded learning for nothing. There was no trade-off. There was just loss.

I believe this conclusion is due to a methodological problem, a form of begging the question if you will. One thing I am certain of is that humans who set their mind to something learn something, and good programmers are among the most tenacious in setting their mind to something. With agentic coding, they definitely learn different things, and so I would expect syntax knowledge to be weaker, but debugging and review skill will increase overall. Why? Because there will be more code, and more breakage, and I still haven't seen any tooling that allows a non-technical person to be effective at this.

Programming knowledge has always had a half-life. The way I see it, this is a big sea change that will fundamentally change the job of software engineers, and some non-trivial percentage will either change careers or find a sheltered slow-moving place to finish out their working years. But for those who were not attached to hand-crafted code, AI provides power tools that empower technically minded people more than anyone else. I have full faith that younger generation still has the same distribution of technical potential, and they will still find ways to develop their craft just as previous generations of hackers have always done.

beej71 21 hours ago [-]
Kids growing up with PCs and learning by tinkering and making crap code reminds me of the Go (game) proverb: "Lose your first 50 games as quickly as possible." That time was so valuable.

Or, as I tell my students, "Every failure is a growth opportunity." I let them resubmit corrected projects for points, too. I'm desperate for them to get the reps in that they'd normally have had as juniors in the field.

gburgett 18 hours ago [-]
We hired on a software engineer fresh out of college this January, and we are a very claude heavy shop. We have noticed and have explained to him that we still have to do the real engineering work: requirements decomposition, interface design, verification and integration. Claude is helpful but he is still responsible for his output. He’s been doing well so far! Theres still plenty of things for him to bang his head on and grow.
datadrivenangel 24 hours ago [-]
Demand for software is large and as the cost goes down we'll want more of it, so there will be demand to keep training people.
mitthrowaway2 21 hours ago [-]
Maybe. Depends on how good the substitute is. Demand for number crunching went up as costs went down, but nobody is training human "computers" anymore.
FuckButtons 16 hours ago [-]
I don’t know that those people were exactly out of a job though, they didn’t do that job, but I find it hard to believe that any of the people solving orbital mechanics by hand wound up with nothing to do but twiddle their thumbs for the remainder of their lives. Similarly, I don’t know that there’s any realistic prospect, even if ai winds up writing all the software, that there wont also be incentive to have people that also understand it.
FromTheFirstIn 15 hours ago [-]
What makes you think the cost has gone down?
quantummagic 18 hours ago [-]
Digital technology has been gutting other fields for decades. For a few examples, we have lost the ladder that trained people as Linotype Operators/Typesetters, Photographic Lab Technicians, Legal Research Associates, Dictation Typists and Stenographers, etc. All which could lead to higher skilled and higher paid jobs.

We didn't care until the same process came knocking on the door for us.

FromTheFirstIn 15 hours ago [-]
All of those are examples where technology replaced the core skill completely. In the age of digital photography and printers I can run a Photo Booth without worrying about the specific techniques of photographic lab technicians. LLMs aren’t replacing the core skill of engineering, because the core skill isn’t writing code, it’s articulating complex systems.
gexla 18 hours ago [-]
> I saw a LinkedIn post a week or two ago by a senior engineer with 25 years’ experience in the industry...

And how many years have we had capable AI? Maybe it's going to take a similar timeline for people to figure out how to be good with an AI assisted workflow (if not fully automated.)

paxys 19 hours ago [-]
I'm old enough to remember when people complained that we would never have competent engineers again because new hires are starting with higher level languages rather than doing "real" programming. AI or not, the profession will be fine.
iugtmkbdfil834 19 hours ago [-]
I think you are right long term, but not based on misapplied historical trends. In some sense, it does not appear to be a simple "no code/low code" hype as in some of the previous iterations. FWIW, I can only speak for myself and for what I see around me.

Still, what I anecdotally do see is a level of economizing current AI spend in a typical way demanded by most MBA types ( give it to everyone, but make it less expensive somehow being one of the more amusing symptom ).

bearfox 23 hours ago [-]
This fits my bias so well, I'm skeptical but can't refute. So well that in fact the title reminds me of an SF story I always come back to when thinking about effect of AI on the society. The Plateau by Christopher Anvil.
20 hours ago [-]
jjk166 22 hours ago [-]
> and those tasks were never just tasks. They were the mechanism that built judgment, intuition, and the ability to supervise the systems we now delegate to AI.

Bullshit. The busywork wasn't being done by low level engineers to train them up, they were doing it because it needed doing, it was undesirable, and they were lowest on the totem pole.

Jobs are self training. Sure doing other jobs may give you some intuition that can be applied to new jobs. Manually writing code and fixing your human created mistakes obviously carries over for debugging AI written code. But people who start their careers with AI written code will also learn how to debug AI code. You don't learn how to architect a system by coding a system somebody else architected. At best you might pick up some common patterns by osmosis, but this often breeds worse engineers who do things as they have been done in the past without understanding why and without regard to how they really ought to be done. True understanding of why A was chosen in this case and B works better in another comes from actually doing the high level work.

Indeed, if AI usage is like any other tool that has come before it, those who grow up using it will be much more adept at utilizing it in practice than those who are adopting it now after spending a lifetime learning a different skillset. We don't exactly lament how much worse software engineers have gotten since they no longer learn how to sort their punch cards if they drop them.

Even if you are of the opinion that the tasks junior engineers do, which now AI can do, are fundamental to becoming competent at higher level skills, that's no problem. You can train people without them doing value-added work. Have engineers code the old fashioned way for training purposes. It's no different from doing math problems despite calculators existing. This is a problem only for extracting underpaid labor out of junior engineers with the lie that they are being paid in experience.

hungryhobbit 22 hours ago [-]
>> and those tasks were never just tasks. They were the mechanism that built judgment, intuition, and the ability to supervise the systems we now delegate to AI. >Bullshit. The busywork wasn't being done by low level engineers to train them up, they were doing it because it needed doing, it was undesirable, and they were lowest on the totem pole.

Why not both? It was work that needed doing AND it taught people to be better engineers.

jjk166 22 hours ago [-]
> it taught people to be better engineers.

It generally does not.

And if it does, they can still do those tasks as exercises.

nullc 19 hours ago [-]
sublinear 22 hours ago [-]
Haven't we been saying similar for all other aspects of software engineering too as they have changed over time? Writing code is just one responsibility amongst many.

I don't want code from someone/something that doesn't know the needs of the business, cannot find where to compromise effectively, does not understand the deployment environments their app will run in, would not know how to respond to an incident with their application in production, etc.

I don't think writing code with AI is relevant to career progress at all. What matters that I can hold someone accountable for the code we have in prod, and they'd better have answers or they don't have a job.

If they are dependable there, only then they can be trusted with more responsibility. That's all we're really talking about. You get paid to be accountable. You do not get paid to do one narrow thing well. It should not take you a decade to read and write code quickly and effectively. I'd argue that should have happened when you were in high school and college (how it was for everyone in upper management right now).

I feel like the quality of new hires has progressively become worse over the years, and we have made so many concessions to remedy it (AI included), and all it's doing is making the problem worse.

DGAP 22 hours ago [-]
There's going to be very very very few engineers.
wolttam 1 days ago [-]
I feel like some of the data in this is horrendously out of date. They're referencing articles from the end of 2024.

There was a massive step-change in the capability of these models towards the end of 2025.

There is just no way that an experienced developer should be slower using the current tools. Doesn't match my experience at all.

The title of the article, though - absolutely true IMO

Esophagus4 1 days ago [-]
Yeah…

> For tasks that would take a human under four minutes—small bug fixes, boilerplate, simple implementations—AI can now do these with near-100% success. For tasks that would take a human around one hour, AI has a roughly 50% success rate. For tasks over four hours, it comes in below a 10% success rate

Opus 4.6 now does 12hr tasks with 50% success. The METR time horizon chart is insane… exponential progression.

indoordin0saur 1 days ago [-]
Really depends on what you're working in. For me, I work with a lot of data frameworks that are maybe underrepresented in these models' training sets and it still tends to get things wrong. The other issue is business logic is complex to describe in a prompt, to the point where giving it all the context and business logic for it to succeed is almost as much work as doing it myself. As a data engineer I still only find models to be useful with small chunks of code or filling in tedious boilerplate to get things moving.
blonder 24 hours ago [-]
Agreed. Common use cases like creating a simple LMS system Opus is shockingly good, saving hours upon hours from having to reinvent the wheel. Other things like simple queries to, and interactions with our ERP system it is still quite poor at, and increases development time rather than shortens it.
drzaiusx11 17 hours ago [-]
Just anecdotal but I work on some fairly left field service architectures; today it was a highly parallelized state machine processor operating on an in-house binary protocol.

Opus 4.6 had no issue correctly identifying and mitigating a hairy out-of-order state corruption issue involving a non-trivial sequence of runtime conditions from thrown errors and failed recoveries. This was simply from having access to the code repository and a brief description of the observed behavior that I provided. Naturally I verified it wasn't bullshitting me, and sure enough it was correct. Impressive really, given none of the specifics could have been in its training set, but I guess we're finding that nothing really is "new", just a remix of what's come before in various recombinations.

alistairSH 24 hours ago [-]
How is success defined in those metrics? Is success "perfect - can deploy to prod immediately" or "saved some arbitrary amount of engineering time"?

Anecdotal experience from my team of 15 engineers is we rarely get "perfect" but we do get enough to massive time savings across several common problem domains.

Esophagus4 21 hours ago [-]
I think for me, it’s not so much an objective success metric as it is showing its progression over time.

That’s what marvels me is how fast LLMs are progressing. And it still feels like early days (!).

For methodology, I would check out the METR website though, they’ve published their results.

skrun_dev 42 minutes ago [-]
[dead]
genadym 4 hours ago [-]
[dead]
AIOperator2026 13 hours ago [-]
[dead]
manudaro 17 hours ago [-]
[dead]
edinetdb 19 hours ago [-]
[dead]
black_13 17 hours ago [-]
[dead]
tazu 18 hours ago [-]
[dead]
Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact
Rendered at 19:55:10 GMT+0000 (Coordinated Universal Time) with Vercel.