Sigh here we go again, model release day is always the worst day of the quarter for me. I always get a lovely anxiety attack and have to avoid all parts of the internet for a few days :/
I feel this way too. Wish I could fully understand the 'why'. I know all of the usual arguments, but nothing seems to fully capture it for me - maybe it' all of them, maybe it's simply the pace of change and having to adapt quicker than we're comfortable with. Anyway best of luck from someone who understands this sentiment.
Really? I think it's pretty straightforward, at least for me - fear of AI replacing my profession and also fear that it will become harder to succeed with a side project.
Yeah I can understand that, and sure this is part of it, just not all of it. There is also broader societal issues (ie. inequality), personal questions around meaning and purpose, and a sprinkling of existential (but not much). I suspect anyone surveyed would have a different formula for what causes this unease - I struggle to define it (yet think about it constantly), hence my comment above.
Ultimately when I think deeper, none of this would worry me if these changes occurred over 20 years - societies and cultures change and are constantly in flux, and that includes jobs and what people value. It's the rate of change and inability to adapt quick enough which overwhelms me.
Not worried about inequality, at least not in the sense that AI would increase it, I'm expecting the opposite. Being intelligent will become less valuable than today, which will make the world more equal, but it may be not be a net positive change for everybody.
Regarding meaning and purpose, I have some worries here too, but can easily imagine a ton of things to do and enjoy in a post-AGI world. Travelling, watching technological progress, playing amazing games.
Maybe the unidentified cause of unease is simply the expectation that the world is going to change and we don't know how and have no control over it. It will just happen and we can only hope that the changes will be positive.
See i don't have any of this fear, I have 0 concerns that LLMs will replace software engineering because the bulk of the work we do (not code) is not at risk.
I felt this way from a year ago up until February 2026. Claude Code and Codex becoming the norm cemented for me that a lot of the projects people are working on (including mine) are totally obsolete. As far as I'm concerned, most code is now abstracted away, and people only want better agents - not traditional software products, except as infrastructure or platforms.
It also looks like the final form of the AI roll-out: whatever the model or application, this is the era of agents, and probably in the near-future mostly automated agents. We'll see an overflow of bespoke automation and in-house agents doing everything from personal task management to enterprise business processes, so releasing a "Personal Fitness Tracker" or a "CRO Auditor" in 2026 doesn't make any sense.
All of my anxiety around it has evaporated because I can see what it actually is: an ouroboros of AI output generating automation of more AI output. What most software engineers will be working on now is guiding that output, making it easier to inspect/configure it, optimizing it, and improving the consumer and developer experience.
Otherwise, we just have to drop our old concepts for projects and work on something else.
For the consumer the floor is rising, and for the experienced developer the ceiling is rising. I personally hate web dev anyway, and I'm glad I can work on interesting engineering problems (even with the help of an AI) instead of having to manually stitch together yet another REST API, or website, or service pipeline.
I think it's a matter of what "very basic programming tasks" actually mean keeps sliding across the years. Surely in the beginning, being able to write Assembly was "very basic programming tasks" but as Algol and Fortran took over, suddenly those instead became the "very basic programming tasks".
Repeat this for decades, and "very basic programming tasks" might be creating a cross-platform browser by using LLMs via voice dictation.
Skill atrophy is intrinsically embarrassing, no matter what those skills are. I am embarrassed to admit that I have forgotten a lot of how to hand-optimize C code with inline assembly, even though few people do that anymore.
And the person you are responding is asserting that the response to incompetence of this level should be the SAME as if it directed and intentional malice. Which is a completely valid way to view a fuckup like this.
>response to incompetence of this level should be the SAME
sure.
but this was not a deliberate attack by microsoft employees to shutdown wireguard. that is what i was trying to say and the essence of the quote in question.
Microsoft drove a truck through a school yard at 150mph. It was not a deliberate attack, it was just the fastest route and their map says there's a highway there. Is it malice?
A certain level of recklessness is automatically malice.
in that case, it certainly wouldnt be called a deliberate attack, right?
the edit in my original comment should hopefully clear up any confusion of my intended point. and, well... the comment you replied to should also make it clear that my entire point is centered around something being deliberate attack vs. ridiculous incompetence.
the deliberateness of it is the entirety of the reason i wrote my comment. choosing the phrase "malice vs. incompetence" was a poor choice on my part, when i should have been extremely explicit. it would have avoided all of this back-and-forth.
whether something is a deliberate attack or not is not worth pointing out?
its, like, the only thing worth pointing out. if microsoft is deliberately targeting projects and literally attacking them, that would be huge fucking news. like crazy news. lawsuits galore.
If this is true then a PMs jira tickets are an abstraction over an engineers code. It's not necessarily wrong by some interpretations but is not how the majority of engineers would define the word.
I think this is a pretty interesting comment because it gets to the heart of differing views on what quality means.
For you, non-buggy software is important. You could also reasonably take a more business centered approach, where having some number of paying customers is an indicator of quality (you've built something people are willing to pay for!) Personally I lean towards the second camp, the bugs are annoying but there is a good sprinkling of magic in the product which overall makes it something I really enjoy using.
All that is to say, I don't think there is a straightforward definition of quality that everyone is going to agree on.
reply