Is anyone else experience this thing where your fellow senior engineers seem to be lobotomised by AI?
-
@Purple
You are absolutely right.Lots of people are blindly trusting AI tools and completely bypass their critical thinking braincells.
I am glad to be the only engineer in my team and thus do not have to deal with people like that. Even though i have access to a few AI tools, i will never have them make engineering decisions for me.
-
Is anyone else experience this thing where your fellow senior engineers seem to be lobotomised by AI?
I've had 4 different senior engineers in the last week come up with absolutely insane changes or code, that they were instructed to do by AI. Things that if you used your brain for a few minutes you should realise just don't work.
They also rarely can explain why they make these changes or what the code actually does.
I feel like I'm absolutely going insane, and it also makes me not able to trust anyones answers or analysis' because I /know/ there is a high chance they should asked AI and wrote it off as their own.
I think the effect AI has had on our industry's knowledge is really significant, and it's honestly very scary.
@Purple Have you tried _hiring_ lately? It's the wild west of mediocrity out there.
-
Is anyone else experience this thing where your fellow senior engineers seem to be lobotomised by AI?
I've had 4 different senior engineers in the last week come up with absolutely insane changes or code, that they were instructed to do by AI. Things that if you used your brain for a few minutes you should realise just don't work.
They also rarely can explain why they make these changes or what the code actually does.
I feel like I'm absolutely going insane, and it also makes me not able to trust anyones answers or analysis' because I /know/ there is a high chance they should asked AI and wrote it off as their own.
I think the effect AI has had on our industry's knowledge is really significant, and it's honestly very scary.
@Purple Yeah, I've seen it happen but only once or twice. I think it happens when the engineers runs out of ideas on how to improve/maintain/whatever the solution they're working so they turn to AI for "suggestions".
-
Is anyone else experience this thing where your fellow senior engineers seem to be lobotomised by AI?
I've had 4 different senior engineers in the last week come up with absolutely insane changes or code, that they were instructed to do by AI. Things that if you used your brain for a few minutes you should realise just don't work.
They also rarely can explain why they make these changes or what the code actually does.
I feel like I'm absolutely going insane, and it also makes me not able to trust anyones answers or analysis' because I /know/ there is a high chance they should asked AI and wrote it off as their own.
I think the effect AI has had on our industry's knowledge is really significant, and it's honestly very scary.
@Purple Yes. I like to think of this as a positive factor for my job security. Every other outlook on this is too grim to think about.
-
Is anyone else experience this thing where your fellow senior engineers seem to be lobotomised by AI?
I've had 4 different senior engineers in the last week come up with absolutely insane changes or code, that they were instructed to do by AI. Things that if you used your brain for a few minutes you should realise just don't work.
They also rarely can explain why they make these changes or what the code actually does.
I feel like I'm absolutely going insane, and it also makes me not able to trust anyones answers or analysis' because I /know/ there is a high chance they should asked AI and wrote it off as their own.
I think the effect AI has had on our industry's knowledge is really significant, and it's honestly very scary.
@Purple big tech companies are pushing the AI narrative so that little tech companies use AI and cripple their own development.
Humans are the only species that not only kill their food, but spend time hunting and killing the other predators that hunt their food.
-
@swift I sometimes end up talking to them about it, and they seem to be aware AI is sometimes wrong... But yet they keep using it for literally everything.
It's almost like an addict who knows the addiction might be hurting them, but can't stop using.
I can see the damage it's doing to the long term maintainability of our environment and platform too

@Purple @swift you've actually nailed it
it looks like an addiction because prompting is like a slot machine: each prompt is another pull on the slot machine hoping for a good result, and maybe, just maybe, if they prompt just one more time, they'll get a good result 
<insert rant about needing human psych as a mandatory class during education here>
-
@Purple Have you tried _hiring_ lately? It's the wild west of mediocrity out there.
@spiqueras @Purple
People need to start hiring the old fashioned way and demand no AI use on the job or in the resume & application.
It should be a contractual fireable offence.
It might attract engineers who get the issues and prefer to use their brains, and not work in a place they are forced to use it.It’s an opportunity for the really skilled engineers who are anti AI.
-
@Purple On the bright side, there are "some" benefits that I've seen in my experiments. The biggest is "planning"
We get requests from clients for modifications that often make no sense and are simply a disaster..... yet, we often just get to it and start framing it out so we can really get an idea of what needs to be done.
with this AI stuff, you can write up the entire process - the business logic - and then include the steps to getting it done and really get it planned out. Then you can have something like Claude CLI dig through it and actually write up a plan - not the code. And THEN, you can break it into manageable pieces, push them into individual branches and THEN start work.
at that point, I'd say 70% of the front-end UI can be done with AI, and then you do the rest. I mean, it is no different than the scaffolding that you get with AngularCLI or VisualStudio, etc...
The real problem is laziness. It's too easy to just make it work and push your PR without even looking at "how" it works.
-
@clew @Purple I both code and one the business. they *should* be worried, but they don't understand how an LLM works now how much better at a lot of things it is than them and finding logical gaps.
Creative programming.... Finding a better way to accomplish a goal... Finding a more economical way to get a short budget project done? So far, AI is not going to replace experience and intelligence of a good programmer with business experience. On the other hand, a lot of project managers can be ditched.
-
@Purple so yeah, you are not alone.
I used to think my (now previous) employer mostly had a staff that actually somewhat cared about the craft. I am not so sure anymore.
On the contrary, I more and more get questioned on or have to justify my decision to avoid these tools wherever I can (sure, being able to do so is a privilege, depending on current pressure at wherever one works to feed and house themselves).I get that the _craft_ changes over time — I had colleagues who grew up on assembler and their C was sweet — but I’m pretty bewildered by how bad stuff is and why do the _customers_ accept it? The customers, who give companies the _money_?
Then I deal with a hilarious-not string of systemic errors from, oh, my cable company ‘s billing system. The only cable provider I have, and they’re trying to persuade regulators to let them increase their monopoly.
-
@clew @Purple I both code and one the business. they *should* be worried, but they don't understand how an LLM works now how much better at a lot of things it is than them and finding logical gaps.
Creative programming.... Finding a better way to accomplish a goal... Finding a more economical way to get a short budget project done? So far, AI is not going to replace experience and intelligence of a good programmer with business experience. On the other hand, a lot of project managers can be ditched.
-
Is anyone else experience this thing where your fellow senior engineers seem to be lobotomised by AI?
I've had 4 different senior engineers in the last week come up with absolutely insane changes or code, that they were instructed to do by AI. Things that if you used your brain for a few minutes you should realise just don't work.
They also rarely can explain why they make these changes or what the code actually does.
I feel like I'm absolutely going insane, and it also makes me not able to trust anyones answers or analysis' because I /know/ there is a high chance they should asked AI and wrote it off as their own.
I think the effect AI has had on our industry's knowledge is really significant, and it's honestly very scary.
@Purple I usually try and avoid blaming tools here ... If someone was prone to copy pasting a bunch of code they didn't understand before ... Nothing has really changed.
Senior devs should take responsibility for the shit they write and then want maintained long term
-
-
Is anyone else experience this thing where your fellow senior engineers seem to be lobotomised by AI?
I've had 4 different senior engineers in the last week come up with absolutely insane changes or code, that they were instructed to do by AI. Things that if you used your brain for a few minutes you should realise just don't work.
They also rarely can explain why they make these changes or what the code actually does.
I feel like I'm absolutely going insane, and it also makes me not able to trust anyones answers or analysis' because I /know/ there is a high chance they should asked AI and wrote it off as their own.
I think the effect AI has had on our industry's knowledge is really significant, and it's honestly very scary.
@Purple lot of people were always sloppy and now they have free rein to just have absolutely zero care
-
Is anyone else experience this thing where your fellow senior engineers seem to be lobotomised by AI?
I've had 4 different senior engineers in the last week come up with absolutely insane changes or code, that they were instructed to do by AI. Things that if you used your brain for a few minutes you should realise just don't work.
They also rarely can explain why they make these changes or what the code actually does.
I feel like I'm absolutely going insane, and it also makes me not able to trust anyones answers or analysis' because I /know/ there is a high chance they should asked AI and wrote it off as their own.
I think the effect AI has had on our industry's knowledge is really significant, and it's honestly very scary.
@Purple @mhartle That is scary, but the lack of sufficiently trained and skilled professionals in software engineering has been scary for a long time. My colleague (a bona fide wizard) once wrote a cross-compiler that would turn other people’s code into something he would write. Often it would fail — because the other code would fail. We didn’t really use it as a test tool, but it was hellagreat at optimizing!
-
@Purple I usually try and avoid blaming tools here ... If someone was prone to copy pasting a bunch of code they didn't understand before ... Nothing has really changed.
Senior devs should take responsibility for the shit they write and then want maintained long term
@suriele
This tool is noteworthy because it can produce bad code orders of magnitude faster than a human. It's faster even than doing a web search and hitting Stack Overflow.It's also good at naming variables and adding comments that, at first glance, can look like what you want. So it takes longer to review.
It has also democratized things, such that anyone that can track down a bug report can now slam keywords into a chatbot and get code that purports to fix the issue, even if they don't know the language (or, indeed, how to program at all).
It's noble to try to avoid blaming tools, but this is drastically different than most others, if it can be called a tool at all. Last I heard, companies making heavy use of it aren't seeing any productivity gain.
LLMs have drastically increased the cost of doing open source, at questionable-if-any benefit.
@Purple -
Is anyone else experience this thing where your fellow senior engineers seem to be lobotomised by AI?
I've had 4 different senior engineers in the last week come up with absolutely insane changes or code, that they were instructed to do by AI. Things that if you used your brain for a few minutes you should realise just don't work.
They also rarely can explain why they make these changes or what the code actually does.
I feel like I'm absolutely going insane, and it also makes me not able to trust anyones answers or analysis' because I /know/ there is a high chance they should asked AI and wrote it off as their own.
I think the effect AI has had on our industry's knowledge is really significant, and it's honestly very scary.
I do not have any fellow senior engineers I work with so I cannot speak directly to the question you asked. I do however have a lot of experience with AI. This is my take on it; for what that is worth.
I have been working on AI since the late 80's and there have been some drastic changes. Modern day artificial neural networks are a real game changer. It is amazing what we can achieve with them. With that said there are a lot of concerns to be had with the monetization of AI by large corporations. I personally only use local AI with models I train specifically for a task with high quality data that is public domain. Where corporate AI tends to train with anything they can scrape from the internet with little to no vetting. This produces inferior training that produces poor results with frequency.
People tend to forget that AI is more than just chat bots on the internet. We use visual AI for searching landmarks to help find lost people in the wilderness. We use visual AI to provide adaptive cruise control, parallel parking assistance, etc. AI is a lot more than chat bots. Though people tend to look at corporate AI as AI when it is typically the worst quality AI you could choose to use. Now I get it, you cannot train a base model yourself without a significant investment in a high end GPU. Though, you can train a minimally trained model of smaller size with a RTX 3060 TI that is only around $200.00 - $300.00 USD. This is going to be painfully slow but doable.
I see such negative comments constantly on AI from people that really do not understand what AI actually is, demonstrated by their comments. Corporations setting up server farms that would require their own dedicated nuclear power plant? No, I am not with that. Local AI that can be tailored to very specific needs can indeed be a great help. We should always keep in mind that we are dealing with a neural network. Just like humans are not always correct AIs will not always be correct either. You have to vet the information you get from an AI.
AI is like fire. Fire has burned down communities and it has made food more nutritious (via cooking) and in some cases edible (many vegetables are toxic even to a lethal level when consumed raw). It has heated habitations to make them livable in colder months. To judge something only by its worst possible metric is not rational. AI is used in the Medical field and many others for a very long time. Saving the lives of many people. I wish that when people get upset at corporate AI they would direct their frustrations where they belong. On the corporations that are doing everything they can to monetize AI, including having ridiculous classes on writing prompts
. -
Is anyone else experience this thing where your fellow senior engineers seem to be lobotomised by AI?
I've had 4 different senior engineers in the last week come up with absolutely insane changes or code, that they were instructed to do by AI. Things that if you used your brain for a few minutes you should realise just don't work.
They also rarely can explain why they make these changes or what the code actually does.
I feel like I'm absolutely going insane, and it also makes me not able to trust anyones answers or analysis' because I /know/ there is a high chance they should asked AI and wrote it off as their own.
I think the effect AI has had on our industry's knowledge is really significant, and it's honestly very scary.
@Purple I ripped out some unit tests from a project recently because I couldn't see any use for them (e.g. one test had PHPUnit create an object based on a trait, then check it implements the trait... which means it's literally testing PHPUnit's ability to create mocks instead of doing anything useful).
After a while staring at the code and wondering why on earth anyone would have written it, I noticed a comment at the top of the file saying it was generated by Claude 🫠
-
J Jürgen Hubert shared this topic