The real fear should be that certain crafts will disappear: the craft of programming, the craft of technical writing, the craft of formulating and maintaining a budget for a large organization. Instead of being a programmer, you'll be someone that runs sanity checks on AI-generated code and occasionally pokes around strange-looking code when things seem not quite right.
Imagine you were a clockmaker at the time the first electric clocks (e.g., General Electric's Telechron) were introduced. Suddenly the skills you worked so hard to acquire and which gave you such satisfaction were obsolete.
White color jobs won't go away, but they will become progressively less satisfying and less lucrative.
People have been trying to sell gullible investors on "this will make programmers obsolete!" more or less since programmers have existed (_COBOL_ was sometimes marketed like this). I'm... unconvinced that This Time Is Different (TM), especially after seeing the actual output of these tools.
All of this seems to be predicated on "LLMs suddenly magically becoming intelligent", and it's just not at all clear why people think that might happen.
Same goes for most other white collar stuff, even moreso for, say, lawyers and accountants (where, really, part of the job is _taking responsibility_).
I and many others have said this before, but I'll say it again.
Programming is the easy part. It really isn't hard and has never been hard to write software that generates code and other software. We've been doing that since the early 2000s with frameworks like winforms, and oh yeah compilers. The only difference now is that we're doing statistical analysis that costs literally billions of dollars in order to generate that code with a data first approach.
To generate code that actually does what you want and what your client wants, all while not being a clusterfuck of spaghetti that breaks after 6 months of changes? That's the challenging part.
So sure, if you need someone to hold your hand every step of the way in order to get a task done and to literally tell you step by step what must be done after two years of being a junior developer, then yeah you probably don't have a bright future. However, that'd be true regardless of if the AI boom was present or not.
To answer the question more directly: no I'm not afraid.
Not at all. There is very little, practically zero of my day to day AI can do.
The other threat is of course reducing the number of people needed. Like an automatic loom. Again not to worried. I think that will generate new demand.
What is something knowledge workers will still do better than AI in 5 years?
Automatic loom is something different because — like other progresses in history — it just automated manual work. This and other industrial progresses gradually led most jobs to shift from manual to office-type work. Computers automated the boring/“mechanical” parts of thinking (simulations, solving equations, etc.) but now they are taking over the whole thinking part: intuition, systemization, requirements gathering, soft skills, etc. Nothing will be spared as far as I see.
Stakeholders at the top of the ladder will still be there (as they will own the means of production — AI + compute centers) but they will mostly just instruct AI to do what they want. AI will ask clarifying questions, do research, maybe spin up a few dozen of instances of themselves to simulate the emergent results of having a group of people/an organization design and think things through, rather than a lone-wolf kind of thing.
What you say is certainly possible but I have not seen anything from LLMs so far that make me think it is probable.
They just make too many mistakes and are not really thinking. I think they are a dead end. If the scenario you talk about happens it is because we discover some new architecture previously unthought of as well as advancement in processor efficiency and price.
Consensus in the machine learning community is that current architectures are enough for AGI, no new architecture needed. There is plenty of leeway as well into how much money we can pour into training and if the linear scaling law for model capabilities still holds — and we have no reason to believe yet that they won't we have the resources (both financial and technological) to reach this within 10-20 years tops.
A few counter points: Most developers prefer Claude 3.5 (or the slightly improved 3.7 since a few days back), which is a year old already. What was supposedly ChatGPT 5 will be called 4.5 instead, after two multi-month and hugely expensive training runs. Grok 3 was trained on a massive cluster of 200.000 H100 and still not beating SOTA models.
There's certainly not "a clear way forward", even though one may hope the billions invested will lead to algorithmic breakthroughs.
physics is merciless, especially the laws of thermodynamics. You can dislike it as much as you want, but the creation simply can't be more perfect than creator. So no, large language models won't replace us. They are google 2.0, nothing more.
Concerned, yes. Even just considering current AI tech it seems extremely likely that there will be an extreme and painful shakeout of today's white collar jobs.
A small subset of workers will be able to swim upstream and capture value from the tools, and that might only be for some limited window of time.
I'm genuinely worried that within 5 years AI will completely replace me as a software developer. For perspective ChatGPT is 3 years old, and it went from spewing non-working code blurbs to scaffolding a whole project in one try.
It's also better at soft skills, communication, understanding requirements and humans, and so on. I don't really see what will be left for white-collar workers once intelligence is completely automated. I think blue-collar jobs are safe for the most part, which is unprecedented in history.
No, I have no fear that's going to happen in any significant numbers. What I do fear, personally as a dev, is that AI will become mandatory and make my job unpleasant.
AI will not "replace" white collar jobs, but will replace how those jobs are done. If you're concerned, it's less depressive to think of these times as a fast transition and adaptation phase.
If I make an analogy to nature & evolution, there's been a rapid change in the "environment" we need to survive in. There will be many causalities, but it's not your fault, so just focus on surviving for now, and you will eventually thrive.
It comes with a huge cognitive dissonance to force ourselves to change, and it's going to be a battle with ourselves during this quick adaptation and reprogramming phase as we learn new ways to solve old problems more efficiently.
The insightful mathematical part of exponential progress is that this is a mathematical law without memory.
In concrete terms, that means that blue-collar workers are protected for exponentially less time than you expect because at this rate of progress the limits of robotics automation are getting piled through
So to answer your question I do think that AI will create greater and greater pressure for many jobs starting from bad software engineers and gradually but quickly eating up through every part of society.
I will be a psychiatrist in a few years and know that there are ways to practice psychiatry that really only rely on easy soft skills and applying pharmacological heuristics and guidelines. Of course, all of this can be very well accomplished by not so frontier models in not so long from now.
But for every given job though, there are ways to practice them that are completely irreplaceable. The software engineer that invents Linux. The psychiatrist that does miracles with mostly his voice. The mechanic who has invaluable insights. And golden fingers. and then of course it is extremely hard to get at this level of mastery if you have never been able to practice as a beginner this job because AI took the place of junior level work.
So yeah I am sometimes scared that AI will completely replace huge swaths of society but then that seems so extreme and yet so believable the only thing I can think of is that prediction markets are maybe among the most important technology right now to figure out our best moves at both individual and societal levels.
And on the topic of psychiatry there’s also things I don’t think humans are ready to let AI do.
I can see how some assessments could be done easier with AI, and how some therapies could also be done using AI…
But can you imagine a chronically depressed person being willing to let an AI do ECT’s on them??
I don’t think I will disappear as an interpreter between the machine and the imprecise meatware known as a product owner. Maybe I will need to move in that direction more myself.
Will I write much code by hand? I doubt it. Debug maybe.
What makes you think that the interpreting won't be taken over by AI? It already does a far better job at understanding me across more areas than most people can.
How people live and work has undergone massive changes over the last few hundred years. At no point was there a time when people couldn't find a way to support themselves. However there is no entitlement to a certain style of life.
Yes. Let's hope we get another 15-20 years. But then we might see some big changes in just 2 years.
And yes, programmers are going to be hit hardest. Other white collar jobs more or less have some barriers or are already too cheap to be replaced.
I predict frontends and data people are the first to be impacted. Reasons: 1) too close to business, 2) in general not really very technical.
The real fear should be that certain crafts will disappear: the craft of programming, the craft of technical writing, the craft of formulating and maintaining a budget for a large organization. Instead of being a programmer, you'll be someone that runs sanity checks on AI-generated code and occasionally pokes around strange-looking code when things seem not quite right.
Imagine you were a clockmaker at the time the first electric clocks (e.g., General Electric's Telechron) were introduced. Suddenly the skills you worked so hard to acquire and which gave you such satisfaction were obsolete.
White color jobs won't go away, but they will become progressively less satisfying and less lucrative.
Quartz based watches were introduced in the 70s and had macroeconomic impact on Switzerland.
People have been trying to sell gullible investors on "this will make programmers obsolete!" more or less since programmers have existed (_COBOL_ was sometimes marketed like this). I'm... unconvinced that This Time Is Different (TM), especially after seeing the actual output of these tools.
All of this seems to be predicated on "LLMs suddenly magically becoming intelligent", and it's just not at all clear why people think that might happen.
Same goes for most other white collar stuff, even moreso for, say, lawyers and accountants (where, really, part of the job is _taking responsibility_).
I and many others have said this before, but I'll say it again.
Programming is the easy part. It really isn't hard and has never been hard to write software that generates code and other software. We've been doing that since the early 2000s with frameworks like winforms, and oh yeah compilers. The only difference now is that we're doing statistical analysis that costs literally billions of dollars in order to generate that code with a data first approach.
To generate code that actually does what you want and what your client wants, all while not being a clusterfuck of spaghetti that breaks after 6 months of changes? That's the challenging part.
So sure, if you need someone to hold your hand every step of the way in order to get a task done and to literally tell you step by step what must be done after two years of being a junior developer, then yeah you probably don't have a bright future. However, that'd be true regardless of if the AI boom was present or not.
To answer the question more directly: no I'm not afraid.
Not at all. There is very little, practically zero of my day to day AI can do.
The other threat is of course reducing the number of people needed. Like an automatic loom. Again not to worried. I think that will generate new demand.
The deepfake side of things is more scary IMO.
What is something knowledge workers will still do better than AI in 5 years?
Automatic loom is something different because — like other progresses in history — it just automated manual work. This and other industrial progresses gradually led most jobs to shift from manual to office-type work. Computers automated the boring/“mechanical” parts of thinking (simulations, solving equations, etc.) but now they are taking over the whole thinking part: intuition, systemization, requirements gathering, soft skills, etc. Nothing will be spared as far as I see.
Stakeholders at the top of the ladder will still be there (as they will own the means of production — AI + compute centers) but they will mostly just instruct AI to do what they want. AI will ask clarifying questions, do research, maybe spin up a few dozen of instances of themselves to simulate the emergent results of having a group of people/an organization design and think things through, rather than a lone-wolf kind of thing.
What you say is certainly possible but I have not seen anything from LLMs so far that make me think it is probable.
They just make too many mistakes and are not really thinking. I think they are a dead end. If the scenario you talk about happens it is because we discover some new architecture previously unthought of as well as advancement in processor efficiency and price.
Consensus in the machine learning community is that current architectures are enough for AGI, no new architecture needed. There is plenty of leeway as well into how much money we can pour into training and if the linear scaling law for model capabilities still holds — and we have no reason to believe yet that they won't we have the resources (both financial and technological) to reach this within 10-20 years tops.
> Consensus in the machine learning community is that current architectures are enough for AGI, no new architecture needed.
I don't think there's anything like consensus on this. There isn't even consensus on whether AGI is possible in the first place.
There isn't even consensus on what AGI means.
A few counter points: Most developers prefer Claude 3.5 (or the slightly improved 3.7 since a few days back), which is a year old already. What was supposedly ChatGPT 5 will be called 4.5 instead, after two multi-month and hugely expensive training runs. Grok 3 was trained on a massive cluster of 200.000 H100 and still not beating SOTA models.
There's certainly not "a clear way forward", even though one may hope the billions invested will lead to algorithmic breakthroughs.
physics is merciless, especially the laws of thermodynamics. You can dislike it as much as you want, but the creation simply can't be more perfect than creator. So no, large language models won't replace us. They are google 2.0, nothing more.
Concerned, yes. Even just considering current AI tech it seems extremely likely that there will be an extreme and painful shakeout of today's white collar jobs.
A small subset of workers will be able to swim upstream and capture value from the tools, and that might only be for some limited window of time.
Exciting times though.
I'm genuinely worried that within 5 years AI will completely replace me as a software developer. For perspective ChatGPT is 3 years old, and it went from spewing non-working code blurbs to scaffolding a whole project in one try.
It's also better at soft skills, communication, understanding requirements and humans, and so on. I don't really see what will be left for white-collar workers once intelligence is completely automated. I think blue-collar jobs are safe for the most part, which is unprecedented in history.
No, I have no fear that's going to happen in any significant numbers. What I do fear, personally as a dev, is that AI will become mandatory and make my job unpleasant.
AI will not "replace" white collar jobs, but will replace how those jobs are done. If you're concerned, it's less depressive to think of these times as a fast transition and adaptation phase.
If I make an analogy to nature & evolution, there's been a rapid change in the "environment" we need to survive in. There will be many causalities, but it's not your fault, so just focus on surviving for now, and you will eventually thrive.
It comes with a huge cognitive dissonance to force ourselves to change, and it's going to be a battle with ourselves during this quick adaptation and reprogramming phase as we learn new ways to solve old problems more efficiently.
I think AI will just change jobs. There will still be jobs.
I think AI poses a threat to knowledge though. Being able to distinguish between what is real and what is not is becoming increasingly more difficult.
I'm not scared. I'm figuring out how to adapt to the inevitability.
The insightful mathematical part of exponential progress is that this is a mathematical law without memory.
In concrete terms, that means that blue-collar workers are protected for exponentially less time than you expect because at this rate of progress the limits of robotics automation are getting piled through
So to answer your question I do think that AI will create greater and greater pressure for many jobs starting from bad software engineers and gradually but quickly eating up through every part of society.
I will be a psychiatrist in a few years and know that there are ways to practice psychiatry that really only rely on easy soft skills and applying pharmacological heuristics and guidelines. Of course, all of this can be very well accomplished by not so frontier models in not so long from now.
But for every given job though, there are ways to practice them that are completely irreplaceable. The software engineer that invents Linux. The psychiatrist that does miracles with mostly his voice. The mechanic who has invaluable insights. And golden fingers. and then of course it is extremely hard to get at this level of mastery if you have never been able to practice as a beginner this job because AI took the place of junior level work.
So yeah I am sometimes scared that AI will completely replace huge swaths of society but then that seems so extreme and yet so believable the only thing I can think of is that prediction markets are maybe among the most important technology right now to figure out our best moves at both individual and societal levels.
And on the topic of psychiatry there’s also things I don’t think humans are ready to let AI do. I can see how some assessments could be done easier with AI, and how some therapies could also be done using AI… But can you imagine a chronically depressed person being willing to let an AI do ECT’s on them??
Depends what you mean by replace.
I don’t think I will disappear as an interpreter between the machine and the imprecise meatware known as a product owner. Maybe I will need to move in that direction more myself.
Will I write much code by hand? I doubt it. Debug maybe.
What makes you think that the interpreting won't be taken over by AI? It already does a far better job at understanding me across more areas than most people can.
I'm scared that the executive class will think that AI could replace white collar jobs and then the whole world will become completely enshittified.
[dead]
[flagged]
How people live and work has undergone massive changes over the last few hundred years. At no point was there a time when people couldn't find a way to support themselves. However there is no entitlement to a certain style of life.