>>106473795 (OP)
>Bill Gates not agitating developers
For once, thank you. This market is not really developer friendly and even if it's not true, I really appreciate the sentiment. Sifting through the constant stream of news about the doom and gloom of coding can make me frustrated especially since it seems to be a specific target of AI.
Do I think it will be 100% human? Yes. People still have trouble connecting to Wifi and Bluetooth even if you spoonfeed them. Gen Z and the Gen Alpha are historically bad at tech to the point where Boomers are teaching them how to install Adobe. I don't think it will change until theres another generational breakthrough in tech literacy like there was with Millennials.
I really wish AI would automate at least some of it, or the most annoying parts of it, where I have to announce, "TURN IT OFF AND TURN IT ON AGAIN" until it works. If AI is so great, why do people still need developers and IT to tell them to turn it off and on again?
Don't get me wrong, I think AI is great. I just think the type of AI that will automate coding is worse than an actual developer. The current systems take a day or 2 where you might rack up a few thousand dollars in fees from it trying and failing over and over again. This easily becomes millions of dollars a year to brute force a AI that is as capable as a professional developer.
When would this not be true? We're currently dealing with exponentials. It took about a year of training after ChatGPT was first released to get to GPT-4. Then 2 years for GPT-5 after that. It might take around 4 years (unless there's some breakthrough) for GPT-6 which would land around 2027 to 2030. Here's the release schedule assuming exponentials for training time,
GPT-7 -----> 2035
GPT-8 -----> 2045
GPT-9 -----> 2065
GPT-10 ----> 2105
That's just the training time, which may not always be the case. However, we would also need exponentially more training data, which we don't currently have exponentially more.