I’m surprised one or more of the software providers don’t just license this app and integrate it.
I am using AI to write more code. Experienced programmers can probably work 10+ times faster with AI. Right now, we have a strong pool of programmers/software engineers with decades of experience programming the old-fashioned way and an excellent understanding of software. - The question is whether they can pick up AI and live with it. Some probably won't be able to. I don't know. I do wonder why Amazon and so many other companies are laying off experienced software engineers. Maybe, perhaps, they just don't have any use for more software.
Anyway, experienced programmers who can write code in 6+ languages can make very, very good use of AI.
I am pretty sure the newer programmers coming out of universities who have been using Co-Pilot from the start will be limited and most will never attain the programming skill of the pre-AI software engineers.
So, I predict a plethora of software systems coming down the road. Many will be almost the same for sure. EVENTUALLY, MOST WILL NOT BE MARKETABLE - THEY WILL BE OPEN-SOURCED. In fact, that is already happening.
Appraisers and other professionals who can learn enough about computers and software to use open-source software will have an advantage. What will be saleable to software engineers is their time to help other professionals set up and run open-source software, as well as other training, courses, and books. It can go in many directions.
The other prediction is that this software will become complex as hell - because you have AI constantly jumping in and adding more functionality.
Yes, the software will basically be free, but so complex to use that you will need training, advice, and maintenance help - at times.
A good computer nowadays is Mac Studio M2 Ultra, maybe even a mini-Mac M4 Pro. I asked ChatGPT about upgrading my M2 Ultra to an M3 or M5 Ultra with max memory and GPUs. It pooed on the idea, saying what was important was bandwidth which it didn't think would improve with the newer models, and the best I could hope for was a slight increase in performance. It said if I really needed more power for AI, I would have to go to one of the more expensive Claude, ChatGPT or Grok plans, in particular the so-called API versions. Probably it will be Claude Max - if I really need it.
My experience is that the ollama deepseek-r1:14b model does an excellent job of making changes to my code. It also handles the tedious work of creating unit and other tests, checking in and out of GitHub, and so on. If I need to do something really complex, like mathematical reasoning, I can move up to the deepseek-r1:70b model, which runs fine on the M2 Ultra, but is a little on the slow side.
Nonetheless, I probably will buy the M5 Ultra when it comes out. I hope it has more than 512 GB of Unified Memory; 1+ Tb would be nice. And more memory bandwidth. We don't know the exact specs yet. But going from the M2 to the M5 is predicted to provide only a 30% increase in performance - we are starting to hit a performance wall.
BTW, existing software companies are going to be making very hefty profits, unless their prices come down. With AI, they can get by with far fewer programmers. So, know - you should expect prices to decrease!! Look and expect min 20% price reductions.