i’m sort of a Luddite so take this with a grain of salt.
I don’t see this going anywhere.
If it’s really good and we can actually have 1 developer instead of 2 now why would any developer want to do this? this would basically be a piece of automation that diminishes the value we are creating.
If it’s crap it’s going to create mountains of verbosity and code written to pump up the LOC numbers. It’s terrible. After that you’re gonna end up maintaining and enhancing what an “ai” spew out.
I’m not buying the argument this is a problem that needs solving. It’s in the same vein with self driving cars. It looks impressive, it’s good PR, it’s absolutely insanely hard to get right and the benefits (even if we get it right) are questionable.
I am skeptical and against the hype of this "replacing" programmers which it certainly won't as the AI engine that it uses (GPT-3) is limited and the code itself can also generate garbage or introduce insecure and vulnerable code as well. This is why it will always be 'assistive' rather than going to 'replace' anything. 10 years later, self-driving cars are still unsafe and immature.
The hype squad of this tool know it is limited but they want to capitalise on the 'AI' automation narrative to those who don't know any better.
I don’t see this going anywhere.
If it’s really good and we can actually have 1 developer instead of 2 now why would any developer want to do this? this would basically be a piece of automation that diminishes the value we are creating.
If it’s crap it’s going to create mountains of verbosity and code written to pump up the LOC numbers. It’s terrible. After that you’re gonna end up maintaining and enhancing what an “ai” spew out.
I’m not buying the argument this is a problem that needs solving. It’s in the same vein with self driving cars. It looks impressive, it’s good PR, it’s absolutely insanely hard to get right and the benefits (even if we get it right) are questionable.
the way it was introduced is also disingenuous.