It looks we are in this interesting cycle: millions of engineers contribute to open-source on github. The best of our minds use the code to develop powerful models to replace exactly these engineers. In fact, the more code a group contributes to github, the easier it is for the companies to replace this group. Case in point, frontend engineers are impacted most so far.
Does this mean people will be less incentivized to contribute to open source as time goes by?
P.S., I think the current trend is a wakeup call to us software engineers. We thought we were doing highly creative work, but in reality we spend a lot of time doing the basic job of knowledge workers: retrieving knowledge and interpolating some basic and highly predictable variations. Unfortunately, the current AI is really good at replacing this type of work.
My optimistic view is that in long term we will have invent or expand into more interesting work, but I'm not sure how long we will have to wait. The current generation of software engineers may suffer high supply but low demand of our profession for years to come.
As much as I support community developed software and "free as in freedom", "Open Source" got completely perverted into tricking people to work for free for huge financial benefits for others. Your comment is just one example of that.
For that reason all my silly little side projects are now in private repos. I dont care the chance somebody builds a business around them is slim to none. Dont think putting a license will protect you either. You'd have to know somebody is violating your license before you can even think about doing anything and that's basically impossible if it gets ripped into a private codebase and isnt obvious externally.
What harm is there to you if someone uses some of your code to build a business, as compared to not doing so? How are you worse off?
I’ve never understood this mentality. It seems very zero sum and kind of anti social. I’ve built a couple of businesses, and there’s always economic or technical precedent. I honestly don’t mind paying it forward if someone can benefit from side projects I enjoyed doing anyways.
And much as people joke about "exposure" not putting food on the table, being able to walk into a job interview with your name known because the company is already using your code/project is huge.
If that someone then takes that work that you're providing for free to other people to build on it, makes a closed source product out of it and gives you no attribution, then you can be darn well sure I want to protect it.
So let's say your side project improves your life by 5 happiness points. You have two options:
--- OPTION A - Keep your project private.
• You get five happiness points.
--- OPTION B - Make your project public.
• Other individuals may get a small number of happiness points.
• A megacorp might turn your project into a major product without compensating you and get a million happiness points.
• You get five happiness points.
----------
In either scenario, you still end up with five happiness points. If you release your code, other people may get even more happiness points than you, which isn't really fair. But you are no worse off, and you've increased humanity's total wealth of happiness points.
You really dont see why somebody wouldnt like a megacorp to take their hard work, use it to make a billion dollars, dont see a cent themselves, while struggling to buy a house in this very unaffordable housing market?
Google, Microsoft, Meta, IBM, Red Hat, etc. are huge players in open source, they probably contribute significantly more hours of work in building and maintaining major open source projects than the hobbyists.
Not that hobbyists don't contribute, but these models are certainly being trained on the work of salaried engineers as much as their trained on hobbyists' spare time projects.
This assumes that none of the effects of making a project public or private have any impact on the output of your personal utility function, which may be true for you personally, but certainly cannot validly be assumed to be generally true.
> "Open Source" got completely perverted into tricking people to work for free for huge financial benefits for others
I'm quite conflicted on this assessment. On one hand, I was wondering if we would get better job market if there were not much open-sourced systems. We may have had a much slower growth, but we would see our growth last for a lot more years, which mean we may enjoy our profession until our retirement and more. On the other hand, open source did create large cakes, right? Like the "big data" market, the ML market, the distributed system market, and etc. Like the millions of data scientists who could barely use Pandas and scipy, or hundreds of thousands of ML engineers who couldn't even bother to know what semi positive definite matrix is.
> P.S., I think the current trend is a wakeup call to us software engineers. We thought we were doing highly creative work, but in reality we spend a lot of time doing the basic job of knowledge workers: retrieving knowledge and interpolating some basic and highly predictable variations. Unfortunately, the current AI is really good at replacing this type of work.
Most of the waking hours of most creative work have this type of drudgery. Professional painters and designers spend most of their time replicating ideas that are well fleshed-out. Musicians spend most of their time rehearsing existing compositions.
There is a point to be made that these repetitive tasks are a prerequisite to come up with creative ideas.
I disagree. AI have shown to most capable in what we consider creative jobs. Music creation, voice acting, text/story writing, art creation, video creation and more.
The difference, of course, being that synthesizers and drum machines are instruments that require actual skill and talent and can be used to express the unique musical style of an artist, whereas AI requires neither skill nor talent, and it cannot generate anything with actual artistic direction, intent or innovation, much less a unique creative style.
AI is never going to give the world a modern Kraftwerk or Silver Apples or Brian Eno. The best an AI "artist" can do is have the machine mimic them.
Still the same thing. The argument then was that synths weren’t “real instruments” and that sequencers meant people weren’t “real musicians”.
AI relies on prompting. In the hands of a skilled artist it is just another tool. In the hands of an amateur hack, it is no different than giving a drum kit to a 4 year old.
They were right in many cases. You can choose to pick out the small percentage of musicians who were successful there or you can recognize the many that were never known.
You can do the same for photography.
People keep lowering expectations or demands on quality because things get easier and humans always prefer the easy option.
Look, in the hands of a skilled artist, generative fill is really useful.
In the same way that the synth is superstitious is banging.
Sampling, when in the hands of a legend is also spectacular, see the prodigy and a break down of the samples they used. (or any half decent hiphop band)
Then you get akon who just sped up a single sample put a beat on it and shat out some halfarsed shit.
AI music is disposable generic soulless trash, even if it is technically correct, in accordance with the rules and conventions of music theory. AI generates Muzak. Totally generic and derivative.
There is no AI equivalent to Kurt Cobain, or James Brown, or Tori Amos.
There is no AI equivalent to Kurt Cobain or any other artists because 99% of what they are is not at all about their musical skill but all about marketing. There are thousands of musicians just as skilled if not more than Kurt Cobain or James Brown yet who don't have their fame. I also have no doubt an AI will outperform most musicians in short order in the foreseeable future. The step from making no music at all, to making acceptable music is gigantic compared to making acceptable music to making great music.
If you mean create as in literally, sure. But not in being creative. AI can't solve novel problems yet. The person you're replying to obviously means being creative not literally creating something.
You can't say AI is creating something new but that it isn't being creative with clearly explaining why you think that's the case. AI is creating novel solution to problems humans haven't cracked in centuries. I don't see anything more creative than this.
Yes, really. Just yesterday google announced their AI was able to improve on human SotA algorithms in 25% of the cases fed into it. One of them was 4x4 complex matrix multiply. Which had pretty huge pressure to be improved.
What is the qualifier for this? Didn't one of the models recently create a "novel" algorithm for a math problem? I'm not sure this holds water anymore.
My pessimistic view is that we're liable to end up cutting off the pipeline into the industry. Similar to lawyers replacing clerks with bots, if senior engineers can now command bots rather than mentor new hires, where is the on-ramp? How does one actually gain enough experience to become a senior?
Or is all this a nothing-burger, since the new hires will just be commanding bots of their own, but on lower level tasks that they are qualified to supervise?
> Does this mean people will be less incentivized to contribute to open source as time goes by?
Yes. I certainly don't intend to put any free code online until I can legally bar AI bros from using it without payment. As Mike Monteiro put it long ago, "F** you, pay me" (https://www.youtube.com/watch?v=jVkLVRt6c1U)
If you extrapolate and generalize further... what is at risk is any task that involves taking information input (text, audio, images, video, etc.), and applying it to create some information output or perform some action which is useful.
That's basically the definition of work. It's not just knowledge work, it's literally any work.
Does this mean people will be less incentivized to contribute to open source as time goes by?
P.S., I think the current trend is a wakeup call to us software engineers. We thought we were doing highly creative work, but in reality we spend a lot of time doing the basic job of knowledge workers: retrieving knowledge and interpolating some basic and highly predictable variations. Unfortunately, the current AI is really good at replacing this type of work.
My optimistic view is that in long term we will have invent or expand into more interesting work, but I'm not sure how long we will have to wait. The current generation of software engineers may suffer high supply but low demand of our profession for years to come.