1. You don't need massive amounts of funding to work on ML. A good deal of important work in ML was done in universities (eg. GANs, DPM, DDPM, DDIM) or were published before the hype (Attention). The only qualifier here is that training cost a lot right now. Even so, you don't need billions to train and costs may go down as memory costs come down and hardware competition increases.
2. You don't need VC type investors to fund ML research. Large tech companies like Facebook, Google, Microsoft, ByteDance and Huawei will continue investing in ML no matter what, even if the total amount they invest goes down (which I personally don't think it will). Even if they shift away from chatbots and only focus on simpler NLP tasks as described above, related research will still continue as all these tasks are related. For example, Attention was originally developed for translation and Llama 3.2 isn't just a chatbot and can also do general image description, which is clearly important to Facebook and ByteDance for recommendations and to Google for image search and ads. Understating what people like and what they are looking at is a difficult NLP problem and one that many tech companies would like to solve. And better image descriptions could then improve existing image datasets by allowing better text-image pairs, which could then improve image generation. So hard NLP, image generation and translation are all related and are increasingly converging into single multimodal LLMS. That is, the best OCR, image generation translation etc. models may be ones that also understand language in general (ie. broad and difficult NLP tasks). The issue is that OP assumes it must be AGI or bust.
1. You don't need massive amounts of funding to work on ML. A good deal of important work in ML was done in universities (eg. GANs, DPM, DDPM, DDIM) or were published before the hype (Attention). The only qualifier here is that training cost a lot right now. Even so, you don't need billions to train and costs may go down as memory costs come down and hardware competition increases.
2. You don't need VC type investors to fund ML research. Large tech companies like Facebook, Google, Microsoft, ByteDance and Huawei will continue investing in ML no matter what, even if the total amount they invest goes down (which I personally don't think it will). Even if they shift away from chatbots and only focus on simpler NLP tasks as described above, related research will still continue as all these tasks are related. For example, Attention was originally developed for translation and Llama 3.2 isn't just a chatbot and can also do general image description, which is clearly important to Facebook and ByteDance for recommendations and to Google for image search and ads. Understating what people like and what they are looking at is a difficult NLP problem and one that many tech companies would like to solve. And better image descriptions could then improve existing image datasets by allowing better text-image pairs, which could then improve image generation. So hard NLP, image generation and translation are all related and are increasingly converging into single multimodal LLMS. That is, the best OCR, image generation translation etc. models may be ones that also understand language in general (ie. broad and difficult NLP tasks). The issue is that OP assumes it must be AGI or bust.