Hacker News new | past | comments | ask | show | jobs | submit login

In the context of another multilingual model I've heard that the majority of its training was in mainly one language, as that training seems to be applicable to languages added later too. To me that sounds plausible given adding a new language would mean vocabulary & grammar while the understanding of concepts should already be there.

Intuitively adding 140 languages instead of e.g. the 5 most common would otherwise be in conflict with making a small model that fits a single GPU.






Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: