LLM’s are also much better at faking it in remarkably unpredictable ways, and are hitting diminishing returns (or even backsliding) on improvements.
And LLM’s can’t actually do anything without humans structuring everything and doing stuff on it’s behalf.
LLM’s are also much better at faking it in remarkably unpredictable ways, and are hitting diminishing returns (or even backsliding) on improvements.
And LLM’s can’t actually do anything without humans structuring everything and doing stuff on it’s behalf.