They’re not even “stupid” though. It’s more like if you somehow trained a parrot with every book ever written and every web page ever created and then had it riff on things.
But, even then, a parrot is a thinking being. It may not understand the words it’s using, but it understands emotion to some extent, it understands “conversation” to a certain extent – taking turns talking, etc. An LLM just predicts the word that should appear next statistically.
An LLM is nothing more than an incredibly sophisticated computer model designed to generate words in a way that fools humans into thinking those words have meaning. It’s almost more like a lantern fish than a parrot.
LLM’s are the most well-read morons on the planet.
They’re not even “stupid” though. It’s more like if you somehow trained a parrot with every book ever written and every web page ever created and then had it riff on things.
But, even then, a parrot is a thinking being. It may not understand the words it’s using, but it understands emotion to some extent, it understands “conversation” to a certain extent – taking turns talking, etc. An LLM just predicts the word that should appear next statistically.
An LLM is nothing more than an incredibly sophisticated computer model designed to generate words in a way that fools humans into thinking those words have meaning. It’s almost more like a lantern fish than a parrot.
And how do you think it predicts that? All that complex math can be clustered into higher level structures. One could almost call it… thinking.
Besides we have reasoning models now, so they can emulate thinking if nothing else