Warning: Some posts on this platform may contain adult material intended for mature audiences only. Viewer discretion is advised. By clicking ‘Continue’, you confirm that you are 18 years or older and consent to viewing explicit content.
Okay, I disagree that our brains “suffer similarly when outside of training data”. The capacity of a mind to infer meaning and dynamically problem solve is qualitatively different from an LLM. We can see something completely new to us and immediately start making connections and inferences. LLMs don’t make inferences because they don’t understand meaning.
However, I agree that our brains are effectively just organic neural networks. That’s just definitionally true, because neural networks are biomimicry, and we can tell we’ve got it right because they successfully mimic pieces of our brain. An LLM is effectively like the language planning centre of our brain, but that planning centre just gives us phrases. We have to pass those phrases through our consciousness, our context engine, to determine if they really mean what we want. When someone is choosing their words carefully, they are doing this. If we aren’t careful sometimes we shit out some words without really thinking and we sound dumb, just like an LLM.
Okay, I disagree that our brains “suffer similarly when outside of training data”. The capacity of a mind to infer meaning and dynamically problem solve is qualitatively different from an LLM. We can see something completely new to us and immediately start making connections and inferences. LLMs don’t make inferences because they don’t understand meaning.
However, I agree that our brains are effectively just organic neural networks. That’s just definitionally true, because neural networks are biomimicry, and we can tell we’ve got it right because they successfully mimic pieces of our brain. An LLM is effectively like the language planning centre of our brain, but that planning centre just gives us phrases. We have to pass those phrases through our consciousness, our context engine, to determine if they really mean what we want. When someone is choosing their words carefully, they are doing this. If we aren’t careful sometimes we shit out some words without really thinking and we sound dumb, just like an LLM.