Warning: Some posts on this platform may contain adult material intended for mature audiences only. Viewer discretion is advised. By clicking ‘Continue’, you confirm that you are 18 years or older and consent to viewing explicit content.
I think the human brain works kind of the opposite of that. Babies are born with a shitload of neural connections, then the connections decrease over a person’s lifetime. ANNs typically do something similar to that while training (many connection weights will be pushed toward zero, having little or no effect).
But yeah, these LLMs are typically trained once, and frozen during use. “Online learning” is a type of training that continually learns, but current online methods typically lead to worse models (ANNs “forget” old things they’ve “learned” when learning new things).
IMHO as long as no new random “neurons” form, it’s not AI as in Artificial Intelligence, just “a lot of ifs”
I think the human brain works kind of the opposite of that. Babies are born with a shitload of neural connections, then the connections decrease over a person’s lifetime. ANNs typically do something similar to that while training (many connection weights will be pushed toward zero, having little or no effect).
But yeah, these LLMs are typically trained once, and frozen during use. “Online learning” is a type of training that continually learns, but current online methods typically lead to worse models (ANNs “forget” old things they’ve “learned” when learning new things).