Warning: Some posts on this platform may contain adult material intended for mature audiences only. Viewer discretion is advised. By clicking ‘Continue’, you confirm that you are 18 years or older and consent to viewing explicit content.
LLMs do not “teach,” and that is why learning from them is dangerous. They synthesize words and return other words, but they do not understand the content presented to them in any sense. Because of this, there is the chance that they are simply spouting bullshit.
Learn from them if you like, but remember they are absolutely no substitute for a human, and basically everything they tell you must be checked for correctness.
LLMs do not “teach,” and that is why learning from them is dangerous. They synthesize words and return other words, but they do not understand the content presented to them in any sense. Because of this, there is the chance that they are simply spouting bullshit.
Learn from them if you like, but remember they are absolutely no substitute for a human, and basically everything they tell you must be checked for correctness.
GPT4 did teach me. I say this as the one who learned, whatever that’s worth.