Warning: Some posts on this platform may contain adult material intended for mature audiences only. Viewer discretion is advised. By clicking ‘Continue’, you confirm that you are 18 years or older and consent to viewing explicit content.
I think LLMs will be fundamentally unable to do certain things because of how they function. They’re only as good as what they’re trained on, and with how fast and loose companies have been with that, they’ll have formed patterns based on incorrect information.
Fundamentally, it doesn’t know what it generates. If I ask for a citation, it’ll likely give me a result that seems legit, but doesn’t actually exist.
I think LLMs will be fundamentally unable to do certain things because of how they function. They’re only as good as what they’re trained on, and with how fast and loose companies have been with that, they’ll have formed patterns based on incorrect information.
Fundamentally, it doesn’t know what it generates. If I ask for a citation, it’ll likely give me a result that seems legit, but doesn’t actually exist.