Warning: Some posts on this platform may contain adult material intended for mature audiences only. Viewer discretion is advised. By clicking ‘Continue’, you confirm that you are 18 years or older and consent to viewing explicit content.
The “glue on pizza” thing wasn’t a result of the AI’s training, the AI was working fine. It was the search result that gave it a goofy answer to summarize.
The problem here is that it seems people don’t really understand what goes into training an LLM or how the training data is used.
The “glue on pizza” thing wasn’t a result of the AI’s training, the AI was working fine. It was the search result that gave it a goofy answer to summarize.
The problem here is that it seems people don’t really understand what goes into training an LLM or how the training data is used.