Warning: Some posts on this platform may contain adult material intended for mature audiences only. Viewer discretion is advised. By clicking ‘Continue’, you confirm that you are 18 years or older and consent to viewing explicit content.
Neither, in this case it’s an accurate summary of one of the results, which happens to be a shitpost on Quara. See, LLM search results can work as intended and authoritatively repeat search results with zero critical analysis!
“If it’s on the internet it must be true” implemented in a billion dollar project.
Not sure what would frighten me more: the fact that this is trainings data or if it was hallucinated
Neither, in this case it’s an accurate summary of one of the results, which happens to be a shitpost on Quara. See, LLM search results can work as intended and authoritatively repeat search results with zero critical analysis!
Pretty sure AI will start telling us “You should not believe everything you see on the internet as told by Abraham Lincoln”