Warning: Some posts on this platform may contain adult material intended for mature audiences only. Viewer discretion is advised. By clicking ‘Continue’, you confirm that you are 18 years or older and consent to viewing explicit content.
“Current AI is not a knowledge tool. It MUST NOT be used to get information about any topic!”
If your child is learning Scottish history from AI, you failed as a teacher/parent.
This isn’t even about bias, just about what an AI model is. It’s not even supposed to be correct, that’s not what it is for. It is for appearing as correct as the things it has been trained on. And as long as there are two opinions in the training data, the AI will gladly make up a third.
That doesn’t matter though. People will definitely use it to acquire knowledge, they are already doing it now. Which is why it’s so dangerous to let these “moderate” inaccuracies fly.
You even perfectly summed up why that is: LLMs are made to give a possibly correct answer in the most convincing way.
Repeat after me:
“Current AI is not a knowledge tool. It MUST NOT be used to get information about any topic!”
If your child is learning Scottish history from AI, you failed as a teacher/parent. This isn’t even about bias, just about what an AI model is. It’s not even supposed to be correct, that’s not what it is for. It is for appearing as correct as the things it has been trained on. And as long as there are two opinions in the training data, the AI will gladly make up a third.
deleted by creator
That doesn’t matter though. People will definitely use it to acquire knowledge, they are already doing it now. Which is why it’s so dangerous to let these “moderate” inaccuracies fly.
You even perfectly summed up why that is: LLMs are made to give a possibly correct answer in the most convincing way.