Warning: Some posts on this platform may contain adult material intended for mature audiences only. Viewer discretion is advised. By clicking ‘Continue’, you confirm that you are 18 years or older and consent to viewing explicit content.
I’ve seen this with gpt4. If I ask it to proofread text with errors it consistently does a great job, but if I prompt it to proofread a text without errors, it hallucinates them. It’s funny to see Microsoft having the same issue.
I’m pretty sure MS uses GPT-4 as the foundation of all their AI stuff, so it’s not surprising to see them have the same issues. Funny, as you said, but not surprising.
I’ve seen this with gpt4. If I ask it to proofread text with errors it consistently does a great job, but if I prompt it to proofread a text without errors, it hallucinates them. It’s funny to see Microsoft having the same issue.
I’m pretty sure MS uses GPT-4 as the foundation of all their AI stuff, so it’s not surprising to see them have the same issues. Funny, as you said, but not surprising.