Warning: Some posts on this platform may contain adult material intended for mature audiences only. Viewer discretion is advised. By clicking ‘Continue’, you confirm that you are 18 years or older and consent to viewing explicit content.
Thing is, there wasn’t even a chance of having a full Tay incident. The problem with Tay was that it was a learning model, so people could teach it to be more messed up.
Meanwhile, ChatGPT doesn’t learn, and instead has a preset dataset it knows (hence why it only knows things up to September 2021), so the main reason why it got so heavily censored is more likely to avoid much more minor incidents, which imo is dumb.
Thing is, there wasn’t even a chance of having a full Tay incident. The problem with Tay was that it was a learning model, so people could teach it to be more messed up.
Meanwhile, ChatGPT doesn’t learn, and instead has a preset dataset it knows (hence why it only knows things up to September 2021), so the main reason why it got so heavily censored is more likely to avoid much more minor incidents, which imo is dumb.