Warning: Some posts on this platform may contain adult material intended for mature audiences only. Viewer discretion is advised. By clicking ‘Continue’, you confirm that you are 18 years or older and consent to viewing explicit content.
GPT 2 was just a bullshit generator. It was like a politician trying to explain something they know nothing about.
GPT 3.0 was just a bigger version of version 2. It was the same architecture but with more nodes and data as far as I followed the research. But that one could suddenly do a lot more than the previous version, so by accident. And then the AI scene exploded.
To be fair they’re not accidentally good enough: they’re intentionally good enough.
That’s where all the salary money went: to find people who could make them intentionally.
GPT 2 was just a bullshit generator. It was like a politician trying to explain something they know nothing about.
GPT 3.0 was just a bigger version of version 2. It was the same architecture but with more nodes and data as far as I followed the research. But that one could suddenly do a lot more than the previous version, so by accident. And then the AI scene exploded.
So the architecture just needed more data to generate useful answers. I don’t think that was an accident.