Warning: Some posts on this platform may contain adult material intended for mature audiences only. Viewer discretion is advised. By clicking ‘Continue’, you confirm that you are 18 years or older and consent to viewing explicit content.
We already do! And on the cheap! I have a Coral TPU running for presence detection on some security cameras, I’m pretty sure they can run LLMs but I haven’t looked around.
GPT4ALL runs rather well on a 2060 and I would only imagine a lot better on newer hardware
Now THAT is the AI innovation I’m here for
LLMs are in a position to make boring NPCs much better.
Once they can be run locally at a good speed it’ll be a game changer.
I reckon we’ll start getting AI cards for computers soon.
We already do! And on the cheap! I have a Coral TPU running for presence detection on some security cameras, I’m pretty sure they can run LLMs but I haven’t looked around.
GPT4ALL runs rather well on a 2060 and I would only imagine a lot better on newer hardware