Warning: Some posts on this platform may contain adult material intended for mature audiences only. Viewer discretion is advised. By clicking ‘Continue’, you confirm that you are 18 years or older and consent to viewing explicit content.
For example, ask GPT4 for an example of cross site scripting in flask and you’ll have an ethics discussion. Grab an uncensored model off HuggingFace you’re off to the races
Requires cuda. They’re essentially large mathematical equations that solve the probability of the next word.
The equations are derived by trying different combinations of values until one works well. (This is the learning in machine learning). The trick is changing the numbers in a way that gets better each time (see e.g. gradient descent)
Is it as good as chatgpt?
The question is quickly answered as none is currently that good, open or not.
Anyway it seems that this is just a manager. I see some competitors available that I have heard good things about, like mistral.
Local LLMs can beat GPT 3.5 now.
I think a good 13B model running on 12GB of VRAM can do pretty well. But I’d be hard pressed to believe anything under 33B would beat 3.5.
Asking as someone who doesn’t know anything about any of this:
Does more B mean better?
B stands for Billion (Parameters) IIRC
3.5 fuckin sucks though. That’s a pretty low bar to set imo.
Many are close!
In terms of usability though, they are better.
For example, ask GPT4 for an example of cross site scripting in flask and you’ll have an ethics discussion. Grab an uncensored model off HuggingFace you’re off to the races
Seems interesting! Do I need high end hardware or can I run them on my old laptop that I use as home server?
Oh no you need a 3060 at least :(
Requires cuda. They’re essentially large mathematical equations that solve the probability of the next word.
The equations are derived by trying different combinations of values until one works well. (This is the learning in machine learning). The trick is changing the numbers in a way that gets better each time (see e.g. gradient descent)
How’s the guy who said he’s running off a 1060 doing it?
Slowly
Then you don’t need a 3060 at least
Oh this is unfortunate ahahahaha
Thanks for the info!