Warning: Some posts on this platform may contain adult material intended for mature audiences only. Viewer discretion is advised. By clicking ‘Continue’, you confirm that you are 18 years or older and consent to viewing explicit content.
I can run a pretty alright text generation model and the stable diffusion models on my 2016 laptop with two GTX1080m cards. You can try with these tools:
Oobabooga textgenUi
I can run a pretty alright text generation model and the stable diffusion models on my 2016 laptop with two GTX1080m cards. You can try with these tools: Oobabooga textgenUi
Automatic1111 image generation
They might not be the most performant applications but they are very easy to use.
You seem to have missed the point a bit
Just read it again and you’re right. But maybe someone else finds it useful.
I do, so thank you :)
deleted by creator
Funny how these comments appeared only today in my instance, I guess there are some federation issues still
new phone who dis?
“I wish I had X”
“Here’s X”
What point was missed here?
The post “I wish X instead of Y”
The comment: “And run it [X] locally”
The next comment: “You can run Y locally”
Also the one I told this literally admitted that I was right and you’re arguing still