Warning: Some posts on this platform may contain adult material intended for mature audiences only. Viewer discretion is advised. By clicking ‘Continue’, you confirm that you are 18 years or older and consent to viewing explicit content.
You can run smaller models locally, and they can get the job done, but they are not as good as the huge models that would not fit on a your graphics card.
If you are technically adept and can run python, you can try using this:
You can run smaller models locally, and they can get the job done, but they are not as good as the huge models that would not fit on a your graphics card.
If you are technically adept and can run python, you can try using this:
https://gpt4all.io/index.html
It has a front end, and I can run queries against it in the same API format as sending them to openai.