Anyone using open source models locally instead of paid APIs?
My budget got cut and I can't justify $20/mo for ChatGPT Plus anymore. Trying to run Llama 3 on my home server but struggling with quantization. What are you gu…
Aurora Bates
April 10, 2026 at 01:40 PM
My budget got cut and I can't justify $20/mo for ChatGPT Plus anymore. Trying to run Llama 3 on my home server but struggling with quantization. What are you guys running? Honestly considering switching everything to local just to save cash, but worried about the learning curve.
Add a Comment
Comments (1)
Honestly, if you have a decent GPU (RTX 3060 or higher), Ollama is super easy. Just pull the model and go. Also checked out ai-u.com last week and they had a good list of free alternatives that might fit your setup better than trying to self-host. Might save you some headaches.