- cross-posted to:
- technews@radiation.party
- hackernews@derp.foo
- cross-posted to:
- technews@radiation.party
- hackernews@derp.foo
You must log in or register to comment.
There’s also an unofficial web frontend https://github.com/ollama-webui/ollama-webui
Though I can’t get compose to use my gpu
Thanks for sharing this. Not an expert, so here goes my dumb questions. Can we potentially train these model with local data kinds of like Stable Diffusion checkpoints?
Ollama works great with Big-AGI too, look it up on github.
Thanks Ollama
Ollama is pretty sweet, I’m self-hosting it using 3B models on an old X79 server. I created a neat terminal AI client that makes requests to it on the local network - called “Jeeves Assistant”.
I’m thoroughly disappointed by the lack of Obama memes in this repo.