You must log in or register to comment.
There’s also an unofficial web frontend https://github.com/ollama-webui/ollama-webui
Though I can’t get compose to use my gpu
Ollama is pretty sweet, I’m self-hosting it using 3B models on an old X79 server. I created a neat terminal AI client that makes requests to it on the local network - called “Jeeves Assistant”.
Thanks for sharing this. Not an expert, so here goes my dumb questions. Can we potentially train these model with local data kinds of like Stable Diffusion checkpoints?
Ollama works great with Big-AGI too, look it up on github.
I’m thoroughly disappointed by the lack of Obama memes in this repo.
Thanks Ollama