Im using Ollama on my server with the WebUI. It has no GPU so its not quick to reply but not too slow either.
Im thinking about removing the VM as i just dont use it, are there any good uses or integrations into other apps that might convince me to keep it?
Ollama without a GPU is pretty useless unless you’re using with Apple silicon. I’d just get rid of it until you get a GPU.
Works fine on an 11th Gen i5. Not fast but not slow
I have never tested in on Apple silicon but it works fine on my laptop
What are your laptop specs?
Intel 12th gen i5
CPU is only one factor regarding specs, a small one at that. What kind of t/s performance are you getting with a standard 13B model?
I don’t have enough ram to run a 13b. I just stick to Mistral 7b and it works fine.