

Ollama is a big thing, do you want it to be fast? You will need a GPU, how large is the model you will be running be? 7/8B with CPU not as fast but no problem. 13B slow with CPU but possible
Ollama is a big thing, do you want it to be fast? You will need a GPU, how large is the model you will be running be? 7/8B with CPU not as fast but no problem. 13B slow with CPU but possible
I would take that any day!
Now I get why it does what it does and how it works. I never thought that the colon was the variable name but it makes so much sense!
Probably because of accessibility I would say. Not good design but it is what it is
Meh that sucks i even have a perfectly working ddns, I mean I know I don’t get something like a PTR record but i wish that mail hosters would allow for more self hosting options
Oh yeah I heard about this and saw that mutahar (some ordinary gamers) was doing it once on windows with a 4090. I would love to do that on my GPU and then split it between my host and my VM
Wonderful thank you so much!
I need that wallpaper! Is there a way you could provide me that?
Just want to piggyback this. You will probably need more than 6gb vram to run good enough models with a acceptable speed and coherent output, but the more the better.
They also created ghidra! Probably second best