home
-
all
|
technology
-
piracy
-
linux
-
asklemmy
-
memes
-
selfhosted
-
technology
-
nostupidquestions
-
mildlyinfuriating
-
games
-
privacy
-
gaming
-
opensource
-
programmerhumor
-
showerthoughts
-
lemmyworld
-
fediverse
-
android
-
asklemmy
-
lemmyshitpost
-
more »
log in
or
sign up
|
settings
Appleseed_ss@alien.top
overview
[+]
[–]
Appleseed_ss
joined 11 months ago
sorted by:
new
top
controversial
old
I’m extremely confused about system requirements. Some people are worried about ram and others about vram. I have 64gb of ram and 12gb vram. What size of model can I run?
in
c/localllama@poweruser.forum
[–]
Appleseed_ss@alien.top
1 points
11 months ago
I have the same specs and basically any of the 7b and 13b models run great using Ollama.
permalink
fedilink
source
I have the same specs and basically any of the 7b and 13b models run great using Ollama.