home
-
all
|
technology
-
piracy
-
linux
-
asklemmy
-
memes
-
selfhosted
-
technology
-
nostupidquestions
-
mildlyinfuriating
-
games
-
worldnews
-
privacy
-
opensource
-
gaming
-
programmerhumor
-
showerthoughts
-
fediverse
-
lemmyworld
-
android
-
asklemmy
-
more »
log in
or
sign up
|
settings
dazld@alien.top
overview
[+]
[–]
dazld
joined 1 year ago
sorted by:
new
top
controversial
old
What’s recommended hosting for open source LLMs?
in
c/localllama@poweruser.forum
[–]
dazld@alien.top
1 points
1 year ago
Did you think about running out of a local m1 Mac mini? Ollama uses the Mac GPU out of the box.
permalink
fedilink
source
Did you think about running out of a local m1 Mac mini? Ollama uses the Mac GPU out of the box.