this post was submitted on 25 Nov 2023
1 points (100.0% liked)
LocalLLaMA
3 readers
1 users here now
Community to discuss about Llama, the family of large language models created by Meta AI.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I am currently running windows with my AMD, but that is only because I prefer windows. Pretty much nothing, except Stable Diffusion at very slow speeds via direct ml and koboldcpp-rocm inference, works. I was able to use normal Stable Diffusion on Ubuntu after ~2h of trying to get it to work. Sadly, it randomly stopped working the next week. Never managed to get Ooba working, but I gave up rather quick after I found koboldcpp-rocm.