this post was submitted on 12 Nov 2023
1 points (100.0% liked)

LocalLLaMA

11 readers
4 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 2 years ago
MODERATORS
 

Curious if anyone got the whole rig and realize they didn't really need it etc

you are viewing a single comment's thread
view the rest of the comments
[–] Relief-Impossible@alien.top 1 points 2 years ago

I’m waiting for the rtx 5090 to release. I heard it’s gonna have 32gb of vram. Right now I only have a 2060 with 6gb of vram which is barely or not enough for alot of ai things and is slower at ai tasks