this post was submitted on 17 Nov 2023
1 points (100.0% liked)

LocalLLaMA

3 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 1 year ago
MODERATORS
 

I upgraded my system a year ago. Amongst these was a GPU 1070 -> 3090 upgrade. My old card rusts now in the cellar. I heard people around here have mentioned they run desktop on their old card to completely free VRAM in their workhorse card.

is this worth doing in the context of loading LLMs and playing around with them, no training yet? The downside is a worse thermal situation if I cram them together.

you are viewing a single comment's thread
view the rest of the comments
[–] panchovix@alien.top 1 points 11 months ago

it will work but you will be limited to 1070 speeds if using all 3 gpus