this post was submitted on 21 Nov 2023
1 points (100.0% liked)

LocalLLaMA

1 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 10 months ago
MODERATORS
 

I have an old PC.

Currently having 512MB RAM and the Xeon E3-1240 CPU with 500W PSU.

Is this worth keeping to throwing the GPU in, like 3090 or something, or is this too old to being useful? It has the PCIe 2.0 but my friend says the throughput is less important to LLM use.

Just was curious in discussion with my friend.

Thanks. ๐Ÿ˜Ž๐Ÿ‘

top 3 comments
sorted by: hot top controversial new old
[โ€“] MeMyself_And_Whateva@alien.top 1 points 10 months ago

You need a completely new PC.

[โ€“] ThisGonBHard@alien.top 1 points 10 months ago

512MB RAM

When did they thaw you out of ice?!

Jokes aside, you probably mean 512 GB of RAM. That platform is slow and old, and that is at best DDR3 1333 dual channel, much worse than even bottom barrel DDR4 dual channel.

A 3090 will not care about it as long as you are doing pure GPU inferencing and not touching the GPU, if not DDR3 and PCI-E2 will kill the performance.

[โ€“] ThisGonBHard@alien.top 1 points 10 months ago

512MB RAM

When did they thaw you out of ice?!

Jokes aside, you probably mean 512 GB of RAM. That platform is slow and old, and that is at best DDR3 1333 dual channel, much worse than even bottom barrel DDR4 dual channel.

A 3090 will not care about it as long as you are doing pure GPU inferencing and not touching the GPU, if not DDR3 and PCI-E2 will kill the performance.