You need a completely new PC.
LocalLLaMA
Community to discuss about Llama, the family of large language models created by Meta AI.
512MB RAM
When did they thaw you out of ice?!
Jokes aside, you probably mean 512 GB of RAM. That platform is slow and old, and that is at best DDR3 1333 dual channel, much worse than even bottom barrel DDR4 dual channel.
A 3090 will not care about it as long as you are doing pure GPU inferencing and not touching the GPU, if not DDR3 and PCI-E2 will kill the performance.
512MB RAM
When did they thaw you out of ice?!
Jokes aside, you probably mean 512 GB of RAM. That platform is slow and old, and that is at best DDR3 1333 dual channel, much worse than even bottom barrel DDR4 dual channel.
A 3090 will not care about it as long as you are doing pure GPU inferencing and not touching the GPU, if not DDR3 and PCI-E2 will kill the performance.