this post was submitted on 25 Nov 2023
1 points (100.0% liked)

LocalLLaMA

1 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 10 months ago
MODERATORS
 

This is the reason why you can't find ones in your local best buy. They are paying premium for it. But it indeed is very helpful, if I can get my hand on a few for my build.

you are viewing a single comment's thread
view the rest of the comments
[โ€“] mcmoose1900@alien.top 1 points 10 months ago (1 children)

I'm sick of Nvidia's VRAM business model

At the top end, they are actually limited by how much they can physically hang off the die (48GB for current silicon, or 196GB(?) for the interposer silicon).

But yeah, below that its price gouging. What are ya gonna do, buy an Arc?

AMD is going along with this game too. You'd see a lot more 7900s on this sub, and on GitHub, if AMD let their manufacturers double up the VRAM to 48GB.

[โ€“] bassoway@alien.top 1 points 10 months ago

VRAM is not located on the GPU die