this post was submitted on 21 Jan 2024
324 points (97.1% liked)

Technology

59635 readers
2858 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Do PC gamers feel 12GB of VRAM is simply not enough for the money in 2024?

you are viewing a single comment's thread
view the rest of the comments
[–] Dra@lemmy.zip 21 points 10 months ago* (last edited 10 months ago) (8 children)

I haven't paid attention to GPUs since I got my 3080 on release day back in Covid.

Why has acceptable level of VRAM suddenly doubled vs 4 years ago? I don't struggle to run a single game on max settings at high frames @ 1440p, what's the benefit that justifies the cost of 20gb VRAM outside of AI workloads?

[–] Eccitaze@yiffit.net 28 points 10 months ago (1 children)

An actual technical answer: Apparently, it's because while the PS5 and Xbox Series X are technically regular x86-64 architecture, they have a design that allows the GPU and CPU to share a single pool of memory with no loss in performance. This makes it easy to allocate a shit load of RAM for the GPU to store textures very quickly, but it also means that as the games industry shifts from developing for the PS4/Xbox One X first (both of which have separate pools of memory for CPU & GPU) to the PS5/XSX first, VRAM requirements are spiking up because it's a lot easier to port to PC if you just keep the assumption that the GPU can handle storing 10-15 GB of texture data at once instead of needing to refactor your code to reduce VRAM usage.

[–] Dra@lemmy.zip 3 points 10 months ago

Perfect answer thank you!

[–] Asafum@feddit.nl 24 points 10 months ago (1 children)

Lmao

We have your comment: what am I doing with 20gb vram?

And one comment down: it's actually criminal there is only 20gb vram

[–] Dra@lemmy.zip 4 points 10 months ago
[–] Blackmist@feddit.uk 10 points 10 months ago

Current gen consoles becoming the baseline is probably it.

As games running on last gen hardware drop away, and expectations for games rise above 1080p, those Recommended specs quickly become an Absolute Minimum. Plus I think RAM prices have tumbled as well, meaning it's almost Scrooge-like not to offer 16GB on a £579 GPU.

That said, I think the pricing is still much more of an issue than the RAM. People just don't want to pay these ludicrous prices for a GPU.

[–] Space_Racer@lemm.ee 9 points 10 months ago

I'm maxed on VRAM in VR for the most part with a 3080. It's my main bottleneck.

[–] AlijahTheMediocre@lemmy.world 5 points 10 months ago

If only game developers optimized their games...

The newest hardware is getting powerful enough that devs are banking on people just buying better cards to play their games.

[–] rndll@lemm.ee 4 points 10 months ago

GPU rendering and AI.

[–] Hadriscus@lemm.ee 3 points 10 months ago* (last edited 10 months ago)

Perhaps not the biggest market but consumer cards (especially nvidia's) have been the preferred hardware in the offline rendering space -ie animation and vfx- for a good few years now. They're the most logical investment for freelancers and small to mid studios thanks to hardware raytracing. CUDA and later Optix may be anecdotal on the gaming front, but they completely changed the game over here

[–] Obi@sopuli.xyz 2 points 10 months ago

Personally I need it for video editing & 3D work but I get that's a niche case compared to the gaming market.