this post was submitted on 21 Jan 2024
15 points (94.1% liked)

Buildapc

3761 readers
16 users here now

founded 1 year ago
MODERATORS
15
submitted 9 months ago* (last edited 9 months ago) by RedNight@lemmy.ml to c/buildapc@lemmy.world
 

UPDATE: They responded to me. It is indeed the RTX 4070 SUPER 12GB (Not 16GB). The listing will appear correctly again on 1/24/2024

I know this community is mainly about building a PC, but wanted to get some opinions from the experts.

So I know very little about hardware, but wanted to move away from my old standard laptop for a gaming and local LLM inference desktop. Decided current series Nvidia GPU with 16BG VRAM would be good.

Saw this pre-order product on newegg and jumped on it: "iBUYPOWER Gaming Desktop RTX 4070 Super 16GB, i7-47000F, 32GB DDR5, 2TB SSD, Windows 11 Home" $1,799.99

Only after I realized the CPU model number doesn't exist and the RTX 4070 Super maxes out at 12GB VRAM, right? Do you think they meant the RTX 4070 Ti Super?

What do you guys think? Thanks

you are viewing a single comment's thread
view the rest of the comments
[–] TropicalDingdong@lemmy.world 9 points 9 months ago (1 children)

I mean, maybe ask first, buy later next time?

If you really want to self host LLM's you'll need far more GPU memory than that.

[–] RedNight@lemmy.ml 1 points 9 months ago (1 children)

Should have spent more time...you're right.

According to some articles, you can self host smaller parameter LLMs and/or quantized versions. For example the 7B models. Recommendations were 16GB+. Some even pulled off lower

[–] TropicalDingdong@lemmy.world 3 points 9 months ago

There are used set ups that can take you to the range of 40 gb.