this post was submitted on 26 Nov 2025
414 points (96.8% liked)

PC Gaming

12795 readers
1475 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments, within reason.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] bold_atlas@lemmy.world 3 points 4 days ago* (last edited 4 days ago) (2 children)

And who's going to be powering that NPC's LLM model? Unless all you want is a free hotlinked chatbot window disguised as a character? Because the publishers and developers sure as hell won't power it on their end and if they do you'll be paying out the ass for it. Otherwise that LLM for an NPC will have to run locally on your own hardware....in addition to the game itself.

So yeah, have fun with that.

And dialogue generation is ALL they can do btw. They can't navigate a character around a 3d environment or even play against you in a grand strategy game. So, looking at RAM and GPU prices... yeah the novelty of LLM in games will run it's course pretty quick.

[–] 1rre@discuss.tchncs.de 5 points 4 days ago (1 children)

It'd be a small model run locally, taking up maybe half a GB of VRAM

Bruh, that's like 25-50% on an Nvidia card. Too much overhead! /s

[–] Appoxo@lemmy.dbzer0.com 2 points 4 days ago

In theory they could offer the ability in the settings to use the NPU if one is available.
Basically the same situation as it was with Raytracing.