this post was submitted on 09 Mar 2024
101 points (94.7% liked)

PC Gaming

8251 readers
581 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[โ€“] umbrella@lemmy.ml 3 points 6 months ago (1 children)

tbf you would need a pretty beefy gpu to do both rendering and ai locally.

as much as i hate to say it (because this idea sounds awesome) the tech is not there yet, and depending on the cloud for this always goes wrong.

[โ€“] cynar@lemmy.world 2 points 6 months ago

I limited LLM would run on a lot of newer gfx cards. It could also be done as a semi online thing. If you have the grunt, you can run it locally. Otherwise, you can farm it out to the online server.