this post was submitted on 03 Jun 2024
82 points (95.6% liked)
PC Gaming
8581 readers
422 users here now
For PC gaming news and discussion. PCGamingWiki
Rules:
- Be Respectful.
- No Spam or Porn.
- No Advertising.
- No Memes.
- No Tech Support.
- No questions about buying/building computers.
- No game suggestions, friend requests, surveys, or begging.
- No Let's Plays, streams, highlight reels/montages, random videos or shorts.
- No off-topic posts/comments, within reason.
- Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
In this context it is being used to reduce rendering load and therefore be less intensive on computer resources.
While itself consuming a metric ton of electricity. The system works ๐คช
No ...
No, I'm saying you are fundamentally misunderstanding what technology they're talking about and are thinking every type of AI is the same. In this article she is talking about graphics AI running on the local system as part of the graphics pipeline. It is less performance and therefore power intensive. There is no "vast AI network" behind AMDs presumptive work on a competor to DLSS/frame generation.