this post was submitted on 16 Sep 2024
71 points (94.9% liked)
PC Gaming
8576 readers
353 users here now
For PC gaming news and discussion.
PCGamingWiki
Rules:
- Be Respectful.
- No Spam or Porn.
- No Advertising.
- No Memes.
- No Tech Support.
- No questions about buying/building computers.
- No game suggestions, friend requests, surveys, or begging.
- No Let's Plays, streams, highlight reels/montages, random videos or shorts.
- No off-topic posts/comments, within reason.
- Use the original source, no clickbait titles, no duplicates.
(Submissions should be from the original source if possible, unless from paywalled or non-english sources.
If the title is clickbait or lacks context you may lightly edit the title.)
founded 1 year ago
MODERATORS
Can't or won't?
"money"
Bosses said use AI so we use AI.
Why wouldn't it?
It's talking about two things "AI" which is actually a pretty good use of the label
Generatinga lower Rez screen and upscaling.
Generating addition frames based on what might happen in between real screens
There's no valid reason not to use that. Hardware costs more so you'd be paying a lot more money for the same performance. With less people making that choice, the price differential would be even greater.
Like, this is right. They can't make them without this for low enough people will buy it.
It's facts bro
In the future it'll be less about upscaling. And more about giving the algorithm a minimalist basic 3D representation as a starting point. And then told to make it photo real. Ray tracing isn't really going anywhere. But AI radiosity is going to supplant it in many applications.
Think about it. These algorithms are already making impressive if uncanny images from Simple Text prompts. In less time than it would take most CPU GPU combinations on consumer Hardware to actually Ray trace a scene. Ray tracing will always be there when you need the accuracy. But AI radiosity is going to offer benefits most people don't even comprehend yet.
For instance once it makes its way into consumer Hardware etc. Suddenly a lot of older games will be able to have their Graphics upgraded with no recoding or tricks. Just using the input video stream as a reference.
That's an entirely different thing and not happening anytime soon...
That's the worst thing about labeling this stuff "ai" it chefs lumped into crazy no feasible shit like you're talking about.
Wrong as usual. Heres a video from 2 years ago. From an actual light transport researcher for 3d rendering. Again, this likely won't be on your 50 series GPU. But it is something very actively being researched. And likely will start showing up in consumer hardware before the end of the decade.
A two year old YouTube prediction with no other evidence?
Obviously you're a man of science...
It's literally just a two to three minute long video. With literal links to the research paper in question. Perhaps you should read more and talk less.
It's weird on a day old thread you immediately get two extra upvotes, and I get three downvotes as soon as you reply...
I don't help people who play thos silly games, have a nice life.