Game creation is one thing, but AMD's CEO believes that AI is going to be increasingly used by developers to get games onto your screen without necessarily rendering everything.
Not really, AMD’s FSR upscaling can increase visual quality/fidelity while using less power than rendering at full resolution. This can be easily seen in Steam Deck’s battery life improvement when enabling it. Scaling this to millions of devices can indeed reduce energy usage.
When you read about “AI power consumption”, its mostly about training the models, not as much the usage after it’s trained.
FSR in this case doesn’t need to be trained more. It’s already a complete dataset, so now it can be released to run on MILLIONS of devices and reduce their load. And then you knock railroads which are one of the most efficient forms of land transportation we have. Just full of bad takes here.
Training an AI is intensive, but using them after the fact is relatively cheap. Cheaper than traditional rendering to reach the same level of detail. The upfront cost of training is offset by the savings on every video card running the tech from then on. Kinda like how railroads are expensive to build but much cheaper to operate after the fact.
It’s pretty simple. If you can’t understand delayed gratification, then you’re right: school did fail you.
Ps.: the railroad comparison really breaks down when you consider that they’re cheaper to build than the highways that trucks use and that we don’t, in fact, need to truck in the resources anyway. We’ve been building railroads longer than trucks have existed, after all.
Thanks for the totally made up figures. I’m glad we agree that training itself is quite costly. No data on how much energy AI will save vs rendering (as we don’t know how much we can avoid rendering; there has to be a cap) so can’t really keep riding that horse.
You’re right tho, the rail analogy sucks. Not for the reasons you list tho, but rather because they will never stop training AI. Unless you feel AI will stop learning and needing to evolve.
No, I’m saying you are fundamentally misunderstanding what technology they’re talking about and are thinking every type of AI is the same. In this article she is talking about graphics AI running on the local system as part of the graphics pipeline. It is less performance and therefore power intensive. There is no “vast AI network” behind AMDs presumptive work on a competor to DLSS/frame generation.
While itself consuming a metric ton of electricity. The system works 🤪
Not really, AMD’s FSR upscaling can increase visual quality/fidelity while using less power than rendering at full resolution. This can be easily seen in Steam Deck’s battery life improvement when enabling it. Scaling this to millions of devices can indeed reduce energy usage.
When you read about “AI power consumption”, its mostly about training the models, not as much the usage after it’s trained.
Removed by mod
FSR in this case doesn’t need to be trained more. It’s already a complete dataset, so now it can be released to run on MILLIONS of devices and reduce their load. And then you knock railroads which are one of the most efficient forms of land transportation we have. Just full of bad takes here.
Training an AI is intensive, but using them after the fact is relatively cheap. Cheaper than traditional rendering to reach the same level of detail. The upfront cost of training is offset by the savings on every video card running the tech from then on. Kinda like how railroads are expensive to build but much cheaper to operate after the fact.
It’s pretty simple. If you can’t understand delayed gratification, then you’re right: school did fail you.
Ps.: the railroad comparison really breaks down when you consider that they’re cheaper to build than the highways that trucks use and that we don’t, in fact, need to truck in the resources anyway. We’ve been building railroads longer than trucks have existed, after all.
Thanks for the totally made up figures. I’m glad we agree that training itself is quite costly. No data on how much energy AI will save vs rendering (as we don’t know how much we can avoid rendering; there has to be a cap) so can’t really keep riding that horse.
You’re right tho, the rail analogy sucks. Not for the reasons you list tho, but rather because they will never stop training AI. Unless you feel AI will stop learning and needing to evolve.
No …
Removed by mod
No, I’m saying you are fundamentally misunderstanding what technology they’re talking about and are thinking every type of AI is the same. In this article she is talking about graphics AI running on the local system as part of the graphics pipeline. It is less performance and therefore power intensive. There is no “vast AI network” behind AMDs presumptive work on a competor to DLSS/frame generation.