It looks like you're new here. If you want to get involved, click one of these buttons!
Intel is aiming to get into the dedicated graphics business alongside Nvidia and AMD, launching its Arc-line of GPUs last fall. We had the chance to put the Intel Arc A750 Limited Edition through its paces this past month, seeing if Intel's $250 GPU is worth a look compared to its competition.
Aka, you simply DO NOT NEED AI for high quality temporal upscaling. (Something both AMD AND Epic Games seem to have realized). That's nothing but a load of utter horses**t perpetuated by Nvidia to justify the MASSIVE silicon die space they dedicated to Tensor Cores starting w/ Turing / the RTX 2000 series that were ACTUALLY added for professional compute reasons.
(Nvidia would never tell you this, but modern DLSS 2 doesn't even significantly use the Tensor Cores anymore! They gave up on that after the disaster that was DLSS 1 which was PROPERLY Tensor accelerated.)
Also XeSS only gives acceptable visual results ON ARC, with the compatibility layer used for other brand's GPU's looking ABSOLUTELY HORRIBLE, meaning game developers are EVEN LESS likely to add it to their games over FSR & DLSS. For this very reason, banking on widespread XeSS support is FREAKING IDIOTIC!!!
Buying ARC Alchemist over RDNA 2 primarily "because it has hardware AI acceleration" is a fools errand of clueless stupidity.
I am very interested in how the Battlemage is going to perform. Also, Celestial? Why not Cleric, fml.
The reason that Nvidia includes tensor cores in their GPU is for the compute market. And not really the entire compute market, but just a substantial subset of machine learning. But Nvidia makes more money on the compute market than they do by selling GeForce cards to gamers.
I think that Intel is after the compute market, and actually more interested in that market than consumer graphics. But you can't compete in the GPU compute market if you don't have credible GPUs.
Tensor cores are basically useless for consumer graphics. Nvidia has shoehorned some code into using them for reasons of marketing, but that's hardly evidence that they're actually useful. Tensor cores are actually an enormous pain to use by any means other than calling a library that Nvidia has written to use them.
One reason to put compute stuff that is useless for consumers into consumer graphics parts is that sometimes the same GPU chips are used for both. The same GPU chip as used in the GeForce RTX 3090 was also sold as the A40 and A10 for compute. The GPU chip used in the GeForce RTX 4090 is also sold as the L40 for compute.
But another reason is for the sake of driver support. AMD's "ROCm" compute drivers are complete garbage, which makes the compute cards that rely on them pretty useless. One reason for it is that hardly anyone has access to use them, so bugs don't get reported and fixed. The handful of supported cards cost a fortune and are difficult to find and buy at all.
In contrast, plenty of hobbyists and students have a GeForce card that is supported by Nvidia's GPU compute drivers. If someone wants to try an Nvidia GPU for compute, it's easy to buy one, and a lot of people even happen to already have one. That helps to build more of an ecosystem for people who use Nvidia GPUs for compute.
Intel basically has to offer GPU compute support in their consumer parts in order to make a credible play for the GPU compute market. And that means putting in things like their version of tensor cores that isn't actually useful for consumer graphics. They're making progress, and at least on Linux, their driver support for GPU compute is already better than AMD's, though that's admittedly a very low bar.
Yes, Intel is having board partners build their video cards just like AMD and Nvidia do, and probably for the same reasons. If they get enough uptake from board partners, they might well stop selling their own cards directly and just let board partners handle it. But they had to start somewhere, and at least initially, Intel didn't have any GPU board partners because they didn't have any discrete GPUs at all.
Those partners being? Acer and Asus? Anyone else?
But unless you're planning on buying this generation Intel board, it doesn't matter much who Intel's partners are at the moment. New partners can join up quickly enough if Intel's next generation looks like it would get large sales numbers.
“Microtransactions? In a single player role-playing game? Are you nuts?”
― CD PROJEKT RED