It looks like you're new here. If you want to get involved, click one of these buttons!
Intel recently announced with some fanfare that they had updated their video drivers for Intel HD Graphics 4000 to increase performance by about 10% or so in some games, as well as adding OpenCL 1.2 support. That's the occasion for this thread, really.
There are several noteworthy items here. First of all, Intel is actually updating their video drivers for a product that launched about 11 months ago. Given that AMD and Nvidia typically offer video driver updates for several years after launching a new video card (they still offer video driver updates for Radeon HD 2000 and GeForce 8000 series cards, respectively, though AMD will likely end the former soon), maybe this shouldn't be shocking. But it wasn't that long ago that I was looking for video driver updates for someone with a then-5-year-old Intel GMA something or other, and the latest drivers available were released a mere three months after launch.
Also of note is that Intel is now supporting the latest version of a meaningful API. That doesn't always happen for a variety of reasons. With OpenGL and OpenCL, the Khronos group announces a new version and then AMD and Nvidia may take some months after that to fully implement it in drivers. Nvidia didn't support DirectX 11 until about 6 months after it launched, because their first cards that intended to support it were delayed. That Intel finally supports an API version that launched in 2011 well into 2013 isn't going to win them any speed awards, but that's better than never. For what it's worth, Nvidia still doesn't support OpenCL 1.2, though that's largely because they're more interested in pushing CUDA for GPU Compute rather than OpenCL.
And it's not like Intel is suddenly supporting the latest and greatest version of everything. While AMD supports DirectX 11.1, OpenGL 4.3, OpenCL 1.2, and OpenGL ES 3.0, Intel hasn't gotten past DirectX 11 and OpenGL 4.0. But still, for Intel, that's progress of sorts.
Perhaps a quick history lesson is in order here. Intel's first graphics product, the i740, launched in 1998. It was so successful that, to this day, it is still the only discrete graphics card that Intel has launched. Intel would soon move into chipset-based integrated graphics with their GMA graphics line, which was generally known for being able to display the desktop and not much more. Well, unless that desktop was the Vista version, in which case, it couldn't handle even that. Video playback? Have fun trying. Games? Hope you didn't mean 3D.
But then at some point it became clear that the future for broad swaths of the market was a GPU and a CPU on the same chip. AMD announced that they would do this shortly after buying ATI in 2006. Intel would eventually go this route as well. Traditionally, people who wanted an Intel processor as well as graphics that actually work could buy a discrete video card for it. If one wanted to go the integrated route to save money, one could get an Nvidia chipset with Nvidia integrated graphics. But that wouldn't be possible anymore if the GPU had to be built into the same chip as the CPU.
But there are advantages to having the CPU and GPU in a single chip, especially for smaller form factors. In cell phones, if you don't have an SoC, you won't be taken seriously. In tablets, it's awfully close to that. Even in laptops, a discrete video card is best avoided if you can for reasons ranging from heat dissipation to battery life to physical space.
So Intel would need to build a GPU that people thought was good enough, or else lose CPU sales. With Arrandale/Clarkdale at the start of 2010, they had Intel HD Graphics, which put the CPU and GPU in the same package, even if they were two different dies. Gaming performance and API compliance was still dismal and energy efficiency was ridiculously bad, but they at least offered working video playback capabilities, which made it considerably less bad than their older GMA lines.
With Sandy Bridge at the start of 2011, Intel offered Intel HD Graphics 2000 and 3000. The latter had double the performance of the former, so if you wanted relatively less slow Intel graphics, you could get it. Unfortunately, API compliance was still dismal; it still didn't support DirectX 11, which launched in 2009, nor even OpenGL 3.2, which was roughly the OpenGL equivalent of 2007's DirectX 10. And then there was the question of whether the integrated graphics even worked at all; at launch, one Linux site reported that they couldn't get any benchmarks at all to even run, whether slowly or otherwise.
Intel could plausibly claim that Sandy Bridge integrated graphics were the best performing integrated graphics available at the time. AMD's latest was Radeon HD 4200 series that was based on the bottom of the line discrete card from the then-ancient Radeon HD 3000 series. Nvidia was out of integrated graphics for desktops and laptops entirely. Cell phone integrated graphics was much slower due to power constraints.
Then Llano hit several months later, and AMD demonstrated that it was possible to make integrated graphics that was actually good. Llano graphics (Radeon HD 6550D for desktops, 6620G for laptops) would support the latest APIs, and offer performance that blew away Intel HD Graphics 3000 in basically every metric that you can think of, plus some that you can't. While Llano's CPU performance left much to be desired, that level of graphical performance in integrated graphics with laptop-friendly power consumption was revolutionary.
Intel tried to counter with Ivy Bridge and Intel HD Graphics 4000 last year. In spite of being a full process node ahead of AMD, Intel was unable to catch Llano in graphical performance. AMD extended its lead considerably with Trinity later last year, and then once again with Richland in recent days.
But Intel didn't necessarily need to beat AMD in integrated graphics performance. They only needed to offer good enough performance. For many people "runs the latest games" isn't a requirement of being "good enough". But "may or may not work right even for simple non-gaming tasks" most certainly isn't good enough.
Enter video drivers. Intel video drivers have traditionally been maligned as awful to worthless, and for good reason. If Intel graphics offer plenty of performance when they work, but sometimes mysteriously fail due to driver bugs, then they're not necessarily good enough. AMD offers graphics that are good enough for most of the market, and for many purposes, also a CPU that is good enough.
So Intel would need not merely higher graphical performance and recent API support, but also drivers that actually worked. And so they have started updating drivers well after a product launches. Will that last several years into the future, as any respectable graphics vendor would? Maybe; it will take several years to find out.
But with Ivy Bridge, Intel looks like they may well be doing the things that they need to in order to have a respectable graphics product. That's not just a departure for Intel's long-ago history; that's a major departure from how they handled the previous generation Sandy Bridge graphics, even. With Haswell, Intel promises to do even more of it, offering the latest DirectX 11.1 support, a third, higher performing graphics option, as well as on-package cache for some chips to improve memory bandwidth greatly.
Intel even recently announced a DirectX extension to do order-independent transparency properly, which is (in my opinion) the one thing glaringly missing from the latest versions of DirectX and OpenGL. I have no idea if the extension actually works, or if it will completely kill your performance. I mean, kill your performance because of the extension in particular, not just because you're using Intel graphics.
Of course, you don't just need to offer video driver updates and claim that you support APIs. You need the drivers to actually work and for bugs to be fixed in a timely manner, as AMD and Nvidia have done for years. So it's too soon to say that Intel graphics are perfectly fine just yet. But neither is it glaringly obvious anymore that the latest Intel graphics are complete garbage. And that constitutes progress for Intel.