Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Is Intel finally taking graphics seriously?

QuizzicalQuizzical Member LegendaryPosts: 25,355

Intel recently announced with some fanfare that they had updated their video drivers for Intel HD Graphics 4000 to increase performance by about 10% or so in some games, as well as adding OpenCL 1.2 support.  That's the occasion for this thread, really.

There are several noteworthy items here.  First of all, Intel is actually updating their video drivers for a product that launched about 11 months ago.  Given that AMD and Nvidia typically offer video driver updates for several years after launching a new video card (they still offer video driver updates for Radeon HD 2000 and GeForce 8000 series cards, respectively, though AMD will likely end the former soon), maybe this shouldn't be shocking.  But it wasn't that long ago that I was looking for video driver updates for someone with a then-5-year-old Intel GMA something or other, and the latest drivers available were released a mere three months after launch.

Also of note is that Intel is now supporting the latest version of a meaningful API.  That doesn't always happen for a variety of reasons.  With OpenGL and OpenCL, the Khronos group announces a new version and then AMD and Nvidia may take some months after that to fully implement it in drivers.  Nvidia didn't support DirectX 11 until about 6 months after it launched, because their first cards that intended to support it were delayed.  That Intel finally supports an API version that launched in 2011 well into 2013 isn't going to win them any speed awards, but that's better than never.  For what it's worth, Nvidia still doesn't support OpenCL 1.2, though that's largely because they're more interested in pushing CUDA for GPU Compute rather than OpenCL.

And it's not like Intel is suddenly supporting the latest and greatest version of everything.  While AMD supports DirectX 11.1, OpenGL 4.3, OpenCL 1.2, and OpenGL ES 3.0, Intel hasn't gotten past DirectX 11 and OpenGL 4.0.  But still, for Intel, that's progress of sorts.

Perhaps a quick history lesson is in order here.  Intel's first graphics product, the i740, launched in 1998.  It was so successful that, to this day, it is still the only discrete graphics card that Intel has launched.  Intel would soon move into chipset-based integrated graphics with their GMA graphics line, which was generally known for being able to display the desktop and not much more.  Well, unless that desktop was the Vista version, in which case, it couldn't handle even that.  Video playback?  Have fun trying.  Games?  Hope you didn't mean 3D.

But then at some point it became clear that the future for broad swaths of the market was a GPU and a CPU on the same chip.  AMD announced that they would do this shortly after buying ATI in 2006.  Intel would eventually go this route as well.  Traditionally, people who wanted an Intel processor as well as graphics that actually work could buy a discrete video card for it.  If one wanted to go the integrated route to save money, one could get an Nvidia chipset with Nvidia integrated graphics.  But that wouldn't be possible anymore if the GPU had to be built into the same chip as the CPU.

But there are advantages to having the CPU and GPU in a single chip, especially for smaller form factors.  In cell phones, if you don't have an SoC, you won't be taken seriously.  In tablets, it's awfully close to that.  Even in laptops, a discrete video card is best avoided if you can for reasons ranging from heat dissipation to battery life to physical space.

So Intel would need to build a GPU that people thought was good enough, or else lose CPU sales.  With Arrandale/Clarkdale at the start of 2010, they had Intel HD Graphics, which put the CPU and GPU in the same package, even if they were two different dies.  Gaming performance and API compliance was still dismal and energy efficiency was ridiculously bad, but they at least offered working video playback capabilities, which made it considerably less bad than their older GMA lines.

With Sandy Bridge at the start of 2011, Intel offered Intel HD Graphics 2000 and 3000.  The latter had double the performance of the former, so if you wanted relatively less slow Intel graphics, you could get it.  Unfortunately, API compliance was still dismal; it still didn't support DirectX 11, which launched in 2009, nor even OpenGL 3.2, which was roughly the OpenGL equivalent of 2007's DirectX 10.  And then there was the question of whether the integrated graphics even worked at all; at launch, one Linux site reported that they couldn't get any benchmarks at all to even run, whether slowly or otherwise.

Intel could plausibly claim that Sandy Bridge integrated graphics were the best performing integrated graphics available at the time.  AMD's latest was Radeon HD 4200 series that was based on the bottom of the line discrete card from the then-ancient Radeon HD 3000 series.  Nvidia was out of integrated graphics for desktops and laptops entirely.  Cell phone integrated graphics was much slower due to power constraints.

Then Llano hit several months later, and AMD demonstrated that it was possible to make integrated graphics that was actually good.  Llano graphics (Radeon HD 6550D for desktops, 6620G for laptops) would support the latest APIs, and offer performance that blew away Intel HD Graphics 3000 in basically every metric that you can think of, plus some that you can't.  While Llano's CPU performance left much to be desired, that level of graphical performance in integrated graphics with laptop-friendly power consumption was revolutionary.

Intel tried to counter with Ivy Bridge and Intel HD Graphics 4000 last year.  In spite of being a full process node ahead of AMD, Intel was unable to catch Llano in graphical performance.  AMD extended its lead considerably with Trinity later last year, and then once again with Richland in recent days.

But Intel didn't necessarily need to beat AMD in integrated graphics performance.  They only needed to offer good enough performance.  For many people "runs the latest games" isn't a requirement of being "good enough".  But "may or may not work right even for simple non-gaming tasks" most certainly isn't good enough.

Enter video drivers.  Intel video drivers have traditionally been maligned as awful to worthless, and for good reason.  If Intel graphics offer plenty of performance when they work, but sometimes mysteriously fail due to driver bugs, then they're not necessarily good enough.  AMD offers graphics that are good enough for most of the market, and for many purposes, also a CPU that is good enough.

So Intel would need not merely higher graphical performance and recent API support, but also drivers that actually worked.  And so they have started updating drivers well after a product launches.  Will that last several years into the future, as any respectable graphics vendor would?  Maybe; it will take several years to find out.

But with Ivy Bridge, Intel looks like they may well be doing the things that they need to in order to have a respectable graphics product.  That's not just a departure for Intel's long-ago history; that's a major departure from how they handled the previous generation Sandy Bridge graphics, even.  With Haswell, Intel promises to do even more of it, offering the latest DirectX 11.1 support, a third, higher performing graphics option, as well as on-package cache for some chips to improve memory bandwidth greatly.

Intel even recently announced a DirectX extension to do order-independent transparency properly, which is (in my opinion) the one thing glaringly missing from the latest versions of DirectX and OpenGL.  I have no idea if the extension actually works, or if it will completely kill your performance.  I mean, kill your performance because of the extension in particular, not just because you're using Intel graphics.

Of course, you don't just need to offer video driver updates and claim that you support APIs.  You need the drivers to actually work and for bugs to be fixed in a timely manner, as AMD and Nvidia have done for years.  So it's too soon to say that Intel graphics are perfectly fine just yet.  But neither is it glaringly obvious anymore that the latest Intel graphics are complete garbage.  And that constitutes progress for Intel.

Comments

  • IchmenIchmen Member UncommonPosts: 1,228

    meh if intel can atleast match amd in power while being acceptable pricewise.. then thats ok by me in all honestly, i can never really understand how AMD sets up their stats for things.. intel always seems to be so simple and clear for understanding :/ though i guess that might be do to me dealing with intel for years idk

     

    while im not a fan of intergrated graphics im sure this could make some pretty decent laptops :o which wouldnt be a bad thing imo

    just typically power and price screws everything over.. even if its the best thing since sliced bread.. if it costs 4x the price of uncut bread and a knife... why would you buy it :/

  • QuizzicalQuizzical Member LegendaryPosts: 25,355
    Originally posted by Ichmen

    just typically power and price screws everything over.. even if its the best thing since sliced bread.. if it costs 4x the price of uncut bread and a knife... why would you buy it :/

    Price could easily end up being rather problematic for Intel.  Pricing on Crystalwell (the variant of Haswell with the on-package memory) is rumored to start at $300 and go up from there.  Even if it does catch Richland in integrated graphics performance, if AMD will sell you a quad core for half the price that Intel charges for a dual core with their best graphics, why buy Intel?

    Perhaps this is why the rumored Ivy Bridge chip with on-package memory was cancelled:  the only laptop vendor interested in it at all was Apple, and Apple wasn't necessarily willing to pay what Intel wanted to charge.

  • Fatality001133Fatality001133 Member Posts: 10
    Wow...only 10% incease in performance? that sucks. Intel is good but ATI is better, the only problem with ATI setup is overheating. 

    image

  • QuizzicalQuizzical Member LegendaryPosts: 25,355
    Originally posted by Fatality001133
    Wow...only 10% incease in performance? that sucks. Intel is good but ATI is better, the only problem with ATI setup is overheating. 

    A 10% performance increase is certainly nice to have.  Remember that this isn't "if you buy a new part, you get a 10% performance increase".  Rather, it's "that part you bought several months ago will run 10% faster if you update your drivers".  A 10% increase from drivers is in line with what you'd hope for from AMD and Nvidia cards.   I don't know how broadly applicable the improvements are; they may only apply to a few particular games.

    AMD graphics offer much better energy efficiency than Intel, so it shouldn't have a problem with overheating unless there is a botched cooling system.  A Trinity APU with a 35 W TDP will beat an Ivy Bridge quad core with a 45 W TDP in most games--in spite of Intel offering substantially better energy efficiency on the CPU side.  AMD's advantage here is enough that I'd be very surprised if an 8 W AMD Temash SoC doesn't crush a 10 W Intel Haswell SoC in more than a few games, in spite of Intel having a process node advantage in addition to more power headroom.  Temash's poor single-threaded performance will also cripple it in more than a few games, so Temash won't win at everything, though.

  • TrionicusTrionicus Member UncommonPosts: 498
    I'm gathfinder that in my lifetime everything will be on 1 chip. It's nice to see intel trying, I may actually purchase an integrated graphics laptop now, well, maybe in the future.
  • KenFisherKenFisher Member UncommonPosts: 5,035

    Intel taking integrated graphics performance seriously?  About 10 years late, if you ask me.

     

    I've always wondered what "extreme" meant in Intel Extreme Graphics... extremely poor D3D support?  *grin*

     


    Ken Fisher - Semi retired old fart Network Administrator, now working in Network Security.  I don't Forum PVP.  If you feel I've attacked you, it was probably by accident.  When I don't understand, I ask.  Such is not intended as criticism.
  • Rthuth434Rthuth434 Member Posts: 346
    taking it seriously sure. too bad that they have less chance of being competitve on this front than AMD does in the high end cpu front.
  • QuizzicalQuizzical Member LegendaryPosts: 25,355
    Originally posted by Trionicus
    I'm gathfinder that in my lifetime everything will be on 1 chip. It's nice to see intel trying, I may actually purchase an integrated graphics laptop now, well, maybe in the future.

    Desktops probably aren't moving to everything on one chip anytime soon, outside of the low end.  If a desktop wants to offer 36 PCI Express lanes, 14 USB ports, 6 SATA ports, and various other things, putting those all into a single chip means that you need a ridiculous number of pins for that chip to offer the pinouts for everything.  What is traditionally done for desktops and laptops is that there will be a separate chipset that has the pinouts for a bunch of ports like that, and then far fewer pins needed to connect the chipset to the CPU.  This commonly means that you can't use the full bandwidth for all of the PCI Express lanes and all of the USB ports and all of the SATA ports all at once, but that's okay, because you don't need to.

    You might think this immediately causes the same problem for the chipset.  But chipsets are made on an old, cheap process node, so taking a lot of die space is a lot less of a concern.  At Intel, chipsets are commonly made on fabs that would otherwise lay idle or be destroyed entirely (or perhaps rather, repurposed to newer process nodes).  Furthermore, system memory takes a ton of pins (240 per channel of DDR2 or DDR3), and in modern chips, those connect to the CPU directly, without needing to go through the chipset.

    I don't think that those reasons to have a chipset in a desktop are going to go away anytime soon.  Neither is it likely that motherboard vendors will want to lose the ability to add their own additional chips and features, rather than only offering a single barebones model.

    In a tablet that only needs 1 SATA port, 2 USB ports, and no PCI Express lanes, you don't have the need for a ton of pins to give the option of connecting a ton of devices.  Whoever is making the tablet probably has considerable input into what goes into the SoC, and ARM will let you put their cores in a chip with whatever else you want.  So there or in cell phones, there's no need for a bunch of separate chips.

    Laptops are an in-between case.  Like tablets, laptops usually don't need the pins available to connect a huge number of devices.  But if a laptop chip is an SoC (System on a Chip; i.e., everything built into a single chip), then trying to use it in a desktop restricts you to very low end product.  If some of your chips that were meant for laptops come back from the fabs and perform fine but use too much power for a laptop, you may be able to stick them in a desktop that can handle the added power consumption and it will be fine.  Intel does this with Ivy Bridge and AMD with Trinity, for example.

    For laptop chips that are low end and incapable of making decent desktops anyway, this isn't a big deal.  AMD's upcoming Kabini chip for ultraportable laptops will do this, for example.  For that matter, Kabini is the high wattage version of it; the same die will be used for Temash in tablets.

  • QuizzicalQuizzical Member LegendaryPosts: 25,355
    Originally posted by Rthuth434
    taking it seriously sure. too bad that they have less chance of being competitve on this front than AMD does in the high end cpu front.

    I'm not predicting that Intel will soon have better integrated graphics than AMD.  But Intel doesn't need to have better graphics than AMD.  If Intel could offer graphics that uniformly offer 80% of the performance of AMD's, that would be a huge improvement for them, as it would mean that Intel had graphics that were clearly good enough for everyone but gamers.  It would mean good drivers, good API support, a good feature set, and so forth, all of which are areas in which Intel has traditionally been woefully deficient.

Sign In or Register to comment.