Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Radeon HD 7970 GHz Edition launches

QuizzicalQuizzical Member LegendaryPosts: 25,348

Apparently it's nothing more than a Radeon HD 7970 with higher clock speeds.  The rumors of greatly reduced voltages were wrong.

The new feature is "PowerTune Technology with Boost".  Basically, AMD is saying, we can do Nvidia's GPU Boost trick, too.  And here, we'll prove it.  There's a nominal clock speed of 1 GHz, with boost clock up to 1.05 GHz.  AMD is guaranteeing that every Radeon HD 7970 GHz Edition will hit the 1.05 GHz clock speed but never go higher apart from overclocking, so there should be a lot less difference between cards than there is with the GeForce GTX 680.

The idea seems to be that if you can adjust clock speeds up and down with power consumption, you can set the nominal clock speed equal to the maximum clock speed that you'll allow, or you can set the nominal clock speed somewhat lower.  At the 7970 launch, AMD did the former.  At the GTX 680 launch, Nvidia did the latter.  So now at the 7970 GHz launch, AMD is saying, we can do the latter, too.

Speaking of difference between cards, the top GPU Boost clock speed has been measured to vary between cards.  Rumors say that Nvidia cherry-picked the best cards to send to the press.  One French site bought a retail GTX 680, and found that performance on their press sample was better than the retail card in every single game they tested--though only by about 2%.  Cheating benchmarks by 2% might not seem like much, but the 7970 GHz and the GTX 680 are close enough in performance that 2% could easily be the difference in a site saying which card is faster on average for quite a few review sites.

There is a penalty to pay for higher clock speeds, of course:  higher power consumption, and likely more noise.  The Radeon HD 7970 GHz Edition uses quite a bit more power than the GeForce GTX 680.  The situation is bad enough that I'd say, don't get a reference card of the former.  It does all right on temperatures, but gets unreasonably loud.  It should be fine with better coolers such as several board partners will use.  We're not approaching GTX 480 territory here.

This leaves the question of whether you should get a Radeon HD 7970 GHz Edition or a GeForce GTX 680.  That depends on a variety of factors, most notably, whether AMD and Nvidia can actually supply the cards--and at what price.

If you could get both at the $500 MSRP, then I'd say to go with a GTX 680 for a single monitor setup or 7970 GHz for Eyefinity.  But that's a big "if"; right now, you can't get either at MSRP, and you might not be able to for a while.  Three months after the GTX 680 launch, Nvidia still can't keep them in stock at MSRP; the cheapest one on New Egg right now is $525.  AMD says 7970 GHz Edition cards should start showing up next week, and with much greater volume in July.

Comments

  • QuizzicalQuizzical Member LegendaryPosts: 25,348

    Apparently there might not be a Radeon HD 7970 GHz Edition reference card at all.  Rather, it will be all non-reference cards right from the start.  That's probably a good thing, as the reference card was no good.

    Tech sites seem to be all over the place with their verdict.  My quick summary:

    Hard OCP:  You're better off getting a normal 7970 with a good cooler and overclocking it.

    Tech Report:  AMD wins at the high end, if you want to go there.

    Anandtech:  It's a tie.

    Tom's Hardware:  Avoid the high end and get a GeForce GTX 670 instead.

    Tech Power Up:  The 7970 GHz is faster, but I'd rather have a cooler, quieter GTX 680.

    Tech Spot:  Most gamers should get a 7950 or GTX 670 instead.

    Hot Hardware:  We can't give you a clear recommendation.

    Hardware Canucks:  Best for ultra high resolutions, not lower resolutions.

    Hexus:  Get a GTX 670 instead and overclock it.

    Guru of 3D:  Blah, blah, blah, I don't have any recommendation.

    -----

    The verdict seems to be that on average, a 7970 GHz Edition is a little faster than a GTX 680, but not a lot.  But it's close enough that Nvidia could probably release an overclocked press edition GTX 680 and reclaim the top spot.

    Even so, being able to get a good card from either vendor is never a bad thing.  If you need vendor-specific features, that makes your decision for you.

    In other news, prices on the Radeon HD 7870 seem to be falling.  New Egg has one now for $299 with promo code, before a $20 rebate, and with a free game:

    http://www.newegg.com/Product/Product.aspx?Item=N82E16814161404

    A 7870 for $350 seemed like too much, especially once the GeForce GTX 670 arrived.  But if it settles in around $300, that's a far more compelling purchase.

  • UOvetUOvet Member Posts: 514

    I think Anandtech said it was nothing to get all hot over. Get a 670, wait for Maxwell come Q2 2014. It's what I'm gonna do anyway..probably won't even need to upgrade to Maxwell, but it'll be a new archi so we'll see.

     

    I've personally never owned AMD. I just hear too many horror stories about drivers and such. I'll stick to Nvidia.

  • QuizzicalQuizzical Member LegendaryPosts: 25,348
    Originally posted by UOvet

    I just hear too many horror stories about drivers and such.

    Which are FUD, and nothing more.  Vague claims about unspecified AMD driver problems in no particular games are so common that sometimes I wonder if they're Nvidia-paid viral marketers.  In order to have driver problems at all, it has to be a particular problem in a particular game, not just drivers that are generically horrible.

    Now, there are some particular problems with AMD drivers that usually get fixed pretty quickly.  But the same is true of Nvidia drivers.  (E.g., remember the Starcraft II title screen frying Nvidia cards?)  The gap in driver quality is small enough that it's not even clear which side has better drivers.  It's certainly not the sort of chasm that would justify dismissing one side out of hand.

    Now, a GeForce GTX 670 is a nice card if that's your budget.  But on a sub-$150 budget or anything around $300-$350, if you insist on an Nvidia card today, you're going to massively overpay for inferior hardware.  And I say that's completely stupid.

  • TerronteTerronte Member Posts: 321

    It used to be NVIDIA was the power hogs and the noisy ones and AMD was slightly quieter (iirc).

    Do card companies introduce any new features, or is it just speed nowadays?

  • [Deleted User][Deleted User] CommonPosts: 0
    The user and all related content has been deleted.
  • RidelynnRidelynn Member EpicPosts: 7,383


    Originally posted by Terronte
    It used to be NVIDIA was the power hogs and the noisy ones and AMD was slightly quieter (iirc).Do card companies introduce any new features, or is it just speed nowadays?

    I wouldn't go so far as to call either company a Power Hog with their latest generations - thankfully. Competition is a good thing. With the 600 series, nVidia has come to the table well with power, and now the two companies are very close with respect to power per pixel.

    There have been many new features added recently. In fact, I'd argue that the speed increases have been more or less meaningless lately, as PC gaming has been more or less hamstrung by Console development.

    In the last couple of generations, we've seen huge changes in power management (both at power and idle), noise reduction, dynamic clock adjustments (PowerTune and GPU Boost), DX11.1 compliance, PCI 3.0 compliance, vapor chamber heat sinks, multi-monitor and Very High Resolution support, and 3D support

    The power management features have been the biggest changes by far - as video cards have started to bump power delivery ceilings, and in turn producing massive amounts of heat (and noise).

  • QuizzicalQuizzical Member LegendaryPosts: 25,348

    In a typical generation, neither side is particularly ahead of the other in performance per watt.  When one side (usually AMD) gets to a new process node first, that gives a temporary advantage in performance per watt until the other side catches up.

    Fermi was something of an aberration, as that was simply a bad architecture that used a lot more power than it should have.  So when it was Evergreen (Radeon HD 5000 series) against broken Fermi (GeForce 400 series) or Northern Islands (Radeon HD 6000 series) against fixed Fermi (GeForce 500 series), AMD was way ahead in performance per watt.  But that advantage is gone with the current generation, just as it was with earlier generations of parts.

Sign In or Register to comment.