Quantcast

Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

PowerColor announces Radeon HD 7990

QuizzicalQuizzical Member LegendaryPosts: 22,094

http://www.powercolor.com/event/Devil13/

Their site is an awful pain to navigate, so try a write-up here:

http://www.xbitlabs.com/news/graphics/display/20120823235642_PowerColor_Unleashes_the_Devil_World_s_First_Dual_GPU_Radeon_HD_7990_Graphics_Card_Launched.html

Two Radeon HD 7970s at 1 GHz on a single card should easily trounce a GeForce GTX 690 in raw performance, since the GTX 690 had to underclock the GPUs considerably from GTX 680 reference speeds in order to fit inside a 300 W power envelope.

The trouble is that the way it gets the extra speed is by using enormous amounts of power.  The card comes with three 8-pin PCI-E power connectors, which means that it would be allowed to draw as much as 525 W.  It also uses a huge 3-slot cooler with three fans to cool the card.  PowerColor says the card can handle 550 W, which I find plausible but not obvious.

And then the other problem is the limited utility of dual GPU cards.  If you want two GPUs in SLI or CrossFire, you're better off getting two single GPU cards.  That's probably also true if you want three GPUs.  So the Radeon HD 7990 is really meant for people who want four GPUs in quad CrossFireX.

And so we go back to power consumption.  Two GeForce GTX 690s mean 600 W in heat from the video cards.  Two PowerColor Radeon HD 7990 Devil 13s mean what, maybe 1000 W?  Add in a processor and power supply inefficiencies and you're looking at a 1200 W space heater of a computer.  Is that what you really want?  I don't.

So it looks like this generation will basically end up as the reverse of the previous one.  Last time, the Asus Mars II (two GTX 580s on a single card) was the fastest dual GPU card on the market, but the Radeon HD 6990 was far more sensible because it used so much less power.  This time, some variant of a Radeon HD 7990 (possibly but not necessarily PowerColor's) will be the fastest card, but the GTX 690 is the more sensible one, again because of an enormous disparity in power consumption.

Unless, of course, you want to use more than four monitors, in which case, I'm not sure if GTX 690s can do it.  A single GK104 chip can only do four.  I'm not sure if you can use Nvidia Surround with more than four monitors by plugging them into different cards.  The GTX 690s each have four monitor ports, and you'd think that if the cap were four per GPU, they'd have some monitor ports wired to each GPU to allow the usage of more monitors.

Does it sound crazy to talk about wanting to use more than four monitors?  Well, consider that we're talking about using four GPUs, and doing that on a single monitor is simply nuts, with the possible exception of a quad HD 3840 x 2160 monitor.  Except that AMD supports that in Radeon HD 7000 series cards and I don't think that Nvidia does.

The upshot is that dual GPU cards are a very narrow niche market, and while some variant of a Radeon HD 7990 will be the way to get a quad CrossFireX setup with the highest graphical performance possible, that only makes sense for a tiny handful of people in the world.  And in particular, it doesn't make sense for most of the people who will buy a 7990.

If I had to have a dual GPU card, I'd rather have a GeForce GTX 690 than a Radeon HD 7990, due to the power consumption difference.  But I don't want a dual GPU card at all.  If I had to buy a new card right now, I'd probably grab a Radeon HD 7870 or so.

Comments

  • CleffyCleffy Member RarePosts: 6,247

    Reminds me of the 6990 ROG edition.  However, the solution to me does not look ideal considering PowerCooler does use Watercooling blocks on some of their graphics cards.

    There is just something about how the fans are positioned and the rest of the cooler is enclosed that seems like it would not be too effecient.  Where does the air go?  Its also my main concern with all PCI-e slot cooling solutions now, where does the air go?  In this case it pushes hot air onto the board then that gets force exhausted to the rear of the case or out the other side of the card.  Considering this card has to at max dissipate the heat generated from over 500w, it could get toasty.  Also this board uses 3 6-pin.  Seems a little wierd considering it would be less obtrusive with 2 8 pin or an 8pin and a 6pin.  The card is also so heavy it needs more support then just the PCI-e lane.

    The upside to AMD is you can get a board that can have quad-fire with 4 single gpu cards.  Like MSI's top range board.

  • QuizzicalQuizzical Member LegendaryPosts: 22,094
    It's three 8-pin connectors.  Their picture is wrong.
  • RidelynnRidelynn Member EpicPosts: 7,060

    Last I remember reading, QuadFire was bugged horribly - with many games outright crashing and totally unplayable, although that was a while ago and the drivers have had a chance to improve. I've not seen anything recent past TriFire to confirm it's performance or improvement.

    To be fair, Quad-SLI had some problems as well, but wasn't as problematic. But again, I can't find much of anything to confirm if it's any better or not.

Sign In or Register to comment.