Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

nVidia's Kepler GTX 680 Specs Leaked

KhrymsonKhrymson Member UncommonPosts: 3,090

Tons more images here, as well as some early benchmarks!

 

It looks like our sources were spot on yesterday when we heard that the new GTX 680 will feature two 6-pin PCI-Express connectors and have a TDP of around 200W (190W to be precise). When combined with 1536 shader units, 2GB of GDDR5 memory clocked at 6GHz and, according to what we heard, a very high GPU clock, it comes as no surprise that Nvidia is claiming higher performance per Watt for its upcoming GTX 680.



This is a big win for Nvidia as we aren't used to hearing performance per Watt from Nvidia and now it appears that the tide has turned. The performance of the GTX 680 is somewhere around Radeon HD 7970 as it wins in some games and benchmarks and loses in others. Of course, these are all results that came from Nvidia so we'll hold our judgement until we see some reviews.



As noted before, the card is based on Nvidia 28nm GK104 Kepler chip that is around 300mm² in size.



The bad part of the story is that our sources were again spot on with the US $549 price tag. Nvidia has a card with similar performance and better TDP so the US $549 price tag isn't really a surprise. We guess that Nvidia has additional maneuvering room when it comes to price and we just need to wait and see if AMD is going to bite down and head into a new round of price wars.

 

 

 

 

Oh me wants it...!

 

Apparenty these cards are gonna be very limited when they are to be released on March 22nd/23rd, but I'm gonna be hovering over my F5 Key and try to score one.  Currently I have the GTX 580 that destroys just about every title out there, but in many games and especially MMOs, it can struggle a bit in cities when there are tons of players around.  Usually it only drops as low as around 30-35ish FPS which is still fairly good considering I run my games on ultra settings with 8-16x AA, sometimes 32x CSAA forced through the nVidia control panel.  Plus many other forced settings.

And with the massive WvWvW battles with GW2 this summer, I want the best experience with 50-60+ FPS while there are hundreds of players on screen...heh

 

Anyway, as you can see from the benchmarks, the GTX 680 is a fair bit better than the GTX 580, and considering that the 580 just had a price reduction this past week, I might as well sell it off sooner rather then later, while it still has a decent value.  Ultimately I want whatever nVidia has planned with their GK 110 chip coming later this fall/winter, will probably be named the GTX 780 or whatnot, but by then the 580 would probably drop in price again, plus I hope we're running around Tyria and the Mists by then.

 

Anyway, I'm excited!

Comments

  • SoraksisSoraksis Member UncommonPosts: 294

    I am not convinced this card is as big of an upgrade as they would have us believe.  My 590 scored much higher 3Dmark11 scores than that 680 did with a lesser CPU in my system.   I also have a system running a pair of 570's in sli with an i7 2700k overclocked to 4.4Ghz that beat that benchmark by almost 4k points.  I am sure its a nice card but I cannot see myself upgrading to it when I have two computers with older cards outscoring it. 

  • KhrymsonKhrymson Member UncommonPosts: 3,090

    You're running SLi in both PCs though, and comparing perfomance with a single GPU.  Of course your setups will destroy this new GPU.  Also its not listed in what I posted above, but this GTX 680 is only nVidia mid-range Kepler GPU.  Its the GK110 and I think GK112 coming later this fall that are their flagship cards.

     

    When it comes to just the 580 vs the 680, its a fairly good jump in performace, plus I'm really liking that 190w TDP.

     

    Even the GTX 295 is still holding its own quite well to this day.  And that GPU is some 3 generations ago...

  • QuizzicalQuizzical Member LegendaryPosts: 25,351

    Let's wait for real reviews to gauge performance, rather than some cherry-picked and/or fake results in synthetics and Nvidia-friendly games.

    As for GK104 being midrange, that's a bit of a stretch.  Rumors say that there was supposed to be a GK100 that was the high end, with GK104 the next chip down, until GK100 was cancelled because Nvidia couldn't make it.  But the GK104 die size is rumored to be closer to Tahiti than Pitcairn.  A rumored $300 is perhaps a little high to be midrange, and a rumored $550 definitely isn't midrange.

    Before we get too excited about TDP, let's not forget that Nvidia's claimed TDPs the last two generations have basically been lies.  Hopefully their clock boost stuff will include a hardware-enforced cap on power consumption, analogous to AMD's PowerTune.  But for real-world power consumption, we'll still have to wait and see how close to the cap cards come.

    The claimed specs do fit with the widespread rumors that Nvidia is finally adopting AMD's approach of more shaders clocked lower.  AMD won the last three generations pretty handily with that approach, so that could finally negate AMD's large advantage in performance per mm^2.  Probably not coincidentally, if Nvidia figured out that they needed to copy that approach the day that the Radeon HD 4870 launched, the first Nvidia cards to come to market after that change would be Kepler.

    If the claimed 1.5 GHz GDDR5 is true, it means that Nvidia finally has a working GDDR5 memory controller after the embarrassment of the previous three generations.  When the memory controller in your high end chips can't handle the stock clock speeds of the bottom commercial bin of GDDR5 memory chips, you're doing something wrong.  AMD's early GDDR5 memory controllers had their flaws, too, but they fixed that way back in 2009.

  • CleffyCleffy Member RarePosts: 6,412

    Looks like an AMD part with a couple more months to develop.  When I looked at the architecture of the stream processors I thought the same thing.  Looks like an AMD board.  But that is kinda a given considering its going to be the AMD boards that the next generation of games are based after.

Sign In or Register to comment.