Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

nVidia GTX 680 Introduction Video

2

Comments

  • kadepsysonkadepsyson Member UncommonPosts: 1,919

    Thanks Quizzical.

     

    I'm especially interested in the 2560 and Nvidia Surround benchmarks that are forthcoming, after one of the initial quotes you listed.  I wonder if the 2GB of memory turns out to be a large detriment, or if something else is at work to either kill off or seriously improve performance at such resolutions.

  • ZekiahZekiah Member UncommonPosts: 2,483

    The history of recent cards is extremely pertinent and important, thanks for sharing.

    I haven't researched the 680 but I know that recent nVidia cards were large and ran loud and hot, anyone know if that has changed with this one?

    "Censorship is never over for those who have experienced it. It is a brand on the imagination that affects the individual who has suffered it, forever." - Noam Chomsky

  • QuizzicalQuizzical Member LegendaryPosts: 25,353

    Originally posted by kadepsyson

    From what little I read though, it seems the card is barely faster than an HD 7970, while having 1GB less memory.  The less memory is a dealbreaker for me personally

    The difference between 2 GB and 3 GB doesn't matter unless you're running a 3-monitor setup.  Even at a resolution of 2560x1600, 2 GB will pretty much invariably be adequate.

    On the other hand, I'm skeptical that the leaks are representative of final performance.  See the comments in post #2 of this thread.  That's from someone who runs a major hardware site that has a card, has tested it, and simply can't post the review quite yet, so it's not random speculation from someone who doesn't know anything.

  • QuizzicalQuizzical Member LegendaryPosts: 25,353

    Originally posted by kadepsyson

    Thanks Quizzical.

     

    I'm especially interested in the 2560 and Nvidia Surround benchmarks that are forthcoming, after one of the initial quotes you listed.  I wonder if the 2GB of memory turns out to be a large detriment, or if something else is at work to either kill off or seriously improve performance at such resolutions.

    What happened last generation was that Fermi had more geometry hardware, while Northern Islands had more shaders.  If your performance is limited by geometry, then Nvidia won.  If your performance was limited by shaders, then AMD won.  Higher resolutions put a lot more load on the shaders, but not so much additional load on the geometry.  Southern Islands seems to balance things about how Northern Islands did.  Kepler is a new architecture, so it could take a very different approach from Fermi--and rumors say that Kepler is a lot more shader-heavy than Fermi was.

    Video memory is really a matter of, either you have enough or you don't.  If you have enough, then it doesn't matter whether you barely have enough or have ten times what you need.  If you don't have enough, your performance will be terrible, or the game might even refuse to run at all.  Higher graphical settings and especially higher monitor resolutions tend to require more video memory.

  • kadepsysonkadepsyson Member UncommonPosts: 1,919

    The Nvidia promotional video did mention that one card can power Surround with the GTX 680, which I think is new for them (?).

    I'd really like comparisons then of that card running 1920 x 1080, 5760 x 1080 and that card running 7680 x 1600 (what I have), and then compared those three benchmarks with the 7970.  I think it'd be very interesting.

  • DLangleyDLangley Member Posts: 1,407

    The on-topic post has been restored.

  • QuizzicalQuizzical Member LegendaryPosts: 25,353

    Originally posted by Zekiah

    The history of recent cards is extremely pertinent and important, thanks for sharing.

    I haven't researched the 680 but I know that recent nVidia cards were large and ran loud and hot, anyone know if that has changed with this one?

    For desktops, that was true of the early GF100-based cards, that is, the reference GeForce GTX 470 and 480.  For the 470 and 480, I think the problem was that Nvidia built coolers meant to dissipate a given amount of heat, then saw that the clock speeds that they could do in that TDP would lose to the Radeon HD 5850 and 5870 on performance.  They decided it was better to lose badly in every way imaginable except for performance than to merely lose somewhat in every way imaginable including performance.  So they clocked the cards higher and put out more heat than the coolers were meant to handle, which is how you get hot and loud.

    But that was a one-time blunder, and not repeated on other cards.  Indeed, some of the GeForce GTX 470s and 480s with custom coolers managed to largely avoid the heat and noise problem simply by spending what it took for a better cooler.  300 W means you're going to have some noise, but it doesn't have to be anywhere near the reference GTX 480 levels.

    I guess the GeForce GTX 590 had related problems.  But that was for a different reason:  no one has figured out how to safely dissipate 450 W on air in a two slot cooler that complies with PCI Express standards for size.  Nvidia tried, and couldn't.  Asus made a custom card that could cool it right (the Mars II), but that was a huge three slot cooler, and heavy enough that you ought to worry about the card's weight damaging the motherboard.

    But more to the point, there's no reason to believe that the GeForce GTX 680 will suffer from that sort of problems.  It's not going to have a runaway TDP, as it wasn't even meant to be the top card in the current generation.  Nvidia ended up cancelling the top GPU chip (GK100), and gave the GeForce GTX 680 marketing name to the top GK104 chip.  If GK100 had not been cancelled, then the GeForce GTX 680 probably would have been the top GK100 chip, and the card that we're calling a GeForce GTX 680 now would have been a GTX 660 or some such.

    In laptops, there is the problem that it's hard to dissipate so much heat in so little space.  AMD winning big on performance per watt the last two generations was a huge deal in laptops.  But I'm expecting Kepler to roughly catch up to Southern Islands, so AMD's advantage there likely disappears.  But we'll find out when the reviews are out.

  • ZekiahZekiah Member UncommonPosts: 2,483

    That's good news then, I'll hold out hope of finally switching back. I've always preferred nVidia and want to switch back on my next rig if it makes sense. These ATI cards have been really good to me, I was surprised at how durable mine has been. It's always been the issue of drivers though, nVidia has always been better at that. I'm having issues with the latest driver and had to turn off anti-aliasing in STO. This concerns me and although I'm sure they'll have a fix soon, I'm once again looking at the new nVidia cards.

    Thanks for the input. image

    "Censorship is never over for those who have experienced it. It is a brand on the imagination that affects the individual who has suffered it, forever." - Noam Chomsky

  • QuizzicalQuizzical Member LegendaryPosts: 25,353

    Originally posted by kadepsyson

    The Nvidia promotional video did mention that one card can power Surround with the GTX 680, which I think is new for them (?).

    I'd really like comparisons then of that card running 1920 x 1080, 5760 x 1080 and that card running 7680 x 1600 (what I have), and then compared those three benchmarks with the 7970.  I think it'd be very interesting.

    Ah, yeah, if you're running 7680 x 1600, then you want 3 GB.  Or 4 GB or 6 GB if they make them.  You're the sort of person that they make the cards with crazy high amounts of video memory for.

    A Radeon HD 5450 could run three monitor Eyefinity.  Not in demanding games at playable frame rates, though.

    Having hardware support for four monitors built into the die is new for Nvidia.  Assuming it's built into the die, that is, and not some other chip on the PCB to split things.  I guess we'll find out when the reviews launch.  There were Nvidia cards that could run three monitors on a single card from previous generations, but it took some custom work from the board partner.

    Nvidia could do that custom work for their reference cards, and then say, hey, we can do that, too.  That's kind of a hack, though, and it's much better to have it properly supported in the GPU die.  Which they probably do.  We'll find out when reviews are posted.

  • kadepsysonkadepsyson Member UncommonPosts: 1,919

    When's the media embargo lift for these benchmarks?

  • QuizzicalQuizzical Member LegendaryPosts: 25,353

    Originally posted by kadepsyson

    When's the media embargo lift for these benchmarks?

    The date and time at which an embargo ends tends to itself be embargoed.  Sometimes it leaks early, but I'm guessing tonight.

    One general PR rule is that you don't release news on weekends (Friday through Sunday) unless it's bad news and you don't want anyone to notice.  For example, the GTX 470 and 480 reviews were posted on a Friday evening, but that's the only time that I can think of that a major hardware NDA ended on a weekend like that.  (This applies to politics, too:  if a cabinet official is caught doing something horrible and has to resign, they'll wait until the next Friday afternoon/evening to announce it if they possibly can.)

    So if it's not coming this weekend, then that means either Thursday (most likely tonight, though it could be during the day tomorrow), or not until next week.  And while New Egg commonly posts cards for sale a few hours early, or sometimes even a day or two, I've never seen them post cards 5 days early, like it would be if the GTX 680 isn't available until next week.  So that's why I'm guessing tonight.  But that's just a guess.

  • SuperXero89SuperXero89 Member UncommonPosts: 2,551

    Won't be buying a new video card for about two years :P

  • VooDoo_PapaVooDoo_Papa Member UncommonPosts: 897

    im still shocked gamers shell out so much to play such a limited selection of games, most of which run fine on hardware dated 2 years old.

    maybe this is a bigger factor in non-mmos but most mmo's I play run on toasters for the most part.

    image
  • QuizzicalQuizzical Member LegendaryPosts: 25,353

    A lot depends on the settings you want to use.  If you want to run games at 7680x1600 like Kade does above, you need rather a rather stronger video card than it takes for 1280x1024.  It also take a lot more hardware if you want to max settings than if you just want games to look nice.

    There's also the issue that you don't necessarily know what games you're going to play in the future or how demanding the games will be.

  • ZolgarZolgar Member Posts: 533

    According to the stickied thread about Kepler news on Tom's Hardware, people are saying that NewEgg has responded to a few e-mails saying it will release at 6AM (PST I assume).

     

    Then someone else linked THIS from xtremesystems. 

    0118 999 881 999 119 725... 3

  • KabaalKabaal Member UncommonPosts: 3,042

    Originally posted by Zolgar

    According to the stickied thread about Kepler news on Tom's Hardware, people are saying that NewEgg has responded to a few e-mails saying it will release at 6AM (PST I assume).

     

    Then someone else linked THIS from xtremesystems. 

    It's true as far as i can tell, the info i posted last night saying the NDA will be lifted today at 13:00 and stores will start listing them came from the purchasing manager of overclockers.co.uk.

     

    There's only a few hours until sites start posting some in depth reviews but here's some more info he's posted :

    Regarding the new boost system on the card:

    Correct, the boost is marginal, unless you set overclocks.



    For example, when I ran Heaven earlier my GPU clock increased too 1201MHz, yet when I was playing Quake III it stayed at 1006MHz. Yet I had not changed any of the settings in afterburner.



    __

    I assume they are all 100% reference hardware wise and the only USP will be warranty, bundled software and cables?

    Correct.

    __

    UK Price will be ~£420

    __

    In short, all cards reference design, no 4GB parts anytime soon. OC models about 2-4 weeks away but just OC yourself.



    Card is mega quiet and is only 2x 6pin power connectors as they have very low TDP and run very cool.

    __

    Regarding card size:

    Two slots, about 10" long.

     

     

  • ZolgarZolgar Member Posts: 533

    0118 999 881 999 119 725... 3

  • KabaalKabaal Member UncommonPosts: 3,042

    Saw that earlier Zolgar, not a very interesting benchmark tbh with the games they selected and they mixed up some of the card results.

  • KhrymsonKhrymson Member UncommonPosts: 3,090

    3D Center Benchmarks ~ more ~ more

    Love it, the 680 is destroying the 580 handily, and in many cases on par with or beating the GTX 590.

     

    Tweaktown GTX 680 Review and Benchmarks

    nVidia GTX 680 301.10 Drivers

     

     

     

  • KabaalKabaal Member UncommonPosts: 3,042

    I wouldn't say they are destroying anything. from the info that's starting to flood the net now the 680 is coming ahead at 1200p but as soon as the res is upped it starts performing the same as the 7970, BF3 benchmarks for example @2560X1600 puts them practically neck and neck.

    The real thing that's going to differentiate them is overclocking and 3 monitor performance which has yet to be seen. Another hour and that info should start appearing.

  • KhrymsonKhrymson Member UncommonPosts: 3,090

    Originally posted by Kabaal

    I wouldn't say they are destroying anything. from the info that's starting to flood the net now the 680 is coming ahead at 1200p but as soon as the res is upped it starts performing the same as the 7970, BF3 benchmarks for example @2560X1600 puts them practically neck and neck.

    The real thing that's going to differentiate them is overclocking and 3 monitor performance which has yet to be seen. Another hour and that info should start appearing.

     

    Don't care about gaming over 1920*1080, so its a great GPU for my habits.  Once you start going higher, then you can't run all the extra Ultra, AA and additional forced settings like I do without taking a huge FPS drop.

     

    The % of gamers that play at 2560+ resolutions is very small, so thats not something the average gamer should be worried about.  The 680 performs very well at the standard 1920*1080, and thats the majority of gamers resolutions.

     

    There is a video of  680 SLi and triple monitor performance already out:

    Newegg TV: EVGA nVidia GeForce GTX 680 SLi & Triple Monitor Benchmarks

     

    Oops, link fixed...

  • KabaalKabaal Member UncommonPosts: 3,042

    You do realise that these cards are aimed at that exact small percentage of gamers? Its great that you're only interested in your own 1080p monitor but most people who may buy the cards, including myself, play at higher reolutions.

    That newegg benchmark was a waste of time, no game performance information whatsoever.

  • BetaguyBetaguy Member UncommonPosts: 2,629

    Originally posted by Quizzical

    Don't you think it would make more sense to wait until reviews are out, and then decide after that?  What if the GTX 680 isn't much faster than a GTX 580?

    Kyle on HardOCP has some interesting comments:

    http://hardforum.com/showthread.php?t=1681041

    "We have 2560 and 5760 coverage coming your way. Not the same story as 1080P."

    "You will see some of it this time because we feel as though we are not telling the entire story if we fully leave 1080P out this time."

    (Usually they go for highest playable settings, and if a game is playable on a card at 2560x1600 and fairly high settings, they won't even post results at lower resolutions.)

    "I am thinking this is what you will see out of us.



    Stock clocks - some 1080p - 2560 - 5760



    then - SLI



    then OCed cards head to head



     As you will see on launch day, there are some things making comparisons a little less black and white. And quite frankly, people relying on canned and synthetic benchmarks are [censored] in having a true analysis. And it is just going to get harder to evaluate properly. Brent and I are on top of it and will be making the investments going forward to make HardOCP the best GPU site in the world."

    -----

    As I read it, it sounds like there's something weird going on, and it's not a simple case of, this card is 20% faster than that one.  Charlie on SemiAccurate said a while ago that Kepler will have a lot more variation in performance from one game to the next than normal.  As in, GK104 would beat Tahiti in some games and lose to Pitcairn in others.

    In the previous generation, Northern Islands scaled to high resolutions a lot better than Fermi.  So it was pretty common to have two cards playing the same game at fixed settings (except changing resolution), and the AMD card wins handily at 2560x1600, while the Nvidia card wins handily at 1280x1024.  The other way around basically never happened unless the AMD card didn't have enough video memory and the Nvidia card did.

    This created the odd situation where, for example, a GeForce GTX 560 Ti would often post higher frame rates than a Radeon HD 6950 in situations where both cards were fast enough that the difference didn't matter (e.g., 100 frames per second versus 90).  But in situations where the frame rates were low enough for the difference to matter (e.g., 45 versus 40), the Radeon HD 6950 nearly always beat a GeForce GTX 560 Ti, and sometimes by a lot.

    retracting my comment, I don't usually talk tech talk because it is like talking politics and religion.  Never ends well.

    "The King and the Pawn return to the same box at the end of the game"

  • KhrymsonKhrymson Member UncommonPosts: 3,090

    EVGA GTX 680 ~ Overview

    EVGA Precision X & nVidia GPU Boost ~ the new Precision X is very nice!

  • Jimmy562Jimmy562 Member UncommonPosts: 1,158

    Originally posted by Khrymson

    Originally posted by Kabaal

    I wouldn't say they are destroying anything. from the info that's starting to flood the net now the 680 is coming ahead at 1200p but as soon as the res is upped it starts performing the same as the 7970, BF3 benchmarks for example @2560X1600 puts them practically neck and neck.

    The real thing that's going to differentiate them is overclocking and 3 monitor performance which has yet to be seen. Another hour and that info should start appearing.

     

    Don't care about gaming over 1920*1080, so its a great GPU for my habits.  Once you start going higher, then you can't run all the extra Ultra, AA and additional forced settings like I do without taking a huge FPS drop.

     

    The % of gamers that play at 2560+ resolutions is very small, so thats not something the average gamer should be worried about.  The 680 performs very well at the standard 1920*1080, and thats the majority of gamers resolutions.

     

    There is a video of  680 SLi and triple monitor performance already out:

    Newegg TV: EVGA nVidia GeForce GTX 680 SLi & Triple Monitor Benchmarks

     

    Oops, link fixed...

    You won't be needing a 680 for 1080 resolution. I use a 480GTX for 1080 and run everything maxed at decent fps so I can't see the reason to spend what ever the 680 will cost for an unnecessary FPS boost unless for some reason you like seeing a higher number.

     

This discussion has been closed.