Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Fuzzy Avatars Solved! Please re-upload your avatar if it was fuzzy!

nVidia GTX 680 Introduction Video

KhrymsonKhrymson Eorzea, MOPosts: 3,090Member

 

Enjoy~!

 

Definitely gonna be getting one of these and selling my GTX 580 shortly thereafter! 

«13

Comments

  • QuizzicalQuizzical Posts: 14,784Member Uncommon

    Don't you think it would make more sense to wait until reviews are out, and then decide after that?  What if the GTX 680 isn't much faster than a GTX 580?

    Kyle on HardOCP has some interesting comments:

    http://hardforum.com/showthread.php?t=1681041

    "We have 2560 and 5760 coverage coming your way. Not the same story as 1080P."

    "You will see some of it this time because we feel as though we are not telling the entire story if we fully leave 1080P out this time."

    (Usually they go for highest playable settings, and if a game is playable on a card at 2560x1600 and fairly high settings, they won't even post results at lower resolutions.)

    "I am thinking this is what you will see out of us.



    Stock clocks - some 1080p - 2560 - 5760



    then - SLI



    then OCed cards head to head



     As you will see on launch day, there are some things making comparisons a little less black and white. And quite frankly, people relying on canned and synthetic benchmarks are [censored] in having a true analysis. And it is just going to get harder to evaluate properly. Brent and I are on top of it and will be making the investments going forward to make HardOCP the best GPU site in the world."

    -----

    As I read it, it sounds like there's something weird going on, and it's not a simple case of, this card is 20% faster than that one.  Charlie on SemiAccurate said a while ago that Kepler will have a lot more variation in performance from one game to the next than normal.  As in, GK104 would beat Tahiti in some games and lose to Pitcairn in others.

    In the previous generation, Northern Islands scaled to high resolutions a lot better than Fermi.  So it was pretty common to have two cards playing the same game at fixed settings (except changing resolution), and the AMD card wins handily at 2560x1600, while the Nvidia card wins handily at 1280x1024.  The other way around basically never happened unless the AMD card didn't have enough video memory and the Nvidia card did.

    This created the odd situation where, for example, a GeForce GTX 560 Ti would often post higher frame rates than a Radeon HD 6950 in situations where both cards were fast enough that the difference didn't matter (e.g., 100 frames per second versus 90).  But in situations where the frame rates were low enough for the difference to matter (e.g., 45 versus 40), the Radeon HD 6950 nearly always beat a GeForce GTX 560 Ti, and sometimes by a lot.

  • ZekiahZekiah Aurora, COPosts: 2,499Member

    I agree with Quiz.

    I've been waiting a long time for nVidia to step up to the plate because I prefer their product with all things being equal, which they haven't been for a long time. The AMD patch system is clumsy but until nVidia surpasses AMD, I'll stick with AMD.

    "Censorship is never over for those who have experienced it. It is a brand on the imagination that affects the individual who has suffered it, forever." - Noam Chomsky

  • QuizzicalQuizzical Posts: 14,784Member Uncommon

    I'm not saying not to buy it.  I'm saying, wait until we see some real reviews, not just official Nvidia marketing.  What matters is how the Nvidia cards you can buy compare to the AMD cards you can buy, as they'll function in your computer in the real world in the games you'll play.

  • ZekiahZekiah Aurora, COPosts: 2,499Member

    Originally posted by Quizzical

    I'm not saying not to buy it.  I'm saying, wait until we see some real reviews, not just official Nvidia marketing.  What matters is how the Nvidia cards you can buy compare to the AMD cards you can buy, as they'll function in your computer in the real world in the games you'll play.

    Oh I know, I just meant that until they can "prove" they have a superior product worth investing in it's better to wait.

    BTW, I appreciate your posts here, I'll be upgrading in the near future and will search out your posts as I believe they are fair and well-researched. I'm hoping nVidia makes a big jump before my next purchase because I've always preferred their cards...when they're worth it.

    At any rate, thanks for all the info. you post.

    "Censorship is never over for those who have experienced it. It is a brand on the imagination that affects the individual who has suffered it, forever." - Noam Chomsky

  • KabaalKabaal Edinburgh, ScotlandPosts: 3,012Member Uncommon

    13:00 GMT tomorrow the NDA will be lifted and stores will start listing the cards so there should be plenty of info over the weekend.

  • KhrymsonKhrymson Eorzea, MOPosts: 3,090Member

    Originally posted by Quizzical

    I'm not saying not to buy it.  I'm saying, wait until we see some real reviews, not just official Nvidia marketing.  What matters is how the Nvidia cards you can buy compare to the AMD cards you can buy, as they'll function in your computer in the real world in the games you'll play.

     

    Ya see Quizz, thats one thing that I can't even give a single iota about.   I don't care how much better you keep preaching AMD supposedly is compared to nVidia.  AMD doesn't exist in my world.  I'll never consider one their cards no matter how great you or anyone else thinks they are over nVidia.

     

    I am through and through nVidia only.  From what I've seen from the leaked benchmarks, including several that have been taken down, the GTX 680 is considerably better than the GTX 580.  Sure some of the benchmarks may not be perfectly exact FPS #s just yet, but that doesn't bother me.  Also, and I'm sure you know this already but the GTX 580 has gotten a price cut recently and may drop even more before this fall when the better Kepler GPUs are to be released.  

    I'd much rather sell off my 580 now, while it still has a decent value, to help fund the 680, then later this fall sell the 680 which will still retain its full value in favor of the speculative 780 or whatever. 

     

    Also, as far as benchmarks for gaming above 1920*1080.  I don't care about that as I only game using a single monitor and never go higher than such.  I'm more than content with that resolution, and as long as I can keep a great FPS while gaming on ultra / maximum settings while forcing additional high end options from the nVidia CP...even better!

  • QuizzicalQuizzical Posts: 14,784Member Uncommon

    If you want to go the fanboy route, it's your money, not mine.  You might want to note that the GTX 680 isn't the thing forcing price cuts on the GTX 580, though.

    Leaks don't always end up being accurate.  Remember the 512 shader, 750 MHz, 175 W GeForce GTX 480 that launched in November 2009?  Would have been a great card if it existed, but the GTX 480 that actually existed in the real world was a complete fiasco.

    Or how about all those rumors that the GeForce GTX 680 was going to cost $300?  Newer rumors put it closer to $500.

  • MMOarQQMMOarQQ BoogalululuPosts: 636Member

    Already paid out the ass for my 3gb 580. Unless this thing projects an alternate reality directly to my visual cortex, I think I'll just go my usual route and skip a generation.

  • RefMinorRefMinor MyTownPosts: 3,452Member
    Originally posted by Khrymson


    Originally posted by Quizzical

    I'm not saying not to buy it.  I'm saying, wait until we see some real reviews, not just official Nvidia marketing.  What matters is how the Nvidia cards you can buy compare to the AMD cards you can buy, as they'll function in your computer in the real world in the games you'll play.

     

    Ya see Quizz, thats one thing that I can't even give a single iota about.   I don't care how much better you keep preaching AMD supposedly is compared to nVidia.  AMD doesn't exist in my world.  I'll never consider one their cards no matter how great you or anyone else thinks they are over nVidia.

     

    I am through and through nVidia only.  From what I've seen from the leaked benchmarks, including several that have been taken down, the GTX 680 is considerably better than the GTX 580.  Sure some of the benchmarks may not be perfectly exact FPS #s just yet, but that doesn't bother me.  Also, and I'm sure you know this already but the GTX 580 has gotten a price cut recently and may drop even more before this fall when the better Kepler GPUs are to be released.  

    I'd much rather sell off my 580 now, while it still has a decent value, to help fund the 680, then later this fall sell the 680 which will still retain its full value in favor of the speculative 780 or whatever. 

     

    Also, as far as benchmarks for gaming above 1920*1080.  I don't care about that as I only game using a single monitor and never go higher than such.  I'm more than content with that resolution, and as long as I can keep a great FPS while gaming on ultra / maximum settings while forcing additional high end options from the nVidia CP...even better!

     

    Wow, so even if someone produces a undeniably better card at a lower price point you will only buy nVidia, and you give people advice in the hardware forum?
  • KhrymsonKhrymson Eorzea, MOPosts: 3,090Member

    Originally posted by Quizzical

    If you want to go the fanboy route

    Thats hilarious from you of all posters here.   You should re-read the posts you make, because you're just as hardcore fanboy when it comes to AMD.  All you ever do is preach how much better you think AMD is and downplay nVidia in every post concerning GPUs.  And well ya know thats your opinion though, but its a bit irritating...

     

  • KhrymsonKhrymson Eorzea, MOPosts: 3,090Member

    Originally posted by RefMinor

     

    Wow, so even if someone produces a undeniably better card at a lower price point you will only buy nVidia, and you give people advice in the hardware forum?

    I have yet to see an undeniably better card from AMD.  I don't give a fuck about AMD, but If you want to play that card then.  ~ The 7970 is $549, and new info is suggesting the 680 may launch at $499, plus most of the benchmarks{game dependant as usual} are showing the 680 coming out well ahead of the 7970.  Hrmmm....

     

    Yet again though, I don't give a damn if you AMD fanboys think your cards are better than nVidia.  I don't make my GPU purchases based on that.  The 680 is better than the 580...ok I will buy it, end of story!

     

    And yes I will give advice in the hardware forums, because it all comes down to preference concerning GPUs.  Not everyone thinks AMD is better at everything like you and especially Quizz think...sheesh!

  • QuizzicalQuizzical Posts: 14,784Member Uncommon

    Originally posted by Khrymson

    Originally posted by Quizzical

    If you want to go the fanboy route

    Thats hilarious from you of all posters here.   You should re-read the posts you make, because you're just as hardcore fanboy when it comes to AMD.  All you ever do is preach how much better you think AMD is and downplay nVidia in every post concerning GPUs.  And well ya know thats your opinion though, but its a bit irritating...

     

    I'm not the one saying I'll buy from one particular company without regard to price, performance, power consumption, features, or anything else that most gamers ought to care about.  When people come in asking for advice on what parts to get, I've linked Nvidia cards on many occasions.

    And if I'm such an AMD fanboy, then I'm surely a bad one, considering that I've never owned an AMD desktop processor.

  • BizkitNLBizkitNL NetherlandsPosts: 2,280Member Common

    Originally posted by Khrymson

    Originally posted by RefMinor

     

    Wow, so even if someone produces a undeniably better card at a lower price point you will only buy nVidia, and you give people advice in the hardware forum?

    I have yet to see an undeniably better card from AMD.  I don't give a fuck about AMD, but If you want to play that card then.  ~ The 7970 is $549, and new info is suggesting the 680 may launch at $499, plus most of the benchmarks{game dependant as usual} are showing the 680 coming out well ahead of the 7970.  Hrmmm....

     

    Yet again though, I don't give a damn if you AMD fanboys think your cards are better than nVidia.  I don't make my GPU purchases based on that.  The 680 is better than the 580...ok I will buy it, end of story!

     

    And yes I will give advice in the hardware forums, because it all comes down to preference concerning GPUs.  Not everyone thinks AMD is better at everything like you and especially Quizz think...sheesh!

    Pot calling the kettle black, if you ask me.

    None of you should advise people, since when it comes to hardware, there's a lot more than the top-of-the-line cards. I could point out certain budget-areas where AMD beats NVIDIA, and vice-versa. But I'm not gonna, since it would be a waste.

    There's nothing wrong with preferring a certain developer, by the way.

    10
  • KhrymsonKhrymson Eorzea, MOPosts: 3,090Member

    Originally posted by Quizzical

    Originally posted by Khrymson

    I'm not the one saying I'll buy from one particular company without regard to price, performance, power consumption, features, or anything else that most gamers ought to care about.  When people come in asking for advice on what parts to get, I've linked Nvidia cards on many occasions.

    And if I'm such an AMD fanboy, then I'm surely a bad one, considering that I've never owned an AMD desktop processor.

    Once again...thats your opinion...do you not understand!?    I like nVidia for my own reasons, regardless...   

     

    OK, so since when did this GPU conversation turn towards CPUs?   Totally differnt discussion...but this just makes my point more clear.  Also notice how you've come into yet another thread I started about nVidia in the past few days and totally derailed it with your AMD is better preaching.  

  • AntariousAntarious Greenville, SCPosts: 2,802Member

    I guess my point of view is if you are interesting in a GTX 680 and wait... you had better be in the mind of waiting for some speciality cooler and/or overclocked version.

     

    Regardless of reviews I would imagine the stock on hand will be out of stock pretty quick just like with the 7970 launch.

     

    Personally if I can get my hands of one.. then I will be buying one.   It uses less power than current nvidia cards (which is a big deal to me) and two major games in my personal *buy* list.. are going to use engines developed in collaboration with Nvidia and built around PhysX (which as far as I know.. neither of my 7970's will do).

     

     

    Moderator's on this site allow certain posters to create endless troll threads. Yet "warn" people for giving recommendations... account *pending* deletion because.. why bother.

  • KhrymsonKhrymson Eorzea, MOPosts: 3,090Member

    I'll be camping Newegg and EVGA's store as soon as I return from work tomorrow morning.  I heard they'll be extremely limited due to the chips having unexpectedly low yields, but regardless of a 22nd or 23rd release date, I have both days free entirely.  

    Gonna do my best to nab one!

     

    Personally though, I prefer the reference design with the vapor chamber cooling, and I'll be ok with just a standard over the superclocked.  Nothing challenges my regular non-OC'd 580, thus I'll be content, but if there is a superclocked ver released, I may consider getting one. 

  • QuizzicalQuizzical Posts: 14,784Member Uncommon

    Well, here it is:

    http://www.newegg.com/Product/Product.aspx?Item=N82E16814162095

    Weird specs on it.  1536 shaders at 2012 MHz?  That would get you about quadruple the theoretical shader performance of a GTX 580, so it's probably wrong.  Or at the very least, a Kepler shader at a given clock speed would be vastly weaker than a Fermi shader.  I guess that's true of AMD shaders for the last six generations.  But still, nearly double the theoretical shader power of Tahiti in a smaller die?

    And an effective memory clock of 6008 MHz?  That would mean they can't use the 1.5 GHz bin of GDDR5 memory and have to go with a higher bin or run it out of spec.  That would be a lot of added cost for no apparent reason in particular, kind of like buying 1866 MHz DDR3 memory and then running it at 1602 MHz.  If that's all the performance you wanted, then save some money, get 1600 MHz memory, and run it at 1600 MHz.

    New Egg has gotten specs wrong on pre-release cards before, I suppose.  So it wouldn't be a surprise if they did it again.

    It also looks like it has only a single 6-pin PCI-E power connector, which would cap the card at 150 W.  That means either awesome performance per watt, not nearly the performance people hoped for, no overclocking headroom, running things way out of spec, or some combination of more than one of those.  Or maybe they have more PCI-E power connectors, but didn't put them next to each other for some bizarre reason.

    Most of the features listed are pretty bog standard for Nvidia cards by now.  PhysX, CUDA, SLI, Surround, 3D Vision... yawn.  PCI Express 3.0 is new for Nvidia, but was expected.

    Still a few features listed are more interesting.  There's support for four displays, which isn't a surprise.  AMD cards can do six, but hardly anyone cares about more than three, so Nvidia basically caught up in number of monitors.  There's still the question of software support, but that's not really knowable yet.

    There's also "NVIDIA Adaptive Vertical Sync".  Not sure what that is, but if they managed to improve on the usual way of doing vertical sync, then that could be a cool feature.  A lot of games implement vertical sync by saying, we've finished rendering a frame so we're going to stop entirely until we display it.  Champions Online has a niftier method of saying, don't stop rendering unless you've completed a frame that you started rendering after the last frame was displayed.  That gets you the anti-tearing features of vertical sync without reducing your frame rate unless you're over the monitor refresh rate.  Is that what Nvidia applied to all games?  I have no clue, but it would be cool if they did.

    And then it lists DirectX 11.  Not 11.1, like the 7970 has.  Of course, New Egg also lists the 7970 as only having DirectX 11, which is wrong.  So they might be wrong about the GTX 680, too.

    "two major games in my personal *buy* list.. are going to use engines developed in collaboration with Nvidia and built around PhysX"

    Is that CPU or GPU PhysX?  The overwhelming majority of games that use PhysX run it on the CPU, in which case, it will perform the same no matter what video card you have.  If you're buying one of the few games that do GPU PhysX and really want to max the PhysX, then have at it.  That's why Nvidia pays a developer to offer GPU PhysX every now and then.

    "I became bored after the word "You've" because once again you still just don't get it."

    You're replying to what you expected me to write.  It has absolutely nothing to do with what I actually wrote.  You should try reading a post before replying to it sometime.

  • DLangleyDLangley Beaumont, TXPosts: 1,407Member

    Stay on topic please.

  • QuizzicalQuizzical Posts: 14,784Member Uncommon

    Originally posted by DLangley

    Stay on topic please.

    Did you actually read my post before deleting it?  The topic is the GeForce GTX 680, right?  That's sure what I thought from the title of this thread.

    I talked at length about what reviews of the GeForce GTX 680 will tell us.  How is that off topic?

  • QuizzicalQuizzical Posts: 14,784Member Uncommon

    Originally posted by DLangley

    Stay on topic please.

    So let me get this straight.  The thread is about a GeForce GTX 680.  The original poster says he wants to buy one.  So when I point out that they're available for sale on New Egg right now, that merits post deletion and a warning for thread hijacking?  Or are you objecting to me quoting some of the specs from New Egg?

  • CabalocCabaloc Fort Pierce, FLPosts: 116Member

    I read your last post that was deleted, boggles my mind as well, as usual a very informative post on GPUS . Half this forum has learned much from Quizz, myself included ,Ridelynn also is good on the info .

  • RefMinorRefMinor MyTownPosts: 3,452Member
    Originally posted by Khrymson


    Originally posted by RefMinor
     
    Wow, so even if someone produces a undeniably better card at a lower price point you will only buy nVidia, and you give people advice in the hardware forum?

    I have yet to see an undeniably better card from AMD.  I don't give a fuck about AMD, but If you want to play that card then.  ~ The 7970 is $549, and new info is suggesting the 680 may launch at $499, plus most of the benchmarks{game dependant as usual} are showing the 680 coming out well ahead of the 7970.  Hrmmm....

     

    Yet again though, I don't give a damn if you AMD fanboys think your cards are better than nVidia.  I don't make my GPU purchases based on that.  The 680 is better than the 580...ok I will buy it, end of story!

     

    And yes I will give advice in the hardware forums, because it all comes down to preference concerning GPUs.  Not everyone thinks AMD is better at everything like you and especially Quizz think...sheesh!

     

    "I'll never consider one their cards no matter how great you or anyone else thinks they are over nVidia"

     

    "I am through and through nVidia only."

     

    I was going from your own quotes, I rarely come into the hardware thread, I don't care who makes a GPU and never buy top end ones, in fact I haven't custom built a PC in about 6 years, but what I do do is base my decisions on fact rather than blind loyalty to a public company whose shares I don't own and if I were to do that I would hope I wouldn't give such blinded advice to other people.
  • kadepsysonkadepsyson sun prairie, WIPosts: 1,937Member

    I'd ask you to retype it Quizzical, so that I could be better informed about the GTX 680, but I don't want you to risk getting banned for being informative and constructive here.

     

    From what little I read though, it seems the card is barely faster than an HD 7970, while having 1GB less memory.  The less memory is a dealbreaker for me personally

    El Psy Congroo

  • ZekiahZekiah Aurora, COPosts: 2,499Member

    I'm not sure what was deleted but I'd appreciate it if any and all posts about products be left alone, I really depend on them and is one main reasons I come here. I'm sure I'm not alone either, as consumers we're trying to find the best product for what we're looking for and it would be a shame if information was being deleted because it wasn't shown in the most positive of light.

    Thanks.

    "Censorship is never over for those who have experienced it. It is a brand on the imagination that affects the individual who has suffered it, forever." - Noam Chomsky

  • QuizzicalQuizzical Posts: 14,784Member Uncommon

    You know what?  The post deletions were so absurd that I'm just going to post the gist of them again.  If the forum admins decide that posting useful, relevant, on-topic posts is a bannable offense, then I don't need to post here anyway.  So let's see if they delete the repost.

    One deleted post:

    There were two GeForce GTX 680s available for sale on New Egg.  I gave a link to one of them.  They seem to be gone now, so I can't give the same link again.  Some of the specs were wonky, most notably 1502 MHz GDDR5 memory.  There's a common commercial bin of 1.5 GHz (which is what the Radeon HD 7970 uses), so if they buy the next bin up, that's a ton of extra cost for basically no benefit.

    The card also only has one 6-pin PCI-E power connector visible.  That would mean a TDP cap of 150 W or less, which is much lower than I'd expect for Nvidia's flagship card.  (The previous generation GeForce GTX 580 was officially 244 W, and an honest TDP was closer to 270 W.  And TDPs tend to be going up, not down, as time passes.)  There are a lot of possible explanations for that, including a picture of the wrong card, or having multiple power connectors that oddly aren't next to each other.

    New Egg also said only DirectX 11, not 11.1.  But they're probably simply wrong about that, since they also say that the 7970 is only DX11.

    Other deleted post:

    I actually have this one, because I was given a warning for thread hijacking, and they mailed me the post saying, here's your post that was deleted for thread hijacking.  So I could post the whole thing again.  But it was long, so I won't.  Here's the last several paragraphs:

    -----

    If the rumors are right, GK104 has a die size between Pitcairn and Tahiti, and closer to Tahiti than Pitcairn. If performance lands about there, too, then we basically have a tie in performance per mm^2, and can expect Nvidia and AMD to be roughly competitive on price all up and down their lineups once the rest of the Kepler cards are out. That would still be kind of a win for AMD this generation on the basis that they got to sell lots of cards between when Southern Islands launched and when Kepler did. But that's irrelevant to people who want to buy something this summer after all of the cards are out.



    If one side wins on performance per mm^2 by 10% or so, that might not filter into retail pricing. AMD has beaten Nvidia by something like 50% the last few generations, which does show up in retail pricing.



    And then there is performance per watt. Tahiti has some HPC and professional graphics stuff that GK104 presumably lacks (and Pitcairn certainly lacks), so the cleaner comparison here is Pitcairn versus GK104. Performance per watt isn't that big of a deal in desktops, but it sure does matter in laptops, where gaming laptops are fundamentally about putting too much heat into too little space.



    For this, you kind of have to check a bunch of different review sites, as different sites have their own methodology. I'm actually expecting an approximate tie in performance per watt, as I regard the whole Fermi fiasco as likely to be something of an outlier. If that happens, then the people who say you should either go Intel+Nvidia or else all AMD might finally have a point, for complicated reasons that I'll explain if it happens. If either AMD or Nvidia wins big on performance per watt, then that's what you want in a gaming laptop at the high end, though low end (integrated graphics) will still be all AMD, as Nvidia doesn't have the x86 license needed to compete in that market.



    And then, of course, there is the question of whether Nvidia can have enough cards ready by the time Ivy Bridge launches. Cape Verde and Pitcairn are certainly ready. If a laptop vendor wants to refresh its line for Ivy Bridge and Kepler isn't available yet, then they'd be foolish to go with GF116 over Cape Verde.

    -----

    Before some moderator decides this is off topic and deletes it again, GK104 is the die that the GeForce GTX 680 is built on.  Tahiti and Pitcairn are the GPU dies of the most analogous cards from AMD.  Kepler is the architecture of the GeForce GTX 680, and some other Nvidia cards.  Fermi is Nvidia's architecture from the previous generation.  There are a bunch of other  code names in there of related parts, but it's all relevant.  So see, even if you don't know what all of that means, it's very much on topic.

    For what it's worth, the earlier paragraphs gave some historical context on the card, and how various cards in previous generations have compared.

«13
This discussion has been closed.