Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Nvidia is about to pwn j00

eyeswideopeneyeswideopen Member Posts: 2,414

-Letting Derek Smart work on your game is like letting Osama bin Laden work in the White House. Something will burn.-
-And on the 8th day, man created God.-

«13

Comments

  • choujiofkonochoujiofkono Member Posts: 852

    Originally posted by eyeswideopen

    http://news.bigdownload.com/2010/11/07/call-of-duty-black-ops-used-to-tease-nvidias-next-pc-graphics-c/

     

    How does 2 billion polygons and multi-layer tesselation sound?

        Depends if the card will cost half as much as a regular computer.  I would use it more for GPU renders than the occasional game that makes use of that particular tech though.  With the GPU on my GTX260 I can render out a HD picture with a global illumination rendering engine in minutes instead of hours on the CPU.  I can imagine what it would be like on one of these new models.  Probably much much faster.  That would promote animations with realistic lighting of a quality only seen in the most advanced movies up to this point.  I am talking full caustics and multiple refraction/reflections with unlimited AA.  That would be pretty awesome.  No need for a render farm at those speeds. 

         I won't go ballistic over GPU's till they are designed to accelerate raytraced engine games, that would be a whole new world of reality in gaming.  Till then it's mainly just which sales gimmick do you want to invest your money into. 

    "I'm not cheap I'm incredibly subconsciously financially optimized"
    "The worst part of censorship is ------------------"
    image

  • eyeswideopeneyeswideopen Member Posts: 2,414

    Originally posted by choujiofkono

    Originally posted by eyeswideopen

    http://news.bigdownload.com/2010/11/07/call-of-duty-black-ops-used-to-tease-nvidias-next-pc-graphics-c/

     

    How does 2 billion polygons and multi-layer tesselation sound?

        Depends if the card will cost half as much as a regular computer.  I would use it more for GPU renders than the occasional game that makes use of that particular tech though.  With the GPU on my GTX260 I can render out a HD picture with a global illumination rendering engine in minutes instead of hours on the CPU.  I can imagine what it would be like on one of these new models.  Probably much much faster.  That would promote animations with realistic lighting of a quality only seen in the most advanced movies up to this point.  I am talking full caustics and multiple refraction/reflections with unlimited AA.  That would be pretty awesome.  No need for a render farm at those speeds. 

         I won't go ballistic over GPU's till they are designed to accelerate raytraced engine games, that would be a whole new world of reality in gaming.  Till then it's mainly just which sales gimmick do you want to invest your money into. 

    Meh, I just wanna play an mmo and watch faces literally melt off when I throw a fireball at 'em.image

    -Letting Derek Smart work on your game is like letting Osama bin Laden work in the White House. Something will burn.-
    -And on the 8th day, man created God.-

  • noquarternoquarter Member Posts: 1,170

    At the rate MMO's adopt graphics technology we might see heavy tessellation in 10 or 12 years considering even games that push tech aren't using heavy tessellation any time soon :( But yea the technique is cool and why I have been excited about DX11.

  • CleffyCleffy Member RarePosts: 6,412

    Seems more like pre-marketing to reduce sales of the upcoming high range of AMD video cards.  2 Billion isn't a special number, AMD achieved 1.7 billion last year.  They also achieved Real-time Raytracing in 2006.  The problem is not with the technology its with adoption.  And ofcourse the rest is just compute shader tricks that both companies can do as long as you adopt HLSL.

  • GrayGhost79GrayGhost79 Member UncommonPosts: 4,775

    Originally posted by Cleffy

    Seems more like pre-marketing to reduce sales of the upcoming high range of AMD video cards.  2 Billion isn't a special number, AMD achieved 1.7 billion last year.  They also achieved Real-time Raytracing in 2006.  The problem is not with the technology its with adoption.  And ofcourse the rest is just compute shader tricks that both companies can do as long as you adopt HLSL.

    Yeah, thats basically what it boils down to. In any case, it's not like we will see much of this coming to MMO's any time soon lol. Even if you have a card that can keep up, I'm not sure much this kind of thing would stress the server.

  • QuizzicalQuizzical Member LegendaryPosts: 25,355

    Originally posted by eyeswideopen

    How does 2 billion polygons and multi-layer tesselation sound?

    Sounds like a synthetic benchmark not relevant to actual gaming.  You've got around 2 million pixels on a screen.  So you want to compute 2 billion polygons, and then be guaranteed to throw away at least 99.9% of them without even using them for a single pixel?  What is this, a goal to see who can make the most badly coded game?

    I'm not against synthetic benchmarks, as they can give interesting information about how products perform in various edge cases.  But "performs well in synthetic benchmarks" is no substitute for "performs well in real games".

    And the card is what, a year and some odd late by now?  Nvidia still hasn't launched the 512 shader Fermi card that they promised in September 2009, but rumors say this might finally be it.  Just in time to get obliterated by the high end Cayman cards in AMD's new Northern Islands architecture.

    We know that it's not a new architecture, as Kepler isn't coming until 2011--and probably late 2011 or possibly delayed into 2012, at that.  We know that it's not a new process node, as there aren't any new process nodes available.  Both TSMC and Global Foundries canceled their 32 nm nodes, and their 28 nm nodes aren't ready and won't be until maybe next summer or so.  It's unlikely that Nvidia could go much larger on die size, as GF100 was already so huge that they couldn't make it properly, and going larger yet makes yield problems worse, not better.

  • IzkimarIzkimar Member UncommonPosts: 568

    Exactly, the way things are right now I don't see the 580 beating Cayman.  Especially not Antilles. 

  • AxeionAxeion Member UncommonPosts: 418

    Originally posted by Cleffy

    Seems more like pre-marketing to reduce sales of the upcoming high range of AMD video cards.  2 Billion isn't a special number, AMD achieved 1.7 billion last year.  They also achieved Real-time Raytracing in 2006.  The problem is not with the technology its with adoption.  And ofcourse the rest is just compute shader tricks that both companies can do as long as you adopt HLSL.

     

     Yeah i kinda agree.But i stillll ordered a ZOTAC AMP! ZT-40102-10P GeForce GTX 480 (Fermi) 1536MB 384-bit GDDR5 PCI Express 2.0 x16 HDCP Ready SLI Support Video Card.Hopen it is the last card i need for my curent comp. Next build i hope to have the 580 by that time its proce should drop a bit.

    Grafic cards now are just getn out their on price .I think best bang for buck is always just under the bleeding edge.BUt it all comes down to what your doing with the comp to beigin with.For wow players id doesnt take much. Friend of mine is geting by with intagrated grafics in wow.Where as i cant help but lag in Aoc with two 260 gtx bfg cards in sli.It realy just depends on what your gona play what screen size you have an how pretty you want things to look.

    "Civilized men are more discourteous than savages because they know they can be impolite without having their skulls split, as a general thing." — Robert E. Howard, The Tower of the Elephant (1933)

  • ShinamiShinami Member UncommonPosts: 825

    I guess these people have never heard of TimeDemos...It would have been better to actually post a timedemo result (Q3, Q4, CoD4) than showing off irrelevant crap. Oh well.  Ignorance truly is bliss...if it caused a thread like this to be started...Imagine all the kids ranting and chanting about some large number representing next to nothing in real world performance. 

  • QuizzicalQuizzical Member LegendaryPosts: 25,355

    A TechPowerUp review that was posted earlier today and taken back down had the GTX 580 beating the GTX 480 by about 15%.  That's right in line with what one would have expected from the specs and being a respin of the GF100 GPU of the GTX 480.  Not exactly earth-shattering performance.

    But that doesn't mean it won't be a good card.  It reportedly uses less power than the GTX 480, so Nvidia is finally getting the sort of performance per watt improvements that they should theoretically have gotten from the die shrink.  The reference GTX 580 has a much better cooler, too, so you can buy one without worrying that it will fry, unless you've got improper case airflow.

    It's extremely unlikely to be in the same league as Cayman from a corporate financial perspective, as similar performance out of a much larger die size and probably with much worse yields is a big loss.  If you're selling a part that costs you $100 to make (for just the GPU chip, not counting other components or the markup from the various companies involved), and your competitor can make something equivalent for $50, you're in a bad spot.

    But that's Nvidia's problem.  The GTX 580 will probably, for the first time in over a year, give Nvidia fans a genuinely high end video card to buy that doesn't run dangerously hot and obnoxiously loud.

    Unless it's a paper launch.

  • AmazingAveryAmazingAvery Age of Conan AdvocateMember UncommonPosts: 7,188

    That techpowerup review had the GTX 580 pretty much the same as a 5970, that is pretty good. Average load was 244W and the decibels were 9 less than a 5970 too. Suprised quite a few people reading around different sites.



  • QuizzicalQuizzical Member LegendaryPosts: 25,355

    Some other sites said that the 5970 was using old drivers that broke something in CrossFire that has since been fixed.  I'm not sure if that's true.

    Regardless, the Radeon HD 5970 wasn't that great of a card.  The reference cards didn't cool the VRMs properly, and the non-reference cards cost too much because they were very limited edition cards.  The 5970 was really just a marketing gimmick so that AMD could claim that they had the fastest card on the market.  For someone who wanted that level of performance, getting two 5870s in CrossFire made a lot more sense.

    The real competition for the GTX 580 is the Radeon HD 6970 that probably launches in about two weeks.

  • choujiofkonochoujiofkono Member Posts: 852

    Originally posted by Cleffy

    Seems more like pre-marketing to reduce sales of the upcoming high range of AMD video cards.  2 Billion isn't a special number, AMD achieved 1.7 billion last year.  They also achieved Real-time Raytracing in 2006.  The problem is not with the technology its with adoption.  And ofcourse the rest is just compute shader tricks that both companies can do as long as you adopt HLSL.

         Octane is the fastest, most advanced GPU render engine in the world atm.  Only on Nvidia CUDA.  I have used it for a few projects and it is really pretty amazing. 

         http://www.refractivesoftware.com/

       Not trying to sell it or anything just saying the things you can do with a powerful GPU are amazing. 

         Here's a render I did with the early beta 0.8a a while ago..  I let it burn about 4-5 minutes on my GTX260. LoL 

       

    "I'm not cheap I'm incredibly subconsciously financially optimized"
    "The worst part of censorship is ------------------"
    image

  • ElendilasXElendilasX Member Posts: 243

    It may look nice, but what real use of it for gamers? There are no games which takes advantage of new hardware. Especialy MMO. Almost all can be run on 5-7 years old comp on best graphics...

    So i will check this stuff after 10 years when Maybe it will be useful/worth it...

  • PhelcherPhelcher Member CommonPosts: 1,053

    It's all irrelevent... 

     

    Think on this:  we now have the ATi HD6850  @  $179 (2nd generation dx11 card)   &   the ATi HD6870  ; $249. With the nVidia GTX580 being really robust/powerful (ie: GTX480rev2), it mainly for bravado because how many actually spend $550 on a video card? Oh... don't get me wrong, I had those years where I had TRI-SLI in my water cooled rigs.., but understand, that rig wasn't 50% faster than one costing $1/3 less.

    I game alot and have gone through alot of video cards. As of late I really perfere the ATi (soon to be AMD) video cards. They just offer more value. Take it for what it's worth. Don't let all the marketing grab you, just make sure you buy a video card that is going to sufficiently push your resolution. Good luck!

    "No they are not charity. That is where the whales come in. (I play for free. Whales pays.) Devs get a business. That is how it works."


    -Nariusseldon

  • ShinamiShinami Member UncommonPosts: 825

    @Quizzical

     

    Crossfire and SLI were both broken for Civilization V across multiple monitors. There was a rush for drivers because this problem also eventually spread into Real Time Strategy Games at higher resolutions. There were also critical glitches which caused warnings to be sent out to players to run certain games under certain resolutions and restrict the use of SLI/Crossfire to one card.

     

    @everyone else

    When I compare cards...I know I've never had an Nvidia card blow out on me. I've also seen better warranties too under Nvidia under certain brand names. My big problem is that people do not realize that while Nvidia and ATI go hand in hand when it comes to Computer Games, that a lot changes outside of computer games.

     

    its true that ATI has something with MLAA and that is the wave of the future. Its anti-alliasing done right! ^_^ However, Nvidia itself sells the cards clocked in a way for performance. They aren't really about saving loads of power as much as they are delivering a product that has power and performance written all over it. 

     

    If I am spending $400 - $500 for a top card, I want performance in as many areas as possible. Gaming Performance is just ONE AREA of performance. It is the ONE area that ATI can do well against Nvidia. Due to this, ATI has capitalized on selling cheaper cards specialized only in gaming performance at a sacrifice of everything else. 

     

    A GTX285 in my tests against 6XXX still beats the ATI line in most areas except gaming. What that means is that the only time the cards ATI offer are on par with Nvidia (or better in some cases) are inside a full screen game. Almost everywhere else (even on a desktop), the performance falls under that of a GTX285. 

     

    I already ran a performance-per-watt test on a GTX480 + 5XXX and 6XXX series cards. Go ahead and do it yourselves. You will find in all 26 areas....ATI will win in about 3 of them, and one of them is the Gaming Framerate under Minimum Settings and Resolution. Thanks to MLAA they win in a second Gaming Test....The third area they win is in average power consumption and load..However in the other 23 tests...ATI not only loses HORRIBLY, it loses to a GTX285 as well in those areas...except two areas...So in 5 out of 26 tests, a 6XXX series card beats a GTX 285, but in 3 out of 26 areas a 6XXX series card beats a GTX 480....

  • QuizzicalQuizzical Member LegendaryPosts: 25,355

    "Crossfire and SLI were both broken for Civilization V across multiple monitors."

    In other words, badly coded games are badly coded.  When a GTX 460, a Radeon HD 5850, and a Radeon HD 6870 all have the same minimum frame rate of 0, it's most likely a problem with the game, not with drivers.

    http://www.hardocp.com/article/2010/10/21/amd_radeon_hd_6870_6850_video_card_review/3

    "I've also seen better warranties too under Nvidia under certain brand names."

    The three that Nvidia fanboys used to cite are EVGA, BFG, and XFX.  Now BFG is out of business, and XFX is AMD-only.  So much for that argument.

    "its true that ATI has something with MLAA and that is the wave of the future. Its anti-alliasing done right!"

    MLAA won't be that nice until they can apply it to the 3D rendered areas while not applying it to the 2D areas.  Blurry text is worse image quality, not better.  I have no idea whether that will be in the first non-beta drivers, or if we'll have to wait years for them to figure that out.

    "However, Nvidia itself sells the cards clocked in a way for performance."

    And AMD doesn't?

    "They aren't really about saving loads of power as much as they are delivering a product that has power and performance written all over it. "

    Except that AMD wins in a landslide in performance per mm^2, too.  The natural competitor by die size (and hence cost of production) of a Radeon HD 6870 is a GeForce GTD 450.  The competitor for the Radeon HD 6970 will be a GeForce GTX 460 1 GB.  In some cases, Nvidia's cards targeted at one market segment manage to lose on performance to AMD's cards targeted at a lower market segment.

    "Due to this, ATI has capitalized on selling cheaper cards specialized only in gaming performance at a sacrifice of everything else."

    In most things other than gaming, there's no real difference in performance.  AMD cards can display the Windows desktop flawlessly, and so can Nvidia cards--and so can Intel integrated graphics, for that matter.  About the only other commonly used thing where video card performance matters is video playback, where AMD cards generally have better image quality than Nvidia cards, too.

    So basically, you're arguing that AMD has sacrificed performance in niche activities that virtually no one uses, in favor of clobbering Nvidia in the things that people actually do use.  And that's a bad thing?

    "A GTX285 in my tests against 6XXX still beats the ATI line in most areas except gaming."

    Areas such as what?

    "You will find in all 26 areas....ATI will win in about 3 of them, and one of them is the Gaming Framerate under Minimum Settings and Resolution."

    Any reference Radeon HD 5000 series card will clobber a GTX 480 in performance per watt in just about any gaming test you could come up with, unless perhaps you're trying to run a high end game on a low end card and it won't run, or something to that effect.  TechPowerUp gives some nice graphs of this, but the site seems to be down at the moment.

    So, what are your 26 tests?

  • BenthonBenthon Member Posts: 2,069

    You've made Quizzical angry. You don't like him when he's angry.

     

    To OP: Really, a Red vs Green thread? The GTX 580 is probably a paper release anyway to hype themselves up over AMD's upcoming 69xx series, which will stomp Nvidias  current line (again). Hopefully maybe AMD will have some competition and won't have to wait a year.

    As Quizzical throughly points out (the guy is a library of knowledge!) even if Nvidia's card were better, AMD is making a much larger profit since the die is much smaller.

    He who keeps his cool best wins.

  • AmazingAveryAmazingAvery Age of Conan AdvocateMember UncommonPosts: 7,188
  • BenthonBenthon Member Posts: 2,069

    Originally posted by AmazingAvery

    http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&DEPA=0&Order=BESTMATCH&Description=gtx+580 Posting from phone won't hardlink lol Gtx 580 is up on newegg

     Jesus, $600 for a card. Yikes.

    He who keeps his cool best wins.

  • choujiofkonochoujiofkono Member Posts: 852

    Originally posted by Shinami

    its true that ATI has something with MLAA and that is the wave of the future. Its anti-alliasing done right! ^_^ However, Nvidia itself sells the cards clocked in a way for performance. They aren't really about saving loads of power as much as they are delivering a product that has power and performance written all over it. 

         I don't think MLAA is the wave of the future.  It makes the screen and words on the screen appear blurry.  I have heard arguments that say that it doesn't actually make the textures etc. blurry and that it "just looks like it".  But I say to those people.. who cares if it "actually" makes a texture blurry if the texture makes it "appear" blurry to the eye, which is the only case that matters. 

        

    "I'm not cheap I'm incredibly subconsciously financially optimized"
    "The worst part of censorship is ------------------"
    image

  • QuizzicalQuizzical Member LegendaryPosts: 25,355

    Odd that New Egg has 10% off promotional codes on launch day.  They're a limit of 2 per customer, so it looks like limited availability, at least to start.  Cyber Power PC has GTX 580s for sale, too, and charges $40 more for them than for a GTX 480.

    Odd that there aren't reviews up yet.  Maybe Nvidia had the NDA expire at a weird time.

  • BenthonBenthon Member Posts: 2,069

    Originally posted by Quizzical

    Odd that New Egg has 10% off promotional codes on launch day.  They're a limit of 2 per customer, so it looks like limited availability, at least to start.  Cyber Power PC has GTX 580s for sale, too, and charges $40 more for them than for a GTX 480.

    Odd that there aren't reviews up yet.  Maybe Nvidia had the NDA expire at a weird time.

     I found a few benchmarks, comparing the GTX 580 to the 5870 (not without Nvidia displaying the graph to decieve the user), but no personal reviews yet.

    What they didn't show is that two GTX 460's schooled the single 580...

    He who keeps his cool best wins.

  • Vagrant_ZeroVagrant_Zero Member Posts: 1,190


    Originally posted by ElendilasX
    It may look nice, but what real use of it for gamers? There are no games which takes advantage of new hardware. Especialy MMO. Almost all can be run on 5-7 years old comp on best graphics...
    So i will check this stuff after 10 years when Maybe it will be useful/worth it...

    Vanguard, Fallen Earth, LotRO, FFXIV, and AoC would all choke to death on 5-7 year old hardware.

    Hell current hardware can still be brought to its knees by AoC.

    I suppose if the only MMO you play is WoW that might be true.

  • choujiofkonochoujiofkono Member Posts: 852

    Newegg code for $50 bux off:

    Use promo code EMCZZYR24 for $50 in savings!

    "I'm not cheap I'm incredibly subconsciously financially optimized"
    "The worst part of censorship is ------------------"
    image

Sign In or Register to comment.