Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Fuzzy Avatars Solved! Please re-upload your avatar if it was fuzzy!

An interesting read/benchmark of the GTX Titan

EMT-PEMT-P Chicago, ILPosts: 19Member

http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_Titan_SLI/1.html

 

I'm sure a lot of you are interested in this so I decided to x-post this from another site. reddit

Comments

  • EMT-PEMT-P Chicago, ILPosts: 19Member

    Lets also remember that the drivers for the Titan are still new, and maturing. I'm interested to see the marks after 3 months or so after release. 

     

  • AeonbladesAeonblades Home, GAPosts: 2,083Member
    Originally posted by EMT-P

    Lets also remember that the drivers for the Titan are still new, and maturing. I'm interested to see the marks after 3 months or so after release. 

     

    This is what I'm inclined to believe as well. In a few months with some new drivers it should see marginally higher scores.

    Currently Playing: ESO and FFXIV
    Have played: You name it
    If you mention rose tinted glasses, you better be referring to Mitch Hedberg.

  • RidelynnRidelynn Fresno, CAPosts: 4,179Member Uncommon

    The most interesting part of the article was actually in the single Titan review (linked on that page), where they talk about GPU Boost 2.0


    The new GPU Boost 2.0 takes temperatures into account, in addition to power draw, when adjusting core clock speeds and supportive voltages.

    http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_Titan/30.html

    This could finally be comparable to PowerTune. Later on in the article they go into a bit more detail. If you consider nVidia's max Boost speed to be stock baseline, and that the card "underclocks" from that, it works very similar to AMD's PowerTune now, with the added benefit that nVidia can also control chip voltage, not just clock speed.

    Also, in the same review they do a real power measurement - it looks like the Titan sticks to a real TDP of around 250W, which is on target with the 250W that nVidia reports it uses.

  • QuizzicalQuizzical Posts: 14,784Member Uncommon
    Originally posted by Aeonblades
    Originally posted by EMT-P

    Lets also remember that the drivers for the Titan are still new, and maturing. I'm interested to see the marks after 3 months or so after release. 

     

    This is what I'm inclined to believe as well. In a few months with some new drivers it should see marginally higher scores.

    I doubt it.  They've had Kepler cards on the market for most of a year.  If they've got mature drivers for chips with 2, 3, 5, 6, 7, and 8 SMXes, how much work do you think it will be to adapt that to drivers for 14 SMXes?

    Furthermore, Titan already sees the gains over a GTX 680 that we'd theoretically hope to see from the hardware specs.  You can't fix what isn't broken.  Any big performance gains from drivers that could be applied to Titan would probably immediately apply to the rest of the Kepler cards.  If they're ever going to be found, then they likely would have been found and implemented months ago.

    The situation where you expect future driver updates to improve performance is for the first cards on a new architecture that is very different from previous ones.  When that happens, the people who are writing the original drivers really have no idea what will run the best, and are trying first to write stable drivers.  They'll come back to optimize performance later.  That's how you get the big post-launch gains.

  • CleffyCleffy San Diego, CAPosts: 4,625Member Uncommon
    I think the big news here is how poorly crossfire scales in those games.
  • Crunchy222Crunchy222 new york, ILPosts: 386Member

    To me this card seems to be targeted at people who use the GPU for applications that crunch a lot of data, over a gamer.

     

    Yeah more VRAM, unchoked bandwith ect...however no game comes close to making full use of my 680.

     

    Watched the newegg overview of the card, and it really seems to be specifically made for people who will make use of the cuda cores in precision mode. I dont see how this card would ever apply to a gamers needs...considering a 670 im sure will be more than enough until the next directx is out, which im sure titan wont support.

    Seems kinda moot getting this thing for games.

  • QuizzicalQuizzical Posts: 14,784Member Uncommon
    Originally posted by Crunchy222

    To me this card seems to be targeted at people who use the GPU for applications that crunch a lot of data, over a gamer.

     

    Yeah more VRAM, unchoked bandwith ect...however no game comes close to making full use of my 680.

     

    Watched the newegg overview of the card, and it really seems to be specifically made for people who will make use of the cuda cores in precision mode. I dont see how this card would ever apply to a gamers needs...considering a 670 im sure will be more than enough until the next directx is out, which im sure titan wont support.

    Seems kinda moot getting this thing for games.

    The top end GPU chip every generation has some GPU compute bloat.  AMD does the same thing.  You wouldn't say that a Radeon HD 7970 isn't a gaming card because it has some stuff added for GPU compute purposes (notably improved double precision floating point capabilities) that is useless for gaming, would you?

Sign In or Register to comment.