Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Fuzzy Avatars Solved! Please re-upload your avatar if it was fuzzy!

GW2 and the GTX 660 TI

Br3akingDawnBr3akingDawn a City, CAPosts: 1,357Member Uncommon

OMG OMG, its out

Amzon has it for 304$ with free borderlands 2 PC game!!!

This is the card of choice for GW2!!!

beautifuly specs:

 

  • Base Clock: 980 MHz
  • Boost Clock: 1059 MHz
  • Memory Clock: 6008 MHz
  • CUDA Cores: 1344
  • PCI-Express 3.0
  • 2048MB GDDR5 192bit

image

Comments

  • IzikIzik Granite Bay, CAPosts: 111Member

    GW2 isn't exactly a benchmark in graphical tech...the game will run fine on a lot worse. Hell the game is only in DX9 for christsakes...

  • CenthanCenthan Toms River, NJPosts: 483Member

    Off topic, but I totally forgot about Borderlands2 being released in September.

    2012 is really turning out to be a great year with some quality games being released.  Puts the last couple of years (at least) to shame.

  • Shoko_LiedShoko_Lied -, WAPosts: 2,080Member Uncommon
    I got the GTX 570 last year, and probably wont need an upgrade for another 5 years.
  • BioBreakBioBreak Columbus, OHPosts: 48Member

    It's the triple monitor setup that is the reason I am getting the 660 ti card.

     

  • QuizzicalQuizzical Posts: 14,772Member Uncommon
    Originally posted by Epic1oots

    beautifuly specs:

    • 2048MB GDDR5 192bit

    Actually, that's quite hideous.  They've mismatched the memory channels.  Even at full memory bandwidth, the card is pretty strongly constrained by memory bandwidth.  And then they mismatch the memory channels, which makes it logically impossible for the card to get its full memory bandwidth if it's using over 1.5 GB.

  • heartlessheartless Brooklyn, NYPosts: 4,993Member
    As it is, I'm getting 40+ fps on a GTX 460. I highly doubt that I'll need to upgrade for GW2. Plus the card plays every thing else at nearly max settings. I'll probably wait for the next generation of cards.

    image

  • QuizzicalQuizzical Posts: 14,772Member Uncommon
    Originally posted by BioBreak

    It's the triple monitor setup that is the reason I am getting the 660 ti card.

    Bad idea.  The GeForce GTX 660 Ti is a single-monitor card, and ideally 1920x1200 or below.  You can add a second or third monitor if they're not used for gaming, but spreading a game window across three monitors is a bad idea.

    Larger resolutions mean you need more memory capacity and bandwidth, which is exactly what's crippled in the GeForce GTX 660 Ti.  Look here:

    http://www.techpowerup.com/reviews/MSI/GTX_660_Ti_Power_Edition/28.html

    At 1920x1200, a GeForce GTX 660 Ti barely loses to a Radeon HD 7950.  At 2560x1600, it loses by quite a bit.  A three-monitor setup is a higher resolution yet, which would probably widen the 7950's advantage.

    You can dodge the memory capacity and mismatched memory channels problem if you get the 3 GB version that Galaxy has, rather than the stupid 2 GB version.  But then you're paying enough to get a superior Radeon HD 7950, so you might as well just get the 7950.

    It's not that the GTX 660 Ti will completely choke on a triple monitor setup.  Rather, it will merely constitute overpaying for inferior hardware.

  • sk8chalifsk8chalif Montreal, QCPosts: 599Member Uncommon
    Originally posted by Izik

    GW2 isn't exactly a benchmark in graphical tech...the game will run fine on a lot worse. Hell the game is only in DX9 for christsakes...

    they quotes somewhere that later they will implement to dx10 then 11  like in expension/patch. but that not for now, will try to find the link,

    here http://www.guildwars2guru.com/topic/47480-official-update-on-dx11-support/

    For those of you who have been asking about DX11 support for Guild Wars 2, our goal with GW2 has always been to provide a gorgeous fantasy world while at the same time running on a wide range of gaming PCs.

    Focusing on DX9 allows us to do this, as it’s a much wider supported graphics API than DX11 is and we wanted our game to reach as many of our fans as possible.

    We will be evaluating supporting DX11 post launch. ~RB2

    image
    ~The only opinion that matters is your own.Everything else is just advice,~

  • AerowynAerowyn BUZZARDS BAY, MAPosts: 7,928Member
    Originally posted by Quizzical
    Originally posted by BioBreak

    It's the triple monitor setup that is the reason I am getting the 660 ti card.

    Bad idea.  The GeForce GTX 660 Ti is a single-monitor card, and ideally 1920x1200 or below.  You can add a second or third monitor if they're not used for gaming, but spreading a game window across three monitors is a bad idea.

    Larger resolutions mean you need more memory capacity and bandwidth, which is exactly what's crippled in the GeForce GTX 660 Ti.  Look here:

    http://www.techpowerup.com/reviews/MSI/GTX_660_Ti_Power_Edition/28.html

    At 1920x1200, a GeForce GTX 660 Ti barely loses to a Radeon HD 7950.  At 2560x1600, it loses by quite a bit.  A three-monitor setup is a higher resolution yet, which would probably widen the 7950's advantage.

    You can dodge the memory capacity and mismatched memory channels problem if you get the 3 GB version that Galaxy has, rather than the stupid 2 GB version.  But then you're paying enough to get a superior Radeon HD 7950, so you might as well just get the 7950.

    It's not that the GTX 660 Ti will completely choke on a triple monitor setup.  Rather, it will merely constitute overpaying for inferior hardware.

    radeon has always been ahead of nvidia when it came to multiple montor and higher res support. For average res(1920x1080) nvidia usually leads the pack though.. just depends things flip flop a lot but radeon has always had the hold on multi monitor support.

    I angered the clerk in a clothing shop today. She asked me what size I was and I said actual, because I am not to scale. I like vending machines 'cause snacks are better when they fall. If I buy a candy bar at a store, oftentimes, I will drop it... so that it achieves its maximum flavor potential. --Mitch Hedberg

  • heartlessheartless Brooklyn, NYPosts: 4,993Member
    Originally posted by sk8chalif
    Originally posted by Izik

    GW2 isn't exactly a benchmark in graphical tech...the game will run fine on a lot worse. Hell the game is only in DX9 for christsakes...

    they quotes somewhere that later they will implement to dx10 then 11  like in expension/patch. but that not for now, will try to find the link,

    here http://www.guildwars2guru.com/topic/47480-official-update-on-dx11-support/

    For those of you who have been asking about DX11 support for Guild Wars 2, our goal with GW2 has always been to provide a gorgeous fantasy world while at the same time running on a wide range of gaming PCs.

    Focusing on DX9 allows us to do this, as it’s a much wider supported graphics API than DX11 is and we wanted our game to reach as many of our fans as possible.

    We will be evaluating supporting DX11 post launch. ~RB2

    Evaluating =/= implementing. That quote basically states that they are considering supporting DX11, not that they are going to. Besides, if they do decide to offer DX11 support, it's probably not going to come for another 6 months or so, possibly even longer.

    image

  • fiontarfiontar Dana, MAPosts: 3,719Member
    Originally posted by Izik

    GW2 isn't exactly a benchmark in graphical tech...the game will run fine on a lot worse. Hell the game is only in DX9 for christsakes...

    Huh?

    Ummm....ok....

    Believe me, the game rewards people who have the hardware to run the game on max quality settings and looks a lot better than most DX10/11 games. The API does not a beautiful game make and it most certainly does not make up for inferior art design. The best developers have always managed to squeeze better visuals from "dated" tech. I'd love to see DX11 in GW2, if the developers were to find that it could further improve the visuals, but it's one of the most visually stunning games in the MMO genre and, at max quality settings, is also one of the most demanding of PC hardware.

    Want to know more about GW2 and why there is so much buzz? Start here: Guild Wars 2 Mass Info for the Uninitiated
    image

  • IzikIzik Granite Bay, CAPosts: 111Member
    Originally posted by fiontar
    Originally posted by Izik

    GW2 isn't exactly a benchmark in graphical tech...the game will run fine on a lot worse. Hell the game is only in DX9 for christsakes...

    Huh?

    Ummm....ok....

    Believe me, the game rewards people who have the hardware to run the game on max quality settings and looks a lot better than most DX10/11 games. The API does not a beautiful game make and it most certainly does not make up for inferior art design. The best developers have always managed to squeeze better visuals from "dated" tech. I'd love to see DX11 in GW2, if the developers were to find that it could further improve the visuals, but it's one of the most visually stunning games in the MMO genre and, at max quality settings, is also one of the most demanding of PC hardware.

     

    bro...play TSW, Lotro, Rift, AoC or Tera on max settings. GW2 is barely passable graphically compared to other mmos, and it's a turd for a 2012 release.

    Look at TSW (2012 release):

    DX11, Tessellation, HDR Lighting, Adaptive tone mapping,Ambient Occlusion and the world's first game with TXAA. The game is gorgeous.

    GW2: Low quality textures with an extremely overbright bloom filter. Not to mention the character models (aside from the Asura imo) are just awful. I mean, have you seen the Norn?... they look like EQ2 models.

    Obviously there had to be compromises in visual quality to ensure smooth performance, (especially for WvWvW) but the fact is the game is just "meh" overall visually. Luckily this has nothing to do with gameplay, which is what makes GW2 worth playing.

  • WeretigarWeretigar winifrede, WVPosts: 609Member
    Originally posted by Izik
    Originally posted by fiontar
    Originally posted by Izik

    GW2 isn't exactly a benchmark in graphical tech...the game will run fine on a lot worse. Hell the game is only in DX9 for christsakes...

    Huh?

    Ummm....ok....

    Believe me, the game rewards people who have the hardware to run the game on max quality settings and looks a lot better than most DX10/11 games. The API does not a beautiful game make and it most certainly does not make up for inferior art design. The best developers have always managed to squeeze better visuals from "dated" tech. I'd love to see DX11 in GW2, if the developers were to find that it could further improve the visuals, but it's one of the most visually stunning games in the MMO genre and, at max quality settings, is also one of the most demanding of PC hardware.

     

    bro...play TSW, Lotro, Rift, AoC or Tera on max settings. GW2 is barely passable graphically compared to other mmos, and it's a turd for a 2012 release.

    Look at TSW (2012 release):

    DX11, Tessellation, HDR Lighting, Adaptive tone mapping,Ambient Occlusion and the world's first game with TXAA. The game is gorgeous.

    GW2: Low quality textures with an extremely overbright bloom filter. Not to mention the character models (aside from the Asura imo) are just awful. I mean, have you seen the Norn?... they look like EQ2 models.

    Obviously there had to be compromises in visual quality to ensure smooth performance, (especially for WvWvW) but the fact is the game is just "meh" overall visually. Luckily this has nothing to do with gameplay, which is what makes GW2 worth playing.

    Im not saying gw2 has amazing graphics

    However tsw looks like it ripped spries from san adreas on the ps2 the background might be amazing but the char sprites are god awful seriously. I would use swtor graphics before i would use tsws.

  • XtenXten over there, KSPosts: 119Member
    Originally posted by Weretigar
    Originally posted by Izik
    Originally posted by fiontar
    Originally posted by Izik

    GW2 isn't exactly a benchmark in graphical tech...the game will run fine on a lot worse. Hell the game is only in DX9 for christsakes...

    Huh?

    Ummm....ok....

    Believe me, the game rewards people who have the hardware to run the game on max quality settings and looks a lot better than most DX10/11 games. The API does not a beautiful game make and it most certainly does not make up for inferior art design. The best developers have always managed to squeeze better visuals from "dated" tech. I'd love to see DX11 in GW2, if the developers were to find that it could further improve the visuals, but it's one of the most visually stunning games in the MMO genre and, at max quality settings, is also one of the most demanding of PC hardware.

     

    bro...play TSW, Lotro, Rift, AoC or Tera on max settings. GW2 is barely passable graphically compared to other mmos, and it's a turd for a 2012 release.

    Look at TSW (2012 release):

    DX11, Tessellation, HDR Lighting, Adaptive tone mapping,Ambient Occlusion and the world's first game with TXAA. The game is gorgeous.

    GW2: Low quality textures with an extremely overbright bloom filter. Not to mention the character models (aside from the Asura imo) are just awful. I mean, have you seen the Norn?... they look like EQ2 models.

    Obviously there had to be compromises in visual quality to ensure smooth performance, (especially for WvWvW) but the fact is the game is just "meh" overall visually. Luckily this has nothing to do with gameplay, which is what makes GW2 worth playing.

    Im not saying gw2 has amazing graphics

    However tsw looks like it ripped spries from san adreas on the ps2 the background might be amazing but the char sprites are god awful seriously. I would use swtor graphics before i would use tsws.

     

    What sprites?

     

    TWS uses full 3d models for the chars but hey ho..

    All the background you see in TWS is actual geometry i am still playing the game and it has tremendous draw distance on toons and the enviroment to boot, it sport impressive vista's. DX11 Tesselation and advanced shaders/lighting and texture techniques really show and add and they have created a great atmosphere artisisticaly.

     

    Comparing TWS's gfx with san andreas on the ps2 and saying that the toons are sprites is quite petty. Really just show that you are in all likelyhood nothing more then a blabbering fanboy and i really dont care which game you fan off to ,it is st00pit no matter what.

     

    I love the look of both games. GW2 may not have DX10 or DX11 but i love how it looks and im just fine with it . The only thing imo they could improve is the character models.

  • QuizzicalQuizzical Posts: 14,772Member Uncommon
    DirectX 11 doesn't matter unless they use tessellation aggressively.  In order to do tessellation properly, you can't just tack it on later.  Your models have to be designed from the start with tessellation in mind, or else you'll have to toss them all out and start over.
  • UtukuMoonUtukuMoon ParisPosts: 1,066Member
    Originally posted by Epic1oots

    OMG OMG, its out

    Amzon has it for 304$ with free borderlands 2 PC game!!!

    This is the card of choice for GW2!!!

    beautifuly specs:

     

    • Base Clock: 980 MHz
    • Boost Clock: 1059 MHz
    • Memory Clock: 6008 MHz
    • CUDA Cores: 1344
    • PCI-Express 3.0
    • 2048MB GDDR5 192bit

    Really! why buy that card when you can get this http://www.overclockers.co.uk/showproduct.php?prodid=GX-095-GI

  • austriacusaustriacus limaPosts: 624Member
    Originally posted by Xten
    Originally posted by Weretigar
    Originally posted by Izik
    Originally posted by fiontar
    Originally posted by Izik

    GW2 isn't exactly a benchmark in graphical tech...the game will run fine on a lot worse. Hell the game is only in DX9 for christsakes...

    Huh?

    Ummm....ok....

    Believe me, the game rewards people who have the hardware to run the game on max quality settings and looks a lot better than most DX10/11 games. The API does not a beautiful game make and it most certainly does not make up for inferior art design. The best developers have always managed to squeeze better visuals from "dated" tech. I'd love to see DX11 in GW2, if the developers were to find that it could further improve the visuals, but it's one of the most visually stunning games in the MMO genre and, at max quality settings, is also one of the most demanding of PC hardware.

     

    bro...play TSW, Lotro, Rift, AoC or Tera on max settings. GW2 is barely passable graphically compared to other mmos, and it's a turd for a 2012 release.

    Look at TSW (2012 release):

    DX11, Tessellation, HDR Lighting, Adaptive tone mapping,Ambient Occlusion and the world's first game with TXAA. The game is gorgeous.

    GW2: Low quality textures with an extremely overbright bloom filter. Not to mention the character models (aside from the Asura imo) are just awful. I mean, have you seen the Norn?... they look like EQ2 models.

    Obviously there had to be compromises in visual quality to ensure smooth performance, (especially for WvWvW) but the fact is the game is just "meh" overall visually. Luckily this has nothing to do with gameplay, which is what makes GW2 worth playing.

    Im not saying gw2 has amazing graphics

    However tsw looks like it ripped spries from san adreas on the ps2 the background might be amazing but the char sprites are god awful seriously. I would use swtor graphics before i would use tsws.

     

    What sprites?

     

    TWS uses full 3d models for the chars but hey ho..

    All the background you see in TWS is actual geometry i am still playing the game and it has tremendous draw distance on toons and the enviroment to boot, it sport impressive vista's. DX11 Tesselation and advanced shaders/lighting and texture techniques really show and add and they have created a great atmosphere artisisticaly.

     

    Comparing TWS's gfx with san andreas on the ps2 and saying that the toons are sprites is quite petty. Really just show that you are in all likelyhood nothing more then a blabbering fanboy and i really dont care which game you fan off to ,it is st00pit no matter what.

     

    I love the look of both games. GW2 may not have DX10 or DX11 but i love how it looks and im just fine with it . The only thing imo they could improve is the character models.

    Hes not alone in his opinion though, a lot of complaints of TSW have come from their awfull character models and no one has complained about them in GW2. At least thats what i remember people discussing in beta.

    I personally enjoy more the character models that the ones ive seen in TSW, thats my opinion though, girls in TSW look fugly.

  • BadSpockBadSpock Somewhere, MIPosts: 7,974Member

    Quizzical is always right 100% of the time.

    Don't waste your money on a 660.

  • QuizzicalQuizzical Posts: 14,772Member Uncommon
    Originally posted by Sylvarii
    Originally posted by Epic1oots

    OMG OMG, its out

    Amzon has it for 304$ with free borderlands 2 PC game!!!

    This is the card of choice for GW2!!!

    beautifuly specs:

     

    • Base Clock: 980 MHz
    • Boost Clock: 1059 MHz
    • Memory Clock: 6008 MHz
    • CUDA Cores: 1344
    • PCI-Express 3.0
    • 2048MB GDDR5 192bit

    Really! why buy that card when you can get this http://www.overclockers.co.uk/showproduct.php?prodid=GX-095-GI

    Because that's too much to pay for a GeForce GTX 670 when you can get a Radeon HD 7970 with the same premium cooler and a hefty factory overclock for the same price from the same web site:

    http://www.overclockers.co.uk/showproduct.php?prodid=GX-097-GI&groupid=701&catid=56&subcat=938

    The basic principle here is, don't buy a lesser card if you can get a better one for the same price.  The hierarchy among higher end cards roughly goes:

    Radeon HD 7870 < GeForce GTX 660 Ti <  Radeon HD 7950 < GeForce GTX 670 < Radeon HD 7970 < GeForce GTX 680 < Radeon HD 7970 GHz Edition

    Or at least that's where I'd put things for a single monitor at 1920x1200 or below.  If you want three monitors, it's more like:

    (stuff you shouldn't get) < Radeon HD 7950 < GeForce GTX 670 < GeForce GTX 680 < Radeon HD 7970 < Radeon HD 7970 GHz Edition.

  • UtukuMoonUtukuMoon ParisPosts: 1,066Member
    Originally posted by BadSpock

    Quizzical is always right 100% of the time.

    Don't waste your money on a 660.

    Ill second that,even though i'm quite in the know myself ..hehe.The card that i linked will give you near on 680 performance for $300 ish,it's one of if not the best 670 on the market today.

    These charts give a good idea http://www.techpowerup.com/reviews/Gigabyte/GeForce_GTX_670_Windforce/1.html

  • MothanosMothanos ArnhemPosts: 1,860Member Uncommon

    300 dollar or 300 pounds as quite a diffrence ;)

  • QuizzicalQuizzical Posts: 14,772Member Uncommon
    Originally posted by Sylvarii

    Ill second that,even though i'm quite in the know myself ..hehe.The card that i linked will give you near on 680 performance for $300 ish,it's one of if not the best 670 on the market today.

    These charts give a good idea http://www.techpowerup.com/reviews/Gigabyte/GeForce_GTX_670_Windforce/1.html

    The trouble is that the 7970 that I linked is overclocked, too--and sees a lot more benefit from the overclock, since it's not so heavily constrained by memory bandwidth.  Even in your own link, the 7970 starts out faster before the overclock (at 1920x1200 and higher), in addition to gaining more from the overclock (which isn't shown).  And that's even in charts that are skewed against the 7970, because it's using the old Catalyst 12.3 drivers, and AMD has gotten substantially bigger gains out of their drivers than Nvidia has since then.

    It's not that I'm against getting a GeForce GTX 670.  It's that I'm against getting a GTX 670 when you can get an equivalent premium SKU of a Radeon HD 7970 for the same price unless you need something specific to Nvidia.  I'd likewise be against getting a Radeon HD 7950 if you could get a GeForce GTX 670 for the same price unless you need something specific to AMD.

  • UtukuMoonUtukuMoon ParisPosts: 1,066Member
    Originally posted by Quizzical
    Originally posted by Sylvarii

    Ill second that,even though i'm quite in the know myself ..hehe.The card that i linked will give you near on 680 performance for $300 ish,it's one of if not the best 670 on the market today.

    These charts give a good idea http://www.techpowerup.com/reviews/Gigabyte/GeForce_GTX_670_Windforce/1.html

    The trouble is that the 7970 that I linked is overclocked, too--and sees a lot more benefit from the overclock, since it's not so heavily constrained by memory bandwidth.  Even in your own link, the 7970 starts out faster before the overclock (at 1920x1200 and higher), in addition to gaining more from the overclock (which isn't shown).  And that's even in charts that are skewed against the 7970, because it's using the old Catalyst 12.3 drivers, and AMD has gotten substantially bigger gains out of their drivers than Nvidia has since then.

    It's not that I'm against getting a GeForce GTX 670.  It's that I'm against getting a GTX 670 when you can get an equivalent premium SKU of a Radeon HD 7970 for the same price unless you need something specific to Nvidia.  I'd likewise be against getting a Radeon HD 7950 if you could get a GeForce GTX 670 for the same price unless you need something specific to AMD.

    Yeah,i can see where you are coming from, it makes sence.If you don't have a problem with the AMD or Nvidia then bang for buck is the way to go.

Sign In or Register to comment.