Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

AMD 290X beats TITAN for almost half the price!!!

124678

Comments

  • SiveriaSiveria Member UncommonPosts: 1,416

    Hmm so they compared 4 of these cards in a quad formation to a pair of titan's in SLi and barely beat them? Lets bring out 4 titans in that formantion to be fair. Bascally 1 titan=2 of these new nvidia cards, if the 4 ati vs 2 titans is how were basing it. Needless to say I cannot afford neither nor do I see a need too yet, my Gtx 560 1gb pretty much runs everything so far at an acceptable frame rate for me (anything over 30 fps is fine for me).

    Also you need to consider that most pc games will be xbone ports, and the xbone as it is is way weaker than the ps4 graphical power wise. Think they said the ps4 is 50% faster overall in everyway, since the pc ports will be based off the xbone version you'll probally not need a high end card to run the games at a decent fps, afterall the ps4 was made out of current gen pc parts, where as xbone is like last gen or 2 gens ago pc parts. If you have a fairly recent pc you should be ok. Besides most people cannot see frames higher than 60 per second, so anything higher is just really overkill.

    Main reason I am waiting a year before I get a ps4 is because in the first year of announced games it really has nothing but garbage in my eyes, and most of those titles will probally end up on the pc anyway, so even less reason to get a ps4 right away. That and the fact there might be a hardware revision to fix any nasty system bricking issues (we all know the xbone will have these, the previous 2 xbox iterations did) that will hopefully be fixed in that revision, as well as maybe a price drop. By then there might be some games I feel are worth buying for it.

    Gaming as a whole has gone down the shit tubes, only saving graces for me have been the indie devs who actually try new things, compared to these AAA devs that just shit out the exact same shit in a new skin 1-2 times a year. The fps games like call of duty and such are expecally bad for bascally being the same game in a slightly diff wrapper with nothing new, and lets not forget sports games where more often than not the only diff between a 2013 and 2014 version is just the roster/stats.

    Being a pessimist is a win-win pattern of thinking. If you're a pessimist (I'll admit that I am!) you're either:

    A. Proven right (if something bad happens)

    or

    B. Pleasantly surprised (if something good happens)

    Either way, you can't lose! Try it out sometime!

  • QuizzicalQuizzical Member LegendaryPosts: 25,347
    Originally posted by Mawnee

     In 4-5 months I'll sell the 290x for a small loss and buy the new Nvidia big dog again. :D

    Titan will still be the Nvidia "big dog" then, and will likely remain so for a year or so.  It's going to be a while before 20 nm is ready, and Nvidia usually doesn't move to new process nodes as fast as AMD, anyway.  And it's not like they're going to improve performance by going with an even bigger GPU than Titan uses; GK110 is already a lot bigger than Hawaii.

  • QuizzicalQuizzical Member LegendaryPosts: 25,347
    Originally posted by Siveria

    Hmm so they compared 4 of these cards in a quad formation to a pair of titan's in SLi and barely beat them? Lets bring out 4 titans in that formantion to be fair. Bascally 1 titan=2 of these new nvidia cards, if the 4 ati vs 2 titans is how were basing it. Needless to say I cannot afford neither nor do I see a need too yet, my Gtx 560 1gb pretty much runs everything so far at an acceptable frame rate for me (anything over 30 fps is fine for me).

    With that much GPU power, it's likely that you're looking at a bottleneck elsewhere in the system.

  • MawneeMawnee Member UncommonPosts: 245
    Originally posted by Quizzical
    Originally posted by Mawnee

     In 4-5 months I'll sell the 290x for a small loss and buy the new Nvidia big dog again. :D

    Titan will still be the Nvidia "big dog" then, and will likely remain so for a year or so.  It's going to be a while before 20 nm is ready, and Nvidia usually doesn't move to new process nodes as fast as AMD, anyway.  And it's not like they're going to improve performance by going with an even bigger GPU than Titan uses; GK110 is already a lot bigger than Hawaii.

    We shall see, either way I'll be sitting on the fastest card available until its the right time to move it. :D

  • Gaia_HunterGaia_Hunter Member UncommonPosts: 3,066
    Originally posted by Mawnee
    Originally posted by Quizzical
    Originally posted by Mawnee

     In 4-5 months I'll sell the 290x for a small loss and buy the new Nvidia big dog again. :D

    Titan will still be the Nvidia "big dog" then, and will likely remain so for a year or so.  It's going to be a while before 20 nm is ready, and Nvidia usually doesn't move to new process nodes as fast as AMD, anyway.  And it's not like they're going to improve performance by going with an even bigger GPU than Titan uses; GK110 is already a lot bigger than Hawaii.

    We shall see, either way I'll be sitting on the fastest card available until its the right time to move it. :D

    The next generation of cards might be further away then you might expect.

    It isn't that the 20 nm process isn't ready but the fact Apple and Qualcomm will be fighting for the TSMC 20 nm wafers and they can afford to pay more than AMD or NVIDIA can.

    Until NVIDIA and AMD can get their hands on the 20 nm process (and that will happen as TSMC ramp up production) don't expect to see much higher performance than GK110 and Hawaii.

    Currently playing: GW2
    Going cardboard starter kit: Ticket to ride, Pandemic, Carcassonne, Dominion, 7 Wonders

  • skeaserskeaser Member RarePosts: 4,179
    Originally posted by Quizzical
    Originally posted by Illudo

     


    Originally posted by skeaser
    I don't care if it's a million times faster than NVidia's best card. I will NEVER put another P.O.S. AMD/ATI card in my computer again. I made that mistake during one build and regretted it daily. They have no clue how to work drivers and their Catalyst Control Panel doesn't make sense half the time and doesn't work the other half.

     

    Totally agree, never AMD again, I've coped with it for the past 3 years but I'm glad I finally got rid of it. For one, Crossfire doesn't work in windowed mode, which is a big deal for me as I use dual monitors; fullscreen has too much of a delay to switch between monitors. Secondly, the drivers are horrendous, again for Crossfire especially, lots of micro stuttering which they have apparently fixed this summer (took them over 3 years!).

    I don't care if their cards are cheaper, if you're a power user and use your PC every single day for quite some hours, you want something you can fully rely on.

    Crossfire and SLI really only target two types of people:  the technically clueless and those for which one high end video card isn't good enough, so they'll spend more and buy two.  Budget considerations mean that very few people are in the latter category.  No one else has any reason to care how well (or whether!) Crossfire or SLI work.

    As to how well Crossfire and SLI work, about two years ago, Tech Report started doing frame pacing tests to see just how useful in the real-world multi-GPU setups were.  Their initial results were that Crossfire and SLI were both a complete mess.  Nvidia realized that this was important quickly and started working on fixing frame timing in SLI, while AMD ignored it.

    About a year later, Nvidia had fixed the frame pacing problems (though not other problems that plague multi-GPU setups), while AMD still hadn't.  So Nvidia sent hardware to various tech review sites that would allow for more precise frame pacing measurements (and a better system than the software-based stuff that Tech Report had previously used), which made the differences really obvious.  There still wasn't much of a difference for single-card systems, but as far as frame pacing went, SLI basically worked and Crossfire basically didn't.

    So AMD figured out that this was a problem and went to work fixing Crossfire.  Today, they've mostly fixed it (entirely for the Radeon R9 290X, but that took some hardware changes not available to older cards), and both SLI and Crossfire work, at least as far as frame pacing goes.

    So does it matter that Nvidia fixed their frame pacing problems about a year before AMD did?  If you were buying a multi-GPU setup six months ago, it did.  (It also mattered that Titan was a lot faster than a Radeon HD 7970 GHz Edition--and if you were looking at a multi-GPU setup, you should have been looking at Titan or at least a GTX 780.)  But today?  Does it matter that AMD had DirectX 11 support about six months earlier than Nvidia?  DirectX 11 surely matters vastly more than SLI or Crossfire.  But once the other side catches up, it doesn't matter who got there first.

    Considering that there are only around 80 games with DX11 support  and around 20 announced upcoming games with DX11 support from now through next year, I'd say it still doesn't matter if a card has DX11 support or not especially since it's just a superset of DX10.1. Sure, the few good games with DX11 sure have some pretty effects but there just aren't enough to really make it a big deal.

    Sig so that badges don't eat my posts.


  • QuizzicalQuizzical Member LegendaryPosts: 25,347
    Originally posted by skeaser

    Considering that there are only around 80 games with DX11 support  and around 20 announced upcoming games with DX11 support from now through next year, I'd say it still doesn't matter if a card has DX11 support or not especially since it's just a superset of DX10.1. Sure, the few good games with DX11 sure have some pretty effects but there just aren't enough to really make it a big deal.

    It's not just the games available today.  As time passes, do you think the trend will be toward greater use of DirectX 11 or OpenGL 4 or later, or do you think that we'll keep having DirectX 9.0c games forever?  Tessellation may be tricky to use properly, but geometry shaders aren't.

  • QuizzicalQuizzical Member LegendaryPosts: 25,347
    Originally posted by Gaia_Hunter
    Originally posted by Mawnee
    Originally posted by Quizzical
    Originally posted by Mawnee

     In 4-5 months I'll sell the 290x for a small loss and buy the new Nvidia big dog again. :D

    Titan will still be the Nvidia "big dog" then, and will likely remain so for a year or so.  It's going to be a while before 20 nm is ready, and Nvidia usually doesn't move to new process nodes as fast as AMD, anyway.  And it's not like they're going to improve performance by going with an even bigger GPU than Titan uses; GK110 is already a lot bigger than Hawaii.

    We shall see, either way I'll be sitting on the fastest card available until its the right time to move it. :D

    The next generation of cards might be further away then you might expect.

    It isn't that the 20 nm process isn't ready but the fact Apple and Qualcomm will be fighting for the TSMC 20 nm wafers and they can afford to pay more than AMD or NVIDIA can.

    Until NVIDIA and AMD can get their hands on the 20 nm process (and that will happen as TSMC ramp up production) don't expect to see much higher performance than GK110 and Hawaii.

    High end GPU chips can sell for hundreds of dollars for just the GPU chip.  Tablet and cell phone chips tend to go in the low tens of dollars.  That makes a big difference, and is why AMD got a large fraction of TSMC's early chips both at 28 nm and 40 nm.

    Even so, what Apple and Qualcomm really want is not more transistors, but lower power.  Apple has been pushing things forward in getting top-notch CPU and GPU performance into tablets sooner than the competition for the last few years.  Too bad Apple hasn't shown any interest in pushing GPU capabilities forward in anything besides raw performance.

  • Gaia_HunterGaia_Hunter Member UncommonPosts: 3,066
    Originally posted by Quizzical
    Originally posted by Gaia_Hunter
    Originally posted by Mawnee
    Originally posted by Quizzical
    Originally posted by Mawnee

     In 4-5 months I'll sell the 290x for a small loss and buy the new Nvidia big dog again. :D

    Titan will still be the Nvidia "big dog" then, and will likely remain so for a year or so.  It's going to be a while before 20 nm is ready, and Nvidia usually doesn't move to new process nodes as fast as AMD, anyway.  And it's not like they're going to improve performance by going with an even bigger GPU than Titan uses; GK110 is already a lot bigger than Hawaii.

    We shall see, either way I'll be sitting on the fastest card available until its the right time to move it. :D

    The next generation of cards might be further away then you might expect.

    It isn't that the 20 nm process isn't ready but the fact Apple and Qualcomm will be fighting for the TSMC 20 nm wafers and they can afford to pay more than AMD or NVIDIA can.

    Until NVIDIA and AMD can get their hands on the 20 nm process (and that will happen as TSMC ramp up production) don't expect to see much higher performance than GK110 and Hawaii.

    High end GPU chips can sell for hundreds of dollars for just the GPU chip.  Tablet and cell phone chips tend to go in the low tens of dollars.  That makes a big difference, and is why AMD got a large fraction of TSMC's early chips both at 28 nm and 40 nm.

    Even so, what Apple and Qualcomm really want is not more transistors, but lower power.  Apple has been pushing things forward in getting top-notch CPU and GPU performance into tablets sooner than the competition for the last few years.  Too bad Apple hasn't shown any interest in pushing GPU capabilities forward in anything besides raw performance.

    You can fit an awful lot of tablet and cell phone chips in the space used for a single high end GPU monster, so not only the cost of each chip is much lower the yield is much higher due to small die size.

    And of course Apple will turn those tiny chips in to phones and tablets that rival the price of most high end GPUs.

    We are talking of die sizes of 438 mm2 for something like Hawaii and 561 mm2 for Titan/GTx 780 vs 0.45 mm2 for ARM A7 or

    1.62 mm2 for A15 Tegra4.

    AMD can fit at most 129 290X dies in a wafer (that is not counting defect, alignment markings, etc). You can get some 150000 A7 in the same wafer.

    Not counting defective units and assuming a 28nm wafer costs some $5000, each Hawaii chip costs AMD $38+. An A7 costs some $0.03.

    Currently playing: GW2
    Going cardboard starter kit: Ticket to ride, Pandemic, Carcassonne, Dominion, 7 Wonders

  • Gaia_HunterGaia_Hunter Member UncommonPosts: 3,066
    The 290X is so bad so bad that the 780 just dropped in price to match it.

    Currently playing: GW2
    Going cardboard starter kit: Ticket to ride, Pandemic, Carcassonne, Dominion, 7 Wonders

  • miguksarammiguksaram Member UncommonPosts: 835

    In terms of raw power it's a good buy but that only applies to those who are willing to to accept that it fails in all other departments (in it's currently available form).  Adding an aftermarket water cooling block really isn't a fair comparison unless you are willing to do that for the GTX 780 as well given the new price point.  Raw power is nice but for me personally I'd much rather have a balanced card.

  • QuizzicalQuizzical Member LegendaryPosts: 25,347
    Originally posted by Gaia_Hunter
    The 290X is so bad so bad that the 780 just dropped in price to match it.

    The GTX 780 dropped to $500, cheaper than the R9 290X.  With the GTX 770 also dropping to $330, the coming Nvidia price cuts have now happened and someone who wants an Nvidia card doesn't necessarily need to wait any longer.  Then again, the upcoming Radeon R9 290 might well force the GTX 780 down a little lower.

    As for the Radeon R9 290X, yeah, wait for the aftermarket coolers.  You don't need a waterblock; you just need the sort of coolers that a lot of AMD board partners put on their higher end video cards.  Better coolers may well help some with power consumption, too, as higher temperatures tend to mean more leakage.  That's not going to be a new process node worth of difference, but it could easily be a difference of 5%.

  • NephelaiNephelai Member UncommonPosts: 185
    Originally posted by Classicstar

    Sorry for making another topic but this is to importend NEW CARD to not let you know about fastest single gpu plus cheap price.

    I know most nvidia fans dont believe or wanne hear this but AMD 290x is faster in most benchmarks that countS and firestrikes which is showcase from futuremark in DX11 beats TITAN also.

    AMD 290x OC(easy OC) is even more rediculously fast.

    And that for half the price!!!

    Its HOT very hot so you need a good cooling case and noise is also louder then most but card is fast and cheap.

    ADVICE: I should wait for the branch makers like ASUS-XFX-MSI or others who will prolly have alot better cooling fans on the BEAST.

    I'm not biased one way or the other as I like competition driving prices down however you do realise that this card is 8 MONTHS after the Titan was released? With such poor thermals, noise etc.

     

    Don't be surprised if its reign is short - on the positive side it should keep the price of Nvidia's next card lower.

  • RidelynnRidelynn Member EpicPosts: 7,383


    Originally posted by Nephelai
    I'm not biased one way or the other as I like competition driving prices down however you do realise that this card is 8 MONTHS after the Titan was released? With such poor thermals, noise etc. Don't be surprised if its reign is short - on the positive side it should keep the price of Nvidia's next card lower.


    And the 7970 (r9 280) was out how much earlier than Titan? Well over a year if I recall correctly, and out months before the 680/770 (especially if you consider the availability of the 680 early on, it was practically non-existent until mid-summer).

    The card uses a lot of power, the cooler has poor thermals. It concerns me, but not nearly as much as the nVidia 480 did (which is probably the closest fair comparison in terms of TDP) - because of PowerTune. The 480 had no way to control power or temperature if it started to run away, other than a static temperature switch (which only hits after the incident). PowerTune (and to be fair, nVidia's Boost will do it as well, but the 480 didn't have Boost) can catch it before it's a run-away problem - it may run hot, but it'll never run overly hot or exceed it's TDP like the 480 could.

  • HrimnirHrimnir Member RarePosts: 2,415

    I'm happy with my SLi'd 760's that ive had for the past 3 months and paid $50 less than AMD's new hot ness (haha, i love myself, double meaning there), that runs circles around it and a titan all day.

    Edit: The even sadder part is i'd be willing to bet both of my 760's use less power draw than the one 290x.  I absolutel KNOW they're quieter.

    "The surest way to corrupt a youth is to instruct him to hold in higher esteem those who think alike than those who think differently."

    - Friedrich Nietzsche

  • Gaia_HunterGaia_Hunter Member UncommonPosts: 3,066
    Originally posted by Hrimnir

    I'm happy with my SLi'd 760's that ive had for the past 3 months and paid $50 less than AMD's new hot ness (haha, i love myself, double meaning there), that runs circles around it and a titan all day.

    Edit: The even sadder part is i'd be willing to bet both of my 760's use less power draw than the one 290x.  I absolutel KNOW they're quieter.

    The GTX 760 SLi will consume a fair bit more power.

    The GTX 760 SLI is slightly faster.

    The GTX 760 SLi are pretty much as noisy as a single 290X.

    image

    image

    Currently playing: GW2
    Going cardboard starter kit: Ticket to ride, Pandemic, Carcassonne, Dominion, 7 Wonders

  • RidelynnRidelynn Member EpicPosts: 7,383

    You can't compare noise levels on a GPU because it's specific to the make and model of the card. Noise/Temperature is a function of the cooler, not the GPU.

    I could make a GT 610 sound like a jet turbine and run at 100C if I were so inclined to do so, even though it only has a TDP of around 30W.

  • RidelynnRidelynn Member EpicPosts: 7,383


    Originally posted by Hrimnir
    I'm happy with my SLi'd 760's that ive had for the past 3 months and paid $50 less than AMD's new hot ness (haha, i love myself, double meaning there), that runs circles around it and a titan all day.Edit: The even sadder part is i'd be willing to bet both of my 760's use less power draw than the one 290x.  I absolutel KNOW they're quieter.

    Glad your happy. I wouldn't recommend SLI to many people though, particularly with non-top tier cards.

    Your happy, that's all that matters; I think you could have done better for the money, but that's my opinion and I'm not the one who's money is being spent on it.

    Have fun with SLI profiles.

  • TerranahTerranah Member UncommonPosts: 3,575

    I have an ati 5850 hd.  I still don't feel a pressing need to upgrade as it runs pretty much everything I use it for just fine.  Sometimes I can't run on ultra, I have to run on high or very high though. 

     

    I do enjoy these posts though. Thanks for posting, because at some point the upgrade will happen.  Holding out as long as I can so I can get the biggest upgrade.

  • VorthanionVorthanion Member RarePosts: 2,749
    Originally posted by Gaia_Hunter
    Originally posted by Vorthanion

     

    Yeah, why is that?  Maybe it never occurred to you that Nvidia cards are better at working with a larger number of system configurations than AMD cards?  I'm no slouch when it comes to system building and I know my way around the circuit board.  If I consistently have driver issues with one company's component, yet not with another, then my sense of logic says to go with the other.  I really don't care if you didn't have issues as we obviously didn't have the same systems.

    If regularly can't make AMD cards work with your system something is wrong with your ability.

    You know what they say about people who make assumptions about people they know nothing about........  I bow to the superior knowledge, foresight and skill of the anonymous bulletin board poster.

    image
  • fivorothfivoroth Member UncommonPosts: 3,916
    personally will never buy an AMD processor or an AMD radeon videocard. If a pc doesn't have an Intel cpu and an nvidia gpu, I am not buying it. Nvidia an Intel all the way. AMD is for poor people.

    Mission in life: Vanquish all MMORPG.com trolls - especially TESO, WOW and GW2 trolls.

  • RidelynnRidelynn Member EpicPosts: 7,383


    Originally posted by Classicstar
    You should be worried as nvidia fans if AMD eventually loses the battle you have one choice then and prolly even more expensive cards if no competition or you NVIDIA fanbois love stealing from your own WALLET?NVIDIA is the one that laughed in your faces for buying there way to expensive cards lol.Bah all you rabid fanbois sad real sad:(

    AMD does the same thing too when they can get away with it. It's competition that brings down the price, not the fact that AMD is saintly and nVidia a bunch of greedy bastards.

    The 7970, when it launched, was $550.00. It stayed that way until nVidia finally got the 680 out in sufficient numbers to hurt sales (nearly early Spring). Today, you can get the same GPU for $289 with free games and rebates on top of that.

    Competition helps everyone. The news here isn't that it beats Titan or doesn't beat Titan in performance, or that it even runs at 30W more power; it's that it's roughly the same performance as Titan for $400 less - that is a huge deal no matter how hot/noisy/powerhungry it is. The fact that it's hot and noisy isn't what made nVidia's prices come down, or force the release of a 780Ti - the fact that the price/performance ratio was what it was is what did that.

    Regardless of if you buy one or not, and for whatever reason you choose to do so, Classicstar is right: the R9 290X has definitely helped improve the overall landscape of GPUs one way or another - much like the 680GTX did when it finally got past it's soft release summer 2012.

  • RidelynnRidelynn Member EpicPosts: 7,383


    Originally posted by fivoroth
    personally will never buy an AMD processor or an AMD radeon videocard. If a pc doesn't have an Intel cpu and an nvidia gpu, I am not buying it. Nvidia an Intel all the way. AMD is for poor people.

    You can choose to do whatever you want to do - your choice. We can all choose to sit back and chuckle at you as well. It's your money.

    Not that all Intel/nVidia buys are poor choices, but if you are just going to put on blinders and buy strictly based on brand name, your cutting out a lot of options there, particularly in some great price points out side of "poor people" territory.

  • fivorothfivoroth Member UncommonPosts: 3,916
    Originally posted by Ridelynn

     


    Originally posted by fivoroth
    personally will never buy an AMD processor or an AMD radeon videocard. If a pc doesn't have an Intel cpu and an nvidia gpu, I am not buying it. Nvidia an Intel all the way. AMD is for poor people.

     

    You can choose to do whatever you want to do - your choice. We can all choose to sit back and chuckle at you as well. It's your money.

    Not that all Intel/nVidia buys are poor choices, but if you are just going to put on blinders and buy strictly based on brand name, your cutting out a lot of options there, particularly in some great price points out side of "poor people" territory.

    Intel/nVidia usually delivers superior performance to AMD. When I had a Radeon I had several issues with different games (not a huge number mind you) but I just never had these problems with nvidia.

    Mission in life: Vanquish all MMORPG.com trolls - especially TESO, WOW and GW2 trolls.

  • RabidMouthRabidMouth Member Posts: 196
    Originally posted by fivoroth
    Originally posted by Ridelynn

     


    Originally posted by fivoroth
    personally will never buy an AMD processor or an AMD radeon videocard. If a pc doesn't have an Intel cpu and an nvidia gpu, I am not buying it. Nvidia an Intel all the way. AMD is for poor people.

     

    You can choose to do whatever you want to do - your choice. We can all choose to sit back and chuckle at you as well. It's your money.

    Not that all Intel/nVidia buys are poor choices, but if you are just going to put on blinders and buy strictly based on brand name, your cutting out a lot of options there, particularly in some great price points out side of "poor people" territory.

    Intel/nVidia usually delivers superior performance to AMD. When I had a Radeon I had several issues with different games (not a huge number mind you) but I just never had these problems with nvidia.

    Just curious, do you like all Apple products as well? You seem to think that if the product costs the most it is obviously the best. Everything else is for "poor people."

    You can't reason someone out of a position they didn't reason themselves into.

Sign In or Register to comment.