Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

AMD 290X beats TITAN for almost half the price!!!

123578

Comments

  • RidelynnRidelynn Member EpicPosts: 7,383

    Not that I advocate going out and dropping this much money unless you just absolutely have to, but this is interesting news.

    Apparently 1 R9 290X is good, but 2 are even better. Yes, they are hot. Yes, the cooler sucks - that can be fixed fortunately. But, the Crossfire scaling using the new bridgeless configuration is very good, and performance in CFX is very nice - well beyond what even Titan can do in SLI, and especially notable at 4k

    http://www.hardocp.com/article/2013/11/01/amd_radeon_r9_290x_crossfire_video_card_review/1


    The GeForce GTX TITAN is the fastest single-GPU video card from NVIDIA that is available. The GeForce GTX 780 Ti is not yet available, performance is unknown, and we don't have one at this time. Even with the GTX 780 Ti, the TITAN pricing is not changing.

    As per the recent NVIDIA price drop announcement GeForce GTX TITAN pricing remains the same at $999. That makes GeForce GTX TITAN SLI cost $2,000. Compare that to AMD Radeon R9 290X CrossFire, at $1,100. Now go back through this review and watch which configuration provides the highest performance and the best gameplay experience. The Radeon R9 290X CrossFire configuration owned GTX TITAN SLI. There wasn't one situation where R9 290X CrossFire wasn't faster. It didn't just ride the line with TITAN SLI, it leaped well beyond GTX TITAN SLI performance at 5760x1200 on three displays and on a 4K display.

    With the price drop to $499 on the GeForce GTX 780 it is very price competitive with the AMD Radeon R9 290X at $549. When configuring SLI and CrossFire, the difference is only $100 though. GTX 780 SLI will be $1,000, while R9 290X CrossFire will be $1,100. For only $100 more, you get gigantic gains in performance from R9 290X CrossFire over GTX 780 SLI.

    We saw from 20-40% performance advantages with R9 290X CrossFire over GTX 780 SLI. These are real, large increases, in performance. That extra money spent, is more than going into more performance delivered and a better gameplay experience.

    We are simply impressed with the dominance that Radeon R9 290X CrossFire is showing over the GTX TITAN SLI and GTX 780 SLI.


    And for the people who never, ever have any trouble at all with nVidia:



    In Metro: Last Light we were able to run at the highest in-game settings at 3840x2160 on AMD Radeon R9 290X CrossFire, but not the other cards. It is possible this game has a bug on TITAN SLI and GTX 780 SLI, as we did not see the full SLI indicator bar fully green, as we have in other games. There could be an inefficiency in SLI performance at 4K in this game on those cards.

    The fact remains though, R9 290X CrossFire does not have this issue. Radeon R9 290X CrossFire is 78% faster than GTX TITAN SLI in its current state in this game at 4K. Radeon R9 290X CrossFire is 94% faster than GTX 780 SLI with this current performance. We tried many things to try and make TITAN SLI and GTX 780 SLI better, but we got this same constant result.


    78% faster. Than something that costs 200% more. Sure, they use more power, but you can see that power is going toward something at least. If you OCed Titans to match those numbers (which you may could on some of the benchmarks, but defeinitely not Metro 2033), you'd probably be not far off the power mark as the R9 290. The 480, as hot and loud as it ran, never came anywhere close to those kind of numbers compared to the 5970 - it burned more power to put out less performance for the most part.

    I'm coming across as a bit of an nVidia fan boy - i'm really not, but I feel like I should defend AMD a bit against all the green fan boys out there. I'm for whichever solution works the best. I really like Keplar, I very nearly bought one irrationally because it is a great architecture. But for some reason nVidia has you guys brain washed into thinking they can do no wrong (every single time I see "I have never had problems with nVidia..." I think to all the blown up cards I've seen over the years), and that's not the case, both have their good and bad, and always have.

    AMD did a good job with the 290X. They can continue to improve on it, if they look to power efficiency, so hopefully the R10 or whatever they will call it will be even better. I thought the same thing about the 580 - it was a very good improvement over the 480, and nVidia totally turned it all around with the 680 and the 680 was very impressive (if they could have actually shipped it rather than soft release it would have been even better).

    Today, if I were going strictly for performance, it would be the 290X, especially in Crossfire. But not many people go strictly for performance, and not many people will afford a single 290X, so it's mostly just a thought experiment.

    I am interested to see how the 780 Ti comes into play though - it will have to be pretty impressive to warrant it's price tag right now.

  • RidelynnRidelynn Member EpicPosts: 7,383


    Originally posted by fivoroth
    Intel/nVidia usually delivers superior performance to AMD. When I had a Radeon I had several issues with different games (not a huge number mind you) but I just never had these problems with nvidia.

    A Lamborghini/Ferrari usually delivers superior performance to a Chevrolet. When I had a Chevrolet, I had to take it into the shop to have the alignment done (not a huge number of times, mind you), but I just never had those problems with my Ferrari.

  • miguksarammiguksaram Member UncommonPosts: 835
    Originally posted by Ridelynn

     


    Originally posted by fivoroth
    Intel/nVidia usually delivers superior performance to AMD. When I had a Radeon I had several issues with different games (not a huge number mind you) but I just never had these problems with nvidia.

     

    A Lamborghini/Ferrari usually delivers superior performance to a Chevrolet. When I had a Chevrolet, I had to take it into the shop to have the alignment done (not a huge number of times, mind you), but I just never had those problems with my Ferrari.

    ROFLMAO....that was funny.

     

    Very impressive numbers with the Xfire setup though.  That kind of performance would almost certainly require custom liquid cooling in order to not turn the room you were using it in into a furnace.   And once you start to factor in those costs the difference in price (not performance) begins to narrow quite a bit.  But there is no denying that at current the Radeon R9 290X is the more powerful card you can purchase for the money.

  • RidelynnRidelynn Member EpicPosts: 7,383

    I won't really argue with you, the stock coolers are crap - but even replacing them with a premium aftermarket cooler your talking $150 - that puts it in the same ballpark as 780TI, and still a long way from a Titan though. The cards are only about 30W difference - that's a good deal of power, but when you are in the 200W+ range, it's not quite that big by comparison.

    But, stock cooler is stock cooler and no one should expect much out of a reference cooler; after all, we don't sit around and complain that the i5 4670k can't overclock well using the stock cooler from Intel...

    That being said, the HardOCP benchmarks are all run using the stock cooler, on an non-custom readily available motherboard. There's nothing special or peculiar in their setup (although I wonder if they left it open air on the bench or actually enclosed it in a case, they aren't terribly clear on that - it could make a difference).

    nVidia does have very good reference coolers; but even with that, they aren't the most popular models. You still see plenty of non-reference coolers that are very popular. I would go so far as to say the biggest reason nVidia has such good reference coolers today is because of the 480 debacle... so maybe AMD will learn something this time around.

    If AMD does something boneheaded like block non-reference coolers, or delay them significantly (say before Black Friday/Cyber Monday - just to stick a point reference in there with absolutely no grounds or basis for justification other than I feel a month is probably long enough), then that's definitely a point for criticism that I will share.

  • miguksarammiguksaram Member UncommonPosts: 835

    Well unless rumors from the major sites are unfounded AMD is indeed delaying the release of 3rd Party non-reference models for at least a couple months.  I hope, the same as you do, that is false information but I've not seen anything substantial to dispute it.

     

    EDIT:  I'll preface this next statement with the fact that is refers to a niche market within a niche market but one that is nevertheless becoming increasingly popular in recent years.  When considering a SFF build thermal temperatures are of utmost importance as your ability to cool those systems is very limited compared to standard ATX or larger form factors.  Outside of a few popular cases which really don't fit the normal mold of SFF (i.e. much larger than they should be such as the original Bitfenix Prodigy) most will want to invest in a reference cooler or rear exhaust blower design (assuming liquid cooling is just not an option).  In that market the quality of nVidia's Titan based reference cooler (now found in a number of other GPUs) is second to none.

    The above is not in anyway shape or form meant to be a reflection of fanboism as I too go where my dollar stretches furthest. With that said I also place effective cooling and acoustics pretty high up on my priority list as oppose to raw power because I'm a family man with young children and after my experience with AMD ATI HD 4870's in Xfire (housed in a Coolermaster Cosmos 1000 case which had PLENTY of airflow) I learned just how hot/loud a room can get when the two aforementioned features aren't taken into consideration.  I'll leave the wife aggro that caused out of this thread.

  • RidelynnRidelynn Member EpicPosts: 7,383


    Originally posted by miguksaram
    That kind of performance would almost certainly require custom liquid cooling in order to not turn the room you were using it in into a furnace.


    Also, just a point of clarification, since I appear to be on a mission on this thread... I know you are just speaking tongue in cheek here but there are a lot of other people who are quite mislead about thermodynamics.

    It wouldn't matter if you used the stock cooler, or custom water cooling, or liquid nitrogen to cool the cards, it would heat up the room the same amount. 250W TDP is 250W, no matter how you dissipate it into the atmosphere (presumably, the room in this instance).

    The difference would be in the temperature you could keep the card, and how much noise the cooler will make in maintaining that temperature.

  • miguksarammiguksaram Member UncommonPosts: 835

    You are of course correct and sometimes I forget my audience when I've had a drink or 5.  Thank you for pointing that out.

     

    EDIT: Totally off topic for a moment but something that for whatever reason this thread reminded me of is if you own an EVGA GTX 770 and purchased it within the last 90 days (just prior to the recent price drop) you can opt to upgrade it to an EVGA GTX 780 for as little as $15 plus S&H.  Picking up a GTX 780 for around $430 total is about as good as it gets.  The random crap that comes to mind when drinking Honey Jack is absurd btw.

    http://www.evga.com/support/stepup/

  • QuizzicalQuizzical Member LegendaryPosts: 25,348
    Originally posted by Ridelynn

     


    Originally posted by miguksaram
    That kind of performance would almost certainly require custom liquid cooling in order to not turn the room you were using it in into a furnace.

     


    Also, just a point of clarification, since I appear to be on a mission on this thread... I know you are just speaking tongue in cheek here but there are a lot of other people who are quite mislead about thermodynamics.

    It wouldn't matter if you used the stock cooler, or custom water cooling, or liquid nitrogen to cool the cards, it would heat up the room the same amount. 250W TDP is 250W, no matter how you dissipate it into the atmosphere (presumably, the room in this instance).

    The difference would be in the temperature you could keep the card, and how much noise the cooler will make in maintaining that temperature.

    While that's pretty close to true, video cards often have more leakage at higher temperatures.  The difference between a premium cooler and a bad one can sometimes reduce power consumption by several percentage points.  One test on a GTX 480 a while ago found that running the same exact card under different setups to change the temperature (higher fan speed and/or cooler room, I don't recall exactly) changed the power consumption by as much as 10%.  It's not an enormous difference in room temperature, but when you're comparing energy efficiency of cards (in performance per watt) that only differ by several percentage points to begin with, it matters.

  • QuizzicalQuizzical Member LegendaryPosts: 25,348
    Originally posted by miguksaram

    Well unless rumors from the major sites are unfounded AMD is indeed delaying the release of 3rd Party non-reference models for at least a couple months.  I hope, the same as you do, that is false information but I've not seen anything substantial to dispute it.

    I find it wildly implausible that AMD would artificially delay non-reference cards longer than necessary.  What is highly plausible, however, is that they wanted to get the R9 290X out as soon as possible so there's going to be a shortage for a while, so that board partners didn't have much time to start working on custom designs before launch.  What is also quite plausible is that custom designs will take somewhat longer than normal because of some new hardware quirks of the card, such as the new CrossFire engine built in hardware to use the PCI Express bus rather than a CrossFire bridge or being the first video card ever to sport a 512-bit GDDR5 memory bus.  (Intel has already done so with the Xeon Phi, but that doesn't exactly have non-reference designs with premium coolers.)

  • RzepRzep Member UncommonPosts: 767

    R9 290x is crap at OC, especially till other companies release cards with better cooling. 

    gtx 780 OC can surpass titan. 

    gtx 780 price cut, better cooling, best choice.

     

    Still I am happy that I don't have to upgrade yet. My super oc 580 runs BF4 just fine and all I had to do was turn down AA to 2x, and turn off Post AA.

  • RidelynnRidelynn Member EpicPosts: 7,383


    Originally posted by Quizzical
    I find it wildly implausible that AMD would artificially delay non-reference cards longer than necessary.  What is highly plausible, however, is that they wanted to get the R9 290X out as soon as possible so there's going to be a shortage for a while, so that board partners didn't have much time to start working on custom designs before launch.  What is also quite plausible is that custom designs will take somewhat longer than normal because of some new hardware quirks of the card, such as the new CrossFire engine built in hardware to use the PCI Express bus rather than a CrossFire bridge or being the first video card ever to sport a 512-bit GDDR5 memory bus.  (Intel has already done so with the Xeon Phi, but that doesn't exactly have non-reference designs with premium coolers.)

    Well, the problem with that theory is that AMD doesn't make any of their own cards, they just provide the GPUs and a reference design. It's entirely up to the OEM to build the PCB and put the card together, including the cooler.

    If a OEM wants to deviate from the reference PCB, there will be design issues, but most don't. AMD hands out a reference PCB design with the GPU and basic firmware file - it's up to the OEM to spec out all the parts to put on the PCB (aside from the GPU), assemble them, package them, and ultimately sell them.

    So... the OEMs have the GPUs, even if there were a shortage it wouldn't preclude them from putting on non-reference cooling.

    Now, you may have a point in that AMD didn't publish the reference PCB layout until late, so OEMs couldn't get aftermarket fan designs into production in time... but that doesn't really hold water either, because they are getting these reference fans from somewhere (AMD isn't making them). If there was time to provide for manufacture of a reference fan, there was time to do something better....

    The reference PCB has mounting bracket/holes for the cooler - it's not like you have to entirely redesign the PCB for a new cooler, you just need to tweak your existing cooler to fit the new PCB (make sure the VRMs are cooled, etc). It's like every new motherboard that comes out - sure a new socket may change the mounting for the heatsinks on the CPU a bit, but it's extremely rare to see a new motherboard come along that requires an entirely new heatsink design.

  • RidelynnRidelynn Member EpicPosts: 7,383


    Originally posted by Rzep
    R9 290x is crap at OC, especially till other companies release cards with better cooling. gtx 780 OC can surpass titan. gtx 780 price cut, better cooling, best choice.

    You obviously are oblivious to how Boost and PowerTune work. The 290X OC's pretty well, and all you need to get it to OC well is keep it cool. It just doesn't OC like older generations did, now you have to play with thermal and power envelopes, not clock speeds. nVidia and AMD work pretty similarly in this regard - the 780 is exactly the same way, it just has a better cooler to start with.

    Can a 780 beat Titan if you OC it? Sure. But your trying to compare an overclock to a stock clock, and that's a false comparison, you could always OC the Titan as well.

  • QuizzicalQuizzical Member LegendaryPosts: 25,348

    OEMs that make custom cards don't ordinarily use the reference PCB and just stick their own custom cooler on it.  I don't think AMD and Nvidia even sell reference PCBs without a cooler attached.

    Look at the Radeon R9 280X, for example.  New Egg has 12 of them:  6 with a black PCB, 5 with a blue PCB, and 1 with a red PCB.  Now, the color of the PCB isn't important for its own sake, but it's the sort of thing that wouldn't vary if they were all using a reference PCB.

    And it's not like all of the PCBs of a given color are identical, either.  Three of the cards have a backplate that prevents you from getting a good look at the PCB, but among the other nine cards, the only pair that has PCBs that aren't obviously different from each other are two that are both made by HIS.  I count four obviously different monitor port combinations, too, and that's limited by the fact that there are only so many combinations of ports that aren't obviously stupid.

  • thinktank001thinktank001 Member UncommonPosts: 2,144

    290 is being priced at $400.    Does that mean 780s are dropping another $100?

  • BoudewijnsBoudewijns Member UncommonPosts: 162
    I dont care if amd is faster and cheaper, i bought a amd card once and had to buy a new one few months late bc the card was bleeped up, for me NVIDIA all the way



  • QuizzicalQuizzical Member LegendaryPosts: 25,348
    The Radeon R9 290 is a lot like the Radeon R9 290X:  fast, hot, and loud.  I wouldn't want one because it's hot and loud, but the cards with aftermarket coolers could be quite nifty and help force prices down.
  • RidelynnRidelynn Member EpicPosts: 7,383

    It's interesting reading the reviews on this card.

    Anandtech basically says "nice try" and "do not buy" because it's too loud.

    HardOCP gives it a Gold award, and tells you to run, not walk, to buy one (and make sure you get an aftermarket cooler when they become available).

    My takeaway way: yeah, the cards run hot and loud. Anand did an interesting comparison of power versus 480GTX, and it was pretty similar; they were pretty neck and neck for all except the runaway cases where Powertune clamped the 290 and nVidia didn't have anything at the time to catch TDP.

    Anand is hung up on the fact that it's loud - and that's a function of the crappy reference cooler. They should be hung up on the fact that it uses a lot of power, but they aren't really talking about that, they are instead showing graphs of "Performance/Noise ratio". Noise is important, but it has to be taken in context; it's a function of the cooler. Time to strike Ryan Smith off my list of credible reviewers.

    HardOCP, in their conclusion page, didn't even mention the word "Noise" or "Loud". They just make a one-line mention that the stock cooler is inefficient, and move on to show what performance the chip can do, and how much it costs with relation to the performance of other chips. That is the proper way to look at this, in my opinion.

    There are some very accurate comparisons with the 480. Both chips are power hungry, and I derailed the 480 because of it and seem to be defending the 290 against it. However, the big difference is that the 480 cost $500 at retail and only marginally faster than the 5870 (which retailed at $400). The 290/290X family is similarly faster than Titan, but significantly cheaper. The 290, which often meets Titan levels of performance, and consistently beats the 780, is almost 1/3 the price of Titan and 4/5's the price of the 780.

    The point here isn't the noise, or even the power. Neither is good, but the noise can be fixed and the power managed. The big news story here is the price. It almost doesn't even matter what the 780Ti benchmarks at, it has to be so significantly faster than the 290X to command that much of a premium in price that I don't know that it can meet it. Right now it's slated to almost double the 290's price, and I'd be really surprised if we see something that 20% faster.

    The latest rumor running around: AMD is intentionally delaying first-party coolers from being released until the 780Ti is released, so they can have something to bring to the table and get additional performance from better coolers than the reference and allow OEMs to ship with factory OC firmware.

  • ClassicstarClassicstar Member UncommonPosts: 2,697

    If i go over to many sides and see how most react is that nvidia boys just wanne pay more having a nvidia in there rig and then telling all that they have a nvidia.

    The 290 gonne cost around 350-400 euro mark a 780ti around 650 euromark maybe even more.

    If ASUS-MSI-SAPPHIRE and all others come with there versions and better coolers, you eather being thief of your own pocket so you can brag i have nvidia or pay alot less for similar or even faster cards.

    I realy don't care if its nvidia or amd i look at price and performence(with maybe little sympathie for AMD as the underdog).

    Btw for future 290s perform very well with 4k benchmarks.

    AMD 290 350 euro's or 780TI 650 euros?

    They both run todays games extremely well.


    Your choice.

    Hope to build full AMD system RYZEN/VEGA/AM4!!!

    MB:Asus V De Luxe z77
    CPU:Intell Icore7 3770k
    GPU: AMD Fury X(waiting for BIG VEGA 10 or 11 HBM2?(bit unclear now))
    MEMORY:Corsair PLAT.DDR3 1866MHZ 16GB
    PSU:Corsair AX1200i
    OS:Windows 10 64bit

  • NephelaiNephelai Member UncommonPosts: 185
    Originally posted by Ridelynn

     


    Originally posted by Nephelai
    I'm not biased one way or the other as I like competition driving prices down however you do realise that this card is 8 MONTHS after the Titan was released? With such poor thermals, noise etc.

     

     

    Don't be surprised if its reign is short - on the positive side it should keep the price of Nvidia's next card lower.


     


    And the 7970 (r9 280) was out how much earlier than Titan? Well over a year if I recall correctly, and out months before the 680/770 (especially if you consider the availability of the 680 early on, it was practically non-existent until mid-summer).

    The card uses a lot of power, the cooler has poor thermals. It concerns me, but not nearly as much as the nVidia 480 did (which is probably the closest fair comparison in terms of TDP) - because of PowerTune. The 480 had no way to control power or temperature if it started to run away, other than a static temperature switch (which only hits after the incident). PowerTune (and to be fair, nVidia's Boost will do it as well, but the 480 didn't have Boost) can catch it before it's a run-away problem - it may run hot, but it'll never run overly hot or exceed it's TDP like the 480 could.

    Your timeline and sequence of events is a bit mixed up however as expected the reign didn't last long - two weeks:

     

    http://www.anandtech.com/show/7492/the-geforce-gtx-780-ti-review

     

     

  • RabidMouthRabidMouth Member Posts: 196
    Originally posted by Nephelai
    Originally posted by Ridelynn

     


    Originally posted by Nephelai
    I'm not biased one way or the other as I like competition driving prices down however you do realise that this card is 8 MONTHS after the Titan was released? With such poor thermals, noise etc.

     

     

    Don't be surprised if its reign is short - on the positive side it should keep the price of Nvidia's next card lower.


     


    And the 7970 (r9 280) was out how much earlier than Titan? Well over a year if I recall correctly, and out months before the 680/770 (especially if you consider the availability of the 680 early on, it was practically non-existent until mid-summer).

    The card uses a lot of power, the cooler has poor thermals. It concerns me, but not nearly as much as the nVidia 480 did (which is probably the closest fair comparison in terms of TDP) - because of PowerTune. The 480 had no way to control power or temperature if it started to run away, other than a static temperature switch (which only hits after the incident). PowerTune (and to be fair, nVidia's Boost will do it as well, but the 480 didn't have Boost) can catch it before it's a run-away problem - it may run hot, but it'll never run overly hot or exceed it's TDP like the 480 could.

    Your timeline and sequence of events is a bit mixed up however as expected the reign didn't last long - two weeks:

     

    http://www.anandtech.com/show/7492/the-geforce-gtx-780-ti-review

     

     

    Not sure you can justify the marginal performance boost for the price they are asking. Classic nvidia imo.

    You can't reason someone out of a position they didn't reason themselves into.

  • RidelynnRidelynn Member EpicPosts: 7,383

    AMD 7970 release: Jan 9, 2012
    nVidia 680 release: March 22, 2012 (with availability issues well into the summer)
    AMD 7970 Ghz re-release: June 22, 2012
    nVidia Titan release: Feb 21, 2013
    nVidia 780 release date: May 23, 2013
    AMD R9 290X release date: October 24. 2013
    nVidia 780Ti release date: November 7, 2013

    In case my dates are a little mixed up, there's the official release dates. 7970 -> Titan, 1 year 2 months a few days. 7970 -> 290X... 1 year 9 months and a few days

    You could call the 7970 Ghz edition a new card, but it was just an official bump in the clock speed (to what many first-party providers were doing anyway), the chip didn't change at all -- similar to the 680->770 (which I didn't even list, because it doesn't seem too pertinent, nVidia brought out a new 780 with that rebadge, whereas AMD didn't have new silicon). I provided the date of that in the list as a re-release.

  • ClassicstarClassicstar Member UncommonPosts: 2,697


    Originally posted by CowboyHat
    Originally posted by Nephelai Originally posted by Ridelynn   Originally posted by Nephelai I'm not biased one way or the other as I like competition driving prices down however you do realise that this card is 8 MONTHS after the Titan was released? With such poor thermals, noise etc.     Don't be surprised if its reign is short - on the positive side it should keep the price of Nvidia's next card lower.
      And the 7970 (r9 280) was out how much earlier than Titan? Well over a year if I recall correctly, and out months before the 680/770 (especially if you consider the availability of the 680 early on, it was practically non-existent until mid-summer). The card uses a lot of power, the cooler has poor thermals. It concerns me, but not nearly as much as the nVidia 480 did (which is probably the closest fair comparison in terms of TDP) - because of PowerTune. The 480 had no way to control power or temperature if it started to run away, other than a static temperature switch (which only hits after the incident). PowerTune (and to be fair, nVidia's Boost will do it as well, but the 480 didn't have Boost) can catch it before it's a run-away problem - it may run hot, but it'll never run overly hot or exceed it's TDP like the 480 could.
    Your timeline and sequence of events is a bit mixed up however as expected the reign didn't last long - two weeks:   http://www.anandtech.com/show/7492/the-geforce-gtx-780-ti-review    
    Not sure you can justify the marginal performance boost for the price they are asking. Classic nvidia imo.

    Dubious review by anand again its smells nvidia favorism and he dont even mention the 4gb vram adtvantage and he claims the xfire-sli is tie HUH?

    He also use old games for this test which also favor nvidia.

    And test things nobody uses.

    290X/290 kick 780ti ass in the xfire-sli battle and 4k resolution, and sometimes even in normal 1gpu 25xx resolution and that for 200dollar cheaper card.

    Only big disadvantage is noise and heat but when they come with 2fans and noise reduction its no brainer which one to buy the over priced one or the future proof cheaper one.

    I realy dont see the big fuss by this new 780ti it cost alot and is not realy alot faster overall.

    Hope to build full AMD system RYZEN/VEGA/AM4!!!

    MB:Asus V De Luxe z77
    CPU:Intell Icore7 3770k
    GPU: AMD Fury X(waiting for BIG VEGA 10 or 11 HBM2?(bit unclear now))
    MEMORY:Corsair PLAT.DDR3 1866MHZ 16GB
    PSU:Corsair AX1200i
    OS:Windows 10 64bit

  • SiveriaSiveria Member UncommonPosts: 1,419
    Last I checked it took 4 of these ati cards to match the power of 2 titans, so I don't think it beats it at all. It does not even match the titan, it takes 2 of these ati cards to just barely be better than 1 titan, so I don't see how its better for half the price if it takes 2 to match the nvidia card.

    Being a pessimist is a win-win pattern of thinking. If you're a pessimist (I'll admit that I am!) you're either:

    A. Proven right (if something bad happens)

    or

    B. Pleasantly surprised (if something good happens)

    Either way, you can't lose! Try it out sometime!

  • KazuhiroKazuhiro Member UncommonPosts: 607

    I used to use ATI cards, from the x9700xtx to the 1680, then I got a 670 nvidia card as it was the best at the time, and I'm not sure I can ever go back to ATI cards now. (AMD now technically.) ATI may be cheaper, much cheaper in fact. But your getting a knockoff GPU basically. I can't even begin to describe how much better Nvidia drivers are, or the fact that they just run with less issues.

    I remember that for Skyrim I bought a top of the line ATI card for the game, and had so many damn issues with it, from frame-skipping, to stutters, and worse. Ended up getting a Nvidia card some time later that was only marginnaly more powerful, and the game ran like a dream. (And Skyrim is a ATI/AMD advertised game fyi.) In short, it seems like you buy a ATI/AMD card when you can't afford an NVIDIA card.

    The ATI/AMD card will always have some major drawback to balance out it's lower price. (In the case of the card your mentioning, it will melt a hole through your pc, through your desk, then through the floor, and then finnaly encase itself in molten stone beneath the earth.) So in the end, I just look at ATI/AMD cards are cheaper/lower tier GPUs.

    To find an intelligent person in a PUG is not that rare, but to find a PUG made up of "all" intelligent people is one of the rarest phenomenons in the known universe.

  • SmikisSmikis Member UncommonPosts: 1,045
    Originally posted by miguksaram

    I'm just going to leave this right here for those that have an open enough mind to accept a different perspective on the card from a well respected review site.

    http://www.overclock3d.net/reviews/gpu_displays/amd_r9_290x_review/1

    If you have the time and care about this card or it's potential impact on the market I suggest reading the written review and watching the associated video.

    I follow all the sites listed in this thread and find the majority of them to be decent but none of them are as up front as TTL IMHO.

    a reviewer which claims that 290x wont overclock, when it easily  does 15%+ oc, which in return outperforms overlocked 780ti, i never had ati cards and have 2 sli geforce cards, so no bias there, just bullshit review there

Sign In or Register to comment.