Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Sapphire Radeon HD 5850 Xtreme

QuizzicalQuizzical Member LegendaryPosts: 25,355

What if you could get a Radeon HD 5850 for $150 or so, without rebates?  Great deal, right?  That's a good deal faster and cheaper than the competing Radeon HD 6850 and GeForce GTX 460 1 GB.  Well, it turns out that you can.

http://www.newegg.com/Product/Product.aspx?Item=N82E16814102932

http://www.tigerdirect.com/applications/searchtools/item-details.asp?EdpNo=199805&SRCCODE=GOOGLEBASE&cm_mmc_o=VRqCjC7BBTkwCjCECjCE

http://www.amazon.com/Sapphire-Radeon-PCI-Express-Graphics-100282XTREME/dp/tech-data/B004W75ATI

That's been out for a week or so, and they're still around.  It's not just a US thing, either:

http://ncix.com/products/?sku=60714&vpn=11162%2D15%2D20G&manufacture=SAPPHIRE

http://www.overclockers.co.uk/showproduct.php?prodid=GX-259-SP

The weird thing about this is that they're all exactly the same SKU:  the Sapphire Radeon HD 5850 Xtreme.  This raises the obvious question of why this card should be so much cheaper than the competition.  Cypress is 334 mm^2, which is not a cheap die to build.  In particular, it's a lot more expensive than the Barts die of the Radeon HD 6850.  And why just Sapphire, and not any of AMD's other board partners?

There's also a Sapphire Radeon HD 5830 Xtreme.  But there notably isn't a Sapphire Radeon HD 5870 Xtreme, which is the top bin of Cypress.  So there is the question of how this ended up happening. AMD mostly discontinued their Cypress GPU chip last fall, when its replacements, Barts and Cayman, launched. The various SKUs of Radeon HD 5870, 5850, and 5830 have slowly been disappearing. At 334 mm^2, it's a large, expensive GPU chip, so it's not the sort of card that AMD could sell cheaply for years.



However, AMD can't discontinue Cypress entirely, as they still need it for FirePro (professional graphics) and FireStream (GPGPU/supercomputer) cards. While those will eventually be replaced by cards based on the Cayman GPU that AMD launched in their Radeon cards (Radeon HD 6970 and 6950) last December, releases in those markets are delayed. Gamers will accept moderately buggy drivers for a while in newly launched gaming cards, but in the professional graphics market, that's a non-starter. So AMD has to delay the launch of new FirePro and FireStream cards until they've worked out the driver issues, which takes several months after the Radeon cards are ready to launch. Thus, those cards still need Cypress GPU chips.



While the FirePro and FireStream cards do include salvage parts, they don't go that low. So AMD ends up with a bunch of chips that will make perfectly good gaming cards, but can't meet the requirements to go into FirePro or FireStream cards. What do they do with those chips?



My guess is that they're selling them to Sapphire, for use in their "Xtreme" Radeon HD 5850 and 5830 cards, since that's a new SKU that released just this month. It's possible that they cut a deal with Sapphire to take all of the spare GPU chips that don't meet any FireStream or FirePro bins off their hands cheaply. Meanwhile, Sapphire passes on the discount to consumers to make sure that they can get rid of the cards before they're throughly obsolete. They don't want to still have millions of Cypress GPU dies laying around once Southern Islands and Kepler are out. Ask Nvidia how that worked for them with long obsolete GT215, GT216, and G92b dies that they're still trying to get rid of.



Now, that's just a guess, so it's likely wrong. But it's the only thing I can think of that would explain the pricing. And the card is an excellent deal for someone who is looking for a new video card today, which is why I started this thread.

Comments

  • ShinamiShinami Member UncommonPosts: 825

    Why not simply spend the $220 - $250 needed to buy a Sapphire version of a 6950 and then flash them to 6970s and for around $450 - $500 you can have a crossfire that matches a 6970 Crossfire's performance? Thats what I did for my Crossfire System (which I use for game testing)

    Of course you can reply with the "Some people just don't have $220" and I can retort with "sure...and they have plenty of money to buy their games each month as well as pay their cell phones...But they don't have enough money to actually upgrade their memory and video card to something decent." 

     

    The fact many are willing to buy a game today and then buy every DLC to a game that is released or expansion that comes out while on top of it pay into subscription fees and cash shops really points in that direction. If they don't have the money to upgrade and preserve a decent system, then I've got the quick fix for you...They are called "Consoles." ^_^ 

  • CatamountCatamount Member Posts: 773

    Originally posted by Shinami

    Why not simply spend the $220 - $250 needed to buy a Sapphire version of a 6950 and then flash them to 6970s and for around $450 - $500 you can have a crossfire that matches a 6970 Crossfire's performance? Thats what I did for my Crossfire System (which I use for game testing)

    Of course you can reply with the "Some people just don't have $220" and I can retort with "sure...and they have plenty of money to buy their games each month as well as pay their cell phones...But they don't have enough money to actually upgrade their memory and video card to something decent." 

     

    The fact many are willing to buy a game today and then buy every DLC to a game that is released or expansion that comes out while on top of it pay into subscription fees and cash shops really points in that direction. If they don't have the money to upgrade and preserve a decent system, then I've got the quick fix for you...They are called "Consoles." ^_^ 

    Hmm, so by that logic, why not simply spend the $2500 to get a pair of Geforce GTX 590s, a water cooling system to keep them at a reasonable temperature, a huge case to house them, and a 1500W PSU to power them?

    I mean, Of course you can reply with the "Some people just don't have $2500" and I can retort with "sure...and they have plenty of money to buy their games each month as well as pay their cell phones...But they don't have enough money to actually upgrade their memory and video card to something decent."



     



    The fact many are willing to buy a game today and then buy every DLC to a game that is released or expansion that comes out while on top of it pay into subscription fees and cash shops really points in that direction. If they don't have the money to upgrade and preserve a decent system, then I've got the quick fix for you...They are called "Consoles." ^_^

    AMIRITE?!!1!LOL

     

    On a price note, only 2GB 6950s flash to 6970s, and those start at about $265. That's an 80%+ price hike over the $144 5850, but the 6970 is not 80% faster than a Radeon HD 5850; it's about 40-50% faster. This also assumes that the inferior memory on a 6950 will take the full overclock to 6970 speeds. Sometimes it does, but I'm not going to bank on it, not long-term.

    It also goes without saying that you can put two of these 5850s in Crossfire as well, and at the resolutions most people play at, I highly doubt that you'd have any trouble running the vast majority of games on high graphical settings (if any). Two of these would represent a vastly better value, in most cases, than a single 6950 (flashed, or not). Crossfire scaling is really pretty good these days.

  • QuizzicalQuizzical Member LegendaryPosts: 25,355

    Most people don't have unlimited budgets, and so "you can get something better if you pay more" isn't terribly useful information.  The relevant question is usually either, what's the best I can get on this budget, or else, what's the cheapest I can get this level of performance?  And the Sapphire Radeon HD 5850 Xtreme has a decent chance of being the answer in either case.

    The question I'm interested in is, why is the card so cheap?  Here are the Radeon HD 5850s on New Egg:

    http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=100007709%20600007320%20600007601&IsNodeId=1&name=Radeon%20HD%205850

    $140 with free shipping for the Sapphire Radeon HD 5850 with no rebate, and then $188 for the next cheapest 5850.  Taken in isolation, that could just mean that the others are overpriced.

    So let's look at comparable cards.  Here's the GeForce GTX 460 1 GB:

    http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=100007709%20600007323%20600062521%20600007801&IsNodeId=1&name=256-bit

    Excluding open box cards, the cheapest is $186 before rebate.  That's an extra $46 for a far inferior card, as it's substantially slower while using more power.

    Or consider the nearest AMD alternative, the Radeon HD 6850:

    http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=100007709%20600083901%20600083830&IsNodeId=1&name=Radeon%20HD%206850

    That starts at $170 before rebate.  The Radeon HD 6850 isn't just slower than the 5850; it's also cheaper to produce.  Barts is a much smaller die than Cypress, at about 3/4 of the size.  AMD understood the process node much better by the time they had to tape out the chip, so Barts probably has better yields.  The 6850 and 5850 are somewhat comparable salvage part bins, though the 6850 actually disables more of the chip.  The 6850 uses less power, so it can take a cheaper heatsink and power circuitry, making it a much cheaper card to produce.

    Indeed, a large reason why Barts exists at all is that the cards based on it are so much cheaper than those based on Cypress.  The savings are large enough to justify paying millions of dollars to design Barts.  So why is the Sapphire Radeon HD 5850 Xtreme so much cheaper than any 6850?

    Furthermore, prices on the Sapphire Radeon HD 5850 Xtreme have been falling.  On New Egg, they started at $150, then dropped to $145, and are now $140.  That's not the sort of thing that happens if the low prices are a typo on their part, or a short term shell-shocker deal.  Rather, it's the sort of thing that happens if they have a huge overstock, to the degree that they might not otherwise be able to get rid of them in a timely manner.

  • LawlieLawlie Member Posts: 49

    kind of an off topic question.. but i have a NVIDIA GeForce 9800 GT... how bad is this card?

    How much better is the sapphire radeon hd 5850 compared to this one? sorry im a complete noob to all of this

     

    thanks.

  • Xero_ChanceXero_Chance Member Posts: 519


    Originally posted by Lawlie
    kind of an off topic question.. but i have a NVIDIA GeForce 9800 GT... how bad is this card?
    How much better is the sapphire radeon hd 5850 compared to this one? sorry im a complete noob to all of this
     
    thanks.

    Not bad seeing how I have a Radeon HD 4600. I can handle Assassin's Creed 2 and ME2 at full graphics flawlessly, I'm pretty sure I could handle most new games at least on medium settings.

    Some of these new graphics cards and the 4+ core processors are just overkill. What's even funnier is when people overpay to install more RAM than their operating system can even use.

  • QuizzicalQuizzical Member LegendaryPosts: 25,355

    A Radeon HD 5850 will typically offer around triple the graphical performance of a GeForce 9800 GT.  The GeForce 9800 GT would today be considered a budget gaming card, so it's not the sort of thing that you need to replace immediately in order to get games to run at all.  The 9800 GT was actually a pretty nice card when it launched in 2007, under the name GeForce 8800 GT, and later renamed as GeForce 9800 GT, and then GeForce GTS 240.  Actually, there were two slightly different cards branded as a 9800 GT that performed the same, with power consumption the only difference; one was rebranded from the 8800 GT, and the other was later rebranded a GTS 240.

    If you're happy with the performance of your current video card, then keep it.  But if you're looking to upgrade, this card would be a good option to consider.

  • RidelynnRidelynn Member EpicPosts: 7,383

    9800 GT is still a decent card. I think most video games coming out today are generally aimed at people running video cards with about the same amount of power as a 9800 GT has, so it should run even modern games on medium settings, and many on high settings. It may not get MAX OMG settings on new titles, or support the latest DX11 effects, but it will still run great at single monitor resolutions.

    If your gaming at 1080p or lower, not really a lot of compelling reasons to swap it out until you find a game that you can't keep the framerate up in it with the visual options you like. Sure, the newer cards are all faster, but when your limited by your screen resolution (which the vast majority of gamers are), then all that extra speed goes to waste until software catches up to find something to do with it.

  • reb007reb007 Member UncommonPosts: 613

    Quizzical, if I buy two of these HD 5850 Xtremes and set them up in Crossfire, will I be able to use dual monitors? (these cards only have a single DVI port)

     

    Also, can the 5850 Xtreme handle 1920x1080 at high settings? (my primary screen is 1920x1080)

  • QuizzicalQuizzical Member LegendaryPosts: 25,355

    Performance should be the same as basically any other Radeon HD 5850.  It will run nearly any game on the market smoothly at high settings at a 1920x1080 monitor resolution.  For some games, you'll be able to max settings with no problems.  For others, you'll need to turn a handful of settings down a bit.

    If you want to run two monitors, then probably the simplest thing to do is to get an adapter.

    http://www.newegg.com/Product/Product.aspx?Item=N82E16812119274

    That will let you plug a DVI monitor into an HDMI port.  I'm not 100% certain that the two monitor cables won't physically block each other, though.  It would depend on how your monitor plugs are shaped.  Getting an HDMI cable for one of the monitors may be an option.

    If you get two cards in CrossFire, I'm not sure exactly how it works to connect multiple monitors.  For Eyefinity, I think it wants you to connect all of the monitors to a single card.  For two monitors, you're presumably not going to use Eyefinity, though.

    Depending on what your old card is, you might be able to keep it in your system, in addition to a new card, and plug extra monitors into the old card.  If you have Windows 7, it can have multiple video drivers installed simultaneously.

  • RidelynnRidelynn Member EpicPosts: 7,383

    The 5850 will actually run 3 monitors at the same time. And, you can use any combination of 2 of the 3 ports. HDMI->DVI adapater, or a DP->DVI adapter either one will work for you to "add" a second DVI port.

    However, the DisplayPort on this card requires the more expensive Active adapter, from what I am hearing. This list helps a bit, but it's still kind of confusing:
    http://support.amd.com/us/eyefinity/Pages/eyefinity-dongles.aspx

    If you run Crossfire and want to use Eyefinity, however, you must hook all the monitors up to the same card. If you are just using the monitors as extra monitors (full screen gaming would still occur on a single monitor), then you can hook them up to either card and Crossfire will work fine.

    So, to answer the question a bit better:
    Yes, if you use Crossfire, you can use both DVI ports. But not if you plan in using Eyefinity across 3 monitors. An adapter would be a less expensive option to accomplish the same thing with a single video card, and you can use either the HDMI->DVI or DP->DVI (active) adapter.

  • TraveeTravee Member Posts: 4

    Originally posted by Quizzical

    The question I'm interested in is, why is the card so cheap?

    To answer your question (as a hd 5850 extreme owner myself), I think Sapphire just wanted to get rid of a pile of cards they had laying around. There is nothing wrong with those cards.

    Spirits of the Wild: Raven

  • RidelynnRidelynn Member EpicPosts: 7,383


    Originally posted by Travee


    Originally posted by Quizzical
    The question I'm interested in is, why is the card so cheap?

    To answer your question (as a hd 5850 extreme owner myself), I think Sapphire just wanted to get rid of a pile of cards they had laying aroud. There is nothing wrong with those cards.

    Also, I read someplace (but I can't find a source, so it may not be true, Sapphire was an ATI partner before this anyway) that Sapphire took over ATI's retail arm (the ATI-branded cards) when AMD spun it off after the buyout. If that's the case, they may still have the internal inside track on ATI chips.

  • QuizzicalQuizzical Member LegendaryPosts: 25,355

    Just don't spend too much on adapters, or else you lose the point of the card, which is that it's cheap.  If you'd have to spend $30 for a DisplayPort to DVI adapter, it might be better to just pick a card with two DVI ports.

    -----

    Sapphire couldn't have had a bunch of cards laying around and decided to get rid of them.  You don't do a new SKU, and hence a new design, and happen to already have the cards laying around.

    Nor would they likely have had a bunch of GPU dies laying around.  You don't order thousands of GPU dies unless you plan to put them in particular cards immediately and sell them.  Tech components lose value rather quickly, so if you're not going to use parts for a few months, you wait a few months to get them, as they might be cheaper by then.  Sapphire wouldn't be selling the cards at the price they are unless they were getting the dies cheaply.  If they only had a relative handful of cards to get rid of, they could charge $20 more for them, have them show up at $30 more at retail, and still be a pretty good deal--and plenty good enough to sell.

    AMD, on the other hand, might have had a bunch of GPU dies laying around.  When they run Cypress wafers, they don't get to pick how many of the dies can go in the top bin.  They try to make all of the dies good enough for the top bin.  TSMC just delivers whatever they can, and AMD makes the best of it.  The reason lower bins and salvage parts exist in the first place is so that if, say, 30% of the dies meet the requirements of your top bin, maybe another 50% can still be useful as a lower bin, so you can sell 80% of the dies rather than only 30%.  Even if the lower bins have to be sold cheaper, you might get more than double the revenue this way.

    That leaves the question of why AMD would have had a bunch of Cypress dies laying around, but not ones that fit the 5870 bin, and also how those dies made it to Sapphire.  I've taken my best guess at that in the initial post of this thread, but it's only a guess.

  • QuizzicalQuizzical Member LegendaryPosts: 25,355

    Originally posted by Ridelynn

     




    Originally posted by Travee





    Originally posted by Quizzical

    The question I'm interested in is, why is the card so cheap?






    To answer your question (as a hd 5850 extreme owner myself), I think Sapphire just wanted to get rid of a pile of cards they had laying aroud. There is nothing wrong with those cards.



     

    Also, I read someplace (but I can't find a source, so it may not be true, Sapphire was an ATI partner before this anyway) that Sapphire took over ATI's retail arm (the ATI-branded cards) when AMD spun it off after the buyout. If that's the case, they may still have the internal inside track on ATI chips.

    Sapphire is AMD's closest board partner.  They manufacture the reference cards that end up getting sold under the brand name of all of the other board partners, for example.  If there's a shortage of GPU dies, Sapphire is likely to get more than what an outside observer might think is their "fair share".  This happened in the early days of Cypress GPUs, for example, so the Sapphire version of the reference cards ended up with about as many reviews on New Egg as all of the other board partners' added together.

    Incidentally, Zotac is basically a different division of the same company as Sapphire, and sells Nvidia cards.  Sapphire is based on Hong Kong, while ATI was in Canada.  Sapphire was also an ATI board partner long before the company was bought by AMD, so it can't be something that was spun off after AMD bought ATI.

    But being a close partner of AMD doesn't, in itself, explain why Sapphire would get a ton of GPU chips at a discounted price.  Though it does partially explain why, if AMD wanted to get rid of a bunch of chips at a discounted price, they might turn to Sapphire rather than another board partner.

Sign In or Register to comment.