Quantcast

Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Why Juniper (Radeon HD 5750/5770/6750/6770) is the most impressive GPU chip in a long time

QuizzicalQuizzical Member LegendaryPosts: 22,130

When the Radeon HD 5770 launched in October 2009, it offered performance about on par with a Radeon HD 4870 or GeForce GTX 260, but cost substantially more than either card.  That didn't last long, as the older cards were basically on clearance and soon disappeared, leaving the Radeon HD 5770 unrivaled as the mid-range gaming card of choice.

Lots of video cards have dominated a market segment for a period of time over the years.  What makes Juniper unique is that it held this position for over two years, an eternity in a world ruled by Moore's Law.  From around the start of 2010 all the way until Cape Verde launched in February 2012, if you wanted a video card in Juniper's market segment, either you bought a Juniper-based card or else you overpaid for inferior hardware.

And this was true almost regardless of your gaming preferences, apart from rabid Nvidia fanboydom.  If you cared greatly about power consumption, Juniper was your best bet.  If you didn't care about power consumption, Juniper was still the best.  If you wanted modern API compatibility, Juniper had it.  If you didn't care about DirectX 11 or later OpenGL 4, Juniper was better at the older APIs than competing older cards.  If you needed several monitors, Juniper could do it.  If you only wanted one, you still couldn't do better.

It doesn't matter if you defined Juniper's market segment by retail price, performance, power consumption, or die size.  It doesn't matter if your metric of greatness was performance per dollar, performance per watt, or performance per mm^2.  Juniper was the best in its market segment no matter which metrics you chose.

Part of Juniper's longevity is because there was a long time gap to a new process node.  If you can do a die shrink, even by a half node, it's pretty easy to make a new card that is better than your old one.  A straight die shrink without architecture improvements will give you easy performance per watt and performance per mm^2 improvements.  Juniper was made on a 40 nm process, and then TSMC cancelled its 32 nm process, so that a process node improvement could not come until TSMC's 28 nm process node was ready.

If you go back a ways further, Nvidia's G80 chip (most GeForce 8800 series cards) completely dominated its market segment at launch.  But only a year later, it was clearly beaten by G92 (some GeForce 9800 series cards) in the relevant efficiency metrics.  G92's dominance would last less than a year before G92b (other GeForce 9800 series cards, and also GeForce GTS 150 and GTS 250) and then RV770 (Radeon HD 4800 series) arrived.  That's what a die shrink can do for you, and is the reason why dominant products rarely remain dominant for long.

Part of it is also because Juniper came very early on the 40 nm process node, yet somehow escaped the yield problems that are often associated with this.  It had a top bin 5770, a salvage part 5750 that only disabled one of the 10 SIMD engines, and no further salvage part bins in most of the world.  There was a Juniper-based 5670 that was sold only in China, because AMD didn't have a ton of defective chips that they needed to get rid of.  Contemporary Nvidia chips such as GF100 and GF104 had catastrophically bad yields, and so would soon give way to GF110 and GF114, which were basically fixed versions of the original chips.  Juniper got it right the first time, with no such need for a fix.

Juniper wasn't the only chip in AMD's Radeon HD 5000 series, of course.  But the others didn't have anywhere near the longevity.  Cypress (Radeon HD 5800 series) had a soft launch, and wasn't in stock for a year before it gave way to Barts (Radeon HD 6800 series).  Redwood (Radeon HD 5500/5600 series) was undercut by RV730 (Radeon HD 4600 series) on price for much of its lifetime, then got replaced by Turks (Radeon HD 6500/6600 series).  Cedar (Radeon HD 5450) couldn't hang with RV710 (Radeon HD 4350/4550) in performance even at launch day.  GPUs in the Radeon HD 6000 series typically had about a year before being demolished by newer GPUs on a 28 nm process--and other than Barts, didn't dominate any market segment even in their heyday.

Not only did AMD see no need to replace Juniper when making its 6000 series, but Nvidia simply couldn't compete with it.  GF106 (GeForce GTS 450) managed to compete with a Radeon HD 5770 in retail price, but couldn't match its performance, and needed a much larger die.  Its successor, GF116 (GeForce GTX 550 Ti) managed to catch a Radeon HD 5770 in performance, but at the cost of a much larger die, considerably more power consumption, and a substantially higher price tag.  No matter what metric you preferred, Juniper was a better GPU chip than GF116.

Even today, more than 34 months after Juniper's launch, now that Juniper has been discontinued and pulled from the market, Nvidia doesn't have an answer for it in desktops.  They do in laptops, where the GeForce GTX 660M is clearly a superior card as compared to the old Mobility Radeon HD 5870.  That's what a die shrink can do for you.

None of today's GPU chips are likely to have anywhere near Juniper's long dominance in a market segment, either.  Cape Verde dominates its market segment today, but will that last once Nvidia brings a proper GK107 card to desktops?  Probably not.  Tahiti's run of dominance at the high end lasted two months before GK104 arrived, or four if you want to wait until the latter was broadly available to people who actually wanted to buy one.  Today, Tahiti and GK104 are competitive with each other, leaving neither to dominate its market segment.  Pitcairn has no real rival in its market segment, but that will probably change as soon as Nvidia gets around to launching an upper mid-range Kepler card.

This doesn't really have any bearing on what you should buy today.  Juniper is now discontinued and mostly off the market, and has given way to Cape Verde (Radeon HD 7700 series).  But I just thought it was interesting and unusual.

Comments

  • TheLizardbonesTheLizardbones Member CommonPosts: 10,910

    You should publish this stuff in an industry blog or something. I don't know who would read it, but it seems like something that people in the "industry" would read. Even if I don't really understand everything you're saying, I feel like I've learned something. That's pretty good.

    ** edit **

    Also, I have a 5770 card and have been extremely happy with it for a long time. It doesn't surprise me that it's a great card.

    I can not remember winning or losing a single debate on the internet.

  • bobbymobobbymo Member UncommonPosts: 48
    Purchased an HIS HD5770 in December 2010 for $99. Paired with a AMD X3 3.3 and 4gb of ram it does the job. For a budget system it has handled every game I have installed. I have no plan(or need) to upgrade any time soon. Definately the best pound for pound video card I have ever used.
  • TrionicusTrionicus Member UncommonPosts: 498

    Got them crossfire 5770's XFX I think. The ones that I thought ran kind of hot, they seem fine now, had to do a little tinkering. All I have to say is that they seem to perform like my 6970, and def better than the 6770 I have. I haven't seen a need to jump to the 7000 series.

    Maybe I will soon enough but for now, 2x5770 are STILL hanging tight in i5 1155 systems.

     

    And yes, can you please start blogging again Quiz, maybe throw a few hyperlinks for those of us that are less technically inclined?

  • QuizzicalQuizzical Member LegendaryPosts: 22,130
    Originally posted by Trionicus

    Got them crossfire 5770's XFX I think. The ones that I thought ran kind of hot, they seem fine now, had to do a little tinkering. All I have to say is that they seem to perform like my 6970, and def better than the 6770 I have. I haven't seen a need to jump to the 7000 series.

    A Radeon HD 5770 and 6770 are just two different names for the same card.  (Technically there is a BIOS change so that the 6770 suppports HDMI 1.4a, but the GPU chip itself is the same.)  When AMD moved from their 5000 series to their 6000 series, they decided to continue using the older Juniper GPU, but renamed it as 6770 to sound like it was part of their new series.

Sign In or Register to comment.