Quantcast

Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Radeon HD 7950 arrives

QuizzicalQuizzical Member LegendaryPosts: 22,234

It's faster than a GeForce GTX 580, cheaper, uses less power, and has a better feature set.  Even if you didn't think the 7970 made the GTX 580 irrelevant, the 7950 certainly does, at least at current prices.  For Nvidia, Kepler can't arrive soon enough.

AMD is still charging quite a lot for their Tahiti cards, and cheaper than a GTX 580 is not the same thing as cheap.  I'd expect to see prices drop a bit once they can more reliably keep them in stock.  But just a bit, unless GK104 manages to start a good price war.  Which it might, as rumors seem to be pretty bullish on GK104 performance.

Further down the line, Pitcairn should be coming soon, and then Cape Verde.  For laptops, I'd expect to see Pitcairn and Cape Verde show up, but not Tahiti.  I'm not sure if Nvidia will try to cram GK104 into a laptop, but even if not, lower end Kepler cards may finally make Nvidia competitive in laptops again, as they haven't been for two years.

Comments

  • RidelynnRidelynn Member EpicPosts: 7,076

    I can certainly understand why they have the price points at $550 and $450 for the 7900 series - they have no competition.

    Really though, that's a price point with a small market.

    Until Kepler gets out here (or whatever nVidia finally pushes out the door), prices are going to stay high. But AMD played this same game with the 6800 series: the 6870 started at $240, and dropped quickly to about $170 once nVidia released it's 560 several months later.

    There was about 6 months between AMD's 5800 series release and nVidia finally getting Fermi rushed out the door.

    I don't see nVidia dropping the 580 price much really, it's an expensive chip to make in the first place, and nVidia fanboys will buy it regardless (those few that don't have it yet). They may be in the Osborne effect though, where everyone is just holding their breath waiting on the 600 series, since AMD's new cards are out now, and the Kepler hype is definitely out there. Even when Kepler comes out, the older series don't drop in price, with rare exception (there was the summer of the 5850, where you could find Sapphire 5850's for basically half price, but it was a clearance type event, and many retailers will do short-term sell-offs just to clear up inventory space).

    I am impressed that the 7950 competes more or less on equal footing to the 580, although honestly, I had expected nothing less. What is somewhat of a surprise is that Cayman architecture just seems to be a one-shot deal: the 6900 series has it, and that's basically it. Unless we see it revived in the 7000 series somewhere, which I haven't really seen any indication of.

    Really, nVidia doesn't need to push performance to stay competitive. They need to take a step back and concentrate on power management and efficiency per mm2. I think they are stuck in a similar rut that Intel was with Netburst - so focused on performance they can't get over the roadblocks they are creating impeding that performance: mainly, efficiency. I think this is a large reason why nVidia hasn't been competitive in any of the lower graphics tiers. Sure, they retained the single-GPU performance crown for a long time, but it was at a high price, and they are at a relative dead end with regard to that architecture. nVidia's big push with GF110 wasn't "more efficient" - it was "Quieter and Cooler" - because the 480 had such a well-earned reputation for being very hot, and subsequently, quite noisy. Although, this isn't any surprise, the 480 was more or less a stop-gap to get something out the door, and may be a portent for what we see with the 680.

    Kepler doesn't need to beat Tahiti to remain viable. Cayman didn't beat Fermi in raw performance, but competed very well in performance per dollar and crushed in performance per watt. As long as you are competitive performance per dollar, you will be able to sell cards. Competitiveness in performance per watt makes it easier to get those cards to customers (easier to cool, lower power requirements, quieter, easier to scale in small form factors and laptops, generally better overclock performance, etc).

    By all real indications - Kepler is still a few months off. We haven't seen many of the traditional leaked pictures, alleged benchmark scores, NDA-be-damned posts from testers, "Official release dates" according to SKU's, etc. I do want Kepler to come, and I want it to be good: not because I am an nVidia fan, but because competition keeps prices low. The day Kepler releases, we'll see the 7000 series price drop, and by a good margin, unless nVidia does something suicidal with the release pricing.

  • KabaalKabaal Member UncommonPosts: 3,042

    I can't see these selling well at the current price of £350 and up as they aren't really needed, there's very few titles out there that even comes close to taxing the 69** series or a 580 to make the cost justifiable for even many enthusiasts. The 7970 hasn't been selling anywhere near as well as had been hoped with many retailers having over-estimated on their stock orders and the pricing of the 7950 isn't much below the 70.

    On the plus side, at least they aren't a side step like 6*** was coming from 5***, some of those were even downgrades when power consumption isn't factored. Now games just need to catch up to make them worthwhile buying.

  • karter64karter64 Member UncommonPosts: 96

    How do you guys keep all those numbers straight without going insane?  And why do lower numbers represent newer or better cards. Just doesn't make sense to me.

    Oh well, glad you can do it though, it makes for interesting reading even if I don't ave a clue what you're talking about.

     

     

  • QuizzicalQuizzical Member LegendaryPosts: 22,234

    At least AMD uses different numbers for different cards.  Sometimes Nvidia uses the same number for different cards.  Well, AMD sometimes gives the same number to a desktop card and to a very different laptop card.  But Nvidia will give the same number or two different desktop cards or sometimes two radically different laptop cards.

  • BrodterBrodter Member Posts: 73
    OP I think you should start a blog.

    image

  • RidelynnRidelynn Member EpicPosts: 7,076


    Originally posted by karter64
    How do you guys keep all those numbers straight without going insane?  And why do lower numbers represent newer or better cards. Just doesn't make sense to me.
    Oh well, glad you can do it though, it makes for interesting reading even if I don't ave a clue what you're talking about.
     
     

    You work around them, read about them, and deal with them all the time, you just kinda get to know them.

    Kinda like motorheads can run around talking about 350's, 327's, LS1's, 305's, etc.
    Or fashion-concious folks can run around talking about Armani, Gucci, Versace, Klein, etc.

    At least they follow a semi-logical pattern.

  • RidelynnRidelynn Member EpicPosts: 7,076


    Originally posted by Kabaal
    I can't see these selling well at the current price of £350 and up as they aren't really needed, there's very few titles out there that even comes close to taxing the 69** series or a 580 to make the cost justifiable for even many enthusiasts. The 7970 hasn't been selling anywhere near as well as had been hoped with many retailers having over-estimated on their stock orders and the pricing of the 7950 isn't much below the 70.
    On the plus side, at least they aren't a side step like 6*** was coming from 5***, some of those were even downgrades when power consumption isn't factored. Now games just need to catch up to make them worthwhile buying.

    I think there is a lot of truth in this.

    If I'm honest about it, I could still be using my GTX260 perfectly well. I haven't budged from a 1920x1200 resolution in .. years. Resolution was the first driver in 3D, then it was AA, then it was shaders, now it's ... ancillary computing, which developers are struggling to implement (and keep a code base that will work on consoles, which lack a lot of this). Heck, a lot of gaming engines are still catching up on shaders.

    Sure, DX11 and Tessellation and blah blah, but so very few games use it, and even those that do, the differences in visual quality are very minor, but the performance required to get that is huge.

    If anything, that performance overhead can go toward brute force improvements - which is what something like Eyefinity and 3D need to work well, and to a lesser extent AA. But since we have been more or less stuck at 1080p, and our polygon count is high enough to not be terribly constraining, and texture memory is fairly large and flexible (Rage) - I honestly think it's console development that's holding large AAA developers (and those making general purpose graphics engines) from using all the tricks that have advanced the graphics field since the introduction of the nVidia RSX and ATI Xenos.

    Although your statement that the 7970 hasn't been selling well - I don't know about that. I haven't seen any sales numbers, all I know is that stock has been spotty lately - that could be high demand, or that could be low supply, or anything in between really. I don't expect the 7950 to sell any better though, when your in that high of a price tier, I think most people are just going to be willing to shuck out the little bit extra and get the full blown 7970 anyway. Once the 7970 drops to mid-upper $300's, and gains more separation from the 7970, we may see it pick up and find a niche where the current 6970's and 570's have been at.

Sign In or Register to comment.