Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

AMD launches desktop Richland: a little more CPU performance, a little more GPU performance, a litt

QuizzicalQuizzical Member LegendaryPosts: 25,347

And, of course, a little higher price tag--all as compared to previous generation Trinity, of course.

Tom's Hardware found that with 2133 MHz memory (which Richland officially supports, by the way), the integrated GPU in an A10-6800K tends to outperform a Radeon HD 6670 discrete desktop card.  That's very much an entry-level gaming card, of course, but it's also the fastest DDR3 discrete card that makes much sense to buy.

Richland isn't an earth-shaking product, of course.  While it's better than Trinity in every way, it's only a little better.  Richland is really just a stopgap product until Kaveri arrives later this year.  Even so, in desktops, Richland is arguably a bigger advance over Trinity than Haswell was over Ivy Bridge.

Comments

  • AdalwulffAdalwulff Member, Newbie CommonPosts: 1,152

    You know I never really understood why AMD went south, when they used to be ok graphics wise.

    During the 90s I bought only AMD, but they just couldn't keep up. I am now sporting an I7 with a GTX 680 card and I cant even imagine ever going back.

    image
  • jdnewelljdnewell Member UncommonPosts: 2,237
    Originally posted by Adalwulff

    You know I never really understood why AMD went south, when they used to be ok graphics wise.

    During the 90s I bought only AMD, but they just couldn't keep up. I am now sporting an I7 with a GTX 680 card and I cant even imagine ever going back.

    He is talking about integrated graphics. Which AMD does very well, better than intel. And AMD discrete GPUs have been very good for years.

  • QuizzicalQuizzical Member LegendaryPosts: 25,347
    If we're talking about desktop graphics, then the current generation is basically a draw between AMD and Nvidia, while AMD won the previous three generations before that.  As for processors, AMD's problem isn't so much that they got worse as that Intel got a lot better.  AMD should close the gap substantially when their Steamroller cores launch in Kaveri, but I don't think AMD has any meaningful hope of catching Intel in desktop CPU performance before about 2015.  I'm not predicting that AMD will catch Intel in desktop CPU performance in 2015, mind you; I'm only noting that AMD will be pushing a radically different architecture around then and we have no clue right now how it will perform.
  • GroovyFlowerGroovyFlower Member Posts: 1,245
    Originally posted by Adalwulff

    You know I never really understood why AMD went south, when they used to be ok graphics wise.

    During the 90s I bought only AMD, but they just couldn't keep up. I am now sporting an I7 with a GTX 680 card and I cant even imagine ever going back.

    I think most of you have impression AMD still spew bad drivers out or be brainwashed by most games that say play nvidia?

    AMD/ATI radeoncards are sinds DX11 era very good and also cheaper then nvidia after my 8800gt i switch over to AMD radeon and never regret it there superb cards and last couple of years there drivers also became alot better then before DX11 era.

    One huge mistake that gaming industry made was let there game be sponsored by one videocard and in a way saying if you buy other then thisa game ad your loser slapping with nvidia or amd other side in the face.

    Game publishers should support both equally ive refused games when it said nvidia only just for setimental reasons hehe.

  • QuizzicalQuizzical Member LegendaryPosts: 25,347
    Originally posted by GroovyFlower

    Game publishers should support both equally ive refused games when it said nvidia only just for setimental reasons hehe.

    That's not a case of "game requires Nvidia-only".  It's a case of "Nvidia paid game publisher to put Nvidia ad in loading screen".

    There are a handful of cases where a game implements something that only runs on Nvidia cards, but the game will still run on AMD cards just fine with a setting or two turned off.  Games really only do that when Nvidia pays them to implement the Nvidia-only features.

  • AdalwulffAdalwulff Member, Newbie CommonPosts: 1,152
    Originally posted by jdnewell
    Originally posted by Adalwulff

    You know I never really understood why AMD went south, when they used to be ok graphics wise.

    During the 90s I bought only AMD, but they just couldn't keep up. I am now sporting an I7 with a GTX 680 card and I cant even imagine ever going back.

    He is talking about integrated graphics. Which AMD does very well, better than intel. And AMD discrete GPUs have been very good for years.

     

    Then why does AMD fail with so many games, while intel performs?

    image
  • AdalwulffAdalwulff Member, Newbie CommonPosts: 1,152
    Originally posted by GroovyFlower
    Originally posted by Adalwulff

    You know I never really understood why AMD went south, when they used to be ok graphics wise.

    During the 90s I bought only AMD, but they just couldn't keep up. I am now sporting an I7 with a GTX 680 card and I cant even imagine ever going back.

    I think most of you have impression AMD still spew bad drivers out or be brainwashed by most games that say play nvidia?

    AMD/ATI radeoncards are sinds DX11 era very good and also cheaper then nvidia after my 8800gt i switch over to AMD radeon and never regret it there superb cards and last couple of years there drivers also became alot better then before DX11 era.

    One huge mistake that gaming industry made was let there game be sponsored by one videocard and in a way saying if you buy other then thisa game ad your loser slapping with nvidia or amd other side in the face.

    Game publishers should support both equally ive refused games when it said nvidia only just for setimental reasons hehe.

     

    I think you guys are not understanding, its not about brainwashing....lol

    Like I said, I always bought AMD, but now I don't because it simply does not perform as well. It had NOTHING to do with seeing NVIDIA on the box. Seriously man, most of us gamers are smarter than that.

    image
  • RidelynnRidelynn Member EpicPosts: 7,383


    Originally posted by Adalwulff

    Originally posted by jdnewell

    Originally posted by Adalwulff You know I never really understood why AMD went south, when they used to be ok graphics wise. During the 90s I bought only AMD, but they just couldn't keep up. I am now sporting an I7 with a GTX 680 card and I cant even imagine ever going back.
    He is talking about integrated graphics. Which AMD does very well, better than intel. And AMD discrete GPUs have been very good for years.
     

    Then why does AMD fail with so many games, while intel performs?


    I can't think of any games where AMD outright fails - be it CPU or GPU.

    Sure, many (most, maybe) games perform better on an Intel CPU than an AMD CPU, but an Intel CPU can also cost roughly 2x-5x more than the AMD CPU, so you kinda expect that. I can't think of any game where a current generation AMD CPU isn't enough to performance to play the game (I won't say at MAX MAX detail though - which some of you may argue is "required" to play the game).

    I can think of many games where Intel GPU fails. Not only "fail" as in the WoW Fanboi "OMGZ Epic Fail" it doesn't work well, but also as in the classic definition of fail being that the game is completely unable to run at all.

    I will say that nVidia does have the GPU performance crown, especially with Titan, but your going to pay for it, and it isn't like AMD isn't at least competitive in common price brackets: I would say in most price brackets that gamers are interested in ($100-$500, in various discrete steps), AMD is outperforming nVidia in the majority of them. There are only a few cases where nVidia actually outperforms it's similarly priced AMD competitor.

    As far as most gamers being smarter than ... whatever that is. As a subset of humans in general, I don't have that much faith in humanities ability to objectively divorce from effective marketing and propoganda (myself included in that). Given that I don't believe in people as a whole, I extend that to the subset of humans which are gamers: I totally believe, and have a certain amount of anecdotal evidence, that gamers really aren't smarter than ... whatever that is.

    Case in point, how many people bought nVidia 470/480's?

  • IselinIselin Member LegendaryPosts: 18,719

    My first PC ever had an AMD 386-40 (I was an Atari and Amiga guy before that.) I've been an AMD fan for more than 2 decades.

    I wish they'd stop pissing around with incremental releases that will be largely ignored since it coincides with the Haswell release, and start seriously challenging the i7 instead of competing against the Intel low end desktop or going after an increased market share of the bottom feeder integrated-GPU laptop market.

    I'd love to be able to build an enthusiast desktop based around an AMD CPU again but that still ain't happening any time soon...I'm not that much of a fan. I guess I'll just have to be satisfied knowing my Raedon 7970 is a damn good GPU :)

    "Social media gives legions of idiots the right to speak when they once only spoke at a bar after a glass of wine, without harming the community ... but now they have the same right to speak as a Nobel Prize winner. It's the invasion of the idiots”

    ― Umberto Eco

    “Microtransactions? In a single player role-playing game? Are you nuts?” 
    ― CD PROJEKT RED

  • AdalwulffAdalwulff Member, Newbie CommonPosts: 1,152
    Originally posted by Ridelynn

     


    Originally posted by Adalwulff

    Originally posted by jdnewell

    Originally posted by Adalwulff You know I never really understood why AMD went south, when they used to be ok graphics wise. During the 90s I bought only AMD, but they just couldn't keep up. I am now sporting an I7 with a GTX 680 card and I cant even imagine ever going back.
    He is talking about integrated graphics. Which AMD does very well, better than intel. And AMD discrete GPUs have been very good for years.
     

     

    Then why does AMD fail with so many games, while intel performs?


     

    I can't think of any games where AMD outright fails - be it CPU or GPU.

    Sure, many (most, maybe) games perform better on an Intel CPU than an AMD CPU, but an Intel CPU can also cost roughly 2x-5x more than the AMD CPU, so you kinda expect that. I can't think of any game where a current generation AMD CPU isn't enough to performance to play the game (I won't say at MAX MAX detail though - which some of you may argue is "required" to play the game).

    I can think of many games where Intel GPU fails. Not only "fail" as in the WoW Fanboi "OMGZ Epic Fail" it doesn't work well, but also as in the classic definition of fail being that the game is completely unable to run at all.

    I will say that nVidia does have the GPU performance crown, especially with Titan, but your going to pay for it, and it isn't like AMD isn't at least competitive in common price brackets: I would say in most price brackets that gamers are interested in ($100-$500, in various discrete steps), AMD is outperforming nVidia in the majority of them. There are only a few cases where nVidia actually outperforms it's similarly priced AMD competitor.

    As far as most gamers being smarter than ... whatever that is. As a subset of humans in general, I don't have that much faith in humanities ability to objectively divorce from effective marketing and propoganda (myself included in that). Given that I don't believe in people as a whole, I extend that to the subset of humans which are gamers: I totally believe, and have a certain amount of anecdotal evidence, that gamers really aren't smarter than ... whatever that is.

    Case in point, how many people bought nVidia 470/480's?

     

    As I don't play any MMOs currently, and haven't for some time, I cant say which MMOs cant run on AMD.

    But, I do play a lot of Planetside 2, and the AMD people are getting the shaft. Most of them couldn't even play during beta. I think Sony finally came up with a patch that fixes most of their issues.

    But from what I am reading on the fourms, there are still many who are still crashing. Something about a bottle neck in the AMD design.

    image
  • Vunak23Vunak23 Member UncommonPosts: 633

    AMD boasts the power but lacks the quality. That is why Intel and Nvidia will always be ahead in their respective fields. 

    Not to mention Intels continued work on light/laser based processors which will blow everything completely out of the water. 

    "In the immediate future, we have this one, and then we’ve got another one that is actually going to be – so we’re going to have, what we want to do, is in January, what we’re targeting to do, this may or may not happen, so you can’t hold me to it. But what we’re targeting to do, is have a fun anniversary to the Ilum shenanigans that happened. An alien race might invade, and they might crash into Ilum and there might be some new activities that happen on the planet." ~Gabe Amatangelo

  • QuizzicalQuizzical Member LegendaryPosts: 25,347
    Originally posted by Adalwulff
    Originally posted by GroovyFlower
    Originally posted by Adalwulff

    You know I never really understood why AMD went south, when they used to be ok graphics wise.

    During the 90s I bought only AMD, but they just couldn't keep up. I am now sporting an I7 with a GTX 680 card and I cant even imagine ever going back.

    I think most of you have impression AMD still spew bad drivers out or be brainwashed by most games that say play nvidia?

    AMD/ATI radeoncards are sinds DX11 era very good and also cheaper then nvidia after my 8800gt i switch over to AMD radeon and never regret it there superb cards and last couple of years there drivers also became alot better then before DX11 era.

    One huge mistake that gaming industry made was let there game be sponsored by one videocard and in a way saying if you buy other then thisa game ad your loser slapping with nvidia or amd other side in the face.

    Game publishers should support both equally ive refused games when it said nvidia only just for setimental reasons hehe.

     

    I think you guys are not understanding, its not about brainwashing....lol

    Like I said, I always bought AMD, but now I don't because it simply does not perform as well. It had NOTHING to do with seeing NVIDIA on the box. Seriously man, most of us gamers are smarter than that.

    Why then did Nvidia pay to have their logo on the box if it doesn't affect video card sales?  More generally, why does marketing exist at all?

    In the current generation, AMD and Nvidia have about as good of architectures as each other.  At the very top end, Nvidia does have the highest performing card, but that's only because Nvidia was willing to build a 550 mm^2 die and AMD wasn't.  In the sub-$600 market (which is nearly the entire market), they're fairly evenly matched for gaming purposes.

    Now, if you actually have the budget to buy a GeForce GTX Titan and other hardware appropriate to it in a $3000 system, and you aren't terribly sensitive to price, then you might not care about a value for the money proposition.  But the price tag makes Titan irrelevant to most gamers.

  • QuizzicalQuizzical Member LegendaryPosts: 25,347
    Originally posted by Iselin

    I wish they'd stop pissing around with incremental releases that will be largely ignored since it coincides with the Haswell release, and start seriously challenging the i7 instead of competing against the Intel low end desktop or going after an increased market share of the bottom feeder integrated-GPU laptop market.

    It's not that AMD doesn't want to compete with Intel at the high end.  Their FX chips are primarily server chips, and not really any more competitive with Intel in their server guise than in desktops.  It's just that they've tried and failed.  Sometimes it happens.

    AMD moved to a very different architecture with Bulldozer in 2011.  Unfortunately, it was quite a bad architecture, for reasons that AMD surely didn't understand until it was too late to fix them.  It takes time to fix problems, and Kaveri around the end of this year should be a lot better.  I'm betting that AMD will make an 8-12 core chip next year based on the same Steamroller cores as in Kaveri and probably paired with DDR4 memory, and that may well be a viable chip for high end gaming desktops.  There's little chance that such a chip would catch Ivy Bridge or Haswell in single-threaded performance, but if it's not that far behind, having so many CPU cores could make it a decent chip for a long, long time.

  • NevulusNevulus Member UncommonPosts: 1,288
    and a little too  LATE
  • QuizzicalQuizzical Member LegendaryPosts: 25,347
    Originally posted by Vunak23

    AMD boasts the power but lacks the quality. That is why Intel and Nvidia will always be ahead in their respective fields. 

    Not to mention Intels continued work on light/laser based processors which will blow everything completely out of the water. 

    And how exactly is AMD behind either Intel or Nvidia in quality at a company-wide level?  In desktop graphics, AMD is roughly even with Nvidia this generation, but had a better architecture than Nvidia in each of the three generations that immediately preceded it.

    As for Intel, AMD had a better desktop CPU architecture from the time the Athlon 64 launched in 2003 until the Core 2 Duo arrived in 2006, though Intel has been ahead ever since.  But AMD has had far better graphics than Intel continuously ever since AMD bought ATI in 2006.  And Intel doesn't even have a serious competitor to AMD's Jaguar cores for tablets or microservers--or, for that matter, game consoles.

    What makes you think that Intel will make photonics commercially viable anytime soon?  Or even if they do, that Intel will be the best at it?

  • PurutzilPurutzil Member UncommonPosts: 3,048
    Originally posted by Quizzical
    Originally posted by Vunak23

    AMD boasts the power but lacks the quality. That is why Intel and Nvidia will always be ahead in their respective fields. 

    Not to mention Intels continued work on light/laser based processors which will blow everything completely out of the water. 

    And how exactly is AMD behind either Intel or Nvidia in quality at a company-wide level?  In desktop graphics, AMD is roughly even with Nvidia this generation, but had a better architecture than Nvidia in each of the three generations that immediately preceded it.

    As for Intel, AMD had a better desktop CPU architecture from the time the Athlon 64 launched in 2003 until the Core 2 Duo arrived in 2006, though Intel has been ahead ever since.  But AMD has had far better graphics than Intel continuously ever since AMD bought ATI in 2006.  And Intel doesn't even have a serious competitor to AMD's Jaguar cores for tablets or microservers--or, for that matter, game consoles.

    What makes you think that Intel will make photonics commercially viable anytime soon?  Or even if they do, that Intel will be the best at it?

    The "Apple mentality" I like to call it "Its more expensive so it naturally is better" when AMD itself has never been far behind. Is AMD 'top benchmark' all the time? No, but they sure is hell aren't as far as a lot of Nvidia/intel fanbois like to call it. AMD has always been a good competitor. Much to their efforts, Nvidia itself  still hasn't reached to the level Radeon cards have with utilizing SLI. They have been getting better but still yet to break even, though to their benefit Nvidia cards have been better single card wise, even if its not such a huge gap. 

     

    The biggest benefit with AMD is the price. You will get so much more for the same price. The biggest issue is it tends to rely on Over-clocking for its full power, something that is a bit more advance for more novice individuals to tamper with. If you overclock you will find the cards can be quite vicious competitors with a much more affordable price line.

     

    The 'best' is a line that has and always likely jump back and forth. If your completely descrediting one side or the other, you are just fooling yourself. 

     

    As far as the 'exclusive rights' on games, when a game advertises a graphics card, it does so because they ARE supporting the card. Often times the company will get first dibs on samples of the game and as such it will run better on that card, at least initially just since they usually get more time to tamper with it and create drivers for that specific game before the other company does. In the long run its something that can likely be made up for.

  • drbaltazardrbaltazar Member UncommonPosts: 7,856
    And ignore a lot of basic stuff like this example:pixel setting you got ycbcr 4.4.4 ,ycbcr 4.2.2,rgb 4.4.4 limited ,and rgb 4.4.4 full.yet ?if you check h264 standard none will fit the bill (the actual most popular till h265 take over.a lot of people live. Update on twitch.TV and they get bad quality as a result.if something as simple as this can't even happen properly (yes it mather if you use perceptual because if the data is 4.2.0 and amd is at 4.2.2 there is a missing .2 that that isn't there ya it cause no end of issue.look wise.and amd is the better one at nvidia its worst .Intel seems to be the top one (it baffle me lol)!or color profile .sRGB is the default yet only one work properly and not with percepetual,ya the non-black scaled profile set in relative .hell how hard is it for ms to use the screen min max range instead of 0 or 255 while still keeping 255 range.as we know scrgb use sRGB as a base so basicly scrgb gets the same error.luckily last I check non use scrgb.if they did they would be annoyed.yes yes ycbcr and xvycc use it in some fashion but luckily those play in the 16 to 240 range or equivalent at 1024 range.but what about if some company implement scrgb via thunderbolt 2 ?ya all hell break loose .ya sadly some of those company forgot to evolve their base .what was good for CRT isn't for IPS.in the end it is hard to want to buy something when you know software wise it isn't complete or dated .
  • CleffyCleffy Member RarePosts: 6,412

    I think Richland is more of a stop gap until Kaveri.  It would have made a good processor as an official offering of a 28nm AMD processor if they used the Intel-esque Tick-Tock approach.  Its a lot more akin to the 200w desktop chips AMD is releasing now, just biding the time until the die shrink.

    On Planetside 2 that's mainly the developers fault.  The AMD bulldozer architecture is a bit unique, so its justifiable that reading its 8 integer cores as only 4 cores is plausible.  However, for a piece of software to view the multi-core focused processor as one core is an error for the developer.  The thing to really consider now is that game developers no longer have the option to ignore jaguar's architecture which should be positive to how future AMD processors perform.

  • RidelynnRidelynn Member EpicPosts: 7,383


    Originally posted by drbaltazar
    And ignore a lot of basic stuff like this example:
    whole lot of garble that has nothing to do with AMD, CPUs, or GPUs

    Good to know.

  • RidelynnRidelynn Member EpicPosts: 7,383


    Originally posted by Cleffy
    The thing to really consider now is that game developers no longer have the option to ignore jaguar's architecture which should be positive to how future AMD processors perform.

    That actually is a really good point, and one I hadn't really considered until you brought it up.

    *edit

    Although, the more I think about it - the problem isn't so much with developers, per say.
    yes, some developers still get down to low level code to debug, but when your talking the Architectural differences between two chips, that's really mostly on the compilers, and on the OS (and drivers for the OS) to work correctly to feed the most efficient instructions to the CPU.

    If you have an OS scheduler that loves to trash your L2 cache - there isn't much you can do as a developer except re-work your program to try to limit the amount of dependancy you have on L2-cache performance boosting algorithms. Something similar to this was part of the reason Bulldozer performed poorly on Win7 and below, although it turned out to not be the only reason for poor performance.

    Likewise if your trying to work with a compliler that refuses to use a Multiply instruction, and keeps working it in as an inefficient loop of Add instructions (just a simple example, I don't think any compiler would go this crazy, except maybe some crazy RISC ones), your only choices are change compilers (possible), or rewrite a whole lot of low level code (the more realistic outcome for most developers - and your doing it for a corner case of your user installation base: This shows AMD vs Intel, but it doesn't break it down between architectures, and we're only dealing with the more modern Module AMD processors, not the older Phenom/Athlon design).

  • GroovyFlowerGroovyFlower Member Posts: 1,245
    Originally posted by Quizzical
    Originally posted by Adalwulff
    Originally posted by GroovyFlower
    Originally posted by Adalwulff

    You know I never really understood why AMD went south, when they used to be ok graphics wise.

    During the 90s I bought only AMD, but they just couldn't keep up. I am now sporting an I7 with a GTX 680 card and I cant even imagine ever going back.

    I think most of you have impression AMD still spew bad drivers out or be brainwashed by most games that say play nvidia?

    AMD/ATI radeoncards are sinds DX11 era very good and also cheaper then nvidia after my 8800gt i switch over to AMD radeon and never regret it there superb cards and last couple of years there drivers also became alot better then before DX11 era.

    One huge mistake that gaming industry made was let there game be sponsored by one videocard and in a way saying if you buy other then thisa game ad your loser slapping with nvidia or amd other side in the face.

    Game publishers should support both equally ive refused games when it said nvidia only just for setimental reasons hehe.

     

    I think you guys are not understanding, its not about brainwashing....lol

    Like I said, I always bought AMD, but now I don't because it simply does not perform as well. It had NOTHING to do with seeing NVIDIA on the box. Seriously man, most of us gamers are smarter than that.

    Why then did Nvidia pay to have their logo on the box if it doesn't affect video card sales?  More generally, why does marketing exist at all?

    In the current generation, AMD and Nvidia have about as good of architectures as each other.  At the very top end, Nvidia does have the highest performing card, but that's only because Nvidia was willing to build a 550 mm^2 die and AMD wasn't.  In the sub-$600 market (which is nearly the entire market), they're fairly evenly matched for gaming purposes.

    Now, if you actually have the budget to buy a GeForce GTX Titan and other hardware appropriate to it in a $3000 system, and you aren't terribly sensitive to price, then you might not care about a value for the money proposition.  But the price tag makes Titan irrelevant to most gamers.

    But still AMD 7970 do the job on most games perfectly.

    Crysis 3 run on my PC with everything maxed very smooth i realy dont see any framerate drops most of time at steady 60fps.

    Ivy bridge system

     

    i7core at 4.6ghz

    16gig ram 1866  ghz

    SSD 2X 256GB(not raid0)

    asus 7970 OC

    asus z77 deluxe

    psu 850 gold corsair

    I dunno what game even more demanding?

    But if i can run Crysis 3 max with 7970 i dont see any reason change it to Geforce Titan while i can afford it.

Sign In or Register to comment.