Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Those promised Intel price cuts still haven't happened.

QuizzicalQuizzical Member LegendaryPosts: 25,351
In late February, some web site posted a claim that Intel was slashing prices on its entire lineup of CPUs in anticipation of Ryzen.  No fewer than three threads in the hardware section of this forum alone made a big to-do about it.  At the time, I loudly called it "fake news", saying that while it was plausible that Intel would cut prices in the future, they certainly hadn't yet.

Ryzen has now been available for about a month and a half.  Ryzen 5 is available now, too, not just Ryzen 7.  So let's have a look at prices:

https://www.newegg.com/Product/Product.aspx?Item=N82E16819117726

Here we have a Core i7-7700K for $350.  No price cuts there.

https://www.newegg.com/Product/Product.aspx?Item=N82E16819117728

And the Core i5-7600K for $240.  Nope, no price cuts there, either.  Now, the choice between those and Ryzen is really just a question of more cores or faster cores.  So one could make a case for buying those CPUs at those prices.  So let's have a look at Broadwell-E:

https://www.newegg.com/Product/Product.aspx?Item=N82E16819117643

A Core i7-6950X for $1650.  No price cuts there, either.  Though Intel could at least claim that it has 10 cores and Ryzen 7 only has 8, so it's better.

https://www.newegg.com/Product/Product.aspx?Item=N82E16819117645

That argument falls apart if you want to talk about the Core i7-6900K, however, which is still $1050.  That's really no better of a CPU than a Ryzen 7 1800X that only costs $500.  If price cuts were going to happen in response to Ryzen, surely they would happen here.  Intel can at least make a case that they have four channels of DDR4 rather than two and more PCI Express bandwidth.

https://www.newegg.com/Product/Product.aspx?Item=N82E16819117647

So here's the Core i7-6850K for $610.  This has only six cores, making it unambiguously inferior to the cheaper Ryzen 7 if you need a lot of cores.  And it's really only competitive with a Ryzen 5 1600X for $250.

It is perhaps worth noting that Intel still charges the original prices for the older Haswell-E, even though it's pointless to buy new anymore because Broadwell-E is so clearly superior.  So it's possible that Intel slashed production of Broadwell-E in anticipation of Ryzen and their plan is to simply not be competitive for a while until Sky Lake-E arrives.
«1

Comments

  • RidelynnRidelynn Member EpicPosts: 7,383
    True. Good thing you were right about it.
  • kitaradkitarad Member LegendaryPosts: 7,910
    Why won't they cut their prices aren't they going to do so to compete ,I mean that is the logical thing to do.

  • QuizzicalQuizzical Member LegendaryPosts: 25,351

    OG_Zorvan said:

    kitarad said:
    Why won't they cut their prices aren't they going to do so to compete ,I mean that is the logical thing to do.


    When you have the superior hardware, you're not the one who needs to cut prices. Of course with caveats liberally sprinkled, AMD can be declared "as good as" or "good enough" to make the lower quality seem good for the lower price, but overall Intel overshadows Ryzen and other AMD offering. The gaming market is not who processors are aimed at as a priority. Business and scientific use of computers overshadows the gaming market by a large margin, and overshadows the "build your own rig" gamer by an even larger margin.

    So while gamers on a gaming website may wonder "Why won't they lower the price to be competitive?", they really don't need to as gamers who build their own rigs aren't really the priority business wise. Even when a particular CPU is marketed toward the gaming demographic, price cuts aren't needed unless/until the competition is actually outpacing you, which Intel currently has no fear of in the  foreseeable future from AMD.


    You can make that case for Kaby Lake.  For Broadwell-E, not so much.

    At the moment, Ryzen stops at eight cores.  But Naples will release shortly and goes up to 32.  It is probable that for many situations, Naples is going to be superior to anything Intel has available today even if you ignore price.  Not all situations or even close to all, but for a lot.
  • XophXoph Member UncommonPosts: 176

    Quizzical said:



    OG_Zorvan said:


    kitarad said:
    Why won't they cut their prices aren't they going to do so to compete ,I mean that is the logical thing to do.


    When you have the superior hardware, you're not the one who needs to cut prices. Of course with caveats liberally sprinkled, AMD can be declared "as good as" or "good enough" to make the lower quality seem good for the lower price, but overall Intel overshadows Ryzen and other AMD offering. The gaming market is not who processors are aimed at as a priority. Business and scientific use of computers overshadows the gaming market by a large margin, and overshadows the "build your own rig" gamer by an even larger margin.

    So while gamers on a gaming website may wonder "Why won't they lower the price to be competitive?", they really don't need to as gamers who build their own rigs aren't really the priority business wise. Even when a particular CPU is marketed toward the gaming demographic, price cuts aren't needed unless/until the competition is actually outpacing you, which Intel currently has no fear of in the  foreseeable future from AMD.




    You can make that case for Kaby Lake.  For Broadwell-E, not so much.

    At the moment, Ryzen stops at eight cores.  But Naples will release shortly and goes up to 32.  It is probable that for many situations, Naples is going to be superior to anything Intel has available today even if you ignore price.  Not all situations or even close to all, but for a lot.


    Not a single game out there will use 32 cores, you're lucky if your game uses 2.  It's meaningless in a gaming environment.
  • GladDogGladDog Member RarePosts: 1,097
    A 32 core CPU is not aimed at the gaming market.  It's aimed at the server market.  A friend that used to work at Intel told me that one of his test beds was a 32 core CPU.  But it is a one-off, nowhere near a production model.  So yeah, Intel has the tech to make a 32 core CPU, but does not have the impetus, i.e. no one is pushing them to make 32 core CPUs, neither friend nor foe.


    The world is going to the dogs, which is just how I planned it!


  • GdemamiGdemami Member EpicPosts: 12,342

    kitarad said:

    Why won't they cut their prices aren't they going to do so to compete ,I mean that is the logical thing to do.


    The one that tries to 'compete' is AMD, not Intel.
  • Gaia_HunterGaia_Hunter Member UncommonPosts: 3,066
    Why would they?
    You just need to read the comments.
    Despite AMD having a more well rounded platform in Ryzen what whatever price point you choose, people are still singing "Intel, Intel".
    I have a 6700K but if I was buying a new machine today, it would probably have a r5 1600 or r7 1700.

    The way reviewers test the products cater to higher IPC (that atm is a 5-10% difference) and higher clocks (no question Intel can reach higher clocks - 4.4-4.6 GHz on Skylake, 4.5-5GHz on Kabylake and around 4.4 GHz on Broadwell-E).

    But you, the consumer, aren't running with a fresh installation of windows every week, with minimal software installed.

    You probably have your game up, half a dozen browser tabs, maybe a video on plus all the clutter you have accumulated during the last 6 months.

    People are still going to buy the 7700K even if they can get a r5 1600 for considerably cheaper and never notice a difference during gameplay.

    Currently playing: GW2
    Going cardboard starter kit: Ticket to ride, Pandemic, Carcassonne, Dominion, 7 Wonders

  • Thomas2006Thomas2006 Member RarePosts: 1,152

    kitarad said:

    Why won't they cut their prices aren't they going to do so to compete ,I mean that is the logical thing to do.


    When you dominate the market as bad as Intel does, you don't have to compete on any level. Now at some point when AMD wakesup and decides to pull upto the table with something that can actually compete with Intel then thats a different story.  But so far AMD is behind and they have been for some time.

    This has given Intel the leg up and Intel hasn't been setting by ideally in that regards. Intel is playing the future game while AMD is just trying to catch up to the now.  The x86 and x64 processors are todays current tech.. But they are not the future and Intel in that regards is already way ahead, where AMD hasn't shown any research into future tech beyond the current coop of processors.
  • Gaia_HunterGaia_Hunter Member UncommonPosts: 3,066
    If in the gaming market Intel can claim highest average FPS, even if often the minimums,1% and 0.1% FPS aren't any better or are even worse, the prosumer and especially server market are the areas where the Ryzen has the most potential.

    Currently playing: GW2
    Going cardboard starter kit: Ticket to ride, Pandemic, Carcassonne, Dominion, 7 Wonders

  • PhryPhry Member LegendaryPosts: 11,004
    I think the biggest hurdle AMD has to overcome, is that consumers generally perceive the Intel CPU's to be superior, and that the higher prices are because they are better. Its also the kind of thinking that is reflected in Nvidia's dominance in the GPU market.
    When it comes to CPU's and GPU's for that matter, AMD really does have to prove to the consumer that their hardware is better, its not an easy battle as historically AMD has been the 'cheaper' and 'less powerful' alternative in both markets. :o
  • QuizzicalQuizzical Member LegendaryPosts: 25,351

    DMKano said:





    If in the gaming market Intel can claim highest average FPS, even if often the minimums,1% and 0.1% FPS aren't any better or are even worse, the prosumer and especially server market are the areas where the Ryzen has the most potential.





    AMD has a tough fight ahead in the server market - as intel holds about 80% market share as of Jan 2017


    AMD pulled out of the server market entirely because they saw that Bulldozer derivatives weren't going to be remotely competitive, so why put all the effort into creating products that no one will buy?  At the moment, their latest server chip still has Piledriver cores from 2012.  Thus, their server market share right now is basically zero.  But Naples is going to be good, and that makes a difference.
  • QuizzicalQuizzical Member LegendaryPosts: 25,351

    Phry said:

    I think the biggest hurdle AMD has to overcome, is that consumers generally perceive the Intel CPU's to be superior, and that the higher prices are because they are better. Its also the kind of thinking that is reflected in Nvidia's dominance in the GPU market.
    When it comes to CPU's and GPU's for that matter, AMD really does have to prove to the consumer that their hardware is better, its not an easy battle as historically AMD has been the 'cheaper' and 'less powerful' alternative in both markets. :o


    Ryzen brings AMD the nearest to catching Intel that they've been since Conroe arrived in 2006, unless you count that the Phenom II would have caught up to Intel if Nehalem hadn't launched two months prior.  That's mostly been driven by Intel having better process nodes and from 2011 until earlier this year, Bulldozer being a disaster.

    The comparison to Nvidia is totally different.  There, AMD has typically been competitive with Nvidia, with only the exception that Nvidia commonly had the fastest top end card purely because they were willing to build a larger die.  But who has the fastest top end card only matters if you're going to buy that top end card, and AMD has usually been competitive and sometimes ahead outright in the sub-$400 market.

    Claiming that Nvidia usually has better GPUs than AMD is pure recency bias.  Yes, Maxwell/Pascal is a better gaming architecture than GCN/Polaris, and so Nvidia has had an efficiency advantage for the last two and a half years, or three years if we're talking laptops.  But that's the only time Nvidia has had an efficiency advantage over AMD since the RV670 arrived way back in 2007.  And even there, Fiji was more or less competitive with GM200 at the top end for a year.

    Furthermore AMD had a considerable efficiency advantage over Nvidia for nearly four years, from the arrival of RV770 (Radeon HD 4870) until Nvidia finally caught up with Kepler.  For about seven months of that period in late 2009 and early 2010, AMD's lead was a gaping chasm, comparable to the advantage Nvidia has over AMD today.
  • Loke666Loke666 Member EpicPosts: 21,441

    Quizzical said:


    You can make that case for Kaby Lake.  For Broadwell-E, not so much.

    At the moment, Ryzen stops at eight cores.  But Naples will release shortly and goes up to 32.  It is probable that for many situations, Naples is going to be superior to anything Intel has available today even if you ignore price.  Not all situations or even close to all, but for a lot.


    Yeah, 32 cores will be extremely useful for us gamers....not. Many games can't even use 6 cores and even the ones that uses 8 does not really blow us away with the improvement compared with 4 cores.

    32 cores could be useful if you compile a lot of data or run a server but for normal uses it is pointless, at least until someone makes games that competently uses all those cores.

    We don't really need more cores, we need once again higher clocking speed and one would think that new technology actually could get the cores faster instead of just adding more cores. With nano technology one would think that having 10 ghz cores would be possible.

    Phry said:

    I think the biggest hurdle AMD has to overcome, is that consumers generally perceive the Intel CPU's to be superior, and that the higher prices are because they are better. Its also the kind of thinking that is reflected in Nvidia's dominance in the GPU market.
    When it comes to CPU's and GPU's for that matter, AMD really does have to prove to the consumer that their hardware is better, its not an easy battle as historically AMD has been the 'cheaper' and 'less powerful' alternative in both markets. :o


    With Nvidia it is more that people never gotten over how bad the ATI drivers used to be, it is not that Nvidias cards were better but their drivers were so the average users got far less problems with them. People havn't forgotten that.

    With AMD VS Intel more then a few none computer geeks I know have the opinion that AMD makes good laptops but Intel makes the good desktop for some reason. Oh, and that AMDs desktop CPUs overheat (still remember those Athlons).

    Myself, I have a Haswell-E now, my last was a Phenomenah II x6, I get the best I can afford when I build a new computer, AMD or Intel does not really matter to me. I generally go for Nvidia though, an old ATI card annoyed me so much that I just can't get over it due to driver issues. It was a long time ago but AMD seriously need something amazing for me to go back to them.
  • QuizzicalQuizzical Member LegendaryPosts: 25,351

    OG_Zorvan said:



    GladDog said:


    A 32 core CPU is not aimed at the gaming market.  It's aimed at the server market.  A friend that used to work at Intel told me that one of his test beds was a 32 core CPU.  But it is a one-off, nowhere near a production model.  So yeah, Intel has the tech to make a 32 core CPU, but does not have the impetus, i.e. no one is pushing them to make 32 core CPUs, neither friend nor foe.



    Intel currently has 64 core cpus out commercially. https://ark.intel.com/products/94033/Intel-Xeon-Phi-Processor-7210-16GB-1_30-GHz-64-core



    After the demise of the Itanium, Intel apparently felt that they needed another ridiculous CPU.  And so the Xeon Phi was born.  It deserves to be compared more to Intel GPUs than to their CPUs, even if it is a CPU.  It's basically a bunch of Atom cores with AVX-512 instructions added.

    The basic idea of it is that some algorithms scale well to enormous numbers of weak cores.  The CPU paradigm of a core trying to get one thread done as quickly as possible is really inefficient for such algorithms.  So Intel would build a CPU with an enormous number of weak cores to target such algorithms.

    The problem is that that sort of embarrassingly parallel algorithms tends to fit GPUs very well, so the market for Xeon Phi consists almost entirely of people willing to throw a ton of money at hardware but have no clue what to buy, and so they buy Intel.  Someone said that people say "Intel, Intel" whether it's a good product or not; the Xeon Phi is basically a test of that proposition.

    It's not just that the paper specs on the Xeon Phi are unimpressive, though they are.  It offers 102 GB/s of memory bandwidth.  For comparison, the Radeon RX 460 and GeForce GTX 1050 offer more than that.  At max turbo, it offers about 3 TFLOPS of theoretical peak performance.  For comparison, a Radeon RX 480 offers about double that, and a GeForce GTX 1080 about triple it, and both while using less power.

    The cache hierarchy of a GPU is very different from that of a CPU.  But the GPU cache hierarchy is the result of decades of development of figuring out how best to optimize for the many-threaded programming model.  The Xeon Phi basically ignores this in favor of just doing what x86 CPUs traditionally do.  So not only does a GPU offer far higher theoretical peak performance, but it's commonly easier to actually get close to that performance on a GPU.

    In order to take proper advantage of Xeon Phi cores, you don't just need a lot of threads; you need maximum width AVX-512 instructions almost everywhere.  GPUs make it easy to transfer data between threads within a warp/wavefront, and to a somewhat lesser extent, within the same block/work-group.  There simply isn't a nice way to pass data back and forth among AVX lanes on a CPU.  GPUs can also use local memory to do efficient table lookups with independent threads grabbing independent values, while there simply is no AVX equivalent to this.  So while there are algorithms that fit nicely on a Xeon Phi but not a GPU, the other way around is far more common.

    Some people will reasonably point out that GPU programming is harder than CPU programming.  If you're comparing it to single-threaded, scalar CPU programming, that's true.  But if you're doing single-threaded, scalar CPU programming, the Xeon Phi will lose badly to a low power laptop CPU, let alone a more sensible CPU.  Many threads but still scalar code will probably leave it slower than an ordinary Xeon.  If you're comparing it to well-threaded code that uses AVX for most of its instructions, is it still easier than a GPU?  I'd argue that it's harder.

    Use of AVX, like SSE, is still fairly immature, even though they've been around for many years.  Using the intrinsics works, but then you have to rewrite your code every few years for the new intrinsics.  Trying to use pragmas and for loops as with OpenMP works in the simplest of cases, but falls apart pretty quickly if you hope to do anything non-trivial.  If you hope to really extract maximum performance out of a Xeon Phi, you have to fuss with all of that, and it's far less mature than GPU programming.

    Maybe it's worth it if the payoff is awesome performance.  People put up with the struggles of FPGA programming in some cases because the performance genuinely is awesome.  But the Xeon Phi has no such payoff at the end, as it will get scorched in nearly everything by either a GPU or an ordinary Xeon, depending on how parallel your algorithm is.  Or lose badly to both sometimes.

    So why would anyone buy a Xeon Phi?  Three reasons:

    1)  The computations fit a GPU well, but there's a big PCI Express bottleneck if you try to use a GPU.
    2)  You're in a weird corner case that fits the cache hierarchy of the Xeon Phi but not a GPU.
    3)  The person making the purchasing decision goes by brand name with no clue about technical details.

    Reason (1) disappears overnight if AMD releases a big APU.  Reason (2) is rarer than you might think.  So that leaves option (3).  And with the price tag, Intel isn't selling them to the "buy something random at Wal-Mart" crowd.
  • Gaia_HunterGaia_Hunter Member UncommonPosts: 3,066
    edited April 2017


    Loke666 said:


    Phry said:



    I think the biggest hurdle AMD has to overcome, is that consumers generally perceive the Intel CPU's to be superior, and that the higher prices are because they are better. Its also the kind of thinking that is reflected in Nvidia's dominance in the GPU market.
    When it comes to CPU's and GPU's for that matter, AMD really does have to prove to the consumer that their hardware is better, its not an easy battle as historically AMD has been the 'cheaper' and 'less powerful' alternative in both markets. :o






    With Nvidia it is more that people never gotten over how bad the ATI drivers used to be, it is not that Nvidias cards were better but their drivers were so the average users got far less problems with them. People havn't forgotten that.

    With AMD VS Intel more then a few none computer geeks I know have the opinion that AMD makes good laptops but Intel makes the good desktop for some reason. Oh, and that AMDs desktop CPUs overheat (still remember those Athlons).

    Myself, I have a Haswell-E now, my last was a Phenomenah II x6, I get the best I can afford when I build a new computer, AMD or Intel does not really matter to me. I generally go for Nvidia though, an old ATI card annoyed me so much that I just can't get over it due to driver issues. It was a long time ago but AMD seriously need something amazing for me to go back to them.




    NVIDIA had drivers kill cards.
    And way more recently than problems ATI had almost 20 years ago.

    Currently playing: GW2
    Going cardboard starter kit: Ticket to ride, Pandemic, Carcassonne, Dominion, 7 Wonders

  • HashbrickHashbrick Member RarePosts: 1,851
    When you are Intel you don't have to cut anything, the only thing you need to do is laugh at the performance results of AMD.  I encourage competition it is very good for us consumers but AMD does nothing but supply the gamers who prefer to go in on the cheap end, knowing they are on inferior technology.  

    It's like that song "Anything you can do I can do better" except Intel executives are the ones saying "No You Can't!" and AMD still believes "Yes I Can!"  The time, resources, money and development cycles are just not in AMD's favor.  I've had tons of problems with AMD CPUs in the past and Zero issues with Intel.  

    It's quality over value honestly.  It's like buying Great Value Cheese vs Kraft Cheese. 
    [[ DEAD ]] - Funny - I deleted my account on the site using the cancel account button.  Forum user is separate and still exists with no way of deleting it. Delete it admins. Do it, this ends now.
  • QuizzicalQuizzical Member LegendaryPosts: 25,351

    Hashbrick said:

    When you are Intel you don't have to cut anything, the only thing you need to do is laugh at the performance results of AMD.  I encourage competition it is very good for us consumers but AMD does nothing but supply the gamers who prefer to go in on the cheap end, knowing they are on inferior technology.  

    It's like that song "Anything you can do I can do better" except Intel executives are the ones saying "No You Can't!" and AMD still believes "Yes I Can!"  The time, resources, money and development cycles are just not in AMD's favor.  I've had tons of problems with AMD CPUs in the past and Zero issues with Intel.  

    It's quality over value honestly.  It's like buying Great Value Cheese vs Kraft Cheese. 


    If you're trying to cite Kraft as an example of a premium cheese worth paying extra for, then I think your analogy has broken down.
  • laseritlaserit Member LegendaryPosts: 7,591

    Quizzical said:



    Hashbrick said:


    When you are Intel you don't have to cut anything, the only thing you need to do is laugh at the performance results of AMD.  I encourage competition it is very good for us consumers but AMD does nothing but supply the gamers who prefer to go in on the cheap end, knowing they are on inferior technology.  

    It's like that song "Anything you can do I can do better" except Intel executives are the ones saying "No You Can't!" and AMD still believes "Yes I Can!"  The time, resources, money and development cycles are just not in AMD's favor.  I've had tons of problems with AMD CPUs in the past and Zero issues with Intel.  

    It's quality over value honestly.  It's like buying Great Value Cheese vs Kraft Cheese. 




    If you're trying to cite Kraft as an example of a premium cheese worth paying extra for, then I think your analogy has broken down.


    I'll go with a Velveeta vs Brie

    Much tastier analogy IMHO ;)


    "Be water my friend" - Bruce Lee

  • IceAgeIceAge Member EpicPosts: 3,120

    Quizzical said:

    In late February, some web site posted a claim that Intel was slashing prices on its entire lineup of CPUs in anticipation of Ryzen. 


    There you go. You can't call "intel" for not cutting down their prices when you didn't actually heard an official statement and just read some..rumors from X/Y website(s).

    Other then that, while I understand you love AMD's new products, I must say that I am pretty sure Intel knows what they are doing with .. or without your QQ post. 

    Reporter: What's behind Blizzard success, and how do you make your gamers happy?
    Blizzard Boss: Making gamers happy is not my concern, making money.. yes!

  • QuizzicalQuizzical Member LegendaryPosts: 25,351

    IceAge said:



    Quizzical said:


    In late February, some web site posted a claim that Intel was slashing prices on its entire lineup of CPUs in anticipation of Ryzen. 




    There you go. You can't call "intel" for not cutting down their prices when you didn't actually heard an official statement and just read some..rumors from X/Y website(s).

    Other then that, while I understand you love AMD's new products, I must say that I am pretty sure Intel knows what they are doing with .. or without your QQ post. 


    I'm not blaming Intel.

    Rather, there is a saying to the effect that a lie can travel halfway around the world before the truth can tie its shoes.  There were a number of threads claiming Intel had cut prices in response to Ryzen.  I decided to do a follow-up now that we actually know what happened (or more to the point, didn't) rather than just guessing about it.
  • HashbrickHashbrick Member RarePosts: 1,851

    Quizzical said:



    Hashbrick said:


    When you are Intel you don't have to cut anything, the only thing you need to do is laugh at the performance results of AMD.  I encourage competition it is very good for us consumers but AMD does nothing but supply the gamers who prefer to go in on the cheap end, knowing they are on inferior technology.  

    It's like that song "Anything you can do I can do better" except Intel executives are the ones saying "No You Can't!" and AMD still believes "Yes I Can!"  The time, resources, money and development cycles are just not in AMD's favor.  I've had tons of problems with AMD CPUs in the past and Zero issues with Intel.  

    It's quality over value honestly.  It's like buying Great Value Cheese vs Kraft Cheese. 




    If you're trying to cite Kraft as an example of a premium cheese worth paying extra for, then I think your analogy has broken down.


    The point is they are both shit cheeses but one is known for quality and one is known for being cheap and sold at a value.  It's the exact reference to AMD and Intel.  I live in Wisconsin we have no shortage of amazing crafted cheeses, I refuse to eat any major chain store bought cheese, but the point still makes perfect sense anywhere in the world.
    [[ DEAD ]] - Funny - I deleted my account on the site using the cancel account button.  Forum user is separate and still exists with no way of deleting it. Delete it admins. Do it, this ends now.
  • RidelynnRidelynn Member EpicPosts: 7,383
    edited April 2017







    Yeah, that's pretty much spot on as far as your comments on ATI drivers.  I've heard that they're a lot better now, by why should I care?  They sucked so badly for so long that I see no reason to buy another AMD card when Nvidia cards have served me well for 16 years across the entire spectrum of gaming, using either Direct3D or OpenGL.  If that changes then sure, I'll give AMD a try, but until then forget it.  I know Malabooga's head would explode were he still here to read this, but them are the facts. :)




    Just to stir the pot:

    So, is it that you'll never try AMD again, or if the cards are better that you'll give AMD a try?

    Because I'm pretty sure nearly everyone is telling you that AMD cards and drivers have changed, and have been changed for the better since around 2008.

    And to poke a bit more: You didn't buy an AMD video card 16 years ago. You bought an ATI video card.
  • laseritlaserit Member LegendaryPosts: 7,591

    Ridelynn said:











    Yeah, that's pretty much spot on as far as your comments on ATI drivers.  I've heard that they're a lot better now, by why should I care?  They sucked so badly for so long that I see no reason to buy another AMD card when Nvidia cards have served me well for 16 years across the entire spectrum of gaming, using either Direct3D or OpenGL.  If that changes then sure, I'll give AMD a try, but until then forget it.  I know Malabooga's head would explode were he still here to read this, but them are the facts. :)






    Just to stir the pot:

    So, is it that you'll never try AMD again, or if the cards are better that you'll give AMD a try?

    Because I'm pretty sure nearly everyone is telling you that AMD cards and drivers have changed, and have been changed for the better since around 2008.

    And to poke a bit more: You didn't buy an AMD video card 16 years ago. You bought an ATI video card.


    Ah..... Fondue ;)

    I keep my eye on the performance for just a couple of applications when it comes to selecting a new gpu. Historically AMD has lagged way behind probably because of poor driver support concerning these applications.

    Hopefully that changes.

    "Be water my friend" - Bruce Lee

  • laseritlaserit Member LegendaryPosts: 7,591
    edited April 2017


    Torval said:


    I'm a bit confused. AMD would write drivers for their hardware and provide API support through Vulkan or DirectX. How is it not the problem of the app developer to support that hardware?

    It's reasonable to say my application developer doesn't support your hardware. It's not reasonable to say your hardware is shit and your drivers are shit because my application developer doesn't support you. I'm sure there's more to it though.




    I'm not in the know of all the nuances of how gpu drivers and applications work together. But as an example back in January Nvidia released their version 378.49 driver update which introduced a memory leak into one of my applications. The issue still has not been resolved and were up to version 381.65 now. One has to roll back to version 376.33 to be rid of the memory leak.

    http://www.prepar3d.com/forum/viewtopic.php?f=6315&t=124227

    Many driver versions over the years have had very noticeable, positive or negative effects on performance concerning some of the applications that use.

    "Be water my friend" - Bruce Lee

  • SlyLoKSlyLoK Member RarePosts: 2,698




    Ridelynn said:















    Yeah, that's pretty much spot on as far as your comments on ATI drivers.  I've heard that they're a lot better now, by why should I care?  They sucked so badly for so long that I see no reason to buy another AMD card when Nvidia cards have served me well for 16 years across the entire spectrum of gaming, using either Direct3D or OpenGL.  If that changes then sure, I'll give AMD a try, but until then forget it.  I know Malabooga's head would explode were he still here to read this, but them are the facts. :)








    Just to stir the pot:

    So, is it that you'll never try AMD again, or if the cards are better that you'll give AMD a try?

    Because I'm pretty sure nearly everyone is telling you that AMD cards and drivers have changed, and have been changed for the better since around 2008.

    And to poke a bit more: You didn't buy an AMD video card 16 years ago. You bought an ATI video card.




    If I start to have problems with Nvidia cards/drivers, then I'll consider giving AMD a try.  Until then, whey would I do that?  I have had consistently great experiences with Nvidia cards over the years.  The only bad experiences I've had with video cards, ever,  were ATI cards (yes, now they're owned by AMD, I realize that).


    I have had less issues with AMD than Nvidia recently. 

    The " driver issues " excuse is played out. 
Sign In or Register to comment.