Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

NVIDIA vs AMD Videocards

24

Comments

  • MukeMuke Member RarePosts: 2,614
    Originally posted by Seelinnikoi

    Have you ever seen any game endorse an AMD card?

    Nope. And for a reason...

    I regret to this day having bought this Radeon 6950 card, so I am saving up to a higher quality Nvidia card.

    Well, Nvidia are very aggressive in putting developers under pressure to promote their product and not AMD. And Nvidia are notoriously difficult if not impossible to work with if you are a outsider.

     

     

    "going into arguments with idiots is a lost cause, it requires you to stoop down to their level and you can't win"

  • QuizzicalQuizzical Member LegendaryPosts: 25,355
    Originally posted by Lobotomist

    People are living in the past.

    AMD was relevant 5-10 years ago and had slight competitive price to Nvidia.

    However for at least 5 years they have been dragging behind. They have worse graphic cards but with similar prices to Nvidia. And their drivers are bad and mostly poorly supported by game developers and game compatibility.

    There is really not one reason to buy AMD product today , except perhaps brand loyalty.

     

    So if you like buying brand name for no other reason than its nifty logo or some nostalgia , than buy AMD.

    Otherwise keep away.

    You haven't a clue what you're talking about.  Ten years ago, AMD didn't have any GPUs at all; they got into graphics by buying ATI.  AMD's drivers, like Nvidia's drivers, mostly work fine.  Game developers code for common APIs such as DirectX and OpenGL; at most, they make make multiple, slightly different code paths to optimize for different architectures.

  • jdnewelljdnewell Member UncommonPosts: 2,237
    Originally posted by Seelinnikoi

    Have you ever seen any game endorse an AMD card?

    Nope. And for a reason...

    I regret to this day having bought this Radeon 6950 card, so I am saving up to a higher quality Nvidia card.

    What??

    At least get your facts straight before posting lol

  • QuizzicalQuizzical Member LegendaryPosts: 25,355
    Originally posted by Phry

    Technically, AMD cards can often be the more powerful, the problem, as is usually the case with comparing these cards, is the games themselves, or to put it another way, real world application.

    Nvidia has the advantage for the most part, games are heavily optimised usually in Nvidia's favour, where you can sometimes get the crazy situation where 2 virtually identical PC's can be running the same game, but one, despite having a less 'technically' capable GPU, outperforms the higher spec'd GPU, and its just down to how the game is coded. Its not fair, but its business and its so far allowed Nvidia to keep ahead of AMD.

    http://www.forbes.com/sites/jasonevangelho/2014/05/28/nvidia-fires-back-the-truth-about-gameworks-amd-optimization-and-watch-dogs/

    So, Nvidia tends to work better with games. Does make buying a GPU a bit more complicated, and sometimes it just means that buying an Nvidia gpu is the 'easy' option.image

    Going all the way back to 2007, AMD has usually had hardware with higher peak instruction throughput than Nvidia, but with restrictions that made it harder to fully utilize that hardware.  Both AMD and Nvidia have moved their architectures toward what the other offered, but that gap is still around today.

    Most notably, AMD's GCN architecture has a wavefront size of 64, while Nvidia's Fermi, Kepler, and Maxwell all have a warp size of 32.  If you want to do something on GCN, you have to do 64 of it; on Nvidia, you can do just 32.  Which means that if your code needs 17 threads to do something, you're using twice as many resources to do it on AMD as on Nvidia.  But if you can fill wavefronts nicely, you're using vastly less scheduling resources on AMD than on Nvidia, which frees up space in silicon for more of everything and increased performance.  This may be why AMD cards have long scaled better to ultra high monitor resolutions than Nvidia cards.

    Even so, there are always a ton of corner cases in which either vendor's architecture completely blows away the other.  If a vendor can convince some game developer to implement such a thing in games, you can get outlier results where a game looks far more favorable to one vendor's architecture than the other's.  But that's vendor shenanigans, not typical game performance.

    For example, it's not hard to write code that will make a Radeon R9 290X massively outperform a GeForce GTX Titan X if that's what you're trying to do--even with both GPUs compiling exactly the same source code.  Try doing something that leans heavily on rotate operations (this is why AMD won by so much at bitcoin mining), register capacity, or local memory bandwidth, for example.  If you want to make Nvidia win by a huge margin, try tessellating stuff to stupid degrees just to overwhelm AMD's hardware tessellation units, or make extensive use of a table arbitrarily sized at 40 KB.

  • QuizzicalQuizzical Member LegendaryPosts: 25,355
    Originally posted by Fdzzaigl

    Nvidia usually has the advantage performance wise.

    That said, you can't go wrong with the higher end cards of AMD price / quality wise imo. Yes they're slower with drivers but it isn't usually that much of a big deal imo.

    I've been sticking with AMD cards for the last decade. And the one time when I switched to a Geforce Nvidia series, I actually had a crapload of issues.

    Nvidia usually has the fastest top end card because they're willing to go larger on die size than AMD.  But that only matters if you're willing to pay up for that top end card.

    To take a CPU comparison, suppose that you're doing something that scales well to arbitrarily many CPU cores.  Vendor A offers CPUs with 2, 4, 8, or 16 cores.  Vendor B offers CPUs with 2, 4, or 8 cores, but not 16.  On a per core basis, vendor A isn't that different from vendor B.  Obviously, the winner in raw performance is going to be vendor A's 16 core processor.  If you've got the budget to buy it, then that's what you want.  But if vendor A's 16 core processor is out of your budget, it shouldn't matter to you that vendor A offers it.  If you're going to buy a quad core CPU because that's what fits your budget, then what matters is whether vendor A's quad core or vendor B's quad core is better, and that has nothing to do who offers a 16 core CPU that you're not going to get.

    It's foolish to say, a GeForce GTX Titan X is better than a Radeon R9 290X, and so Nvidia must be better than AMD today, and so I'm going to buy a GeForce GTX 650 over a Radeon R7 250X.  If the choices that fit your budget are the GTX 650 and the R7 250X, then look at those cards to see which of them is better.  Both the Titan X and the R9 290X are irrelevant if you can't afford them anyway.

  • QuizzicalQuizzical Member LegendaryPosts: 25,355
    Originally posted by DMKano
    Originally posted by Muke
    Originally posted by Seelinnikoi

    Have you ever seen any game endorse an AMD card?

    Nope. And for a reason...

    I regret to this day having bought this Radeon 6950 card, so I am saving up to a higher quality Nvidia card.

    Well, Nvidia are very aggressive in putting developers under pressure to promote their product and not AMD. And Nvidia are notoriously difficult if not impossible to work with if you are a outsider.

     

     

     

    I've said this before I'll say it again 

    When it comes to MMO Dev studios - they are almost all exclusively using Nvidia cards to develop their games on. 

    This is true if all Asian Dev studios, Blizzard, Zenimax, Trion, NcSoft etc... is almost all 100% nvidia in house.

    Yes the QA has all different syatems they test on including AMD, but look at the developer PCs - 99.9% is NVidia.

    Yes the games will run fine on both - but the vast majority of MMOs specifically are made on NVidia PCs.

    That only matters if they are specifically trying to optimize their code only on Nvidia and don't try to optimize it for AMD.  To believe that is to believe that all MMO game development studios are stupid.

    Furthermore, if a game is going to run on consoles, then it's going to run on AMD.  Both the PS4 and the Xbox One use AMD's GCN architecture for the GPU.  The most recent console to use an Nvidia GPU is the PS3, and that's an ancient GPU that precedes the unified shader era and has nothing to do with today's architectures.  Most MMOs aren't on consoles, of course, but any game that is is going to have to heavily optimize the code for AMD--and doesn't necessarily have to do so for Nvidia.

    If a developer writes CPU code that isn't even aware that there is a GPU in the system, it doesn't matter what GPU he uses to to display his code on the screen.  And the overwhelming majority of code in a game is purely CPU code.

  • GreteldaGretelda Member UncommonPosts: 359

    very easy

    you want constant 60 fps with all games? nvidia cause more drivers and usually games with Nvidia logo on them are more optimized for Nvidia than AMD. my experience anyway

    you don't care? AMD it's cheaper and still great.

    my top MMOs: UO,DAOC,WoW,GW2

    most of my posts are just my opinions they are not facts,it is the same for you too.

  • MikehaMikeha Member EpicPosts: 9,196

    AMD will take the stage at E3 in a couple weeks and show the world that the True King has returned. image

     

     

    http://www.worldsfactory.net/2015/04/30/pc-has-its-own-e3-conference-now

  • AzmodeusAzmodeus Member UncommonPosts: 268
    The rage runs high in this thread.  Don't let anyone talk you out of anything.  Research, try out and go with your gut. 
      OMG I am Ancient!
  • RidelynnRidelynn Member EpicPosts: 7,383

    I think it's clear that nVidia has a bigger marketing budget, and a much better viral campaign.

    Why does it matter what video card a dev has in their system running Visual Studio anyway?

    Power use does matter - but not really for cost. The true cost adds up to single digit dollars per year - I remember seeing the math somewhere, and yours isn't incorrect, it just makes some inaccurate assumptions (the video card at 100% TDP for 30 hours a week, for instance - which may be true for a mining rig, but definitely not for a gaming rig). And it's not like the more power efficient card uses no electricity, so you gotta subtract that back out to see the additional cost of the more power hungry architecture.

    Today, nVidia has the more power efficient architecture. Last generation, it was AMD. It flips back and forth almost every other generation.

    One difference you could talk about is their power management: PowerTune vs TurboBoost. AMD defaults to a 100% clock, and only downclocks on TDP or temperature. nVidia defaults to a lower (~80%) clock, and clocks up if there is headroom. I won't say one is any better than the other, they both have pros and cons, but it is a difference.

    True, ATI had a huge problem with drivers. That was one of the first issues to get resolved when AMD bought them. They don't do monthly updates just for the sake of pushing updates any longer, but it's vastly superior to the ATI days - and on par with what nVidia pushes.

    The only one behind in the driver race now is Intel, and well, you could make a solid case that it doesn't matter because their GPUs are well behind everything else - despite being the #1 GPU manufacturer in the world.

  • IselinIselin Member LegendaryPosts: 18,719
    Originally posted by NightHaveN


    But when was the last official ATI driver released? December, 5 months now.

    And I had zero problems with their drivers playing a variety of games that push the HW.

     

    You guys do know, that needing to release drivers every couple of weeks can also be spun as a negative, don't you?

    "Social media gives legions of idiots the right to speak when they once only spoke at a bar after a glass of wine, without harming the community ... but now they have the same right to speak as a Nobel Prize winner. It's the invasion of the idiots”

    ― Umberto Eco

    “Microtransactions? In a single player role-playing game? Are you nuts?” 
    ― CD PROJEKT RED

  • mbrodiembrodie Member RarePosts: 1,504
    Originally posted by Iselin
    Originally posted by NightHaveN


    But when was the last official ATI driver released? December, 5 months now.

    And I had zero problems with their drivers playing a variety of games that push the HW.

     

    You guys do know, that needing to release drivers every couple of weeks can also be spun as a negative, don't you?

    i have to agree with this... regardless of the fact i keep up with beta drivers with AMD basically because i switched to windows 10 and it's in my favor to do so, i've never HAD to update my drivers to get a game to run and on my 290x's i've never experienced performance issues with any of the games others have... could come down to luck of the draw, but i'm excited to see the 300 range get released and there is a good chance for the interim AMD has stolen me away from Nvidia, my last experience with EVGA GTX 780's was less than desirable and it left a sour taste in my mouth... now i know multiple people with 900 series who arent overly impressed and i'm sitting back with my crossfire setup i paid $850 for which were top of the line AMD cards, if i had of gone single GTX980 it would have cost me the same price.. or twice as much for 2 and i have 0 complaints.

    At the end of the day though i generally go for bang for buck, unless someone recommends something to me, which was what the AMD cards were a recommendation from a friend who was really happy with them at a time i was sour with nvidia.

    You have to buy whats best for you / your budget.

  • Loke666Loke666 Member EpicPosts: 21,441
    Originally posted by Quizzical
    Originally posted by APRIME
    I've only ever used NVIDIA products but was thinking about picking up an AMD card.  However, I keep hearing that AMD's cards are more prone to crashes and BSOD, and that the drivers aren't as good as NVIDIA's.  What comments do any of you have on the issue?  I'd especially like to hear from people that have used GPU's from both companies.  Thanks!

    Claiming that AMD cards crash a lot is pure FUD, whether from Nvidia or their fanboys trying to scare you away from perfectly good products.  I've had an AMD card in my current computer for about 5 2/3 years, and the computer has crashed exactly three times in that duration.  That doesn't mean that the video card caused the computer to crash three times; it means three crashes in total for any reason--such as flooding or power outages, let alone other drivers or programs that don't even touch the GPU doing something stupid.

    Well, some AMD/ATI cards have been prone to crashes, but not because the cards in themselves but due to drivers.

    However was that more common way back, AMDs drivers have become better. I think NVidia still are in the lead for drivers however but not by very much anymore.

    Certain games work better on one cardtype or the other though, Witcher 3 works better on Nvidia card as example but few persons are planning to just play a single game on their computer so it rarely matters much.

    It is indeed best to check which card will give you most performance for the cash you want to spend. I would go for Nvidia if the performance is equal but if the AMD card performs best you should probably go for that.

    Here is a helpful chart (for 1080p, there are others if you go back for specific games and other resolutions):http://www.tomshardware.com/charts/2015-vga-charts/20-Index-1080p,3683.html

  • QuizzicalQuizzical Member LegendaryPosts: 25,355
    Originally posted by NightHaveN
    ATI these days installs something similar to NVidia Experience. They install a customized (what companies call reskin these days) Raptr.

    For the last couple of years I have used Tomshardware current month recommendations based on my budget to buy cards around $280. While recently Nvidia took that spot, it was ATI territory for years ($200-$325 range).

    Maybe that's the reason management demanded a new 30x line to cover all markets, instead of rebrands, and a new driver team (again).

    Their previous team at first was doing a good job with frequent drivers, but dunno if they cut there, or relocate them to consoles. But today drivers maintenance feels like a ghost town.

    Nvidia drivers may not be rock solid on every version, but their fast releases usually means you are not stuck with a problem for long.

    But when was the last official ATI driver released? December, 5 months now.

    Video driver updates are a means, not an end.  The end is having good drivers that work properly.  I don't want a new driver every week that doesn't work.  I want a driver that works, even if it's a year old.

    Remember that AMD's GCN architecture is now more than three years old.  There have been some slight tweaks to the architecture as new cards released, but it's only slight tweaks.  AMD figured out how to make the architecture perform well a long time ago and no longer has a big stack of things that constantly needs to be fixed.  For the same reason, Nvidia isn't constantly having to issue new drivers to fix things broken in their Kepler architecture.

    But Nvidia has launched a new architecture and new cards for it a lot more recently with Maxwell.  That means a more frequent need for drivers to fix things that aren't yet working perfectly in the new architecture.  AMD's situation at the moment of almost its entire lineup being such old cards is very unusual historically, though with no new process node to move to, I can understand why they've done things that way.

  • barasawabarasawa Member UncommonPosts: 618

    I've done both companies cards over the years, and they both have advantages & drawbacks as well as pretty similar gaming output. For the past few years I've not had enough fundage to run both, so things could have changed some without me knowing. Besides, every ones experience will differ somewhat.

     

    AMD. In general my experience has been that they will be less expensive, and will look better on paper, but you won't notice it in any actual gameplay. Also, they've inherited the ATI curse (Their entire graphics department used to be ATI), so their driver issues and lack of properly updating everything that's not their newest chip is an issue. Sure that's ancient history, but it's also a pattern that they seem to keep repeating.

     

    Nvidia. You are going to pay more for it, and on paper they seem to look second best a lot of the time. When you are gaming though, you just won't notice any of those supposed differences.

    Gee, it says it only displays 12million colors instead of 16.7 million colors. Who cares, you have less than 1million pixels on your screen. Note, all those numbers are for example, and not actual statistics.

     

    So you make a choice. For me it's a card I'm going to have for a long time and I have to chose between saving some money, or getting a card that will be properly updated for some time to come.

    Lost my mind, now trying to lose yours...

  • mbrodiembrodie Member RarePosts: 1,504
    Originally posted by Loke666
    Originally posted by Quizzical
    Originally posted by APRIME
    I've only ever used NVIDIA products but was thinking about picking up an AMD card.  However, I keep hearing that AMD's cards are more prone to crashes and BSOD, and that the drivers aren't as good as NVIDIA's.  What comments do any of you have on the issue?  I'd especially like to hear from people that have used GPU's from both companies.  Thanks!

    Claiming that AMD cards crash a lot is pure FUD, whether from Nvidia or their fanboys trying to scare you away from perfectly good products.  I've had an AMD card in my current computer for about 5 2/3 years, and the computer has crashed exactly three times in that duration.  That doesn't mean that the video card caused the computer to crash three times; it means three crashes in total for any reason--such as flooding or power outages, let alone other drivers or programs that don't even touch the GPU doing something stupid.

    Well, some AMD/ATI cards have been prone to crashes, but not because the cards in themselves but due to drivers.

    However was that more common way back, AMDs drivers have become better. I think NVidia still are in the lead for drivers however but not by very much anymore.

    Certain games work better on one cardtype or the other though, Witcher 3 works better on Nvidia card as example but few persons are planning to just play a single game on their computer so it rarely matters much.

    It is indeed best to check which card will give you most performance for the cash you want to spend. I would go for Nvidia if the performance is equal but if the AMD card performs best you should probably go for that.

    Here is a helpful chart (for 1080p, there are others if you go back for specific games and other resolutions):http://www.tomshardware.com/charts/2015-vga-charts/20-Index-1080p,3683.html

    See, you say Nvidia runs Witcher 3 better, but mine against probably 5 other friends who have it whom have Nvidia cards ranging from 600 - 900 series has been the opposite, they have experienced nothing but trouble, but my R9 290x's have been smooth and fantastic with no crashes so far...

    But i've read a lot of users experiencing problems with AMD cards also, which i dont understand why my experience has been so good.

  • snowman22snowman22 Member UncommonPosts: 54

    witcher 3 is flawless on my r9 290 from amd, yes it lounder yes it hotter yes it uses more power, but we are using desktops people!!! why the hell should i care how loud hot or how much power it is using.. i want to to run the fastest and have best bang for buck.

     

    so OP if you can wait til august when the 390x comes out, i say wait and get an AMD, at $500 usd it will double out preform the titan x 400, so more than half the cost twice the performance, also with dx12 comming around the corner in 1 to 2 yrs, nvidia will be behind the gun a tad, since they never wanted anyone to have access to the dye. where AMD using mantle is basically dx12 already.

    amd to nvidia is same as intel to AMD,  if you care about heat noise and power bill then go nvidia intel, but whats funn is you are paying almost double in price to save on power consumption. which would take about 3yrs of powerbills to recoupe by going non amd. i personally upgrade every evolutionary jump. so i personally will not upgrade my 8350fx until 2016 and the new AMD cpus come out, and as for gpu i will skip the 300 series until it is a 14nm dye, the new amd gpu is designed this yr on a 28nm dye but come 2016 it will be 14nm.

     

    Also on the driver issue. there isnt any. just need to do a clean amd driver install when you upgrade the driver.

    heat? well come august with the new 300 series all AMD gaming cards(x80 -x95x) will be water cooled, so yeah heat will not be an issue.

    really only reason to go NVIDIA right now is aparently the new 980 supports hdmi 2.0 so you could try to enjoy 4k at 60fps

  • LokeroLokero Member RarePosts: 1,514

    I think most people just look for the best bang-for-the-buck when deciding which to get.  Most people aren't going to spend 500+ on one of the top end cards, so it becomes a chess match in the low-mid range of cards.

    It just depends who has the best quality for the cash at the time you are looking, usually.

    Personally, I always look for Nvidia because I've just found them much more reliable. But if AMD has a great offering at the right time, I'd still consider it.

    My current creed is to find one that runs cool and doesn't sound like a jet engine.  The newer generations that come out always have that problem until they later release the cheaper mid-range, better optimized versions of them down the road.

  • daltaniousdaltanious Member UncommonPosts: 2,381
    Originally posted by APRIME
    I've only ever used NVIDIA products but was thinking about picking up an AMD card.  However, I keep hearing that AMD's cards are more prone to crashes and BSOD, and that the drivers aren't as good as NVIDIA's.  What comments do any of you have on the issue?  I'd especially like to hear from people that have used GPU's from both companies.  Thanks!

    Had nVidias, but last 2 were AMD and ZERO problems.

  • Leon1eLeon1e Member UncommonPosts: 791

    Just pick a GPU within the price range that makes you feel good and see the tradeoffs between each class. 

    AMD graphics are solid, they tend to come off cheaper but you never know really. 

    Usually its nVidia that has the better GPU (currently the GTX 980) but being priced 200$ over the current AMD flagship with minor difference in performance is pretty much up to how much of a fanboy or not you are. Atm R9 290X with aftermarket cooling is the better price for your buck. 

    Things may switch after the R9 390X. If its better GPU it certainly *WILL* be priced higher than the current 980. But then nVidia will release the 980 Ti to compete, so you see ... its pretty much cat & mouse vendors. 

    It all depends on when you decide to "bite the bullet". Check promotions and whatnot. If you go after high end GPUs, it really doesn't matter which one you pick. 

    Been playing Witcher 3 on ultra on my "rusty" R9 290X for the past couple of days. I think it can last few years more :) 

     

    Also, just FYI, if you expect a single GPU to pull a 4K @ 60fps, you'll have to wait couple of years more. 

  • BC_AnimusBC_Animus Member UncommonPosts: 115

    I just switched to a NVIDIA card recently...  but my previous 4 cards had been medium-high-end AMD cards.

    My experience is this - AMD cards are very affordable, and you can get some decent performance from them, but sadly they use up a lot of power, and can overheat quickly.  Not sure if it's just me, but my cards' fans never seemed to work properly automatically out of the box either, and I had to manually override them and rev them up during graphics intensive gameplay sessions to keep the card stable.  And this is coming from a non-overclocker too.

    Unlike a lot of others, personally I had no complaints about the drivers on offer.  However I have had a few BSODs and graphical glitches - but that was all due to overheating, for which I should partly blame myself for not having more efficient case/fan setups, and perhaps for refusing to lower my graphics settings while gaming.

    With NIVIDA cards...  like I said just switched cards recently, to a GTX-980.  Not sure if it's because it's more powerful than my previous cards and can processed more without stressing itself, but thus far I have been impressed by just how cool the card stays, as well as its energy requirement - it uses nearly 40% less power than AMD's R9 series.  Some useful 1st party software available from Nvidia too, like the "GeForce Experience" which makes for really simple drivers upgrading and game settings optimising.

    I'm probably gonna stick with NVIDIA for my next few cards...  but having said that, NIVIDA cards are kinda pricey - for gaming the average user can probably get a similar experience for half the price if they go with AMD.

  • roxorxlroxorxl Member UncommonPosts: 7

    I had 4 Nvidia and 6 AMD.

    In Benchmarks and tests the AMD was a bit higher but i had always problems with drivers.

    Now i have Alienware Laptop with 2 AMDs and still driver proplems + the Crossfire sucks (turns off, bugs).

    My XP is that AMD drivers are a mess.

    My other laptop with 2 Nvidia SLI runs smoth and no problems. 

     

    Now i stick with Nvidia. Never again AMD.

  • CleffyCleffy Member RarePosts: 6,412
    Pick the one that is best for your needs. For professional work without paying for a professional card, AMD and NVidia perform differently in different applications so you have to search the net for the differences on your particular workload. For gaming it doesn't matter as they are all priced well with the performance they offer. Just pick the most modern one in your price range.
  • CalmOceansCalmOceans Member UncommonPosts: 2,437
    If you like red, buy AMD. If you like Green buy Nvidia. It really doesn't matter to be honest.
  • BC_AnimusBC_Animus Member UncommonPosts: 115
    Originally posted by centkin

    Total cost of ownership...

     If you use your computer 30 hours a week, and the card uses 100 extra watts, and you pay 20 cents for kwh of electricity, and you keep that video card for 4 years, you are looking at 3*.20*4*52=$124.80 extra going with the card that uses more electricity.

     It also means more wear and tear on the power supply(if not using a larger one), more heat(shortening the life of the computer), and a shorter run time on a UPS if you have one...

    To be honest I think the only concern when it comes to power use and requirements is for those looking to upgrade older systems with lower PSUs, or for folks wanting to multiple cards in their systems.  Sure, running certain cards MIGHT save you $30 or so a year in power, maybe - but anyone who can afford the kinds of cards we're talking about here shouldn't have issues with that (or so you would hope).

Sign In or Register to comment.