Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

ATI or nVidia

2

Comments

  • CeridithCeridith Member UncommonPosts: 2,980

    Originally posted by Paradoxy

    Originally posted by Ceridith

    ATI gives you the most bang for your buck in speed, however...

    ATI cards consume more power in relation to the amount of speed you get, and by proxy of this also run a good deal hotter. Driver support can also leave a lot to be desired. The combination of the last two issues can result in a good deal of instability, not to mention the card's lifespan beign fairly short.

    Nvidia on the other hand, does technically cost more per the processing power you get, but they tend to run cooler, have better driver support, run more stable, and last longer, than ATI cards.

    If you'd prefer to save a bit of cash, but don't mind fiddling with drivers more often and have decent cooling on your PC, ATI is better.

    If you don't mind spending a little more but would prefer a card that doesn't require as much maintenance, Nvidia is better.

    I think you mixed the two because ATI consumes less power and runs cooler in compairson to Nvidia which are power hogs and run hotter.

    Unless things have changed recently, I haven't.

    I usually buy the card just under the top end for that particular generation. Every time I've bought ATI, it's always run much hotter and less stable than when I would buy Nvidia.

    I'm currently running an Nvidia card with no problems, where as my last two ATI cards burnt out in about a year each, and I don't overclock and regularly clean out my PC.

  • ParadoxyParadoxy Member Posts: 786

    Originally posted by Ceridith

    Originally posted by Paradoxy


    Originally posted by Ceridith

    ATI gives you the most bang for your buck in speed, however...

    ATI cards consume more power in relation to the amount of speed you get, and by proxy of this also run a good deal hotter. Driver support can also leave a lot to be desired. The combination of the last two issues can result in a good deal of instability, not to mention the card's lifespan beign fairly short.

    Nvidia on the other hand, does technically cost more per the processing power you get, but they tend to run cooler, have better driver support, run more stable, and last longer, than ATI cards.

    If you'd prefer to save a bit of cash, but don't mind fiddling with drivers more often and have decent cooling on your PC, ATI is better.

    If you don't mind spending a little more but would prefer a card that doesn't require as much maintenance, Nvidia is better.

    I think you mixed the two because ATI consumes less power and runs cooler in compairson to Nvidia which are power hogs and run hotter.

    Unless things have changed recently, I haven't.

    I usually buy the card just under the top end for that particular generation. Every time I've bought ATI, it's always run much hotter and less stable than when I would buy Nvidia.

    I'm currently running an Nvidia card with no problems, where as my last two ATI cards burnt out in about a year each, and I don't overclock and regularly clean out my PC.

    http://www.anandtech.com/show/2977/nvidia-s-geforce-gtx-480-and-gtx-470-6-months-late-was-it-worth-the-wait-

    The power/heat situation also bears mentioning, since it often goes hand-in-hand with yield issues. With a 500mm2+ die on the 40nm process, it should come as no surprise that both the GTX 480 and GTX 470 are hot cards. NVIDIA has to pay the piper for having such a large die, and this is one of the places where they do so. The TDP for the GTX 480 is 250W while it’s 215W for the GTX 470; meanwhile the cards idle at 47W and 33W respectively. NVIDIA’s large die strategy usually leads to them having power-hungry parts, but from a historical perspective the GTX 480 is the hungriest yet for a single-GPU card; even the GTX280 wasn’t quite as high. We’ll get in to this more when we take a look at measured power consumption.

    I used to have Gtx 470, i changed it to  ATi 5870 and never looked back. Runs cooler and doesn't hog power.

    Who could have thought that WOW could bring super power like USA to its knees?


    Originally posted by Arcken

    To put it in a nutshell, our society is about to hit the fan, grades are dropping, obesity is going up,childhood the USA is going to lose its super power status before too long, but hey, as long as we have a cheap method to babysit our kids, all will be well no?
    Im picking on WoW btw because its the beast that made all of this possible

  • CleffyCleffy Member RarePosts: 6,412

    O.O so many people getting facts from the Bullshit Prophet.  I based my post on facts.  Even though I am partial to AMD, I did not leave my assessment based on this but rather a poor business strategy by nVidia.  AMD cards do consume less power and generate less heat them comparative nVidia cards.  AMD cards to outperform nVidia cards per dollar.  AMD cards have had audio going through the card since the HD2xxx series based on the guidelines set for DX10.  AMD cards have a higher processing power then competing nVidia cards (They are the only ones in gigaflop territory off 1 card).  AMD cards can generate more polies then competing nVidia cards.  This is not fanboyism, but shear facts.  What nVidia has done in the last 3 years is build up walls like Apple and they have slowly closed off their market competitiveness by bogging down their cards with exclusive extras.  In PC hardware, exclusivity is not a positive thing.  As a result they failed to target what matters in the GPU business.  Creating a card that is the best Graphics Processor period.  Instead of getting the title through actual engineering they have relied on their past reputation and underhanded tactics to hinder competing products in order to maintain their rather poor 30% market share.

    Its clear that nVidia made an error in its competition with AMD, and its going to be a while until these errors are corrected.  The market has already spoken.  No one no matter how much you like nVidia can really justify spending more and getting less.

  • QuizzicalQuizzical Member LegendaryPosts: 25,348

    Originally posted by Ceridith

    ATI gives you the most bang for your buck in speed, however...

    ATI cards consume more power in relation to the amount of speed you get, and by proxy of this also run a good deal hotter. Driver support can also leave a lot to be desired. The combination of the last two issues can result in a good deal of instability, not to mention the card's lifespan beign fairly short.

    Your information is very obsolete.  Nvidia did win in performance per watt about four years ago.  Today, AMD wins handily in performance per watt if you're comparing Evergreen to Fermi, regardless of the market segment.  Some Fermi cards manage to go very low on idle power if you have only one monitor attached, but that's about all that can be said in their favor regarding power consumption.  See here, for example:

    http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_580/28.html

    Both AMD and Nvidia have good Windows drivers for their consumer graphics cards.

  • ThomasN7ThomasN7 87.18.7.148Member CommonPosts: 6,690

    I have had both Nvidia and ATI cards and I must say when booth have worked properly I have found little to no difference at all as far as performance when gaming. The biggest issue is price and by far ATI gives you better value for your dollar. I bought an Nvidia GTX465 this yearand the card is just way too big but I managed to make it fit. If I had to do it again I would have bought an ATI because I would have saved more money and I wouldn't have had the trouble of making the card fit in my pc. It is all about preference really.

    30
  • DataDayDataDay Member UncommonPosts: 1,538

    ATI is currently the best brand to get right now. I used to be a "NVIDIA" fanboy of sorts, meaning thats the company i preferred and defended but as time went on, Nvidia got too full of themselves, they beat ATI in terms of latest and greatest technology and let it go to their heads. Now their GPUs are not all that good, the quality has dropped significantly. I know quite a few people as well as tech labs that use current gen Nvidia cards and not only are there driver issues, but the cards themselves flip out if not die altogether. All the ATI cards right now are more stable and solid, though they do not have the image Nvidia has at the moment, they are putting more effort into their products in order to be industry leaders once more. Nvidia cheapened their already expensive product to make more monies while not really pushing the technology on the hardware front. At the same time they have been focusing too much on their little pet projects on the software side. 

    The only reason Nvidia really got ahead was due to allowing third parties to make cards with their GPUs where as ATI kept it strictly first party.

    If you want the best product right now (stability, price and performance) though perhaps lacking some of the gimmicky software features, then ATI is the safest investment for now. 

    Like me, you dont have to stick with any one company, just go where the best product is and right now is the time to stick with ATI. Maybe once Nvidia gets their heads out of their rear and starts delivering better quality products, I'll go with them, but no, not now.

  • ZekiahZekiah Member UncommonPosts: 2,483

    5870 here, first ATI I've ever owned. NVidia is easier to setup with drivers and all but I always had issues with crashes in the past with their cards. Since I got my 5870 almost a year ago, I think I've crashed maybe twice. I can't tell you how great that card has been for me, it's just incredible. I'll have no problem switching back to NVidia if they get things back into gear but right now ATI has my vote.

    "Censorship is never over for those who have experienced it. It is a brand on the imagination that affects the individual who has suffered it, forever." - Noam Chomsky

  • CatamountCatamount Member Posts: 773

    Originally posted by SaintViktor

    I have had both Nvidia and ATI cards and I must say when booth have worked properly I have found little to no difference at all as far as performance when gaming. The biggest issue is price and by far ATI gives you better value for your dollar.

    Right here; this is basically everything that needs to be said.

    For all the red herrings about the GPGPU capabilities of the Geforce cards, for all .001% of the market who wants to build $1,000 computers just to have expensive folding@home boxes, or the 1% of the market (give or take) who actually care about Linux, especially among gamers who's software is largely not even supported, the simple fact is that at the end of the day, the only thing the vast majority of customers for high performance GPUs, ESPECIALLY on a gaming forum, care about, is value for gaming.

    It isn't a case where AMD makes good gaming cards that suck at everything else, and Nvidia makes good gaming cards that are then good at everything else too, thereby making them better; it's a case where AMD makes sleak, streamlined gaming cards that deliver the same level of performance with far lower complexity and manufacturing costs than a typical Nvidia card, while Nvidia doesn't even MAKE gaming cards, and just takes GPGPUs that are mediocre at best at gaming, and cram more and more transistors into them until they they brute force their way into MAKING them fast gaming cards, ending up with hot, power-hungry, convoluted designs that fail to compete in any way with AMD's competing offerings unless Nvidia drops them in price to the point of not making a profit.

     

    If Nvidia wants to sell Geforce GTX 460s for the cost of production, and way below their intended price, just to compete with AMD's very profitable prices on the Radeon HD 6850, then there's no shame in taking advantage of it as a consumer, but this is about which company is generally better, and given that AMD can make a cheaper and simpler card than Nvidia can, by far, for a given level of gaming performance, due to making actual dedicated gaming cards instead of trying to fudge GPGPUs into that role, it means that overall, AMD is the better company that is in the better market position to offer the best deals for gamers, their primary target. If someone wants to hire me to make GPGPU work stations to model weather or climate, or simulate protein folding, then I'll doubtless use Nvidia GPUs, but this isn't NASA or a university school of medicine; it's MMORPG.com

  • ShinamiShinami Member UncommonPosts: 825

    The great masses of the people only think about buying a house to live in, while they forget a house must be built in order to be sold. Nvidia is what is used to BUILD and PLAY a game while ATI can only PLAY a game. I am in the middle of creating a fantasy-shooter, where ATI doesn't even give me 10 FPS. My 480 GTX gives me 37 FPS with Physics + Tesselation enable.

     

    When I TEST the game (in short, actually run the level of the game after compiling) the two video cards give me similar framerates (GTX 480 and 6870). 

  • QuizzicalQuizzical Member LegendaryPosts: 25,348

    Originally posted by Shinami

    Nvidia is what is used to BUILD and PLAY a game while ATI can only PLAY a game. I am in the middle of creating a fantasy-shooter, where ATI doesn't even give me 10 FPS. My 480 GTX gives me 37 FPS with Physics + Tesselation enable.

    Congratulations, you figured out how to make a badly coded game.

    You really think that developers don't test their games on hardware from both major video card vendors?  Most early DirectX 11 games were developed on AMD hardware, simply because Nvidia didn't have DirectX 11 hardware until considerably later.

  • Vagrant_ZeroVagrant_Zero Member Posts: 1,190


    Originally posted by Paradoxy

    Originally posted by galoa309
    I prefer nVidia.
    My point is that not only some number are important ..
    As far as Im concerned most of the games are better optimized for nVidia (they have more market share as well as more developer tools) ...
    Nope. I don't know where are you getting your info from? please don't try to pass your assumptions as some legit information.

    Nvidia does have a larger market share so he wasn't assuming anything. http://store.steampowered.com/hwsurvey/

    Please don't speak on things you know nothing about. Ever.

  • QuizzicalQuizzical Member LegendaryPosts: 25,348

    Yeah, Nvidia has a lot more market share among older cards, because they were well ahead three years ago.  AMD has more market share among newer cards.

    But that's if you restrict to gaming cards, of course.  If you count everything, then Intel has more market share, with their awful integrated graphics.

  • vanderghastvanderghast Member UncommonPosts: 309

    I got my first ati card in a LONG time last year with the 5870.  I haven't had this many driver issues with games in a VERY long time.  Game either just didn't work or was missing features, etc.  Maybe i just play some weird games i dunno.

     

    That said i saw an article recently that compared nvidia and ati's base image quality and nvidia at default settings was quite a bit better than ATI at defaults.  I'm not a brand name type person where i support a single company, i thought the 5870 was a good buy at the time, but frankly i don't think i'd buy another ATI card.  In fact i just put my 5870 on ebay and am going to replace it with an nvidia and am also considering getting their 3d vision setup, could add some extra fun to games.

  • QuizzicalQuizzical Member LegendaryPosts: 25,348

    Originally posted by vanderghast

    That said i saw an article recently that compared nvidia and ati's base image quality and nvidia at default settings was quite a bit better than ATI at defaults.

    You can find plenty of articles that say either side has better image quality than the other.  AMD and Nvidia are constantly squabbling about which settings for each card need to be chosen in order to be comparable.  Indeed, the pursuit of comparable settings is a big argument in favor of turning off anti-aliasing and anisotropic filtering entirely when comparing benchmarks.

    Did you make sure to update the drivers properly?  The early beta drivers had some problems, but those have long since been fixed, and there don't seem to still be widespread problems with the drivers of either side.  I halfway expect Cayman to change that for a while, but whatever problems show up there will be fixed soon enough.

  • noquarternoquarter Member Posts: 1,170


    Originally posted by vanderghast
    I got my first ati card in a LONG time last year with the 5870.  I haven't had this many driver issues with games in a VERY long time.  Game either just didn't work or was missing features, etc.  Maybe i just play some weird games i dunno.
     
    That said i saw an article recently that compared nvidia and ati's base image quality and nvidia at default settings was quite a bit better than ATI at defaults.  I'm not a brand name type person where i support a single company, i thought the 5870 was a good buy at the time, but frankly i don't think i'd buy another ATI card.  In fact i just put my 5870 on ebay and am going to replace it with an nvidia and am also considering getting their 3d vision setup, could add some extra fun to games.

    Yes that issue is a red herring to me. In the latest driver update ATI added another quality level to the settings slider and changed the 'default' settings to have a couple more optimizations turned on than it used to.


    This caused very minor anomalies in 5+ year old games. No one has shown a single modern game that has an IQ difference because of that change to the default slider position. If you did notice an IQ drop you could always just put the slider up one to where it used to be under the old drivers.

    The issue is compounded by the fact that the nVidia drivers used in the article have a bug that cause them to use highest quality filtering settings even when it's suppose to be using default quality.

  • jvxmtgjvxmtg Member Posts: 371

    AMD processors had failed me so many time that I have boycott their products since 2001. And I used to buy ATI over Nvidia until AMD acquired ATI, then I stopped buying ATI ever since. Nvidia had their performance improved specially if built by eVGA.

     

    So right now, I'm running an Nvidia by eVGA playing Cata (tweaked using Precision).


    Ready for GW2!!!
    image
  • QuizzicalQuizzical Member LegendaryPosts: 25,348

    Originally posted by jvxmtg

    AMD processors had failed me so many time that I have boycott their products since 2001. And I used to buy ATI over Nvidia until AMD acquired ATI, then I stopped buying ATI ever since. Nvidia had their performance improved specially if built by eVGA.

    Processors are pretty resilient and virtually never fail unless overclocked or improperly cooled.  This is especially so because both Intel and AMD clock modern processors far slower than they're capable of running.  Before 2001 is ancient history in computer terms.

    You do realize that EVGA doesn't actually build any video cards, don't you?  EVGA handles marketing and warranty service and does some design, but they contract out to have others actually build their cards.  A lot of their cards are reference cards, even, which means that it's exactly the same card as other Nvidia board partners sell, but with a different sticker on it.  The sticker doesn't magically increase performance.

    Now, there are reasons why one might prefer to buy an EVGA card over, say, ECS or Sparkle.  Warranty service does matter.  But better performance isn't such a reason, unless perhaps you're looking at a particular factory overclocked card, but lots of companies do factory overclocks.

  • CleffyCleffy Member RarePosts: 6,412

    Originally posted by Vagrant_Zero

     




    Originally posted by Paradoxy





    Originally posted by galoa309

    I prefer nVidia.

    My point is that not only some number are important ..

    As far as Im concerned most of the games are better optimized for nVidia (they have more market share as well as more developer tools) ...






    Nope. I don't know where are you getting your info from? please don't try to pass your assumptions as some legit information.



     

    Nvidia does have a larger market share so he wasn't assuming anything. http://store.steampowered.com/hwsurvey/

    Please don't speak on things you know nothing about. Ever.

     Actually AMD has the larger market share in Discrete Graphics.  Steam survey is a lagging indicator for market share.

    http://www.guru3d.com/news/amd-gpu-marketshare-declining-again/

  • IsturiIsturi Member Posts: 1,509

    Im running a nVidia GeFroce 260 GTX and I played CATA beta just fine on ultra high. Heck for that matter Ive played Batman Arkhum Assylum on high. Acualy I dont think that there is not much that this card can not handle on the highest settings.

    image

  • noquarternoquarter Member Posts: 1,170


    Originally posted by jvxmtg
    AMD processors had failed me so many time that I have boycott their products since 2001. And I used to buy ATI over Nvidia until AMD acquired ATI, then I stopped buying ATI ever since. Nvidia had their performance improved specially if built by eVGA.
     
    So right now, I'm running an Nvidia by eVGA playing Cata (tweaked using Precision).

    Weird I've never had a processor itself die. Mobo for sure. Back in those days a lot of CPU's got damaged by heat sinks though due to the exposed processor getting chipped or cracked during installation. That's why all CPU's have a protective heat spreader glued on now.

    I understand your position though but statistically Intel and AMD probabably have equally low failure rates. The CPU's are thoroughly binned and it makes much more sense for both of them to bin a CPU as a cheaper stable model than as an unstable expensive model. That's why CPU's are the most reliable part of the system and last thing I assume is wrong in a problem system.

  • QuizzicalQuizzical Member LegendaryPosts: 25,348

    Originally posted by noquarter

    I understand your position though but statistically Intel and AMD probabably have equally low failure rates. The CPU's are thoroughly binned and it makes much more sense for both of them to bin a CPU as a cheaper stable model than as an unstable expensive model. That's why CPU's are the most reliable part of the system and last thing I assume is wrong in a problem system.

    Actually, I'd expect AMD processors to have a considerably higher failure rate than Intel processors, mainly because they're a lot more likely to be overclocked.  AMD's market share in desktops is much higher than in laptops or servers, and desktops are the only ones that get overclocked.  Furthermore, AMD caters more to enthusiasts who are inclined to overclock processors (see how many Black Edition processors AMD has, for example), while Intel probably has a higher market share in market segments where people wouldn't even consider overclocking the processor, such as businesses.  Now, Intel probably has a majority in nearly every market segment, but a 90% market share in a market segment where overclocking is out of the question versus 70% in a segment where a considerable fraction will be overclocked means that an AMD processor is more likely to be overclocked.

    If you restrict to desktop processors that are left at the stock speeds, I have no idea which would be more likely to fail.  If I had to guess, I'd say AMD processors were still more likely to fail, on the grounds that they tend to have less overclocking headroom.  Most such processor failures are probably really attributable to a bad motherboard or a bad power supply, too.  Excluding those, if it's 1 in 5000 will fail from one company and 1 in 10000 from the other, does the difference really matter?

  • jvxmtgjvxmtg Member Posts: 371

    Originally posted by noquarter

     




    Originally posted by jvxmtg

    AMD processors had failed me so many time that I have boycott their products since 2001. And I used to buy ATI over Nvidia until AMD acquired ATI, then I stopped buying ATI ever since. Nvidia had their performance improved specially if built by eVGA.

     

    So right now, I'm running an Nvidia by eVGA playing Cata (tweaked using Precision).




    Weird I've never had a processor itself die. Mobo for sure. Back in those days a lot of CPU's got damaged by heat sinks though due to the exposed processor getting chipped or cracked during installation. That's why all CPU's have a protective heat spreader glued on now.

    I understand your position though but statistically Intel and AMD probabably have equally low failure rates. The CPU's are thoroughly binned and it makes much more sense for both of them to bin a CPU as a cheaper stable model than as an unstable expensive model. That's why CPU's are the most reliable part of the system and last thing I assume is wrong in a problem system.

    Not really sure what the real reason is, but I guess it's one of those moment when you have a Chevy, for example, where you expect it to run properly then all of a sudden the engine failed. It convinces you that Chevy had failed you and never gonna buy from them again. It may or may not be reasonable, but I'm a consumer.

     

    It only takes one time to feed me bad food for me to boycott your restaurant and give you bad rating regardless if it's an isolated incident or not. This is why free market is great, you don't have to continue on eating the same crap because there's no other choices.

     

    If Intel or nVidia failed me, which had not been the case so far, I'd move on also.


    Ready for GW2!!!
    image
  • CatamountCatamount Member Posts: 773

    Originally posted by jvxmtg

    Originally posted by noquarter

     




    Originally posted by jvxmtg

    AMD processors had failed me so many time that I have boycott their products since 2001. And I used to buy ATI over Nvidia until AMD acquired ATI, then I stopped buying ATI ever since. Nvidia had their performance improved specially if built by eVGA.

     

    So right now, I'm running an Nvidia by eVGA playing Cata (tweaked using Precision).





    Weird I've never had a processor itself die. Mobo for sure. Back in those days a lot of CPU's got damaged by heat sinks though due to the exposed processor getting chipped or cracked during installation. That's why all CPU's have a protective heat spreader glued on now.

    I understand your position though but statistically Intel and AMD probabably have equally low failure rates. The CPU's are thoroughly binned and it makes much more sense for both of them to bin a CPU as a cheaper stable model than as an unstable expensive model. That's why CPU's are the most reliable part of the system and last thing I assume is wrong in a problem system.

    Not really sure what the real reason is, but I guess it's one of those moment when you have a Chevy, for example, where you expect it to run properly then all of a sudden the engine failed. It convinces you that Chevy had failed you and never gonna buy from them again. It may or may not be reasonable, but I'm a consumer.

     

    It only takes one time to feed me bad food for me to boycott your restaurant and give you bad rating regardless if it's an isolated incident or not. This is why free market is great, you don't have to continue on eating the same crap because there's no other choices.

    If there's one thing that years of retail work taught me, it's that consumers are stupid, and typically understand little to nothing about the product their purchasing, nor the process involved in getting that product to them. This is especially true with technology.

     

    You, however, already know that the failure rates are going to be low no matter which company you buy from. I'm sure you're more than able to figure out that your previous failure is in no way linked to subsequent hardware purchases, which means that statistically, you're as likely to get a "dud" piece of hardware from Intel or Nvidia as from AMD, and the previous hardware failure you experienced has no bearing on that, whatsoever. As such, it's illogical to base future purchasing decisions on a past event that's physically incapable of affecting said future purchases. It's equally illogical to try to judge an average of a company from a small sample size of products, especially when that sample size is one.

     

    Ati cards right now are simpler, faster devices that will, on average, offer far better value than Nvidia parts, and will, on average, give you no more trouble (be it failures, driver issues, etc). Given the notorious "30%" of crashes that Nvidia drivers caused in Windows Vista in 2007 (three times as many as Ati drivers), I'd be inclined to say you might actually statistically do worse with Nvidia hardware, but that's really neither here nor there. What's important is that regardless of which company you buy from, 9,999 times out of 10,000, you'll pop their card/CPU in, install the drivers, and play games happily without ever experiencing a problem. Given this fact, it seems a tad bit silly, to say the least, to intentionally purchase parts that are in inferior value (at least with video cards; Intel makes fine CPUs) for what really boils down to no reason, whatsoever, because your past experiences cannot have a bearing on the reliability of future cards (unless you believe in some sort of wierd computer Karma).

    As you said though, you're a "consumer", so you can buy things that are an inferior value if you want. That's the wonderful thing about free markets, you don't have to purchase the best product if you want to go with brand loyalty instead, because no one mandates that you purchase anything from anyone (public services excluded image). That doesn't make it the smartest move you can make, however.

     

    This isn't like not going to a restaurant because they gave you a bad meal. It's like never again going to the best restaurant in town, as rated by everyone else who lives there, because one day a waiter slipped and spilled a glass of water on you. It shouldn't change the fact that they make the best food around.


    If Intel or nVidia failed me, which had not been the case so far, I'd move on also.

    Might I ask to whom you'd move on to?

    I've had failures from all three companies. CPU failrues are extremely rare, but I did once have a desktop Pentium 4 (Northwood) in a laptop, and that thing actually didn't live all that long (is it any wonder?). I once had the video memory die on a Radeon HD 5770 (it kept giving me BC116 errors, specifically, "page fault in non-paged area" errors in the video driver). I once had a Geforce 6150 flat out blow up inside a laptop (different laptop... my luck hasn't always been the best with them).

    This is why I think your approach is a bit silly. If I did things your way, what would I have left, Matrox video cards and Motorola CPUs? I don't think that would get me very far in high-performance computing :p

  • Vagrant_ZeroVagrant_Zero Member Posts: 1,190


    Originally posted by Cleffy

    Originally posted by Vagrant_Zero
     


    Originally posted by Paradoxy



    Originally posted by galoa309
    I prefer nVidia.
    My point is that not only some number are important ..
    As far as Im concerned most of the games are better optimized for nVidia (they have more market share as well as more developer tools) ...


    Nope. I don't know where are you getting your info from? please don't try to pass your assumptions as some legit information.



     
    Nvidia does have a larger market share so he wasn't assuming anything. http://store.steampowered.com/hwsurvey/
    Please don't speak on things you know nothing about. Ever.


     Actually AMD has the larger market share in Discrete Graphics.  Steam survey is a lagging indicator for market share.
    http://www.guru3d.com/news/amd-gpu-marketshare-declining-again/

    That's only DX11 cards/cards sold in 2010 since Nvidia dragged their feet on Fermi. When you look at the bigger picture (ie NOT just 2010), Nvidia is still far in the lead thanks to their relative dominance starting from the 6800 series and ending with the 280s.

    Go reread your article bro, the very first sentence points out that their info is based off 2010 sales.

    The steam survey is just that, it takes everyone who is using steam and counts if they have Nvidia or ATI. Clearly, Nvidia is a good deal in the lead.

  • QuizzicalQuizzical Member LegendaryPosts: 25,348

    For simplicity, let's regard anything before the GeForce 8000 and Radeon HD 2000 series as ancient history, and discard all Intel and VIA graphics entirely.  Those add up to about a 23% market share.

    The first generation after that was the GeForce 8000 series versus the Radeon HD 2000 series.  Nvidia won that series by an enormous margin, both because they had better products and because they launched the products on time.

    The next generation was the GeForce 9000/100 series versus the Radeon HD 3000 series.  Nvidia won by a huge margin here as well.  AMD got products out on time, but Nvidia simply had better cards.

    After that, we have the GeForce GTX 200 series versus the Radeon HD 4000 series.  AMD won this generation by a substantial margin, largely because Nvidia priced themselves out of much of the market, with their intended lower end derivative chips delayed by nearly a year.

    Next, we have the GeForce GT 200/300 series versus the Radeon HD 5000 series.  AMD won this generation by an enormous margin, as Nvidia basically didn't show up.  Nvidia was launching low end DirectX 10.1 cards that were borderline obsolete the day they launched as AMD was launching a full lineup of DirectX 11 cards.

    That brings us to present day, with the GeForce 400/500 series against the Radeon HD 6000 series.  Nvidia might be more competitive than they were the previous two generations, now that they seem to have figured out how to fix their Fermi architecture.  But Nvidia still trails way behind on performance per mm^2 and significantly behind in performance per watt.  Maybe they can keep a good chunk of market share by offering deep discounts on their cards (like what AMD is doing in the CPU arena right now), but outright winning the generation will be tough when they have a far inferior architecture.

    Furthermore, AMD has an entirely new class of products coming this generation that Nvidia has no answer for:  APUs.  Bobcat probably won't show up much in the Steam Hardware Survey, but Llano sure will, as it's going to be the first budget gaming laptop, which is a product that a lot of people seem to want to buy.  Because they lack an x86 license, Nvidia wouldn't be able to compete in this market segment even if they gave away their products for free.

    -----

    If you look at who has more cards out there, then yes, Nvidia has more than AMD.  But AMD is selling more now than Nvidia, and has more cards of the more recent generations out there.  I'm betting that older cards will tend to be retired from use before newer cards, and that alone means that if AMD and Nvidia were exactly even on sales from today onward, AMD would probably eventually pull ahead in market share on Steam.  Indeed, if you restrict to DirectX 10/11 systems, Nvidia has more market share than AMD on Steam, but not a lot more.  It's about a 53/47 split if you restrict to those two vendors.

    Let's not forget that AMD has some big advantages that will help in future generations, too.  For starters, they have a better architecture right now.  Evergreen beat Fermi handily, and Northern Islands will probably win by even more, as it's still competing against Fermi.  Now, it's easier to improve on a bad architecture than a good one, but it's also easier for the next architecture to be good if you already know how to make a good architecture because your previous one was.

    Additionally, there aren't APUs on the market just yet, but there are going to be an awful lot of them and very soon.  Intel will get a significant chunk of this market, but it's mostly going to be AMD showing up on Steam.  And Nvidia is locked out entirely, as they don't have an x86 license and ARM can't run Windows.

    So if you're a developer today working on a game that you expect to launch in 2012 (which means you know in the back of your mind that it will probably be delayed until 2013), who do you expect to have the most market share once the game is ready?  The real answer is that you optimize for both AMD and Nvidia cards.  But you'd be a fool to only focus on Nvidia cards and ignore AMD.

Sign In or Register to comment.