Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

ATI Crossfire vs Nvidia SLI Performance Comparison Review

Dreadknot357Dreadknot357 Member Posts: 148

http://www.overclockersclub.com/reviews/crossfire_vs_sli/

SLI won 36 out of 40 tests in Quad SLI vs Quad CF, and 37 out of 40 in the other categories.

Conclusions:

* SLI is best overall for multi-GPU performance

* SLI has better driver support and delivers more consistent performance than CF

* ATI HD 4870x2 CF has worst power consumption

* ATI’s drivers are frustrating

* Best bang for the buck is GTX 260-216 SLI

* GeForce offers much more than gaming perf: Folding@Home perf is better than ATI; 3D Vision; CUDA accelerates a variety of non-gaming apps

Key POINTS I think that people should read (are in red)

Conclusion:

ccokeman's Thoughts

So what did we learn from this little exercise? I learned that SLI is the more mature multi-GPU solution currently. In each of the three classes, the Nvidia technology and their video cards

lost no more than four times out of 40 tests, with all things being equal - this being in the quad GPU class. Each of the other classes delivered a 37 to 3 margin of victory for team Green. Each set of cards was run at the default factory settings in the control panel to show what Joe Average will get out of a plug and play system. Could performance be improved upon by manipulating the settings in the control panel? Sure it could - on both counts, but this was about running what the system defaults to. The only deviation from this is in 3DMark Vantage, as the Nvidia cards share a distinct advantage in the PhsyX tests. The reasoning is that Futuremark does not allow the scores when the GPU does the PhysX calculations, because ATI does not have an equivalent technology

to compete right now. Right now, SLI is the way to go for multi-GPU performance in the games tested in this comparison. The downside to this is that it will cost you a little bit more to get this performance, as ATI seems to have the pricing game down pat at the performance levels I looked at here. With just raw cost as a factor the GTX 295 quad GPU setup will set you back just over $1100, while the HD 4870x2 combo will only (insert sarcasm here) set you back $800-$1000 depending on the cards you buy. The Sapphire cards will only set you back 400 bucks each, instead of the more popular pricing of 500 bucks. In the single GPU card class, the GTX 285 combo will cost you almost $700 - at a cozy $680 - while the HD 4890 combo only goes for $530, with prices scaling downward based on the video card's capabilities. That being said, the best value per frame per second delivered is the GTX 260 SLI combination at $3.44 per FPS at 1920x1200, and $4.59 per frame at 2560x1600. In most of the games tested, the GTX 260 SLI combo did not deliver the best performance, but it was able to hold its own. Raw performance goes to the Quad SLI GTX 295 combo.

For most of us, at some point the power bill gets to be a concern - especially when you run a distributed computing project on both your CPU and GPUs. The power company wants their piece of the pie just like everyone else. While our test systems are not the most power hungry, it pulls a decent amount of power under load. The loser in the power consumption testing has to be the HD 4870x2 CrossFireX setup. With our systems and these two cards under load, the system pulled a total of 936 watts from the outlet. As a comparison, the Quad SLI setup used 757 watts from the mains. This scenario continued when the dual GPU cards were pulled out of multi-GPU mode, with the 4870x2 pulling 656 watts, and the GTX 295 pulling 471 watts under load. In the third class, the GTX 285 SLI combo did pull more current than the HD 4890 combo, with the GTX 260 setup falling between the HD 4870 and HD 4850 CrossFire setups.

When you look at the scaling in performance you get from going to a quad GPU setup, it is not going to be anywhere near 100% in most cases. There were a few instances where the scaling on the HD 4850x2 and HD 4870x2 scaled close to 100% - which was a real surprise. This was the exception, and not the rule. In turn, there were games where there was no scaling whatsoever with cards in CrossFireX, while the SLI combo scaled well. That's something that could be a driver fix, to enable better CrossFire support in games. To use all this video horsepower, you will need a monitor that runs no less than a 1920x1200 resolution, as anything else means you are just throwing money away.   (THIS IS SOMETHING ALL YOU ATI FANS NEED TO KNOW!!)

The ideal solution is to go with that 30+ inch monitor you have been lusting over to take advantage of the firepower. Currently, Nvidia and its SLI technology is the performance winner here, on this system, with the drivers and video cards tested. The numbers are the numbers, and this is what they show. ATI has great price points, but they just can't deliver overpowering performance for that price. On the other hand, Nvidia delivers the performance but at a steeper price. The old adage "you have to pay to play" comes to mind here. By paying, you have a GPU that is capable of doing much more than just creating pretty pictures for us to look at on that magic screen. With Nvidia's CUDA technology, you have a wide array of applications ready to take advantage of the parallel computing capabilities of its architecture, such as vReveal from Motion DSP which allows you to clean up poorly shot video, Badaboom for converting video to most popular mobile formats, Folding@Home where performance is just amazing, with more coming each week, it seems. You have GeForce 3D Vision to immerse you in the game. GPU Acceleration in Photoshop! What more do you need? Drivers? Why yes indeed, Johnny. Nvidia seems to be more committed to delivering drivers almost as fast as I change my underwear (yes, it's a daily occurrence), while ATI is still stuck on a once a month schedule, and you need to hope and pray they work.  (THIS IS FUNNY) One thing that could overcome the performance problems is having user adjustable CrossFire profiles so the performance is there. When there is no scaling or negative scaling, a profile for that game may be all that's needed to excel, but currently that's not the case. Price is a point of difference, but there is more to it than a GPU that costs less but offers less. You don't pay Hyundai dollars and expect Corvette performance. Sometimes you get a surprise, but now its ATI's turn to swing for the fences. The potential is there on the CrossFire side of the fence.



Bosco's Thoughts

To tap into what Frank said, what did I learn? That's an easy question to answer: consistency!

Nvidia has been very aggressive with getting their drivers to perform, either before a game is launched or on the launch date of the game. They have strived to be very aggressive, and it shows in our testing. When we all talk about multi-GPU performance, opinions vary on which company is better, and with ATI's driver issues it's kind of hard not to get frustrated. I have been very vocal about this in the past, and during our testing I was burning the phone lines up with Frank again complaining.

As Frank said, pricing is a big thing here, especially with it comes to running GTX 295s in Quad SLI, but really, if you are going to spend over $1000 on a 30" LCD, what's another $1000+ on video cards...seriously. If the situation were reversed, nobody could sit here and tell me that ATI would not charge $1100 for their cards. The reason is simple, they are cheaper because they offer inconsistent performance - pretty basic concept. Personally, I would not spend $800+ on ATI when I don't get the performance return out of it. What's the point? I might as well spend the extra $300 since I like Folding@Home, and I play a lot of games that scale well on Nvidia cards, like Dead Space for example. So why would I buy ATI? The answer is I wouldn't. I want a GPU that I can do more with than just gaming, and Nvidia has proven to have a ton more uses for their GPUs than ATI, as Frank said above. Sure ATI can fold, but nowhere near the level that Nvidia can, so it's a pretty cut and dried choice for me. I want the company that busts their asses to give you maximum performance without having to wait a month or two at a time for a driver update, if not longer, to add support for the game that you want to play now. To me, that is completely unacceptable, and this has been a complaint for a very, very long time from communities all over the Internet when it comes to CrossFire scaling. Who knows the reason for this, but they need to start working on their drivers to be more efficient, because in some games it's just embarrassing, and it has to be frustrating to owners.  (THIS IS WHY I LEFT ATI)

If you are running a CrossFire motherboard, what do you do? Simple - either buy a high-end Nvidia card such as the GTX 295, or run CrossFire. You may not get the same performance that you would with SLI, but you will still get an upgrade nonetheless. ATI cards are decent in certain games, no doubt about that, they are just not consistent across the board like I said earlier. My suggestion is, if you plan on doing a multi-GPU setup, do your research. Look at the games you want to play, and see how they scale. See where Nvidia and ATI are performance-wise, and make your purchase based on what you learn from reading and asking questions.

The idea of this article was to give people an idea as to where everything stands with different GPU setups. At the end of the day, you the user will have to decide exactly where your hard earned money is going to go. Our goal is to try and keep these types of articles coming for you as newer cards come out, so you can make good choices - because you can bet there are going to be more games like Crysis coming.

I know people will call me a fanboy, but it's my money. My Nvidia rigs are way more consistent than people's CrossFire rigs, and I get a lot more Folding@Home work units completed than the majority of people - and that's what matters to me.

As for me, my choice is a given - I have Green in my blood!

 

 

Price & Power

 

If you do any cruising through the video card threads on any number of web sites, the controversy is always price versus performance. Some people will always pay for higher performing cards, while you have the rest of the world who look for the balance and try to get the most bang for their buck. People buying the top-end cards with their top-end prices is always going to happen. But, when it comes to multi-GPU systems, the thought is usually on performance and not so much on cost. I remember my first multi-GPU system was with two ATI 1900XT's; the total cost was almost $1200 when the cards first came out. As an early adopter, I paid a premium for these cards - but at the time, this combination was pretty much top of the heap, until the Nvidia G80 cards came out and just cleaned house. Thinking about that $1200+ price tag and the level of performance delivered, the highest price combo in this comparison comes in at just over 1100 bucks, more than $100 less than my 1900XT combo from just a few years back. Just thinking of how far performance has come, and the costs associated with that performance, over the last few years is astounding.



Cost will always be a concern, so along that line I decided to see what the cost per FPS would be for each combo. To do this, I will add up the total FPS per card at each resolution and divide that total by 18, the amount of tests run where a result was given in FPS. The cost for the combo will then be divided by the average FPS to give a cost per FPS - effectively giving us a price versus performance comparison. Pricing will be the cards' current cost from Newegg. The measurement used will be dollars and cents.

 

 

The most power consumed by any combination in this test is hands down the HD 4870x2 CrossFireX combo, weighing in at 936 watts. This is almost 200 watts higher than the GTX295 in Quad SLI. The 4850x2, of course, had the lowest power consumed in the quad GPU category. When the x2 cards were run in single card mode, power consumption dropped as expected, with the GTX 295 still outperforming the HD 4870x2 - and almost using less power than the 4850x2. In the two GPU category, the GTX 285 SLI combo used the most power - but also delivered the highest performance. When it comes to pricing, the least expensive quad GPU setup is the HD 4850x2 CrossFireX combination, coming in at 520 bones. The GTX 295 is far and away the most expensive setup, but does prove its worth throughout the testing. Usually you get what you pay for. The least expensive dual GPU setup in this comparison is the Toxic HD 4850 setup at $260, and performance-wise the adage holds true - as this combo delivered the lowest performance of the group. When it came time to figure out which combo offered the most bang for your dollar, the GTX 295 Quad SLI setup is just about out of the running at $9.10 per FPS at 2560x1600. Even with the exceptional performance, the price tag is a bit steep at $1120 for the pair of cards. The HD 4870x2 setup is $320 less expensive, and begins to compete with the other combinations. The hands down best bang for your buck based on the testing I have done here is the GTX 260-216 SLI setup, which costs less than any other combination based on total cost versus performance delivered.   (I have said this before)

I post this to one futher my points in other threads..lol and to start pulling back the BS and AMD hype thats going around... There are alot of people buying Hardwear right now...and you need the Truth...Not hype..**

Also there Are Physics  in play here too....  Over 40+ titles  that when Going Nvidia  you can get an Extra bouns  of Improvements  Using CUDA  and PhysX  that  cant be done with a ATI GPU.

The Future will Tell a diffrent tune...But right now... Nvidia IS offering the Best Bang for you Buck.  And Has no limits to Physics enabled Games... coming out in the next year and more. 

** All internet Benchmarks and Sites can be bias  But  from my Exp   with ATI and Nvidia

this is dead on....

my Last few GPU's    

GTX 295

GTX 285 Sli

GTX 280 sli

GTS 8800 OC  SLi

My system now

Antec 1200 full tower

i7 920 0C @ 4.2 on air

Coolmaster V8 CPU cooler

EVGA x58 SLI MB

Corsair DDR3 6 gigs 1600 9.9.9:24 1T

EVGA GTX 285 Stock IN SLI

WD Raptor X 150gig X2 Raid 0

Corsiar h1000watt PSU

Creative Sound Blaster X-Fi Titanium Fatal1ty Champion.

Pioneer Elite A/V receiver.

Infinity Sound Speakers 5.1

Smasung 26 inch LCD syncmaster 1900x1200

Wacom Cintq 21 inch digital art monitor

logitech G15 keayboard

logitech MX laser mouse

logitech 5.1suround

 

"Beauty is only is only skin deep..." said the AMD/ATI fan. "Blah..Thats just what ugly people say..." said the Intel/Nvidia fan. You want price / performance, use the dollar menu..
image
image

«13

Comments

  • Sir_DripSir_Drip Member Posts: 133

    You get what you pay for!

    I dont care if its In-game physics or Nvidia Physx! Nvidia Cards are going to play both the way they were ment to be played! ATI will not have Nvidia Physx for a long time! So.... ANY game that comes out, for the rest of this year and prolly the next, If you purchased a Nvidia card It's a Win win! ATi is out of luck! I dont mind paying an exrta $20 bucks for a card over the other manufacture, as long as I can play ALL games the way they were ment to be PLAYED! Sad part is that all you that purchased ATI and saved $20.00 can not! You got ripped off for saving 20 bucks! LOL! And even if it was 20 bucks more ontop of that, The Nvidia Driver support is worth 20 bucks alone!

    Also look at the games that have Nvidia.Physx. (Cry3 engine...Cryostasis and Unreal engine....Mirrors Edge).

    http://www.overclockersclub.com/reviews/crossfire_vs_sli/10.htm

    http://www.overclockersclub.com/reviews/crossfire_vs_sli/17.htm

     It's something more than just breaking glass! It's a pimp slapping!

    The one that get me is "Deep Space" I don't beleave that this game is supported by Nvidia Physx, but acts as if it was.

    http://www.overclockersclub.com/reviews/crossfire_vs_sli/19.htm

    It prolly has some kind of phisics in the game engine and the Nvidia cards are picking it up!? Or could it be that the game makers didnt say anything about the support? What ever the case may be...I beleave it's in the game engine.

     

     

     

    EDIT Peace...

    image

  • VortigonVortigon Member UncommonPosts: 723

    I have a 8800GT 1GiG Memory.

    I have to say that physX is the biggest waste of space ever to enter the graphics market. It's buggy as hell, causes no end of compatability issues with all sorts of games, and has almost no future in this business.

    It is purely a marketing ploy, and a badly implemented one at that. 99% of all games with physics do not use ANY part of the physX system. It is useless junk and within a short time span we will see it removed from Nvidia cards, or entirely reworked fromt he ground up.

  • Sir_DripSir_Drip Member Posts: 133
    Originally posted by Vortigon


    I have a 8800GT 1GiG Memory.
    I have to say that physX is the biggest waste of space ever to enter the graphics market. It's buggy as hell, causes no end of compatability issues with all sorts of games, and has almost no future in this business.
    It is purely a marketing ploy, and a badly implemented one at that. 99% of all games with physics do not use ANY part of the physX system. It is useless junk and within a short time span we will see it removed from Nvidia cards, or entirely reworked fromt he ground up.



     

    Sorry to hear that you got the 1g 8800gt. The 512 meg cards are so much faster when overclocked. Also the 8800gt are on their way out.

     

     EDIT...Ill come back to this one in a year or two ...We'll see if your still around!

    image

  • Varlok91Varlok91 Member Posts: 396
    Originally posted by Sir_Drip
     
    Sorry to hear that you got the 1g 8800gt. The 512 meg cards are so much faster when overclocked. Also the 8800gt are on their way out.
     
     EDIT...Ill come back to this one in a year or two ...We'll see if your still around!

     

    The extra 512 megs of memory doesn't slow down the card at all. Sure, the extra memory does not do much, but it does not make the card perform worse.

    --------------------------------
    Desktop - AMD 8450 Tri Core, 3 gigs of DDR2 800 RAM, ATI HD 3200 Graphics, Windows Vista Home Premium 64-bit
    Laptop (Dell Latitude E6400) - Intel P8400, 2 GIGs of RAM, Intel X4500, Windows XP Professional

  • Sir_DripSir_Drip Member Posts: 133
    Originally posted by Varlok91

    Originally posted by Sir_Drip
     
    Sorry to hear that you got the 1g 8800gt. The 512 meg cards are so much faster when overclocked. Also the 8800gt are on their way out.
     
     EDIT...Ill come back to this one in a year or two ...We'll see if your still around!

     

    The extra 512 megs of memory doesn't slow down the card at all. Sure, the extra memory does not do much, but it does not make the card perform worse.

    No....the 512 meg cards overclock higher.

     

    image

  • Erowid420Erowid420 Member Posts: 93
    Originally posted by Dreadknot357


    http://www.overclockersclub.com/reviews/crossfire_vs_sli/
    SLI won 36 out of 40 tests in Quad SLI vs Quad CF, and 37 out of 40 in the other categories.
    Conclusions:

    * SLI is best overall for multi-GPU performance

    * SLI has better driver support and delivers more consistent performance than CF

    * ATI HD 4870x2 CF has worst power consumption

    * ATI’s drivers are frustrating

    * Best bang for the buck is GTX 260-216 SLI

    * GeForce offers much more than gaming perf: Folding@Home perf is better than ATI; 3D Vision; CUDA accelerates a variety of non-gaming apps
    Key POINTS I think that people should read (are in red)
    Conclusion:

    ccokeman's Thoughts
    So what did we learn from this little exercise? I learned that SLI is the more mature multi-GPU solution currently. In each of the three classes, the Nvidia technology and their video cards

    lost no more than four times out of 40 tests, with all things being equal - this being in the quad GPU class. Each of the other classes delivered a 37 to 3 margin of victory for team Green. Each set of cards was run at the default factory settings in the control panel to show what Joe Average will get out of a plug and play system. Could performance be improved upon by manipulating the settings in the control panel? Sure it could - on both counts, but this was about running what the system defaults to. The only deviation from this is in 3DMark Vantage, as the Nvidia cards share a distinct advantage in the PhsyX tests. The reasoning is that Futuremark does not allow the scores when the GPU does the PhysX calculations, because ATI does not have an equivalent technology

    to compete right now. Right now, SLI is the way to go for multi-GPU performance in the games tested in this comparison. The downside to this is that it will cost you a little bit more to get this performance, as ATI seems to have the pricing game down pat at the performance levels I looked at here. With just raw cost as a factor the GTX 295 quad GPU setup will set you back just over $1100, while the HD 4870x2 combo will only (insert sarcasm here) set you back $800-$1000 depending on the cards you buy. The Sapphire cards will only set you back 400 bucks each, instead of the more popular pricing of 500 bucks. In the single GPU card class, the GTX 285 combo will cost you almost $700 - at a cozy $680 - while the HD 4890 combo only goes for $530, with prices scaling downward based on the video card's capabilities. That being said, the best value per frame per second delivered is the GTX 260 SLI combination at $3.44 per FPS at 1920x1200, and $4.59 per frame at 2560x1600. In most of the games tested, the GTX 260 SLI combo did not deliver the best performance, but it was able to hold its own. Raw performance goes to the Quad SLI GTX 295 combo.
    For most of us, at some point the power bill gets to be a concern - especially when you run a distributed computing project on both your CPU and GPUs. The power company wants their piece of the pie just like everyone else. While our test systems are not the most power hungry, it pulls a decent amount of power under load. The loser in the power consumption testing has to be the HD 4870x2 CrossFireX setup. With our systems and these two cards under load, the system pulled a total of 936 watts from the outlet. As a comparison, the Quad SLI setup used 757 watts from the mains. This scenario continued when the dual GPU cards were pulled out of multi-GPU mode, with the 4870x2 pulling 656 watts, and the GTX 295 pulling 471 watts under load. In the third class, the GTX 285 SLI combo did pull more current than the HD 4890 combo, with the GTX 260 setup falling between the HD 4870 and HD 4850 CrossFire setups.
    When you look at the scaling in performance you get from going to a quad GPU setup, it is not going to be anywhere near 100% in most cases. There were a few instances where the scaling on the HD 4850x2 and HD 4870x2 scaled close to 100% - which was a real surprise. This was the exception, and not the rule. In turn, there were games where there was no scaling whatsoever with cards in CrossFireX, while the SLI combo scaled well. That's something that could be a driver fix, to enable better CrossFire support in games. To use all this video horsepower, you will need a monitor that runs no less than a 1920x1200 resolution, as anything else means you are just throwing money away.   (THIS IS SOMETHING ALL YOU ATI FANS NEED TO KNOW!!)
    The ideal solution is to go with that 30+ inch monitor you have been lusting over to take advantage of the firepower. Currently, Nvidia and its SLI technology is the performance winner here, on this system, with the drivers and video cards tested. The numbers are the numbers, and this is what they show. ATI has great price points, but they just can't deliver overpowering performance for that price. On the other hand, Nvidia delivers the performance but at a steeper price. The old adage "you have to pay to play" comes to mind here. By paying, you have a GPU that is capable of doing much more than just creating pretty pictures for us to look at on that magic screen. With Nvidia's CUDA technology, you have a wide array of applications ready to take advantage of the parallel computing capabilities of its architecture, such as vReveal from Motion DSP which allows you to clean up poorly shot video, Badaboom for converting video to most popular mobile formats, Folding@Home where performance is just amazing, with more coming each week, it seems. You have GeForce 3D Vision to immerse you in the game. GPU Acceleration in Photoshop! What more do you need? Drivers? Why yes indeed, Johnny. Nvidia seems to be more committed to delivering drivers almost as fast as I change my underwear (yes, it's a daily occurrence), while ATI is still stuck on a once a month schedule, and you need to hope and pray they work.  (THIS IS FUNNY) One thing that could overcome the performance problems is having user adjustable CrossFire profiles so the performance is there. When there is no scaling or negative scaling, a profile for that game may be all that's needed to excel, but currently that's not the case. Price is a point of difference, but there is more to it than a GPU that costs less but offers less. You don't pay Hyundai dollars and expect Corvette performance. Sometimes you get a surprise, but now its ATI's turn to swing for the fences. The potential is there on the CrossFire side of the fence.


    Bosco's Thoughts
    To tap into what Frank said, what did I learn? That's an easy question to answer: consistency!
    Nvidia has been very aggressive with getting their drivers to perform, either before a game is launched or on the launch date of the game. They have strived to be very aggressive, and it shows in our testing. When we all talk about multi-GPU performance, opinions vary on which company is better, and with ATI's driver issues it's kind of hard not to get frustrated. I have been very vocal about this in the past, and during our testing I was burning the phone lines up with Frank again complaining.
    As Frank said, pricing is a big thing here, especially with it comes to running GTX 295s in Quad SLI, but really, if you are going to spend over $1000 on a 30" LCD, what's another $1000+ on video cards...seriously. If the situation were reversed, nobody could sit here and tell me that ATI would not charge $1100 for their cards. The reason is simple, they are cheaper because they offer inconsistent performance - pretty basic concept. Personally, I would not spend $800+ on ATI when I don't get the performance return out of it. What's the point? I might as well spend the extra $300 since I like Folding@Home, and I play a lot of games that scale well on Nvidia cards, like Dead Space for example. So why would I buy ATI? The answer is I wouldn't. I want a GPU that I can do more with than just gaming, and Nvidia has proven to have a ton more uses for their GPUs than ATI, as Frank said above. Sure ATI can fold, but nowhere near the level that Nvidia can, so it's a pretty cut and dried choice for me. I want the company that busts their asses to give you maximum performance without having to wait a month or two at a time for a driver update, if not longer, to add support for the game that you want to play now. To me, that is completely unacceptable, and this has been a complaint for a very, very long time from communities all over the Internet when it comes to CrossFire scaling. Who knows the reason for this, but they need to start working on their drivers to be more efficient, because in some games it's just embarrassing, and it has to be frustrating to owners.  (THIS IS WHY I LEFT ATI)
    If you are running a CrossFire motherboard, what do you do? Simple - either buy a high-end Nvidia card such as the GTX 295, or run CrossFire. You may not get the same performance that you would with SLI, but you will still get an upgrade nonetheless. ATI cards are decent in certain games, no doubt about that, they are just not consistent across the board like I said earlier. My suggestion is, if you plan on doing a multi-GPU setup, do your research. Look at the games you want to play, and see how they scale. See where Nvidia and ATI are performance-wise, and make your purchase based on what you learn from reading and asking questions.
    The idea of this article was to give people an idea as to where everything stands with different GPU setups. At the end of the day, you the user will have to decide exactly where your hard earned money is going to go. Our goal is to try and keep these types of articles coming for you as newer cards come out, so you can make good choices - because you can bet there are going to be more games like Crysis coming.
    I know people will call me a fanboy, but it's my money. My Nvidia rigs are way more consistent than people's CrossFire rigs, and I get a lot more Folding@Home work units completed than the majority of people - and that's what matters to me.
    As for me, my choice is a given - I have Green in my blood!

     
     
    Price & Power
     
    If you do any cruising through the video card threads on any number of web sites, the controversy is always price versus performance. Some people will always pay for higher performing cards, while you have the rest of the world who look for the balance and try to get the most bang for their buck. People buying the top-end cards with their top-end prices is always going to happen. But, when it comes to multi-GPU systems, the thought is usually on performance and not so much on cost. I remember my first multi-GPU system was with two ATI 1900XT's; the total cost was almost $1200 when the cards first came out. As an early adopter, I paid a premium for these cards - but at the time, this combination was pretty much top of the heap, until the Nvidia G80 cards came out and just cleaned house. Thinking about that $1200+ price tag and the level of performance delivered, the highest price combo in this comparison comes in at just over 1100 bucks, more than $100 less than my 1900XT combo from just a few years back. Just thinking of how far performance has come, and the costs associated with that performance, over the last few years is astounding.


    Cost will always be a concern, so along that line I decided to see what the cost per FPS would be for each combo. To do this, I will add up the total FPS per card at each resolution and divide that total by 18, the amount of tests run where a result was given in FPS. The cost for the combo will then be divided by the average FPS to give a cost per FPS - effectively giving us a price versus performance comparison. Pricing will be the cards' current cost from Newegg. The measurement used will be dollars and cents.
     
     
    The most power consumed by any combination in this test is hands down the HD 4870x2 CrossFireX combo, weighing in at 936 watts. This is almost 200 watts higher than the GTX295 in Quad SLI. The 4850x2, of course, had the lowest power consumed in the quad GPU category. When the x2 cards were run in single card mode, power consumption dropped as expected, with the GTX 295 still outperforming the HD 4870x2 - and almost using less power than the 4850x2. In the two GPU category, the GTX 285 SLI combo used the most power - but also delivered the highest performance. When it comes to pricing, the least expensive quad GPU setup is the HD 4850x2 CrossFireX combination, coming in at 520 bones. The GTX 295 is far and away the most expensive setup, but does prove its worth throughout the testing. Usually you get what you pay for. The least expensive dual GPU setup in this comparison is the Toxic HD 4850 setup at $260, and performance-wise the adage holds true - as this combo delivered the lowest performance of the group. When it came time to figure out which combo offered the most bang for your dollar, the GTX 295 Quad SLI setup is just about out of the running at $9.10 per FPS at 2560x1600. Even with the exceptional performance, the price tag is a bit steep at $1120 for the pair of cards. The HD 4870x2 setup is $320 less expensive, and begins to compete with the other combinations. The hands down best bang for your buck based on the testing I have done here is the GTX 260-216 SLI setup, which costs less than any other combination based on total cost versus performance delivered.   (I have said this before)


    I post this to one futher my points in other threads..lol and to start pulling back the BS and AMD hype thats going around... There are alot of people buying Hardwear right now...and you need the Truth...Not hype..**
    Also there Are Physics  in play here too....  Over 40+ titles  that when Going Nvidia  you can get an Extra bouns  of Improvements  Using CUDA  and PhysX  that  cant be done with a ATI GPU.
    The Future will Tell a diffrent tune...But right now... Nvidia IS offering the Best Bang for you Buck.  And Has no limits to Physics enabled Games... coming out in the next year and more. 
    ** All internet Benchmarks and Sites can be bias  But  from my Exp   with ATI and Nvidia
    this is dead on....
    my Last few GPU's    
    GTX 295
    GTX 285 Sli
    GTX 280 sli
    GTS 8800 OC  SLi
    My system now
    Antec 1200 full tower

    i7 920 0C @ 4.2 on air

    Coolmaster V8 CPU cooler

    EVGA x58 SLI MB

    Corsair DDR3 6 gigs 1600 9.9.9:24 1T

    EVGA GTX 285 Stock IN SLI

    WD Raptor X 150gig X2 Raid 0

    Corsiar h1000watt PSU

    Creative Sound Blaster X-Fi Titanium Fatal1ty Champion.

    Pioneer Elite A/V receiver.

    Infinity Sound Speakers 5.1

    Smasung 26 inch LCD syncmaster 1900x1200

    Wacom Cintq 21 inch digital art monitor

    logitech G15 keayboard

    logitech MX laser mouse

    logitech 5.1suround
     



     

    Thanks for the link...  it's good to know that $2,116 worth of Nvidia cards (quad SLI) is the winner of that review, because I would hate to see a $299 offering do 70% of that....

    Oh wait....   lol.

     

    Gotta love fanbois who don't understand bang for the buck! (seeing that most of us upgrade every 10 months or so and not buy 4 video card all at once...lol)

    ___________________________

    - Knowledge is power, ive been in school for 28 years!

  • Dreadknot357Dreadknot357 Member Posts: 148
    Originally posted by Erowid420




     
    Thanks for the link...  it's good to know that $2,116 worth of Nvidia cards (quad SLI) is the winner of that review, because I would hate to see a $299 offering do 70% of that....
    Oh wait....   lol.
     
    Gotta love fanbois who don't understand bang for the buck! (seeing that most of us upgrade every 10 months or so and not buy 4 video card all at once...lol

    $2116 worth of nvidia Cards?  Quad? 

    you buy  Hardwear?

    "Beauty is only is only skin deep..." said the AMD/ATI fan. "Blah..Thats just what ugly people say..." said the Intel/Nvidia fan. You want price / performance, use the dollar menu..
    image
    image

  • csthaocsthao Member UncommonPosts: 1,121

    Does anyone ever really need QUAD GPU's? Seriously...I'm still only using 1 GPU and am happy with it.

  • Dreadknot357Dreadknot357 Member Posts: 148
    Originally posted by csthao


    Does anyone ever really need QUAD GPU's? Seriously...I'm still only using 1 GPU and am happy with it.



     

    yes  it can be needed.  Its  just like someone adding 200  horsepower to their Car,  when the speed limit  is 65. (if you want the power,  its there if you need it) 

    This  Review is not about just the Quads  its About GPUs and companys,  and what is best for the money.

    If you have A high rez Monitor  Over 1920 x 1200  you need more Power to run higher settings in game...

    So yes...  there are still games on the market that Can not be played at uber high rez and max settings@ 60 Frames. (which is the goal  for most....

    And Most killer setups  can still have problems....

     

    What rez and games are you playing?

     

    "Beauty is only is only skin deep..." said the AMD/ATI fan. "Blah..Thats just what ugly people say..." said the Intel/Nvidia fan. You want price / performance, use the dollar menu..
    image
    image

  • csthaocsthao Member UncommonPosts: 1,121
    Originally posted by Dreadknot357

    Originally posted by csthao


    Does anyone ever really need QUAD GPU's? Seriously...I'm still only using 1 GPU and am happy with it.



     

    yes  it can be needed.  Its  just like someone adding 200  horsepower to their Car,  when the speed limit  is 65. (if you want the power,  its there if you need it) 

    This  Review is not about just the Quads  its About GPUs and companys,  and what is best for the money.

    If you have A high rez Monitor  Over 1920 x 1200  you need more Power to run higher settings in game...

    So yes...  there are still games on the market that Can not be played at uber high rez and max settings@ 60 Frames. (which is the goal  for most....

    And Most killer setups  can still have problems....

     

    What rez and games are you playing?

     



     

    1440x900, I'm not playing any games as of right now, but used to play VG, EQ, WoW, and Warhammer. I am using 8600 GT, which is pretty much good enough for me. It does enough to make me happy, I'm not too serious about graphics. I can still play those games on somewhat high settings (with a few tweaks) and still have good performance bout 20-40 fps.

  • Loke666Loke666 Member EpicPosts: 21,441

    First: Stop quoting the whole post when it is really long, cut out the parts you don't answering to. Otherwise it is a nightmare to read the thread.

    Secondly, do any MMO supposrts even regular SLI? I kinda doubt it, this is not the right place to compare these things, you just makes people believe that these things gives better performance in MMOs, but for us who don't play FPS this won't help. We need to find a single card with maximum performance instead.

    I ran a lot of games in SLI a year ago, no difference whatsoever from a single card (in a MMOs mind you, FPS games got better performance). Clocking up my GFX cards gave a 5-10% increase so I gave up and got rid of my 2 8800 GTS cards and bought a 280 GTX instead, the rise in performance was very impressive.

    I only rarely play FPS games.

  • Dreadknot357Dreadknot357 Member Posts: 148
    Originally posted by Loke666


    First: Stop quoting the whole post when it is really long, cut out the parts you don't answering to. Otherwise it is a nightmare to read the thread.
    Secondly, do any MMO supposrts even regular SLI? I kinda doubt it, this is not the right place to compare these things, you just makes people believe that these things gives better performance in MMOs, but for us who don't play FPS this won't help. We need to find a single card with maximum performance instead.
    I ran a lot of games in SLI a year ago, no difference whatsoever from a single card (in a MMOs mind you, FPS games got better performance). Clocking up my GFX cards gave a 5-10% increase so I gave up and got rid of my 2 8800 GTS cards and bought a 280 GTX instead, the rise in performance was very impressive.
    I only rarely play FPS games.



     

    I got a  50% Perfomance boost in AOC  DX10 with SLI

    when i had one GTX285  i was getting trouble in this one spot, In Thunder River... on a overlook of the valley

    About 13~32 frames avg.... I was pissed.  I guess it was AOC DX10 beta that was the problem ( and it was most it was)

    I got the Second GTX 285 for sli and Brute forced that Bitch ass game into giving me my frames!....I was sitting at 60~74 Avg

    I got about a 20% in Wow.....so yea.  (not that i need it...but in the middle of Stormwind  it didnt lag as bad)

    The rise you got was cause that 280 card is a beast...lol  I know I had 2 of them i swaped out from two 8800GTS OC 640s

    to 1 GTX280.  then got the second one a week later.

    Right now there are not many MMOs  that need Sli...

    But these newer MMOS coming out  (Unreal 3) will see a great gain from Sli...and the fact that you can Get Sli or CF for the price of some Single card solutions...then its a good buy.

    And all of this is if you need to Max your settings...(which is the only way i can play...graphics mean everthing for me. Im a Artist)

     

     

     

    "Beauty is only is only skin deep..." said the AMD/ATI fan. "Blah..Thats just what ugly people say..." said the Intel/Nvidia fan. You want price / performance, use the dollar menu..
    image
    image

  • Sir_DripSir_Drip Member Posts: 133
    Originally posted by Dreadknot357

    Originally posted by Erowid420




     
    Thanks for the link...  it's good to know that $2,116 worth of Nvidia cards (quad SLI) is the winner of that review, because I would hate to see a $299 offering do 70% of that....
    Oh wait....   lol.
     
    Gotta love fanbois who don't understand bang for the buck! (seeing that most of us upgrade every 10 months or so and not buy 4 video card all at once...lol

    $2116 worth of nvidia Cards?  Quad? 

    you buy  Hardwear?

    Yea ...He needs to learn to READ! and ADD! "28 years WAISTED!

    Retarted is Retarted  it is just that!...No matter how many years you try to teach them...Their still retarted!

    image

  • Erowid420Erowid420 Member Posts: 93
    Originally posted by Sir_Drip


    You get what you pay for!
    I dont care if its In-game physics or Nvidia Physx! Nvidia Cards are going to play both the way they were ment to be played! ATI will not have Nvidia Physx for a long time! So.... ANY game that comes out, for the rest of this year and prolly the next, If you purchased a Nvidia card It's a Win win! ATi is out of luck! I dont mind paying an exrta $20 bucks for a card over the other manufacture, as long as I can play ALL games the way they were ment to be PLAYED! Sad part is that all you that purchased ATI and saved $20.00 can not! You got ripped off for saving 20 bucks! LOL! And even if it was 20 bucks more ontop of that, The Nvidia Driver support is worth 20 bucks alone!
    Also look at the games that have Nvidia.Physx. (Cry3 engine...Cryostasis and Unreal engine....Mirrors Edge).
    http://www.overclockersclub.com/reviews/crossfire_vs_sli/10.htm
    http://www.overclockersclub.com/reviews/crossfire_vs_sli/17.htm
     It's something more than just breaking glass! It's a pimp slapping!
    The one that get me is "Deep Space" I don't beleave that this game is supported by Nvidia Physx, but acts as if it was.
    http://www.overclockersclub.com/reviews/crossfire_vs_sli/19.htm
    It prolly has some kind of phisics in the game engine and the Nvidia cards are picking it up!? Or could it be that the game makers didnt say anything about the support? What ever the case may be...I beleave it's in the game engine.
     
     
     
    EDIT Peace...



     

    Unfortuneatly, that is not true... as everyone knows you need a second Nvidia card (SLI) to make use of PhysX in those games.

    Secondly, the physX that do work with 1 Nvidia card are ancillary "effects" that add fluff, like realistic glass breaking. You and your friend both know this. Odd why you kep posting this marketing stuff, perhaps you are spondored by Nvidia..?

    ATI earlier this year bowed out of using their cards to enhance physics in games, because the industry is moving towards Havok, OpenCL and other proprietary CPU based physics (Project Offset, Infernal Engine). As already mentione a thousand times over in other threads.

    CUDA is great for other things, but you can't run it and gain something for free, while the video card itself, is trying to render the game. That is what you fail to understand. You still need several Nvidia cards to use physics without stuttering or or smooth gameplay. Several benchmark programs didn't compensate for CUDA, thus early reviews showed gains. Newer reviews of high-end cards can handle ancillary physic (using PhysX) in the games. But again, do you really need to spend $100 more, just to have glass break more realistically... or wait and just spend that added money later?

    Since you gain nothng in actual game play.

     

     

    BTW, Overclockersclub is not an objective site and their reviews are extremely biased. Even long time members question some of their latest reviews. Other, more reputable sites have counterdicted several of their recent reviews. If this is where the 55SIX boys are getting ALL of their information, I can understand their myopic perspective.

    Nothing wrong with Nvidias card, or even CUDA, but their actual use and performance gain over ATi is nil, dollar for dollar. As PhysX and Nvidias Marketing partners are just an orchestrated gimmick to sell more of their videa cards. You don't get anything for free. Off coarse, if you are playing a nifty pinball game, that doesn't stress your video card, it might be able to handle PhysX.

     

     

    ___________________________

    - Knowledge is power, ive been in school for 28 years!

  • Dreadknot357Dreadknot357 Member Posts: 148
    Originally posted by Erowid420




     
    Unfortuneatly, that is not true... as everyone knows you need a second Nvidia card (SLI) to make use of PhysX in those games.
    Secondly, the physX that do work with 1 Nvidia card are ancillary "effects" that add fluff, like realistic glass breaking. You and your friend both know this. Odd why you kep posting this marketing stuff, perhaps you are spondored by Nvidia..?
    ATI earlier this year bowed out of using their cards to enhance physics in games, because the industry is moving towards Havok, OpenCL and other proprietary CPU based physics (Project Offset, Infernal Engine). As already mentione a thousand times over in other threads.
    CUDA is great for other things, but you can't run it and gain something for free, while the video card itself, is trying to render the game. That is what you fail to understand. You still need several Nvidia cards to use physics without stuttering or or smooth gameplay. Several benchmark programs didn't compensate for CUDA, thus early reviews showed gains. Newer reviews of high-end cards can handle ancillary physic (using PhysX) in the games. But again, do you really need to spend $100 more, just to have glass break more realistically... or wait and just spend that added money later?
    Since you gain nothng in actual game play.
     
     
    BTW, Overclockersclub is not an objective site and their reviews are extremely biased. Even long time members question some of their latest reviews. Other, more reputable sites have counterdicted several of their recent reviews. If this is where the 55SIX boys are getting ALL of their information, I can understand their myopic perspective.
    Nothing wrong with Nvidias card, or even CUDA, but their actual use and performance gain over ATi is nil, dollar for dollar. As PhysX and Nvidias Marketing partners are just an orchestrated gimmick to sell more of their videa cards. You don't get anything for free. Off coarse, if you are playing a nifty pinball game, that doesn't stress your video card, it might be able to handle PhysX.
     
     



     

    one.. that was a link that i stumble apon... I hate benchmarks  and websites...i like to do my own hands on Exp.

    but im sure they know more than all of us...

     

    If you read anything, you would notice i said that watch out for sites that can be bias...and this could be true...

    But with my" Real Exp" with this hardwear...its  dead on.

     

    And before you try and act like you know anthing about Hardwear...

    Answer the question that i asked before. 

     "it's good to know that $2,116 worth of Nvidia cards (quad SLI) is the winner of that review, because I would hate to see a $299 offering do 70% of that"

    What do you mean $2,116 worth of nvidia cards quad Sli?

     

    And $299 offering do 70% of that....?  what ?

    I dont need any other answers, or BS about anything else, i just want the answer to these questions....

    "Beauty is only is only skin deep..." said the AMD/ATI fan. "Blah..Thats just what ugly people say..." said the Intel/Nvidia fan. You want price / performance, use the dollar menu..
    image
    image

  • viralzviralz Member Posts: 78

    sli/crossfire is just a marketing ploy. enjoy 15 extra fps for double the money.

    image

  • Sir_DripSir_Drip Member Posts: 133
    Originally posted by viralz


    sli/crossfire is just a marketing ploy. enjoy 15 extra fps for double the money.



     

    Depends on the game.

    image

  • Dreadknot357Dreadknot357 Member Posts: 148
    Originally posted by viralz


    sli/crossfire is just a marketing ploy. enjoy 15 extra fps for double the money.



     

    someone  has no idea  what they are talking about...lol

    "Beauty is only is only skin deep..." said the AMD/ATI fan. "Blah..Thats just what ugly people say..." said the Intel/Nvidia fan. You want price / performance, use the dollar menu..
    image
    image

  • JackcoltJackcolt Member UncommonPosts: 2,170

    And had you taken a performance review test about a year ago ATI would have won due to 4850 in CF simply giving best bang for the buck(according to Toms Hardware). But then again everybody knows that overclockers club are nVidia fanboys Let me give you another opinion about the price/performace as of may : www.tomshardware.com/reviews/radeon-geforce-graphics,2296.html

    ATI wins most ranges or at least ties it. I agree with a lot of said on overclockers club, but fact is still unless you have lot of dough to spend you'll still the best price/performance at most tiers with an ATI solution. Of course if you want GPGPU then CUDA is to way to go as of now.

    Furthermore ATI is working with Intel to get hardware support for Havok, which right as of now the most widespread physics SDK.

    image
    image

  • Dreadknot357Dreadknot357 Member Posts: 148
    Originally posted by Jackcolt


    And had you taken a performance review test about a year ago ATI would have won due to 4850 in CF simply giving best bang for the buck(according to Toms Hardware). But then again everybody knows that overclockers club are nVidia fanboys Let me give you another opinion about the price/performace as of may : www.tomshardware.com/reviews/radeon-geforce-graphics,2296.html
    ATI wins most ranges or at least ties it. I agree with a lot of said on overclockers club, but fact is still unless you have lot of dough to spend you'll still the best price/performance at most tiers with an ATI solution. Of course if you want GPGPU then CUDA is to way to go as of now.
    Furthermore ATI is working with Intel to get hardware support for Havok, which right as of now the most widespread physics SDK.



     

    So you are saying  OC  fuged the benchmarks and the Prices...?

    "Beauty is only is only skin deep..." said the AMD/ATI fan. "Blah..Thats just what ugly people say..." said the Intel/Nvidia fan. You want price / performance, use the dollar menu..
    image
    image

  • Sir_DripSir_Drip Member Posts: 133
    Originally posted by Erowid420

    Originally posted by Sir_Drip


    You get what you pay for!
    I dont care if its In-game physics or Nvidia Physx! Nvidia Cards are going to play both the way they were ment to be played! ATI will not have Nvidia Physx for a long time! So.... ANY game that comes out, for the rest of this year and prolly the next, If you purchased a Nvidia card It's a Win win! ATi is out of luck! I dont mind paying an exrta $20 bucks for a card over the other manufacture, as long as I can play ALL games the way they were ment to be PLAYED! Sad part is that all you that purchased ATI and saved $20.00 can not! You got ripped off for saving 20 bucks! LOL! And even if it was 20 bucks more ontop of that, The Nvidia Driver support is worth 20 bucks alone!
    Also look at the games that have Nvidia.Physx. (Cry3 engine...Cryostasis and Unreal engine....Mirrors Edge).
    http://www.overclockersclub.com/reviews/crossfire_vs_sli/10.htm
    http://www.overclockersclub.com/reviews/crossfire_vs_sli/17.htm
     It's something more than just breaking glass! It's a pimp slapping!
    The one that get me is "Deep Space" I don't beleave that this game is supported by Nvidia Physx, but acts as if it was.
    http://www.overclockersclub.com/reviews/crossfire_vs_sli/19.htm
    It prolly has some kind of phisics in the game engine and the Nvidia cards are picking it up!? Or could it be that the game makers didnt say anything about the support? What ever the case may be...I beleave it's in the game engine.
     
     
     
    EDIT Peace...



     

    Unfortuneatly, that is not true... as everyone knows you need a second Nvidia card (SLI) to make use of PhysX in those games.

    Secondly, the physX that do work with 1 Nvidia card are ancillary "effects" that add fluff, like realistic glass breaking. You and your friend both know this. Odd why you kep posting this marketing stuff, perhaps you are spondored by Nvidia..?

    ATI earlier this year bowed out of using their cards to enhance physics in games, because the industry is moving towards Havok, OpenCL and other proprietary CPU based physics (Project Offset, Infernal Engine). As already mentione a thousand times over in other threads.

    CUDA is great for other things, but you can't run it and gain something for free, while the video card itself, is trying to render the game. That is what you fail to understand. You still need several Nvidia cards to use physics without stuttering or or smooth gameplay. Several benchmark programs didn't compensate for CUDA, thus early reviews showed gains. Newer reviews of high-end cards can handle ancillary physic (using PhysX) in the games. But again, do you really need to spend $100 more, just to have glass break more realistically... or wait and just spend that added money later?

    Since you gain nothng in actual game play.

     

     

    BTW, Overclockersclub is not an objective site and their reviews are extremely biased. Even long time members question some of their latest reviews. Other, more reputable sites have counterdicted several of their recent reviews. If this is where the 55SIX boys are getting ALL of their information, I can understand their myopic perspective.

    Nothing wrong with Nvidias card, or even CUDA, but their actual use and performance gain over ATi is nil, dollar for dollar. As PhysX and Nvidias Marketing partners are just an orchestrated gimmick to sell more of their videa cards. You don't get anything for free. Off coarse, if you are playing a nifty pinball game, that doesn't stress your video card, it might be able to handle PhysX.

     

     

    "Unfortuneatly, that is not true... as everyone knows you need a second Nvidia card (SLI) to make use of PhysX in those games."

     

     

    REPLY- NOPE! you are wrong once again. You will need to go back to Nvidia's website and read it AGAIN! Hint: there are 4 configurations/setups that all run Nvidia Physx.

     

    "Secondly, the physX that do work with 1 Nvidia card are ancillary "effects" that add fluff, like realistic glass breaking. You and your friend both know this. Odd why you kep posting this marketing stuff, perhaps you are spondored by Nvidia..?"

     

    REPLY- Yep...you guessed at it again! And  Wrong again! You will need to go back to Nvidia's website and read it AGAIN! Nvidia Physx is Nvidia Physx...There's not  diffrent verson's for single card users! The single card setup dosent off load the CPU as much as a 2 card setup.

     

    "BTW, Overclockersclub is not an objective site and their reviews are extremely biased. Even long time members question some of their latest reviews. Other, more reputable sites have counterdicted several of their recent reviews. If this is where the 55SIX boys are getting ALL of their information, I can understand their myopic perspective."

     

    Got the Overclockers club link form  Toms hardware site. You say that Overclockersclub are extremely biased. Sounds about right coming from a ATI fan. Toms hardware's GPU compairson dont give you the test bed spec's, Driver ver used for the test, nor if Physx was on or off! Now who would you think is extremely biased? At leased Overclockersclub gives you all that needed information.

    image

  • Erowid420Erowid420 Member Posts: 93
    Originally posted by Dreadknot357

    Originally posted by Erowid420




     
    Unfortuneatly, that is not true... as everyone knows you need a second Nvidia card (SLI) to make use of PhysX in those games.
    Secondly, the physX that do work with 1 Nvidia card are ancillary "effects" that add fluff, like realistic glass breaking. You and your friend both know this. Odd why you kep posting this marketing stuff, perhaps you are spondored by Nvidia..?
    ATI earlier this year bowed out of using their cards to enhance physics in games, because the industry is moving towards Havok, OpenCL and other proprietary CPU based physics (Project Offset, Infernal Engine). As already mentione a thousand times over in other threads.
    CUDA is great for other things, but you can't run it and gain something for free, while the video card itself, is trying to render the game. That is what you fail to understand. You still need several Nvidia cards to use physics without stuttering or or smooth gameplay. Several benchmark programs didn't compensate for CUDA, thus early reviews showed gains. Newer reviews of high-end cards can handle ancillary physic (using PhysX) in the games. But again, do you really need to spend $100 more, just to have glass break more realistically... or wait and just spend that added money later?
    Since you gain nothng in actual game play.
     
     
    BTW, Overclockersclub is not an objective site and their reviews are extremely biased. Even long time members question some of their latest reviews. Other, more reputable sites have counterdicted several of their recent reviews. If this is where the 55SIX boys are getting ALL of their information, I can understand their myopic perspective.
    Nothing wrong with Nvidias card, or even CUDA, but their actual use and performance gain over ATi is nil, dollar for dollar. As PhysX and Nvidias Marketing partners are just an orchestrated gimmick to sell more of their videa cards. You don't get anything for free. Off coarse, if you are playing a nifty pinball game, that doesn't stress your video card, it might be able to handle PhysX.
     
     



     

    one.. that was a link that i stumble apon... I hate benchmarks  and websites...i like to do my own hands on Exp.

    but im sure they know more than all of us...

     

    If you read anything, you would notice i said that watch out for sites that can be bias...and this could be true...

    But with my" Real Exp" with this hardwear...its  dead on.

     

    And before you try and act like you know anthing about Hardwear...

    Answer the question that i asked before. 

     "it's good to know that $2,116 worth of Nvidia cards (quad SLI) is the winner of that review, because I would hate to see a $299 offering do 70% of that"

    What do you mean $2,116 worth of nvidia cards quad Sli?

     

    And $299 offering do 70% of that....?  what ?

    I dont need any other answers, or BS about anything else, i just want the answer to these questions....



     

     

    Obviously that was a typo... it was $1,116...   duh. (as the GTX 295 cost $550 a piece...)

    Yes, and for whatever performance those two GTX 295's offer,  70% of that performance can be had with HD4850x2 for $299 bucks. Maximum frame rates do not matter, you just don't want low frame rates or stutter. Quade SLI is pointless, unless you fold.

    Plus, who really wants to spend $1,200 every 10 months. I understand you don't care about that because you are sponsored by Nvidia and you get your stuff for free. But most people aren't, therefore we buy with bang for the buck in mind.

    Currently, it's pretty hard to bat the HD4770, HD4870(1GB), HD4890(1GB) or the HD4850x2.  Nvidia's top of the line cards have fallen nearly 50% in just 3 months and have lost several OEM partners. Plus, only an idiot would buy a SLI set up so close to the DirectX 11.0 card comming out.

     

    ___________________________

    - Knowledge is power, ive been in school for 28 years!

  • Erowid420Erowid420 Member Posts: 93
    Originally posted by Sir_Drip

    Originally posted by Erowid420

    Originally posted by Sir_Drip


    You get what you pay for!
    I dont care if its In-game physics or Nvidia Physx! Nvidia Cards are going to play both the way they were ment to be played! ATI will not have Nvidia Physx for a long time! So.... ANY game that comes out, for the rest of this year and prolly the next, If you purchased a Nvidia card It's a Win win! ATi is out of luck! I dont mind paying an exrta $20 bucks for a card over the other manufacture, as long as I can play ALL games the way they were ment to be PLAYED! Sad part is that all you that purchased ATI and saved $20.00 can not! You got ripped off for saving 20 bucks! LOL! And even if it was 20 bucks more ontop of that, The Nvidia Driver support is worth 20 bucks alone!
    Also look at the games that have Nvidia.Physx. (Cry3 engine...Cryostasis and Unreal engine....Mirrors Edge).
    http://www.overclockersclub.com/reviews/crossfire_vs_sli/10.htm
    http://www.overclockersclub.com/reviews/crossfire_vs_sli/17.htm
     It's something more than just breaking glass! It's a pimp slapping!
    The one that get me is "Deep Space" I don't beleave that this game is supported by Nvidia Physx, but acts as if it was.
    http://www.overclockersclub.com/reviews/crossfire_vs_sli/19.htm
    It prolly has some kind of phisics in the game engine and the Nvidia cards are picking it up!? Or could it be that the game makers didnt say anything about the support? What ever the case may be...I beleave it's in the game engine.
     
     
     
    EDIT Peace...



     

    Unfortuneatly, that is not true... as everyone knows you need a second Nvidia card (SLI) to make use of PhysX in those games.

    Secondly, the physX that do work with 1 Nvidia card are ancillary "effects" that add fluff, like realistic glass breaking. You and your friend both know this. Odd why you kep posting this marketing stuff, perhaps you are spondored by Nvidia..?

    ATI earlier this year bowed out of using their cards to enhance physics in games, because the industry is moving towards Havok, OpenCL and other proprietary CPU based physics (Project Offset, Infernal Engine). As already mentione a thousand times over in other threads.

    CUDA is great for other things, but you can't run it and gain something for free, while the video card itself, is trying to render the game. That is what you fail to understand. You still need several Nvidia cards to use physics without stuttering or or smooth gameplay. Several benchmark programs didn't compensate for CUDA, thus early reviews showed gains. Newer reviews of high-end cards can handle ancillary physic (using PhysX) in the games. But again, do you really need to spend $100 more, just to have glass break more realistically... or wait and just spend that added money later?

    Since you gain nothng in actual game play.

     

     

    BTW, Overclockersclub is not an objective site and their reviews are extremely biased. Even long time members question some of their latest reviews. Other, more reputable sites have counterdicted several of their recent reviews. If this is where the 55SIX boys are getting ALL of their information, I can understand their myopic perspective.

    Nothing wrong with Nvidias card, or even CUDA, but their actual use and performance gain over ATi is nil, dollar for dollar. As PhysX and Nvidias Marketing partners are just an orchestrated gimmick to sell more of their videa cards. You don't get anything for free. Off coarse, if you are playing a nifty pinball game, that doesn't stress your video card, it might be able to handle PhysX.

     

     

    "Unfortuneatly, that is not true... as everyone knows you need a second Nvidia card (SLI) to make use of PhysX in those games."

     

     

    REPLY- NOPE! you are wrong once again. You will need to go back to Nvidia's website and read it AGAIN! Hint: there are 4 configurations/setups that all run Nvidia Physx.

     

    "Secondly, the physX that do work with 1 Nvidia card are ancillary "effects" that add fluff, like realistic glass breaking. You and your friend both know this. Odd why you kep posting this marketing stuff, perhaps you are spondored by Nvidia..?"

     

    REPLY- Yep...you guessed at it again! And  Wrong again! You will need to go back to Nvidia's website and read it AGAIN! Nvidia Physx is Nvidia Physx...There's not  diffrent verson's for single card users! The single card setup dosent off load the CPU as much as a 2 card setup.

     

    "BTW, Overclockersclub is not an objective site and their reviews are extremely biased. Even long time members question some of their latest reviews. Other, more reputable sites have counterdicted several of their recent reviews. If this is where the 55SIX boys are getting ALL of their information, I can understand their myopic perspective."

     

    Got the Overclockers club link form  Toms hardware site. You say that Overclockersclub are extremely biased. Sounds about right coming from a ATI fan. Toms hardware's GPU compairson dont give you the test bed spec's, Driver ver used for the test, nor if Physx was on or off! Now who would you think is extremely biased? At leased Overclockersclub gives you all that needed information.



     

    I do not need to goto Nvidia's website... when almost every review of PhysX has proven it.  You can either turn on all the PhysX and slow down your frame rates, or dial it back and get ancillary "effects" and gain better FPS.

    You don't get something for free... the video card either has to use all of the cores for rendering the graphics, or use a % for graphics, then PhysX..  so the more you dedicate for PhysX, the less graphical power you have to render the game. Thats how it works... thats the fundmental concept and design of Nvidi'as CUDA.... which is the middle ware that PhysX runs on.

     

    That is why you need a SECOND card to do any sort of "effects" without stuttering or slowing down you games. This is very well hidden by Nividia and the reason they never do demo with just 1 video card. If they do, the demo is of just "effects" like glass breaking.. etc.

    Nvidia isn't going to come out and tell you that, because it's all marketing.

     

    ___________________________

    - Knowledge is power, ive been in school for 28 years!

  • Dreadknot357Dreadknot357 Member Posts: 148
    Originally posted by Erowid420

    Originally posted by Dreadknot357




     
    one.. that was a link that i stumble apon... I hate benchmarks  and websites...i like to do my own hands on Exp.
    but im sure they know more than all of us...
     
    If you read anything, you would notice i said that watch out for sites that can be bias...and this could be true...
    But with my" Real Exp" with this hardwear...its  dead on.
     
    And before you try and act like you know anthing about Hardwear...
    Answer the question that i asked before. 
     "it's good to know that $2,116 worth of Nvidia cards (quad SLI) is the winner of that review, because I would hate to see a $299 offering do 70% of that"
    What do you mean $2,116 worth of nvidia cards quad Sli?

     
    And $299 offering do 70% of that....?  what ?
    I dont need any other answers, or BS about anything else, i just want the answer to these questions....



     

     

    Obviously that was a typo... it was $1,116...   duh. (as the GTX 295 cost $550 a piece...)

    Yes, and for whatever performance those two GTX 295's offer,  70% of that performance can be had with HD4850x2 for $299 bucks. Maximum frame rates do not matter, you just don't want low frame rates or stutter. Quade SLI is pointless, unless you fold.

    Plus, who really wants to spend $1,200 every 10 months. I understand you don't care about that because you are sponsored by Nvidia and you get your stuff for free. But most people aren't, therefore we buy with bang for the buck in mind.

    Currently, it's pretty hard to bat the HD4770, HD4870(1GB), HD4890(1GB) or the HD4850x2.  Nvidia's top of the line cards have fallen nearly 50% in just 3 months and have lost several OEM partners. Plus, only an idiot would buy a SLI set up so close to the DirectX 11.0 card comming out.

     



     

    lol  Nice try.....you thought it was 4 GTX295s  make a Quad SLI...lol  

    you run around talking hardwear...and you know all about Tech????

    you stated:

    "Thanks for the link... it's good to know that $2,116 worth of Nvidia cards (quad SLI) is the winner of that review, because I would hate to see a $299 offering do 70% of that....

    Oh wait.... lol."

     

    Gotta love fanbois who don't understand bang for the buck! (seeing that most of us upgrade every 10 months or so and not buy 4 video card all at once...lol)"



    Anyone as far up on there game, as you claim to be... Would know  that  The GTX 295 Is a Dual GPU  Card...lol

    your going to tel people the diffrence between hardwear,  when you have now proven  you dont know shit..

    You have been following me around MMOrpg  trying to Debunk me 

    and now you F'd up

    your not sidestepping this one...

    dont even say that was a Typo!!.... you where adamint  about  how your point was going to stump us.... seeing how the Nvidia quad and the ATI quad  are in the same price range, and the GTX 295's quad Performance is one card higher than the ATI quad

    there is now way this statement would have...

    No one would have acted that way... (unless you thought it was 4 GPUS x $500= $2116)

    And the GTX 295 is not $550 its below or higher than $550....But it is $529  which if you divide  $2116  (which you posted divided by 4, which you thought....  is $529.....

    your a clown  and full of shit..  and it took  me 4 threads  and 2 weeks to prove it, but the proof in now in front of everyones faces...

    take a walk punk....your nobody..  And dont know shit about Tech...your just another Google Copy and paste guy.

     

     

    "Beauty is only is only skin deep..." said the AMD/ATI fan. "Blah..Thats just what ugly people say..." said the Intel/Nvidia fan. You want price / performance, use the dollar menu..
    image
    image

  • Sir_DripSir_Drip Member Posts: 133
    Originally posted by Dreadknot357

    Originally posted by Erowid420

    Originally posted by Dreadknot357




     
    one.. that was a link that i stumble apon... I hate benchmarks  and websites...i like to do my own hands on Exp.
    but im sure they know more than all of us...
     
    If you read anything, you would notice i said that watch out for sites that can be bias...and this could be true...
    But with my" Real Exp" with this hardwear...its  dead on.
     
    And before you try and act like you know anthing about Hardwear...
    Answer the question that i asked before. 
     "it's good to know that $2,116 worth of Nvidia cards (quad SLI) is the winner of that review, because I would hate to see a $299 offering do 70% of that"
    What do you mean $2,116 worth of nvidia cards quad Sli?

     
    And $299 offering do 70% of that....?  what ?
    I dont need any other answers, or BS about anything else, i just want the answer to these questions....



     

     

    Obviously that was a typo... it was $1,116...   duh. (as the GTX 295 cost $550 a piece...)

    Yes, and for whatever performance those two GTX 295's offer,  70% of that performance can be had with HD4850x2 for $299 bucks. Maximum frame rates do not matter, you just don't want low frame rates or stutter. Quade SLI is pointless, unless you fold.

    Plus, who really wants to spend $1,200 every 10 months. I understand you don't care about that because you are sponsored by Nvidia and you get your stuff for free. But most people aren't, therefore we buy with bang for the buck in mind.

    Currently, it's pretty hard to bat the HD4770, HD4870(1GB), HD4890(1GB) or the HD4850x2.  Nvidia's top of the line cards have fallen nearly 50% in just 3 months and have lost several OEM partners. Plus, only an idiot would buy a SLI set up so close to the DirectX 11.0 card comming out.

     



     

    lol  Nice try.....you thought it was 4 GTX295s  make a Quad SLI...lol  

    you run around talking hardwear...and you know all about Tech????

    you stated:

    "Thanks for the link... it's good to know that $2,116 worth of Nvidia cards (quad SLI) is the winner of that review, because I would hate to see a $299 offering do 70% of that....

    Oh wait.... lol."

     

    Gotta love fanbois who don't understand bang for the buck! (seeing that most of us upgrade every 10 months or so and not buy 4 video card all at once...lol)"



    Anyone as far up on there game, as you claim to be... Would know  that  The GTX 295 Is a Dual GPU  Card...lol

    your going to tel people the diffrence between hardwear,  when you have now proven  you dont know shit..

    You have been following me around MMOrpg  trying to Debunk me 

    and now you F'd up

    your not sidestepping this one...

    dont even say that was a Typo!!.... you where adamint  about  how your point was going to stump us.... seeing how the Nvidia quad and the ATI quad  are in the same price range, and the GTX 295's quad Performance is one card higher than the ATI quad

    there is now way this statement would have...

    No one would have acted that way... (unless you thought it was 4 GPUS x $500= $2116)

    And the GTX 295 is not $550 its below or higher than $550....But it is $529  which if you divide  $2116  (which you posted divided by 4, which you thought....  is $529.....

    your a clown  and full of shit..  and it took  me 4 threads  and 2 weeks to prove it, but the proof in now in front of everyones faces...

    take a walk punk....your nobody..  And dont know shit about Tech...your just another Google Copy and paste guy.

     

     

     Yea....kinda of reminds me of an old joke....."How to torture Hellen Keller"

     

    image

Sign In or Register to comment.