Quantcast

Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Graphics Card Upgrade Help

2»

Comments

  • HrimnirHrimnir Member RarePosts: 2,415
    Originally posted by 13lake
    Originally posted by Quizzical
    Upon further review, your current power supply doesn't have the appropriate PCI-E power connectors to hook up a Radeon R9 290.  But that's redundant, really, as the GeForce GTX 970 will hook up to it just fine.

    U didn't check thorough enough or acidently overlooked it, the HEC 550TB aka ADATA BN-550 has 2x6/8 pin pci-e connectors on different cables, which is exactly how much a 290 needs.

    The exact psu has 2 12v rails with first one rated @ fake 28 amps with second rated @ 20 amps, the combined rated wattage for both rails(probably a single rail split into 2), is 480W which is 480W/12V=40A, so both rails have access to actual 40 rather 48A.

    R9 290 TDP is 275W which is 275W/12V=29.92A, so at max usage there is exactly 10amps or 120 watts headroom, which isn't a lot, i usually prefer to have atleast half the value of the cards TDP in amps of headroom, to account for PSU degradation, and non premium components, however 10A is plenty and would do the job fine.

    Did you forget the 130w CPU, and the various other components in the system?  No way in hell that PSU is going to reliably power a 290, if he did that it would be a ticking time bomb.

    "The surest way to corrupt a youth is to instruct him to hold in higher esteem those who think alike than those who think differently."

    - Friedrich Nietzsche

  • 13lake13lake Member UncommonPosts: 718

    Nvidia fails in the past 2 months so far :

     

    1) 3.5GB + 0.5GB 970 Fiasco

    2) Locking and forbidding both overlocking and underclocking on 900 series mobile gpus with the latest driver

    3)Leaked mobile Gsync driver, which works without a gsync board in the monitor.

     

    In the meantime AMD has done the following :

    1) Best driver that they have almost ever released : AMD Catalyst 14.12 OMEGA driver

     

    Oh the irony, ...

  • 13lake13lake Member UncommonPosts: 718
    Originally posted by Hrimnir
    Originally posted by 13lake
    Originally posted by Quizzical
    Upon further review, your current power supply doesn't have the appropriate PCI-E power connectors to hook up a Radeon R9 290.  But that's redundant, really, as the GeForce GTX 970 will hook up to it just fine.

    U didn't check thorough enough or acidently overlooked it, the HEC 550TB aka ADATA BN-550 has 2x6/8 pin pci-e connectors on different cables, which is exactly how much a 290 needs.

    The exact psu has 2 12v rails with first one rated @ fake 28 amps with second rated @ 20 amps, the combined rated wattage for both rails(probably a single rail split into 2), is 480W which is 480W/12V=40A, so both rails have access to actual 40 rather 48A.

    R9 290 TDP is 275W which is 275W/12V=29.92A, so at max usage there is exactly 10amps or 120 watts headroom, which isn't a lot, i usually prefer to have atleast half the value of the cards TDP in amps of headroom, to account for PSU degradation, and non premium components, however 10A is plenty and would do the job fine.

    Did you forget the 130w CPU, and the various other components in the system?  No way in hell that PSU is going to reliably power a 290, if he did that it would be a ticking time bomb.

    The other components don't use the 12V rail neither exclusively nor relatively, don't jump the gun, 12v has always and is reserved primarily for the GPU.

     

    What do 3.3v and 5v rail do ? collect dust ? play poker ?

     

    U can even connect the card on a even lower w psu which lacks a second 6 pin pci-e connector, why do you think GPU manufactures include the elusive molex to 6-pin connector in almost every GPU box ?

     

    It's so u can tap into the combined wattage/amperage of the 3.3V and 5V in the extreme situations and make it work like that.

  • QuizzicalQuizzical Member LegendaryPosts: 22,687
    Originally posted by 13lake
    Originally posted by Quizzical
    Upon further review, your current power supply doesn't have the appropriate PCI-E power connectors to hook up a Radeon R9 290.  But that's redundant, really, as the GeForce GTX 970 will hook up to it just fine.

    U didn't check thorough enough or acidently overlooked it, the HEC 550TB aka ADATA BN-550 has 2x6/8 pin pci-e connectors on different cables, which is exactly how much a 290 needs.

    The exact psu has 2 12v rails with first one rated @ fake 28 amps with second rated @ 20 amps, the combined rated wattage for both rails(probably a single rail split into 2), is 480W which is 480W/12V=40A, so both rails have access to actual 40 rather 48A.

    R9 290 TDP is 275W which is 275W/12V=29.92A, so at max usage there is exactly 10amps or 120 watts headroom, which isn't a lot, i usually prefer to have atleast half the value of the cards TDP in amps of headroom, to account for PSU degradation, and non premium components, however 10A is plenty and would do the job fine.

    HEC says otherwise:

    http://hecgroupusa.com/home/hec550tb/

    It even shows a picture.  That's a 6-pin connector, not 6+2-pin.

    Rebranded power supplies with the same internal hardware don't always have exactly the same cable configuration.

  • 13lake13lake Member UncommonPosts: 718

    That's the lamest thing ever, it's on par with Gigabyte sub $100 motherboards losing phases and the second bios with new revision while it still says dual bios on the box. ...

     

    Does the Adata have it or not, which revisions of HEC have it or not ? how can u even figure it out if u don't have the PSU in your hands, ...

  • QuizzicalQuizzical Member LegendaryPosts: 22,687
    Originally posted by RollieJoe

    Appreciate the replies folks.  I went ahead and ordered:

    http://www.amazon.com/MSI-GTX-970-4GD5T-OC/dp/B00NN0GIA0/ref=sr_1_2?ie=UTF8&qid=1424140353&sr=8-2&keywords=msi+gtx+970

    Which is an MSI 970 that I think is slightly overclocked, for $320+ tax after rebate. Hopefully it was a good choice. I had $200 in amazon credit which is why I went with amazon over newegg.

    One quick follow-up to some of the replies though - I know my case is extremely dated and doesn't have the best airflow. But, if under prolonged max load for the applications/games I run, if my CPU and GPU both stay within safe, standard temp ranges, and never anything even close to throttle/shutdown range, does it really matter? From my understanding, keeping my processor at 50c instead of 60c under load isn't going to improve its performance at all, and won't extend its lifespan in any meaningful way for example.

    The CPU and GPU aren't the only things in your computer that don't like excessive heat.  They get most of the attention, as they put out most of the heat.  But there are a bunch of other chips elsewhere on the motherboard and video card that don't like to get too hot, either.  Most of them don't have fans attached and don't have much of a heatsink, but they'll be fine from general case airflow so long as you don't do something stupid like put a high-powered CPU and GPU into a case that doesn't have much airflow.

    On the GeForce GTX 590, for example, the GPUs were cooled well enough, but the VRMs sure weren't, and that lead to cards not even surviving the initial reviews.  If you don't have a stupid design on the motherboard or video card, it really only takes having the ambient air in the case decently close to room temperature to keep everything cool.  But if you're not going to do that, you should worry about problems with the motherboard overheating--and there are a whole bunch of chips, capacitors, and other electronic circuitry that can independently cause trouble.

    It might work out fine for you, just like if you run your CPU and GPU at 90 C every day, it might still work fine for you.  But higher temperatures means greater probability of problems for most of the important components in a computer other than the hard drives.

    Throttling clock speeds for temperature is a last-ditch resort for when things have gone horribly wrong and the chip doesn't want to fry within seconds.  Staying just below the threshold at which that happens doesn't mean you're safe for long-term use.

  • 13lake13lake Member UncommonPosts: 718

    Haha Quizzical check this out you're gonna laugh your ass of :

     

    http://livedoor.blogimg.jp/admtan/imgs/4/6/462d839b.jpg

     

    forgot the second pic : http://blog-imgs-51.fc2.com/a/d/m/administratan/IMG_6037.jpg

     

    This version has 1x6pin only and 1x6+2pin only, and it seems the ADATA rebrand is the one that has 2x6/8pin.

     

    This is the last time i give HEC the benefit of the doubt, thx for pointing it out.

  • QuizzicalQuizzical Member LegendaryPosts: 22,687
    Originally posted by Hrimnir
    Originally posted by 13lake
    Originally posted by Quizzical
    Upon further review, your current power supply doesn't have the appropriate PCI-E power connectors to hook up a Radeon R9 290.  But that's redundant, really, as the GeForce GTX 970 will hook up to it just fine.

    U didn't check thorough enough or acidently overlooked it, the HEC 550TB aka ADATA BN-550 has 2x6/8 pin pci-e connectors on different cables, which is exactly how much a 290 needs.

    The exact psu has 2 12v rails with first one rated @ fake 28 amps with second rated @ 20 amps, the combined rated wattage for both rails(probably a single rail split into 2), is 480W which is 480W/12V=40A, so both rails have access to actual 40 rather 48A.

    R9 290 TDP is 275W which is 275W/12V=29.92A, so at max usage there is exactly 10amps or 120 watts headroom, which isn't a lot, i usually prefer to have atleast half the value of the cards TDP in amps of headroom, to account for PSU degradation, and non premium components, however 10A is plenty and would do the job fine.

    Did you forget the 130w CPU, and the various other components in the system?  No way in hell that PSU is going to reliably power a 290, if he did that it would be a ticking time bomb.

    His CPU has a TDP of 84 W, and a decent chunk of that is really for the integrated graphics that aren't even going to be used.

  • QuizzicalQuizzical Member LegendaryPosts: 22,687
    Originally posted by Hrimnir
    Originally posted by Ridelynn

     


    Originally posted by yaminsux
    In my personal experience, I wouldn't touch Radeon with a 10-foot pole. I have lost two top of the line Radeons (during its respective release dates) due to overheating.

     

    That being said, go 970. It'll fit your rig like a glove, no hassle.


     

    Because nVidia never overheats.

    Next we'll be hearing about how the drivers are so much better. Or Physx...

    /sigh

    The drivers ARE better, there has been literally MOUNTAINS of evidence to support that.  AMD only recently got their shit together in the last couple of years after they were FORCED to fix their drivers when some prominent people in the industry discovered they were half assing things and it was causing micro-stuttering.

    As far as nvidia cards not overheating, that hasn't been a problem since the 4xx series cards. Every card Nvidia has released since then has not had any heat issues.  Now statistically speaking yes people are gonna get defective products, but that affects both nvidia and amd, so thats a moot point.

    I used to be a hardcore AMD video card fan, but me and my entire group of friends stopped using us after several of us had issues, either hardware or driver issues with AMD cards MULTIPLE times.  I gave ATI a chance for several generations and has consistent issues with them, switched to nvidia and haven't looked back since.

    In my entire history of video cards (which has been since the days when video cards had 8mb of ram), the ONLY driver related issue i've had with an nvidia card was back in the FX series days, i had a couple cards which for some reason wouldn't ramp the fan up past 70% with the stock driver, so i had to use a utility to set manual fan stuff.  That was almost 10 years ago.  Since then, zero issues.

    That being said, i don't neccesarily see anything wrong with AMD cards, they generally represent a good value for price/perf, but the tradeoff is heat generated and power usage.

    PHYSX on the other hand was and is a joke.  God, i remember when companies were selling $200 seperate PHYSX cards talking about how it would be the wave of the future and such.

    Because only Nvidia gives you driver problems like this:

    http://www.tomshardware.com/news/Nvidia-196.75-drivers-over-heating,9802.html

    Really, though, any problems with ATI drivers are basically ancient history at this point.  Even if one vendor did have clearly superior drivers 10 or 15 years ago, is that still a good reason to buy from that vendor today?

    For heat, AMD's GCN isn't terribly different from Nvidia's Kepler.  Nvidia's Maxwell is better, but that's recent and a newer generation.  AMD is promising that Carrizzo will be a new GPU architecture and AMD's biggest energy efficiency improvements ever.  That might just be marketing bluster, but it's also quite possible that Nvidia's energy efficiency advantage will be short-lived--just as advantages that various GPU vendors have had in many previous generations ended when the competitor released its new generation.

    Of course, if you want to buy a card today, you don't care what AMD is going to launch in a few months.  Kind of like how when I bought my current computer in October 2009, AMD was way ahead because it was a comparison of Radeon HD 5850/5870 versus various GeForce 200 series parts that didn't have DirectX 11/OpenGL 4 support.  Nvidia would catch up there several months later.

    -----

    Doing physics computations via OpenCL on the integrated GPU may make some sense if you're rendering the game on a discrete card.  But not enough that I'm aware of any game that actually does that.  Context switching between CUDA and DirectX or OpenGL every single frame on a single GPU is what makes no sense.

  • HrimnirHrimnir Member RarePosts: 2,415
    Originally posted by Quizzical
    Originally posted by Hrimnir
    Originally posted by Ridelynn

     


    Originally posted by yaminsux
    In my personal experience, I wouldn't touch Radeon with a 10-foot pole. I have lost two top of the line Radeons (during its respective release dates) due to overheating.

     

    That being said, go 970. It'll fit your rig like a glove, no hassle.


     

    Because nVidia never overheats.

    Next we'll be hearing about how the drivers are so much better. Or Physx...

    /sigh

    The drivers ARE better, there has been literally MOUNTAINS of evidence to support that.  AMD only recently got their shit together in the last couple of years after they were FORCED to fix their drivers when some prominent people in the industry discovered they were half assing things and it was causing micro-stuttering.

    As far as nvidia cards not overheating, that hasn't been a problem since the 4xx series cards. Every card Nvidia has released since then has not had any heat issues.  Now statistically speaking yes people are gonna get defective products, but that affects both nvidia and amd, so thats a moot point.

    I used to be a hardcore AMD video card fan, but me and my entire group of friends stopped using us after several of us had issues, either hardware or driver issues with AMD cards MULTIPLE times.  I gave ATI a chance for several generations and has consistent issues with them, switched to nvidia and haven't looked back since.

    In my entire history of video cards (which has been since the days when video cards had 8mb of ram), the ONLY driver related issue i've had with an nvidia card was back in the FX series days, i had a couple cards which for some reason wouldn't ramp the fan up past 70% with the stock driver, so i had to use a utility to set manual fan stuff.  That was almost 10 years ago.  Since then, zero issues.

    That being said, i don't neccesarily see anything wrong with AMD cards, they generally represent a good value for price/perf, but the tradeoff is heat generated and power usage.

    PHYSX on the other hand was and is a joke.  God, i remember when companies were selling $200 seperate PHYSX cards talking about how it would be the wave of the future and such.

    Because only Nvidia gives you driver problems like this:

    http://www.tomshardware.com/news/Nvidia-196.75-drivers-over-heating,9802.html

    Really, though, any problems with ATI drivers are basically ancient history at this point.  Even if one vendor did have clearly superior drivers 10 or 15 years ago, is that still a good reason to buy from that vendor today?

    For heat, AMD's GCN isn't terribly different from Nvidia's Kepler.  Nvidia's Maxwell is better, but that's recent and a newer generation.  AMD is promising that Carrizzo will be a new GPU architecture and AMD's biggest energy efficiency improvements ever.  That might just be marketing bluster, but it's also quite possible that Nvidia's energy efficiency advantage will be short-lived--just as advantages that various GPU vendors have had in many previous generations ended when the competitor released its new generation.

    Of course, if you want to buy a card today, you don't care what AMD is going to launch in a few months.  Kind of like how when I bought my current computer in October 2009, AMD was way ahead because it was a comparison of Radeon HD 5850/5870 versus various GeForce 200 series parts that didn't have DirectX 11/OpenGL 4 support.  Nvidia would catch up there several months later.

    -----

    Doing physics computations via OpenCL on the integrated GPU may make some sense if you're rendering the game on a discrete card.  But not enough that I'm aware of any game that actually does that.  Context switching between CUDA and DirectX or OpenGL every single frame on a single GPU is what makes no sense.

    All fair points, but come on now, you're comparing one iteration of a driver which had an issue, to extended multiple year history of (well documented) driver issues which was a recent as late 2013 with the whole FCAT / Microstuttering fiasco, where they found out AMD had been half assing things for several years.  Thats outside of the random other minor BS they had prior to that like booting up and finding your desktop at a different resolution than when you turned it off, or random settings in the driver not saving properly, etc etc.

    Again, like you said, its mostly in the past, but that guy playing his fanboi card like nvidia had just as many issues is really stretching.  Have nvidia had issues, yes, but compared to AMD in the past, well, its not a comparison.

    As far as the energy efficiency thing, i'll believe it when i see it.  AMD has been making promises like that for ages in both GPU and CPU markets and has yet to deliver on anything of the sort.  One of the main reasons they had to refocus on the entry to mid level CPU market to be profitable because they couldn't produce CPU's that competed with intel on the high end that didn't go thermonuclear and require an insanely large die, which of course meant more heat, more power draw, etc.

    Do i have an obvious bias towards nvidia/intel? yes.  But mine has been based on almost 20 years of personal experience, building and fixing and troubleshooting PC's for me and about 10-15 friends over the years.  Its just been far too common of an occurance to have an issue with an AMD/ATI product for me to believe we just happen to be those statiscal anomalies that aren't the norm.

    Either way, people can take their own gambles, i can only give the information as best i know it and people should be adult enough to make their own decision.

    For the record about the only thing i am irrational about is the console vs PC thing.  I don't care how good they make consoles i refuse to buy or support them purely on principal. Would i likely buy an AMD product in the future? no, but im not completely opposed to the idea.

    Edit: Actually i lied a bit, i did buy a HD5450 a while back for an HTPC card because all the reviews suggested for that purpose it was far better suited to the task, and with a few minor niggles back in the 13.x driver series where it really hated saving certain image settings, its been perfect.

    "The surest way to corrupt a youth is to instruct him to hold in higher esteem those who think alike than those who think differently."

    - Friedrich Nietzsche

  • gervaise1gervaise1 Member EpicPosts: 6,919
    Originally posted by Quizzical
    Originally posted by gervaise1

    Also the MSI Razer fan design is very good at getting rid of heat as well.

    Nearly all video cards do a decent job of getting heat off of the GPU chip and away from the card.  It's getting the heat out of the case that I'm more worried about here, as he's not going to get much help from the case.

    .

    Yes; I should have been more specific, I was talking about getting rid of it as in out of the case.

    The link the OP posted shows a picture of a Razer fan design; it pulls the air over (a pipe / heat sink arrangement) and pushes it out of the case.

     

  • QuizzicalQuizzical Member LegendaryPosts: 22,687
    Originally posted by Hrimnir

    All fair points, but come on now, you're comparing one iteration of a driver which had an issue, to extended multiple year history of (well documented) driver issues which was a recent as late 2013 with the whole FCAT / Microstuttering fiasco, where they found out AMD had been half assing things for several years.  Thats outside of the random other minor BS they had prior to that like booting up and finding your desktop at a different resolution than when you turned it off, or random settings in the driver not saving properly, etc etc.

    Again, like you said, its mostly in the past, but that guy playing his fanboi card like nvidia had just as many issues is really stretching.  Have nvidia had issues, yes, but compared to AMD in the past, well, its not a comparison.

    As far as the energy efficiency thing, i'll believe it when i see it.  AMD has been making promises like that for ages in both GPU and CPU markets and has yet to deliver on anything of the sort.  One of the main reasons they had to refocus on the entry to mid level CPU market to be profitable because they couldn't produce CPU's that competed with intel on the high end that didn't go thermonuclear and require an insanely large die, which of course meant more heat, more power draw, etc.

    It used to be that neither GPU vendor took frame timing seriously.  Then a few years ago, in one of the most important tech articles of the last decade, Tech Report demonstrated that what matters for smooth gameplay is not just how many frames you can produce per second, but how those frames are timed.  That's especially the case with CrossFire/SLI rigs, but also true of single GPU systems, though the latter is harder to screw up.

    The initial article showed that both Nvidia and AMD were a mess on it, as neither had really tried to optimize for it up to that point.  In their defense, they were optimizing for the benchmarks that tech sites were running, and tech sites were looking for high frame rate averages.

    Nvidia saw the article and said, hey, this is important.  We should optimize this.  And so over the course of the next year, they did.  And AMD didn't.  Once they had their optimizations in place, they went back and said, hey, you should try your frame timing stuff again, and here's some tools that we'll give you to help you time things more precisely.  That's what the whole FCAT thing was about.

    Sure enough, once Nvidia had optimized for frame timings and AMD didn't, Nvidia was massively better at it.  At that point, AMD said, we have a problem.  We need to optimize for frame timings, too.  It took AMD about a year or so to catch up, but catch up they did.  And today, the "Nvidia is better at FCAT than AMD" debacle is firmly in the past.

    Should Nvidia get some credit for taking frame timings seriously before AMD did?  Sure.  Just like AMD should get credit for supporting more monitors better before Nvidia did, reintroducing SSAA for higher image quality, or any of the other cases where one vendor offered something first and the other later caught up.  (Unless that thing is stupid, such as video transcoding or "True Audio".)

    But that hardly means that Nvidia offers better video drivers than AMD today.

    ----

    For energy efficiency, let's separate CPUs from GPUs here.  When buying a video card, GPUs are the thing that matters.  Carrizzo is an APU, so it's possible that AMD's claims of improved energy efficiency apply mostly to the CPU side of things, not the GPU.  But the GPU eats up a considerable chunk of the TDP, so there's only so much to gain if you're not improving GPU energy efficiency.

    Since AMD bought ATI, they've been ahead of Nvidia in GPU energy efficiency more often than Nvidia has been ahead.  A substantial fraction of the time, this was due to a process node advantage, such as 55 nm Radeon HD 3000 series versus 65 nm GeForce 9000, the 55 nm Radeon HD 4000 series versus the 65 nm GeForce GTX 260/280, or the 40 nm Radeon HD 4770 versus the 55 nm GeForce GTX 200 series.  And, of course, Fermi was terrible on energy efficiency, even on the same 40 nm process node AMD was using.  So it's hardly inconceivable that AMD will pull ahead on energy efficiency again.  Indeed, whoever gets to 14/16 nm first will probably be well ahead on energy efficiency until the other side gets there--and it's been many years since Nvidia got to a process node first.

    In CPUs, AMD has pretty much always been behind on process nodes, as Intel has the best fabs in the world.  Nvidia, of course, does not have Intel's fabs.  In recent years, AMD has compounded the problem with a Bulldozer architecture and its derivatives that were pretty much a mess in their own right, on top of having to compete with Intel's excellent Sandy Bridge and its derivatives.  But AMD's Bobcat/Jaguar cores did beat quite a few generations of Intel Atom cores in efficiency, at least at the clock speeds and voltages that made sense for a laptop or nettop.  Intel has lately taken to paying vendors a bunch of money to use Atom rather than AMD's Beema or Mullins to mask how far behind Intel is in that market.

    AMD's Zen architecture, due out next year, won't be another Bulldozer derivative, so it will offer a chance for AMD to be competitive again.  Will AMD be competitive again on CPUs?  My guess is, not in desktops, but we'll see.

  • 13lake13lake Member UncommonPosts: 718

    @ Hrimnir

    I have suggested, built, helped build, etc, ... computers for about 10 or so of my friends and about 20 or so acquaintances ( This is all IRL, not counting any people over forums.) over the past 10 years.

    And I and they have had more problems with nvidia and intel hardware that amd and ati hardware. That hasn't stopped me recommending neither intel nor nvidia(to a very lesser degree) parts over the years, simply because bias is an imaginary concept for me, regardless of the context or the situation.

     

    Right now it's your statistical insignificance vs my statistical insignificance, in the wider scale of things, we and our friends represent a statistical average so small, that we cannot judge the whole world situation of the hardware represented based purely on our personal experience and feeling.

     

    Our personal experience and feeling need to represent a building block of the whole combination of logic, emotion, experience, intuition, research, fact, hearsay and so on, and not an overwhelming majority for the final decision/suggestion/action.

    And this is a key difference in the process of decision making between various people suggesting and recommending on this website as is in all 7 billion of us on this planet for all types of decisions.

     

    And that is the key problem here, which i'm trying to point out over and over again :(

     

    It is also a problem  that is just a result of homo sapiens bio diversity and social upbringing and nothing more, we all change and adapt over the years for better or worse and a magnitude of things affects every and all parts of all of our lives.

  • QuizzicalQuizzical Member LegendaryPosts: 22,687
    Originally posted by gervaise1
    Originally posted by Quizzical
    Originally posted by gervaise1

    Also the MSI Razer fan design is very good at getting rid of heat as well.

    Nearly all video cards do a decent job of getting heat off of the GPU chip and away from the card.  It's getting the heat out of the case that I'm more worried about here, as he's not going to get much help from the case.

    .

    Yes; I should have been more specific, I was talking about getting rid of it as in out of the case.

    The link the OP posted shows a picture of a Razer fan design; it pulls the air over (a pipe / heat sink arrangement) and pushes it out of the case.

     

    The card that the original poster bought is an internal exhaust model that will spray hot air off in every which direction.  A little bit may go right out of the case, but not very much.  I linked some external exhaust models earlier in the thread to say this is what you should buy, but he ignored that.

  • HrimnirHrimnir Member RarePosts: 2,415
    Originally posted by 13lake

    @ Hrimnir

    Right now it's your statistical insignificance vs my statistical insignificance, in the wider scale of things, we and our friends represent a statistical average so small, that we cannot judge the whole world situation of the hardware represented based purely on personal experience and feeling.

    Its a fair point and i think we're all on the same page there.  I guess i (emotionally) reacted to your statement and interjected the assumption of fanboism based on what was admittedly a cryptic statement that is akin to things fanbois normally state.

    For that, I apologize.

    As a headsup the only reason i posted my experience was just to say, "this is where im coming from", it wasnt intended as some sort of superiority play or something like that.  I was just saying based on my experience, X is Y.  Which is also why i said it leads me to believe its *not* a statistical anomaly.

    Regardless.  I think we've both cleared things up quite well.

    "The surest way to corrupt a youth is to instruct him to hold in higher esteem those who think alike than those who think differently."

    - Friedrich Nietzsche

  • 13lake13lake Member UncommonPosts: 718
    Originally posted by Hrimnir
    Originally posted by 13lake

    @ Hrimnir

    Right now it's your statistical insignificance vs my statistical insignificance, in the wider scale of things, we and our friends represent a statistical average so small, that we cannot judge the whole world situation of the hardware represented based purely on personal experience and feeling.

    Its a fair point and i think we're all on the same page there.  I guess i (emotionally) reacted to your statement and interjected the assumption of fanboism based on what was admittedly a cryptic statement that is akin to things fanbois normally state.

    For that, I apologize.

    As a headsup the only reason i posted my experience was just to say, "this is where im coming from", it wasnt intended as some sort of superiority play or something like that.  I was just saying based on my experience, X is Y.  Which is also why i said it leads me to believe its *not* a statistical anomaly.

    Regardless.  I think we've both cleared things up quite well.

    Apology accepted :),  i tend to be overzealous at times, and then just give up the next dozen or so times,  guess it just piles up. And i also share the opinion that "this is where im coming from" is 99% of the time the best way to start.

    And i sure like to throw around the words imaginative and bias at the same time, but i find myself in situations where i'm definitely biased sometimes without even noticing it :)

    I apologize also for assuming things, nvidia and intel fanbois, have made a grumpy old man out of me at a way too young age.

    Honestly there's just so many more nvidia and intel fanbois(and bad information and assumptions) than there is amd fanbois.

    They just seem like they're everywhere at all times xD, my paid shill paranoia has also been getting worse lately :)

  • HrimnirHrimnir Member RarePosts: 2,415
    Originally posted by 13lake

    Apology accepted :),  i tend to be overzealous at times, and then just give up the next dozen or so times,  guess it just piles up. And i also share the opinion that "this is where im coming from" is 99% of the time the best way to start.

    And i sure like to throw around the words imaginative and bias at the same time, but i find myself in situations where i'm definitely biased sometimes without even noticing it :)

    I apologize also for assuming things, nvidia and intel fanbois, have made a grumpy old man out of me at a way too young age.

    Honestly there's just so many more nvidia and intel fanbois(and bad information and assumptions) than there is amd fanbois.

    They just seem like they're everywhere at all times xD, my paid shill paranoia has also been getting worse lately :)

    It pains me to admit it, because i *do* believe nvidia and intel have the superior products, but, there is a lot of dumbasses out there who like to spew things, and intel / nvidia does seem to have a much greater frequency of dipshittery.

    "The surest way to corrupt a youth is to instruct him to hold in higher esteem those who think alike than those who think differently."

    - Friedrich Nietzsche

  • zasdzasd Member UncommonPosts: 45
    Originally posted by 13lake
    Originally posted by zasd

    I rarely post but I have been building computers for almost 20 years fwiw.

    The PSU

    • Wattage vs 12v rail AMPS, many people look at the wattage of the PSU only and think the bigger the number the better the PSU.  Your current PSU is just under 30 AMPS on the rails I try to stay above 30... but it is passable.
    How did u come to the conclusion that it has just under 30 amps on the (i assume you mean 12v) rails ?
     
     
    pls elaborate ?

    http://hecgroupusa.com/home/hec550tb/ Rail 1 has 28 amps rail 2 has 20 amps... if there is this "fake" rail or not I don't know but last time I checked 28 amps is just under 30 amps...

  • 13lake13lake Member UncommonPosts: 718
    Originally posted by zasd
    Originally posted by 13lake
    Originally posted by zasd

    I rarely post but I have been building computers for almost 20 years fwiw.

    The PSU

    • Wattage vs 12v rail AMPS, many people look at the wattage of the PSU only and think the bigger the number the better the PSU.  Your current PSU is just under 30 AMPS on the rails I try to stay above 30... but it is passable.
    How did u come to the conclusion that it has just under 30 amps on the (i assume you mean 12v) rails ?
     
     
    pls elaborate ?

    http://hecgroupusa.com/home/hec550tb/ Rail 1 has 28 amps rail 2 has 20 amps... if there is this "fake" rail or not I don't know but last time I checked 28 amps is just under 30 amps...

    You're looking at the wrong thing, specifically wires for the 6-pin connectors can be connected in a wierd way to both rails, not necessarily 1 to 12v1 and second to 12v2.

    That's why you need to look at the combined wattage, most of the time i see on power supplies lets say 4 rails 12v1,12v2,12v3,12v4, and it says each one of them can handle 20 amps, but then it lists a combined wattage for all of them at 480 watts which is 40 amps.

    So for those rails if u go over 40 amps in between them that's their maximum they wouldn't be able to handle more even though theoretically numbers say they can, in reality they have 40 amps in between them and that's it.

     

    Look again at the picture of the sticker on side of the psu, under the amperage of the rails it says combined wattage, that's the maximum amount of watts all and any 12v rails can take, amps are related to watts via volts, you cannot change the laws of physics.

    if u divide the watts by the voltage you will get the amperage, it's like the abc of electrical engineering.

     

    480 watts divided by 12 volts = 40 amps, therefore 40 amps is what both rails are rated for , and the first rail can do 28 amps only if the second does 12 amps or less, if they second rail is pulling more than 12 amps the first one can only do 27 for instance.

     

    12v1     12v2

    28A       12A

    27A       13A

    26A       14A

    ....          ....

    20A       20A

     

    the combined amperage of the 12v1 and 12v2 rail must never exceed 40A regardless of what the individual rail is rated for.

  • RidelynnRidelynn Member EpicPosts: 7,147

    That's why I like single rail power supplies - much easier to see what you can actually do with it.

  • 13lake13lake Member UncommonPosts: 718

    A huge amount of "multi rail" power supplies are not even actually multi rail at all, they don't have distinct rails, just hot-wired   weirdness inside. It's usually just a single rail split 2 or 3 ways or sometimes even 4. 

    This makes things exponentially more confusing and troublesome, as u literally have no clue what the heck is going on in that abomination inside :)

     

    The only company i could "trust" that their PSUs have independent rail setups is Enermax. 

     

    And i just want clarify that regardless of whether rails are truly independent or not, u still have to consider the Combined Wattage/Amperage at ALL times

     

    Here's the link on the official enermax website for a psu of theirs : 

    http://www.enermax.com/home.php?fn=eng/product_a1_1_2&lv0=1&lv1=58&no=188

    You might notice that unlike other PSU manufacturers who like to twist the truth, forget to write it, or just intentionally omit it, enermax spells out every little single thing for you, even writing individual rail amperage not as static but as dynamic 0-25 per rail, ...

     

    Also notice how next to the combined wattage they have written the combined amperage, and 744 divided by 12 is 62

  • TrionicusTrionicus Member UncommonPosts: 498

    I've been super impressed with the performance and quality of AMD 7xxx series with both XFX and Sapphire AND Nvidia's 6xx & 7xx series from EVGA & Gigabyte models in the past  year.

    130+ various gpu's bought from newegg / amazon from the above vendors. No DOA's or funny heating issues from ANY vendor. The only thing I factor now is price / performance, hunting for good MIR's and extras (free games etc...). In the past I've had issues with Nvidia that kept me away from them but since the GTX 660 / TI's I've been buying them almost equally to AMD.

    The whole coin mining fad made things screwy but since prices have normalized I'd recommend going straight for price points. Assuming adequate case cooling.

     

  • 13lake13lake Member UncommonPosts: 718

    Y prices have gotten to almost how they were before the craze, and u don't need to be specifically looking for 7000 series it are the same cards as the 280 and 280x :)

    280x would be the best bin of the 7970 for instance, so a much safer bet.

Sign In or Register to comment.