Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

GTX 970 4GB or AMD 390 8GB

jpnolejpnole Member UncommonPosts: 1,698

OK the time to upgrade my GPU continues to draw nearer. Which is the best value? What one would you get in the price range of these two cards?

 

Edit: Gaming at 22" 1080p and would consider 1440p possibly before too long though probably not for another year after I do my build.

Comments

  • QuizzicalQuizzical Member LegendaryPosts: 25,355

    If power consumption is a huge deal to you, then the GTX 970.  If you think you'll need more than 4 GB in the useful life of the card (which I'm skeptical of), then the R9 390.  If you're planning on getting a new monitor with adaptive sync in the useful life of the card, then probably the R9 390, unless Nvidia decides to support it.  (G-sync does the same thing, but adds about $150 to the price tag of a monitor over adaptive sync.)

    Otherwise, let's wait and see some reviews first.  On performance alone, likely the GTX 970, but I'd want to see some reviews to see just what AMD got out of a respin.

     

  • ChaserzChaserz Member UncommonPosts: 317

    Hands down the 970.  The 300 series is just another re-brand of the power hungry 200 series with minor tweaks and overclocked about as far as they will go.    The 300 series are heat machines with no overhead for further clocking left.  Maxwell beats all of them very efficiently.

    Tomshardware:  So, is this just another rebrand? Sadly, that’s pretty much the bottom line. There’s no real innovation to speak of in AMD’s 300 series, at least as far as these models go. Let’s hope for the company’s sake that Fiji doesn’t turn into Bermuda.

    If you are looking towards 4K neither of these cards are in that arena.  You'll pay 600+ to have something that can handle that comfortably.

     

     

  • ShodanasShodanas Member RarePosts: 1,933

    You ask us about two gfx cards yet you fail to provide the most important piece of information: the size of your monitor and the resolutions you play at. 

    Anyway, the 970 runs quieter and way cooler, is much less power consuming and easy to overclock quite high without raising it's temperatures. Perfomance wise both score about the same marks at 1080p. At higher resolutions the AMD card pulls ahead. 

  • QuizzicalQuizzical Member LegendaryPosts: 25,355

    Temperature and noise depend on other things, especially the cooling system, and not just the GPU chip itself.  Now, it is easier to make a card cooler and quieter with a GPU chip that doesn't put out very much heat.

    But "it's easier" isn't the same as "it's always done".  The GeForce FX 5800 was notoriously loud with a TDP no higher than 75 W (and possibly a lot less than that; I can't find a reliable source).  The original Pentium was notoriously hot with a TDP of 5.5 W.

  • KabaalKabaal Member UncommonPosts: 3,042
    Another alternative if you don't need the 8GB memory is to buy a 290X and save money. The new modded drivers have almost closed the slight performance gap between the 390x and the 290x. Chances are AMD will hold the version of the drivers back that work on the 290x until after the 390x launch but the modded version works just fine.
  • booniedog96booniedog96 Member UncommonPosts: 289
    If you are a control freak go with AMD, if you want auto pilot go nVidia.  If you are playing on 1080p buy a used GTX 680 or HD7970 because this will be the optimal value @ 1920x1080 if you haggle em to $100.00US.
  • jpnolejpnole Member UncommonPosts: 1,698
    Hey thanks for the replies as usual. I am currently at 1080p 22" and looking to do a Skylake build in the fall. I was thinking maybe I'd act a little sooner on the GPU and let it choke in my 4.5 year old PCI 2.0 slot for a few months.
  • IAmMMOIAmMMO Member UncommonPosts: 1,462
    What's with all these PC gamer uses HDMI over DVI? Are you all using IPS Office monitors instead of gaming monitors? No Pc gamer will use a 60hrz monitor in the year 2015, they're on 120hrz ,144hrz  and now sync free monitors connected using DVI-D. There's no point getting a top of the range graphics card just to limit it to 30hrz or 60hrz.. And why would any PC gamer want speakers on their monitor and use HDMI to bring a lackluster audio to it?  No HDMI setup will beat  a PC with a dedicated sound card with a top speaker system connected directly into it.  Is HDMI 1080P the new budget gamers? As it's certainly not the choice Pc gamers building proper systems to compete online with.
  • KabaalKabaal Member UncommonPosts: 3,042
    Originally posted by IAmMMO
    What's with all these PC gamer uses HDMI over DVI? Are you all using IPS Office monitors instead of gaming monitors? No Pc gamer will use a 60hrz monitor in the year 2015, they're on 120hrz ,144hrz  and now sync free monitors connected using DVI-D. There's no point getting a top of the range graphics card just to limit it to 30hrz or 60hrz.. And why would any PC gamer want speakers on their monitor and use HDMI to bring a lackluster audio to it?  No HDMI setup will beat  a PC with a dedicated sound card with a top speaker system connected directly into it.  Is HDMI 1080P the new budget gamers? As it's certainly not the choice Pc gamers building proper systems to compete online with.

    Nonsense. I'd be willing to put money on almost all gamers still using 60hz monitors, the enthusiasts using 144hz etc are a tiny minority. Cutting edge is not the norm.

     

    I use HDMI to the TV when playing games like FIFA 2015, The Witcher etc... i must not be a proper gamer going by your thinking.

  • booniedog96booniedog96 Member UncommonPosts: 289
    Originally posted by jpnole
    Hey thanks for the replies as usual. I am currently at 1080p 22" and looking to do a Skylake build in the fall. I was thinking maybe I'd act a little sooner on the GPU and let it choke in my 4.5 year old PCI 2.0 slot for a few months.

    I would look into upgrading to 1440p with those cards you are interested in otherwise it's like putting Hemi in a Toyota Corola with a speed limiter set at 60mph.

     

    PCI? or PCI-e 2.0?

     

    If it's a PCI-e x16 2.0 it's still very viable tech, I don't think even slotting a GTX 980 Ti will be able to saturate the bandwidth of PCI-e x16 2.0.  If you were slotting an Intel 750 Series SSD then you will want a 3.0.  By the fall though a lot of these prices will drop and the GPU wars have just gotten started with AMD's new line up.  Some tech sites have predicted HDD prices to drop to 10 cents per GB. 

     

    Whatever you choose for your GPU make sure you figure out what resolution you want to play at and then go from there and buy the the best GPU you can afford if you are going to use the PC mostly for gaming.  If you are going to use the PC for professional work then you'll have to balance out your budget between the CPU and GPU.  For gaming, get a good CPU and splurge on the GPU, don't hold out on the GPU but don't chimp on the other components.  Keep an eye out on the Fury X especially if you going to jump on the VR band wagon.

     

     

  • booniedog96booniedog96 Member UncommonPosts: 289
    Originally posted by Kabaal
    Originally posted by IAmMMO
    What's with all these PC gamer uses HDMI over DVI? Are you all using IPS Office monitors instead of gaming monitors? No Pc gamer will use a 60hrz monitor in the year 2015, they're on 120hrz ,144hrz  and now sync free monitors connected using DVI-D. There's no point getting a top of the range graphics card just to limit it to 30hrz or 60hrz.. And why would any PC gamer want speakers on their monitor and use HDMI to bring a lackluster audio to it?  No HDMI setup will beat  a PC with a dedicated sound card with a top speaker system connected directly into it.  Is HDMI 1080P the new budget gamers? As it's certainly not the choice Pc gamers building proper systems to compete online with.

    Nonsense. I'd be willing to put money on almost all gamers still using 60hz monitors, the enthusiasts using 144hz etc are a tiny minority. Cutting edge is not the norm.

     

    I use HDMI to the TV when playing games like FIFA 2015, The Witcher etc... i must not be a proper gamer going by your thinking.

    Breh, it's got bigger numbers.  Everyone knows that when the numbers go up (insert sarcasm) it's soooooo much better,  just like my Alienware. My Alienware has so much Gee Bees - like, a lot.  I had the Alienware X but then I got the XX and gave it to my younger brother because I got Alienware X Black xtreme Gee Bee edition w/ additional molecules. Not with HDMI, not with DVI, but with DisplayPort 96.  Because MOBAs and CS:GO.

  • QuizzicalQuizzical Member LegendaryPosts: 25,355
    Originally posted by IAmMMO
    sync free monitors connected using DVI-D.

    I don't care to correct all of the problems with your post, but the adaptive sync standard requires use of DisplayPort.  DVI is a legacy port on its way out.

  • RidelynnRidelynn Member EpicPosts: 7,383


    Originally posted by IAmMMO
    Is HDMI 1080P the new budget gamers? As it's certainly not the choice Pc gamers building proper systems to compete online with.

    HDMI 1080p has been the budget gamers standard for many years now. I could make up some statistics, but I won't -- it has been a very popular interface for whatever reason. Probably because the first wave of "affordable" LCD monitors back in the early/mid-2000's were mostly rebranded televisions, and then it just carried over after that.

  • AlumicardAlumicard Member UncommonPosts: 388

    If you build it in the fall then I'd wait. The AMD Fury(x) is about to release and from what I read the Furx X2 will release in fall.Maybe the prices will drop with its release so my suggestion is wait a bit.

     

    Also the 290x and 390x nearly perform the same so that could be something to consider.

  • HyperpsycrowHyperpsycrow Member RarePosts: 915
    I always liked amd but lately i changed to Nvidia because the card is basicly updating driver it self and fine tuning ur game gfx and experience with ther driver program...and amd i had to seach the jungle for new drivers even ther auto driver app failed sometimes...and i had so many problems with adm always...i had to spend alot of time on ther forums asking people..now i just love nvidia for being thewre not giving me any problems..an di have the 970 gtx...best money spend ever




Sign In or Register to comment.