Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

R9 390 vs GTX970 in 2016.

2

Comments

  • 13lake13lake Member UncommonPosts: 719
    edited December 2016
    Torval said:
    Malabooga said:
    And i dont know any that use those cards, but then, im in real engineering, you know, sturctural, mechanical, geotechical, geological, electrotechnical... .... ....
    So all engineering programs run on the gpu? Interesting I didn't know that. lol, what's up with the "real" engineer thing? Feeling a little insecure or something? Don't worry engineers are important too.

    What kind of real engineers would buy some scrub retail card like the 390 to do work on? I would think that something as important as engineering would require enterprise level hardware.
    he means the original gangsta, aka the engineers not related to software development. using autoCAD, various math programs, weather prediction, seismic software, oil spill , ice tracking, animal tracking, etc, ... the programs that use double precision which nvidia and amd gimped incredible since geforce 600 and amd 7900 series. Programs which need many cpu cores and raw cpu power and gpu or professional graphic cards exclusively.

    He probably means engineers not related to IT hardware development as well.

    As professional graphic cards are very expensive and original engineers have different management and economical models in companies to work around compared to software engineers(developers), the only alternative to professional multi-thousand graphic cards are amd 300 series and below.

     Why ? amd 300 is 200 series which still has ok double-precision and raw power (had more of it compared to then gen nvidia especially, and this is important because OG engineering companies won't change PCs so often).

    as Quizzical said, the best bang for your buck double precision gpu is 7970. second best is 290/390. You need multiple machines with OG engineering software, you get 7970 or 290/390 or a professional quadro/tesla. You don't get silly 970/980/Ti 1070/1080/Ti/Titan period.

    Different people in different fields with different needs and different economics at play. 

    It is very fiscally irresponsible and company ruining to rule out amd cards for basic individual employee workstation computers in original gangsta engineering companies.
    Yes they will have a single supercomputer probably, and a few extra professional gpu and xeon equipped workstations for employees to share.

    But employees need their individual workstations to test before wasting time and waiting in line for the big computers. These companies usually have more people, and depending whether they can bring their own computers to work or not there's even more factors in play, so it's pretty different than you usual graphic design studio, or a game design studio.
  • MukeMuke Member RarePosts: 2,614
    Demrocks said:
    A 1070 cost 500 euro here not even going to mention a 1080....

    AMD always age so much better then Nvidia and gamers do get more performance over the lifespan from AMD GPU's.

    I wait for Vega and Zen as i want AMD to survive as they realy do amazing work with their limited budget.

    Nvidia is pooping out so many gpu's and are currently dominating the charts with their monster gpu's but they also ask premium prices and theur gpu's age realy realy bad, not to mention their performance or lack thereoff in Vulkan / dx12.

    2017 is hopefully going to be a fantastic year both in cpu and gpu hardware :)
    If their GPU's are so bad, why is it that they are dominating AMD for years? Because of marketing?
    Or because they poop out so many good GPU's?
    And yes, they are overpriced, but why is that? No real competition?

    With ZEN you just have to wait if it is very good other then watching some biased AMD execs showing off their cpu.
    I wait till the reviews are in (the unbiased ones) even before ordering.


    "going into arguments with idiots is a lost cause, it requires you to stoop down to their level and you can't win"

  • 13lake13lake Member UncommonPosts: 719
    edited December 2016
    For instance i know a small company doing all kinds of outsourcing for bigger developers (game, non-game, non-IT as well) and they run purely amd graphics based gpu rendering machines. Not a single nvidia card.

    They do have a regular employee whos only job is to setup the cards to work in vray, redshift, mental ray, arnold, etc, ... because anyone who has ever worked with any of these programs (or god forbid tried to use gpu for after effects for example) knows that it's a slight hassle to get some of them to work nvidia (opticx ray tracing problems for instance and similar), it is an absolute nightmare to make amd cards work in them.

    So this guys spends an enormous amount of time setting up adobe and autodesk family of programs to work with the rendering programs, but it pays off.

    Why ? well they spent and are spending less money on amd + amd setup dude 24/7 than they would have spent for an nvidia setup, because they're a small company and don't get discounts on parts/computers.
  • RidelynnRidelynn Member EpicPosts: 7,383
    Malabooga said:
    There are other people participating too and from what ive seen vast majority dont understand. AAA games (games that actually require something more than an APU) are not selling all that well on PC. Even NVidia themselves published their own statistics that 80% of NVidia users are < PS4. Pushing price hikes and badly optimized games aint gonna help that lol
    http://www.digitaltrends.com/gaming/pc-market-grew-in-2016-led-by-mobile-and-pc-gaming/

    For as poorly as AAA games are selling on the PC, the PC only beat out the console market by a factor of 5. An MMO even gets mentioned as a high revenue generator for the industry (Guild Wars 2). You can say AAA games are selling poorly, but you take the top 3 titles - LoL, DFC, and Crossfire, and those alone are more than half of the entirety of the console market by themselves. I'd say those games have a development budget and staff that allows them to be called AAA by pretty much every measure of the term.
  • gervaise1gervaise1 Member EpicPosts: 6,919
    Ridelynn said:
    Malabooga said:
    There are other people participating too and from what ive seen vast majority dont understand. AAA games (games that actually require something more than an APU) are not selling all that well on PC. Even NVidia themselves published their own statistics that 80% of NVidia users are < PS4. Pushing price hikes and badly optimized games aint gonna help that lol
    http://www.digitaltrends.com/gaming/pc-market-grew-in-2016-led-by-mobile-and-pc-gaming/

    For as poorly as AAA games are selling on the PC, the PC only beat out the console market by a factor of 5. An MMO even gets mentioned as a high revenue generator for the industry (Guild Wars 2). You can say AAA games are selling poorly, but you take the top 3 titles - LoL, DFC, and Crossfire, and those alone are more than half of the entirety of the console market by themselves. I'd say those games have a development budget and staff that allows them to be called AAA by pretty much every measure of the term.
    Warning Will Robinson - Superdata guesstimate incoming.

    (Apologies Ridelynn but it has to be said.)
  • RidelynnRidelynn Member EpicPosts: 7,383
    Superdata could be off by as much as a factor of 2 on their estimates, in the least-favorable direction in each case, and PC still handily beats out Console sales by a significant margin. 

    So sure, it may be a guesstimate, but it doesn't have to be exactly accurate for it to tell which way the wind is blowing either.
  • RenoakuRenoaku Member EpicPosts: 3,157
    edited December 2016
    Buy the Nvidia hands down "Radeon always has bad drivers, and many bugs" Not to mention (Open GL used in some older games) has shader issues they refuse to fix lol...

    Nvidia on the other side working purrrfectly.

    Oh and whatever you do (Do not buy Sapphire Cards) I purchased one of these things years ago and it failed after like 1-2 years Max with no warranty support, yet I have Graphics cards from 2004 by Nvidia (Still Funny Working...) 

    I would love AMD to explain this to me lol or Sapphire for that matter...

    If you do go with AMD at least go with (XFX) or (Asus) something with Life-Time warrantym even BFG back in the day was great...

    On the other hand I am happy with my 750ti, and this... 
    http://imgur.com/a/Mr8u8
  • MalaboogaMalabooga Member UncommonPosts: 2,977
    edited December 2016
    Renoaku said:
    Buy the Nvidia hands down "Radeon always has bad drivers, and many bugs" Not to mention (Open GL used in some older games) has shader issues they refuse to fix lol...

    Nvidia on the other side working purrrfectly.

    Oh and whatever you do (Do not buy Sapphire Cards) I purchased one of these things years ago and it failed after like 1-2 years Max with no warranty support, yet I have Graphics cards from 2004 by Nvidia (Still Funny Working...) 

    I would love AMD to explain this to me lol or Sapphire for that matter...

    If you do go with AMD at least go with (XFX) or (Asus) something with Life-Time warrantym even BFG back in the day was great...

    On the other hand I am happy with my 750ti, and this... 
    http://imgur.com/a/Mr8u8
    NVidia has DX12/Vulkan issues that they refuse to fix....no wait, they just straight up lied about their GPU DX12/Vulkan capabilities so they lose performance in every DX12/Vulkan game.

    Sapphire is best AMD retailer, your anecdotal evidence is hilarious.

    hey look at "best" Nvidia cards:

    http://wccftech.com/nvidia-gtx-1080-evga-catches-fire-video/

    good for new years eve fireworks. Now we know what "FE" stands for: "Fireworks Edition" lol

  • RidelynnRidelynn Member EpicPosts: 7,383
    Malabooga said:
    Ridelynn said:
    Superdata could be off by as much as a factor of 2 on their estimates, in the least-favorable direction in each case, and PC still handily beats out Console sales by a significant margin. 

    So sure, it may be a guesstimate, but it doesn't have to be exactly accurate for it to tell which way the wind is blowing either.
    And Ridelyn strikes again, not having a clue what he links and that thats just DIGITAL console revenue which is minor part.

    And this proves that.... AAA games are doing poorly? Seems like that graph is also saying PC gaming is doing just fine. 
  • 13lake13lake Member UncommonPosts: 719
    edited December 2016
    Torval said:
    So this corner case for buying gimped retail hardware is based on a fringe scenario of some engineering companies that use cheap parts and cut corners to save money.

    That is a frightening but not unsurprising summary and explains why modern precision engineering has gone to shit.
    A moderate sized company with for instance 2000-3000 employees that need their own workstations that the company gets. $3000 per professional gpu alone makes it $9 million dollars for just the gpus in those workstations :)

    The number of original engineering companies who can afford to throw around like this can be counted on fingers of one hand. (ok germany+usa+britain might bit slightly more but you get the point)

    And look outside germany/usa/britain and maybe just maybe france and israel, and you won't even find one :)

    Also what's the biggest market for gpus in the world ? china
    Which country has the highest number of ongoing projects that involve old school engineering ? china
    (and that's just by nature of having built monstrous amounts of infrastructure in the past few years that all the countries who could afford to do the same have already had built for a hundred years now xD)

  • 13lake13lake Member UncommonPosts: 719
    edited December 2016
    Amec Foster Wheeler for instance has operating net income of 334 million pounds for 40 000 employees ( and no im not doing an assessment of how many of those 40k scaled down need a workstation, ... just rough figures to show the general economics at play)

    Scale that down and imagine a similar company in china, india, russia, heck even korea. And the math for cost per employee just for individual or multiple workstations for person on top of shared xeon workstations skyrockets.

    And we aren't even counting all the other expenses :)
  • GladDogGladDog Member RarePosts: 1,097
    edited December 2016
    filmoret said:
    If you wanna surf the web then get AMD.  If you want to play a video game then buy Nvidia.
    My AMD RX-480 playing BDO and ESO at max graphics begs to differ.

    Also the AMD APU in my XBOX ONE S would like a word with you...


    The world is going to the dogs, which is just how I planned it!


  • The user and all related content has been deleted.

    거북이는 목을 내밀 때 안 움직입니다












  • 13lake13lake Member UncommonPosts: 719
    If you want to surf get intel or amd APUs, not discreet cards, ...
  • QuizzicalQuizzical Member LegendaryPosts: 25,355
    Quizzical said:
    Malabooga said:
    DMKano said:
    Malabooga said:
    If you dont care - why post at all lol
    Oh I do care, just doesnt it seem steange to not test a 1070?
    And why test 1070? 1070 is 400$ card, it would only prove how bad it is comapred to 970/390 in performance/price (especially 390) lol
    Depends on what you do with your card. If it is just for gaming? Sure pick the 390 (8 gig of course)
    If you do other things besides gaming (and that too) the 1000 series or 400 series are great performance per watt cards. 
    Passmark is better.
    Lower TDP
    Higher texture rate
    Higher memory clock speed
    The floating point performance is better (which as a developer is important to me)
    Much better direct compute power
    Higher pixel rate
     
    Who cares about Passmark?  It's a synthetic benchmark of who knows what that doesn't even measure an illuminating corner case.

    Memory clock speed isn't relevant in isolation.  Memory bandwidth is the salient consideration, and there, the Fury cards are still king because of HBM, and the R9 390 has far more bandwidth than even a GTX 1080 because of the wider memory bus.

    If you're comparing the top of the line from a previous generation to the top of the line today, then of course the latter is going to be better at most things.  But that's not the comparison most people make.  Usually it's the best $300 card from a previous generation to the best $300 card today, or $200 or $100 or whatever.

    And if that's what you want, the comparisons get more complicated.  The Fury cards still blow away all else on the market at local memory bandwidth and capacity, for example.  If you want double precision compute in a sub-$1000 card, the venerable Radeon HD 7970 is still the best there is, as it was the last card where the consumer version didn't cripple the double precision compute.
      We care about Passmark.
      The comparison of the 1070 came up and was said to be not better than the 390. I disagree. As do the numbers that back that disagreement up.

    ;)

    Sorry. As an owner of a 1080 that came from a 380 my workstation power shot up dramatically.
    No one is going to convince me otherwise.
    Well yes, a GeForce GTX 1080 is a much faster card at most things than a Radeon R9 380, which I think is the comparison you're making.  That's a big enough difference that to find something where the 380 wins outright, you'd have to go fishing for bugs or weird errata; I'd expect more typical performance to be the GTX 1080 getting about triple the performance of the R9 380.

    And it's perfectly reasonable to care about performance in the programs you care about.  But why Passmark in particular?  Why not cite games or whatever programs you use on your workstation?
  • QuizzicalQuizzical Member LegendaryPosts: 25,355
    Torval said:
    Outside of folding, bitcoin mining, and maybe trying to find prime numbers what sort of workstation graphics processing are we talking about that need double precision?
    For double precision to matter, you usually have to leave graphics behind.  It's also completely irrelevant to bitcoin mining, which is all about integer instructions, not floating point of any precision.  I'm skeptical that double precision would be of use to find prime numbers, either, though I don't know that for certain.

    If you've done CPU programming, it's basically a question of when do you absolutely have to use a double because a float just won't be good enough.  It's the same question on GPUs, except that on GPUs you perhaps think more about it because it's a big performance difference and your kernel code isn't very long.
  • QuizzicalQuizzical Member LegendaryPosts: 25,355
    Malabooga said:
    Torval said:
    Outside of folding, bitcoin mining, and maybe trying to find prime numbers what sort of workstation graphics processing are we talking about that need double precision?
    every single engineering application. Engineers dont deal in "approximate" or "good enough" values lol

    I'm not sure if that was sarcasm, approximate and good enough are pretty much all that engineers deal with.  Quantum mechanics will spoil your day if you expected to have exact values of anything in the real world.

    If you need exact values in computations, you usually can't use floating point anything at all of any precision.  At minimum, would have to be extremely careful about overflowing the mantissa and getting rounding errors.
  • 13lake13lake Member UncommonPosts: 719
    Malabooga said:
    And i dont know any that use those cards, but then, im in real engineering, you know, sturctural, mechanical, geotechical, geological, electrotechnical... .... ....
    I'm in game development. As people here already know. I work with Software Engineers.
    Why do i get the feeling you work for Riot games, ... scratch that i would bet money you work for RIot Games, ...

    When i remember how many players I've unintentionally and indirectly brought you and the hon guys, ... 

    I wish i knew what i was doing before i siphoned all those players from dota 1 (even though it's not that much compared to the whole world, ...)
  • QuizzicalQuizzical Member LegendaryPosts: 25,355
    Torval said:
    Malabooga said:
    And i dont know any that use those cards, but then, im in real engineering, you know, sturctural, mechanical, geotechical, geological, electrotechnical... .... ....
    So all engineering programs run on the gpu? Interesting I didn't know that. lol, what's up with the "real" engineer thing? Feeling a little insecure or something? Don't worry engineers are important too.

    What kind of real engineers would buy some scrub retail card like the 390 to do work on? I would think that something as important as engineering would require enterprise level hardware.
    It all depends on what you're going to do with the computer.  A lot of people--surely including some engineers--don't do anything on their computer for which a top of the line video card offer any benefits over integrated graphics.
  • MalaboogaMalabooga Member UncommonPosts: 2,977
    edited December 2016
    Quizzical said:
    Malabooga said:
    Torval said:
    Outside of folding, bitcoin mining, and maybe trying to find prime numbers what sort of workstation graphics processing are we talking about that need double precision?
    every single engineering application. Engineers dont deal in "approximate" or "good enough" values lol

    I'm not sure if that was sarcasm, approximate and good enough are pretty much all that engineers deal with.  Quantum mechanics will spoil your day if you expected to have exact values of anything in the real world.

    If you need exact values in computations, you usually can't use floating point anything at all of any precision.  At minimum, would have to be extremely careful about overflowing the mantissa and getting rounding errors.
    You are looking this from a layman/theoreticl point but it doesnt work that way.

    FEA is approximate (numeric) method. But when it comes down to actual calculation it demands high precision (as an example). You can have millions of nodes each one depending on previous node value, even smallest error on each node would result in big mistakes in later nodes (and increasing it with each new node)

    In the end, as i said it goes well beyond the scope of this thread (and quite a few posters)

    1. NVidia GP100, DP capability 15000$
    2. NVidia  GP102, same chip in everything but without DP capability 5000$

    next best thing for DP is AMD Hawaii aka "390/x" in their Pro variants.

    But ive come to terms tha most people insist on staying uneducated.
  • QuizzicalQuizzical Member LegendaryPosts: 25,355
    Malabooga said:
    Quizzical said:
    Malabooga said:
    Torval said:
    Outside of folding, bitcoin mining, and maybe trying to find prime numbers what sort of workstation graphics processing are we talking about that need double precision?
    every single engineering application. Engineers dont deal in "approximate" or "good enough" values lol

    I'm not sure if that was sarcasm, approximate and good enough are pretty much all that engineers deal with.  Quantum mechanics will spoil your day if you expected to have exact values of anything in the real world.

    If you need exact values in computations, you usually can't use floating point anything at all of any precision.  At minimum, would have to be extremely careful about overflowing the mantissa and getting rounding errors.
    You are looking this from a layman/theoreticl point but it doesnt work that way.

    FEA is approximate (numeric) method. But when it comes down to actual calculation it demands high precision (as an example). You can have millions of nodes each one depending on previous node value, even smallest error on each node would result in big mistakes in later nodes (and increasing it with each new node)
    Certainly, there are some things that demand high precision.  There are reasons why some programs need to use double precision rather than single, and some need higher precision than that, even.  I once dealt with an algorithm that required rounding numbers larger than 10^1000000 to the nearest integer.

    But high precision is not at all similar to exact values.  Encryption or hash functions, for example, typically require every bit to be exactly right all the way through, and a single bit getting flipped at any point quickly blows up into the whole thing being complete garbage.  When you need that kind of exact values, you're very leery of using floating-point anything at all, and double precision is useless for the same reasons as single precision.

    Or to take another example, suppose that you roll a fair die 1000 times.  What is the probability that you get a multiple of 3 on an even number of those rolls?  It's extremely close to 1/2, but not quite equal to it.  There's no hope of experimentally determining whether the true probability is more or less than 1/2, and throwing more hardware at it won't make a bit of difference.  But 1/2 is not the exact answer, and it's actually not very hard to compute by hand.
  • gervaise1gervaise1 Member EpicPosts: 6,919
    edited December 2016
    13lake said:
    Torval said:
    So this corner case for buying gimped retail hardware is based on a fringe scenario of some engineering companies that use cheap parts and cut corners to save money.

    That is a frightening but not unsurprising summary and explains why modern precision engineering has gone to shit.
    A moderate sized company with for instance 2000-3000 employees that need their own workstations that the company gets. $3000 per professional gpu alone makes it $9 million dollars for just the gpus in those workstations :)

    The number of original engineering companies who can afford to throw around like this can be counted on fingers of one hand. (ok germany+usa+britain might bit slightly more but you get the point)

    And look outside germany/usa/britain and maybe just maybe france and israel, and you won't even find one :)

    Also what's the biggest market for gpus in the world ? china
    Which country has the highest number of ongoing projects that involve old school engineering ? china
    (and that's just by nature of having built monstrous amounts of infrastructure in the past few years that all the countries who could afford to do the same have already had built for a hundred years now xD)

    Yep. Which is why most engineers will just have a PC - or laptop - for stuff like Office. And most will be normal, possibly outdated, PCs with Intel or AMD cpus, Nvidia or Radeon gpus or even Intel or AMD on-board graphics.

    Make no mistake they will also have access to workstations - or even supercomputers - for stuff like fluid dynamic calculations (varies by discipline). And there are different ways all this can be / is arranged. 

    End result: the employees get what they need; the company doesn't spend a fortune every year updating its PCs.
  • MalaboogaMalabooga Member UncommonPosts: 2,977
    edited December 2016
    "I'm not sure if that was sarcasm, approximate and good enough are pretty much all that engineers deal with."

    i was responding to this.

    but, if you have something thats "best to work with" you work with it, its better than nothing at all, even if you know that its not 100% correct, or you just threw out some things for sake of simplicity, because even if they do add to result, that addition is insignificant in grand scheme of things (and it might even be compensated by other, more simple things). But after youve dealt with theory, you dont just go doing half arsed calculations because you know that theoretically it isnt 100% correct, quite the opposite lol

    and statements like this:

    "Much better direct compute power"

    when that 390 beats whole Maxwell and Pascal line of cards except P100 in DP lol (or to be more precise, those cant even do DP in practice, in theory they do 1/32)

    but as i said, out of scope of this thread and forum as a whole lol

  • GladDogGladDog Member RarePosts: 1,097
    GladDog said:
    filmoret said:
    If you wanna surf the web then get AMD.  If you want to play a video game then buy Nvidia.
    My AMD RX-480 playing BDO and ESO at max graphics begs to differ.

    Also the AMD APU in my XBOX ONE S would like a word with you...
    Awesome! I hold no favorite on Green or Red camps. I buy the best card out there at the time I am buying one. Which is not usually too often. I like a beefy card that lasts.

    That 480 you have is a great card =)

    I'm with you, I go with the best deal I can find.  When I bought the RX-480 I was comparing the GTX1060 with it.  The 1060 was slightly more money, and the 480 had 8GB RAM vs 6GB for the 1060.  Also, the 480 had slightly better performance in DX12 than the 1060, but they were so close a driver update would fix that.

    In the end cheaper + more RAM + $10 rebate won me over to the RX-480.


    The world is going to the dogs, which is just how I planned it!


  • MalaboogaMalabooga Member UncommonPosts: 2,977
    New 2017. tests of new gen cards, 19 games tested



    GTX1070 is only 31% faster than RX480 but costs double. GTX1070 is one of THE most overrated and overhyped cards in history (thanks currupted media), as its ABBYSMAL performance/price and value.

    Wont even comment 1200$ Titan XP thats only 17% faster than GTX1080 lol

    And yeah, RX480 is faster than GTX1060 (and will only improve that score over time)

    RX470 4GB/RX480 4GB are BY FAR best performance/price cards.

     1050ti is abbysmal card to buy (almost as bad as 1070) as RX470 is only 20-30$ more expencive but 44% faster.
Sign In or Register to comment.