Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Nvidia announces the GeForce GTX 1080 Ti, on sale for $700 next week

QuizzicalQuizzical Member LegendaryPosts: 25,351
As expected, it's a salvage part of GP102.  It disables a little bit of a lot of things, from compute units to memory channels.  Nvidia decided not to repeat the GTX 970 "4 GB" fiasco and will market it as an 11 GB card.  Both the GPU itself and the memory will be clocked higher than the Titan X, making it likely a hair faster in typical situations.

With reviews not yet out because the card isn't launched, we don't officially know exactly how it will perform yet.  But really, we pretty much do from the paper specs, as it's a salvage part of a GPU already on the market.

With the GTX 1080 Ti available at $700, a GTX 1080 at $600 looks rather overpriced.  Nvidia agreed and slashed prices on the latter to $500.  They're also cutting prices on the GTX 1070 founders edition to $400, but you can get other GTX 1070s for cheaper than that, so that's not terribly interesting.

One could ask why it took Nvidia 7 months after the launch of the Titan X for the salvage part to arrive.  My guess is that yields weren't very good, and as Nvidia had the top end all to themselves with the GTX 1080 anyway, they didn't feel the need to rush.  Do respins, let the process node mature, or whatever it takes for better yields and get good yields before you start really producing GP102 in high volume.

This sets the bar for AMD Vega, due out in the second quarter of this year.  Vega should beat a GTX 1080 Ti in most of the paper specs (TFLOPS, memory bandwidth, etc.), but that's been the case for most of the last decade, and Nvidia was able to counter with the ability to more efficiently use the hardware available.  For a CPU analogue, think of it as Nvidia had better IPC but AMD had more cores--except that in the GPU world, unlike consumer CPUs, this sometimes meant AMD won.

GCN was a nice architecture in its day, but AMD has been selling derivatives of it for over five years now.  If Vega is nothing more than a minorly tweaked GCN, AMD's GPU side could be in trouble, as that's not going to catch Pascal.  A scaled up Polaris isn't going to catch a GTX 1080 Ti inside of 300 W and 500 mm^2.  That might be why AMD didn't bother to produce such a chip last year.  GCN was competitive in 2012 and 2013, but Nvidia made Maxwell a lot more efficient in 2014, and AMD has been generally behind ever since.

If Vega is an all new architecture, or at least a massive overhaul, then it could be just about anything.  It could be this generation's equivalent of the Radeon 9700 Pro that lapped the competition, or it could be this generation's equivalent of the Radeon HD 2900 XT that was hot, late, and slow.

The upshot of this is, if you want a high end gaming card, you really should wait until next week.  I don't see a dire need to wait for Vega, as I'd be very surprised if Vega blows away Pascal or forces Nvidia to sell the GTX 1080 Ti for $500 and the GTX 1080 for $350.  If the top end Vega does match a GTX 1080 Ti in performance, it will probably about match it in price tag, too.
«1

Comments

  • filmoretfilmoret Member EpicPosts: 4,906
    They delivered much better then expected this time.  35% faster then the fastest GPU on the market.  Really low price.  And a new cooling system that helps it run 41 degrees cooler then the 1080.  Downside is 250w power draw.

    Yea AMD got one over on intel for sure.   But getting one over on Nvidia isn't going to be easy at all.
    Are you onto something or just on something?
  • CleffyCleffy Member RarePosts: 6,412
    edited March 2017
    I am pretty sure Vega will be a modified GCN from the news I heard. I don't think that is a bad thing. Pascal is efficient, but does poorly in DX12 and Vulkan. I really can't see the justification in buying a card that you will use for 4 years that does poorly with these APIs. I would only consider it sub $300. The major advantage seen with using GCN over the last few generations is that the cards aged incredibly well. Many older AMD cards are now competing with cards a tier higher compared to release. The architecture itself also has an advantage in gpu compute.
    I would also wait for benchmarks. Vega is releasing in May, this is releasing in April. Are they doing a soft release in order to drive down Vega sales, or is it a hard release because they know the card is not competitive?
  • kitaradkitarad Member LegendaryPosts: 7,910
    However people don't buy two cards so if they buy a Nvidia now earlier they aren't likely to buy a AMD one later so soon when it  releases. Lots of people are just waiting for the Nvidia prices to  go down.

  • filmoretfilmoret Member EpicPosts: 4,906
    I don't think the high end vega will be less then 600$.  Nvidia releasing this card for only 700$ is nothing more then a counter to what they expect from AMD.  Its a strongarm move to prevent AMD from taking their market.
    Are you onto something or just on something?
  • RidelynnRidelynn Member EpicPosts: 7,383
    Quizzical said:
    As expected, it's a salvage part of GP102.  

    ...

    One could ask why it took Nvidia 7 months after the launch of the Titan X for the salvage part to arrive.  My guess is that yields weren't very good, and as Nvidia had the top end all to themselves with the GTX 1080 anyway, they didn't feel the need to rush.  Do respins, let the process node mature, or whatever it takes for better yields and get good yields before you start really producing GP102 in high volume.

    ...

    GCN was a nice architecture in its day, but AMD has been selling derivatives of it for over five years now. 

    I don't know if yields had as much to do with it, as they just wanted to have something in their back pocket to take the wind out of the sails of an impending Vega. Except they didn't wait for Vega, they attacked Ryzen (which is odd if you think about nVidia being worried about a CPU competing against their GPU business, but I guess in a business sense, the CPU business is what AMD is betting the bank on right now, so if you wanted to put them down for the count, you'd try to attack them where it hurts the most). I don't think the 1080Ti has really done anything to drown out Ryzen news, Ryzen talk is everywhere, and I barely hear anything about the 1080Ti - probably because... yawn... it's not really any different than Titan X, just a bit cheaper. Ironically, the same could be said for Ryzen - it's not really any different than Intel HEDT, just a bit cheaper - but there it's between two companies, not one company really just competing against itself, so I guess that does make it news worthy.

    As far as GCN goes - I don't know that it necessarily needs a massive overhaul or not. I guess Vega will be a big tell as far as that goes. My understanding was that GCN does receive some pretty significant updates generation over generation -- Fiji isn't the same GCN that's in Tahiti. I always thought it was more like x86 - it's a common platform, but implementations of it get more and more efficient through generations, and additional instruction sets are added over the years.
  • ReizlaReizla Member RarePosts: 4,092
    @Quizzical I think you've written better reviews on hardware than this one. I know, the GTX1080 Ti is not out yet, but you're mostly making a speculation comparison on AMD Vega, rather than looking at what the card will be like (aside from the 1st paragraph).

    That being said, I think the pricing of this card is decent if you got the CPU to pull it and are 'in need' of a replacement (being a 780 or 970).
  • HefaistosHefaistos Member UncommonPosts: 388
    are there games on the market that 980ti wont fill the needs? I mean, i use 980ti for 1440p and i am totally satisfied. 

    I would say its a great day to buy something starting from 980ti up to 1080...but getting one 1080ti now considering there are no games outthere other card would perform very good would be a bad call. 

    my 5 cents. 
  • RidelynnRidelynn Member EpicPosts: 7,383
    Hefaistos said:
    are there games on the market that 980ti wont fill the needs? I mean, i use 980ti for 1440p and i am totally satisfied. 

    I would say its a great day to buy something starting from 980ti up to 1080...but getting one 1080ti now considering there are no games outthere other card would perform very good would be a bad call. 

    my 5 cents. 
    Well, the short answer to that question is Yes, there are.

    People bought the Titan X, it went for $1,200. So there's a market out there for people who want/need all the performance they can get. If you are trying to do something like push MSAA at 4K, I doubt even the 1080Ti would get you 60FPS in a lot of games.

    The question I think you are trying to ask is, does the 1080Ti meet ~my~ needs? As there are certainly games out there that it will perform well on. It sounds like your happy, and a lot of other people would certainly agree with you (I'm perfectly happy with my 980 right now, but then again I'm pretty forgiving, I don't need 60+ FPS constant and I'm ok with adjusting some settings down) - but there are some people that want more, and a smalller set of folks that actually really could use more power.

    If your gaming at 1080p, it's absolutely still overkill. The 1080 continues to be overkill for that market segment, so that's nothing new.
    The 1440p/2k folks - you could make a case for it either way. Just depends on if your one of those MAX-MAX folks, or a 60+ FPS purist, or not (and have the budget to buy it in the first place).
    If your going to be getting into 4K, then you want all the power you can get your hands on.

    Vega may still a decent ways out. It was announced to be a second quarter release, which could be as early as April, but it could also be as late as July - and with every possibility that it could slip back further than that again. There's also the rumors (just rumors and speculation, but hey, what else do we have to go on around here) that Vega is competitive in the existing 1070/1080 market, but here we are looking a performance tier that's (slightly) above that yet, so if your looking for all the performance you can get, Vega may not be worth waiting for, and your looking to spend $600+ anyway on your GPU - Titan X/1080Ti are it and probably won't change much even after Vega's release (unless Vega blows us all away). 

    But if your in that $300-$500 range, Vega could shake things up there a good deal. AMD already is competitive in the <$300 market with the RX4xx series, so Vega won't do much in that market.
  • RidelynnRidelynn Member EpicPosts: 7,383
    kitarad said:
    However people don't buy two cards so if they buy a Nvidia now earlier they aren't likely to buy a AMD one later so soon when it  releases. Lots of people are just waiting for the Nvidia prices to  go down.
    Don't underestimate the folks that are willing to go top tier no matter what it takes. They have a very good habit of buying whatever is fastest, and selling whatever they are upgrading from on Ebay to cut some of the losses.

    I know a lot of people that jumped from 1080 to TItan X, and just ebayed their used 1080's for around $400-450. If Vega (or whatever, really) proves to be better than what they have now, they will fork out the cash and just flip whatever they had before.

    It's not a big market, but it definitely exists. I wish I had that kind of disposable income.
  • sacredfoolsacredfool Member UncommonPosts: 849
    Ridelynn said:
    Quizzical said:
    As expected, it's a salvage part of GP102.  

    ...

    One could ask why it took Nvidia 7 months after the launch of the Titan X for the salvage part to arrive.  My guess is that yields weren't very good, and as Nvidia had the top end all to themselves with the GTX 1080 anyway, they didn't feel the need to rush.  Do respins, let the process node mature, or whatever it takes for better yields and get good yields before you start really producing GP102 in high volume.

    ...

    GCN was a nice architecture in its day, but AMD has been selling derivatives of it for over five years now. 

    I don't know if yields had as much to do with it, as they just wanted to have something in their back pocket to take the wind out of the sails of an impending Vega. Except they didn't wait for Vega, they attacked Ryzen 
    They didn't "attack" Ryzen.

    nVidia probably decided that releasing the 180ti with Ryzen means there will be a lot of people who will put this new GPU in their new Ryzen builds.

    Of course, those same people would have probably put the 1080 in that build but I guess nVidia wanted to give people a new and shiny GPU to pair with their new and shiny CPU.


    Originally posted by nethaniah

    Seriously Farmville? Yeah I think it's great. In a World where half our population is dying of hunger the more fortunate half is spending their time harvesting food that doesn't exist.


  • CleffyCleffy Member RarePosts: 6,412
    edited March 2017
    Looks like another soft launch.
    A 1080ti or Titan Pascal can make sense with a Ryzen build depending on your workload. For instance if you use Blender a lot, then nVidia makes a bit more sense here as most of the GPGPU functionality is written on CUDA. There is also a 3D scanning tool that composes the final model using CUDA. It's typical to do test renders with the GPGPU, and final render with the CPU. GPUs are designed to calculate 32-bit color data very fast. It's easier to get 64-bit data from the CPU and calculate the bounces. Personally, I would still wait for Volta.
    Post edited by Cleffy on
  • AmazingAveryAmazingAvery Age of Conan AdvocateMember UncommonPosts: 7,188
    Vega will have to beat it by 10% in game performance and be 10% less expensive for it to be worth it or even considered a success. Should be easy with the 8 months they had with the 1080 out.



  • QuizzicalQuizzical Member LegendaryPosts: 25,351
    Vega will have to beat it by 10% in game performance and be 10% less expensive for it to be worth it or even considered a success. Should be easy with the 8 months they had with the 1080 out.
    Nonsense.  If Vega matches a GTX 1080 Ti in price, performance, and power consumption, then it will be successful.  Not super-awesome blow-away-the-competition successful, but plenty good enough to make a ton of money.
  • filmoretfilmoret Member EpicPosts: 4,906
    Quizzical said:
    Vega will have to beat it by 10% in game performance and be 10% less expensive for it to be worth it or even considered a success. Should be easy with the 8 months they had with the 1080 out.
    Nonsense.  If Vega matches a GTX 1080 Ti in price, performance, and power consumption, then it will be successful.  Not super-awesome blow-away-the-competition successful, but plenty good enough to make a ton of money.
    It might.  Meanwhile the 1080 will be considered last generation by the time it comes out.
    Are you onto something or just on something?
  • OzmodanOzmodan Member EpicPosts: 9,726
    edited March 2017
    Most of you should know that a decent 4k monitor or even a good TV is well over the price of a nice computer system ATM.  So I don't really know what all the argument is about.  

    As to Vega, if AMD can put a price competitive GPU out there it will sell well because Nvidia has been resting on their laurels for too long, clearly shown by the high prices and poor driver support lately.  If nothing else, all of you should be congratulating AMD for making them sit up and take notice they are not the only game in town.
  • RidelynnRidelynn Member EpicPosts: 7,383
    filmoret said:
    Quizzical said:
    Vega will have to beat it by 10% in game performance and be 10% less expensive for it to be worth it or even considered a success. Should be easy with the 8 months they had with the 1080 out.
    Nonsense.  If Vega matches a GTX 1080 Ti in price, performance, and power consumption, then it will be successful.  Not super-awesome blow-away-the-competition successful, but plenty good enough to make a ton of money.
    It might.  Meanwhile the 1080 will be considered last generation by the time it comes out.
    There is a big if here, but bear with me.

    IF Vega matches 1080Ti
    THEN why does it matter that 1080 would be considered Last Generation at that point? Your comparing Vega to 1080Ti, not 1080, so whatever the 1080 may or may not be would be irrelevent at that point.


  • AmazingAveryAmazingAvery Age of Conan AdvocateMember UncommonPosts: 7,188
    Quizzical said:
    Vega will have to beat it by 10% in game performance and be 10% less expensive for it to be worth it or even considered a success. Should be easy with the 8 months they had with the 1080 out.
    Nonsense.  If Vega matches a GTX 1080 Ti in price, performance, and power consumption, then it will be successful.  Not super-awesome blow-away-the-competition successful, but plenty good enough to make a ton of money.
    Not really. The amount of time they have had to sort their house out has been ages....
    IMO needs to improve on TI to be successful. So much time to strategy and plan. minimum.
    At worst it needs to be equal and 10% less $$$$$$

    Would be nice to see AMD competitive at the high end, been so long.



  • filmoretfilmoret Member EpicPosts: 4,906
    RX 490 (or RX fury, or whatever)... or 1080 ti. Both valid candidate for my next graphic card.
    Considering how good the GTX 1080 already is, the ti will be a bomb.
    EH?   Its 35% faster and runs 40 degrees cooler.  Looks like its a nuke.
    Are you onto something or just on something?
  • CleffyCleffy Member RarePosts: 6,412
    edited March 2017
    35% is correct for the 1080->1080ti. You are going from 8 gigaflops to 12 gigaflops. 35%-50% is to be expected. I glossed over some reviews that showed about that performance delta. 40C cooler is an exaggeration considering it's using a blower style cooler. If it's compared to the R9 Fury X because he didn't see Picard was talking about the next generation Fury X card, Then it's about 60% faster, but will run much hotter. The Fury X uses an AIO water cooler. Thing tops out at 50C for a GPU.
    The R9 Fury X was not that long ago, it was competitive at the high end and still competes against the 1070. For me I have no intention of buying a Pascal or Polaris card for my next gpu. If Vega is not very good, it's a wait for Volta.
  • QuizzicalQuizzical Member LegendaryPosts: 25,351
    Cleffy said:
    35% is correct for the 1080->1080ti. You are going from 8 gigaflops to 12 gigaflops. 35%-50% is to be expected. I glossed over some reviews that showed about that performance delta. 40C cooler is an exaggeration considering it's using a blower style cooler. If it's compared to the R9 Fury X because he didn't see Picard was talking about the next generation Fury X card, Then it's about 60% faster, but will run much hotter. The Fury X uses an AIO water cooler. Thing tops out at 50C for a GPU.
    The R9 Fury X was not that long ago, it was competitive at the high end and still competes against the 1070. For me I have no intention of buying a Pascal or Polaris card for my next gpu. If Vega is not very good, it's a wait for Volta.
    The GTX 1080 is theoretically about 8.9 TFLOPS and the GTX 1080 Ti is 11.3 TFLOPS, with both at the stock boost clock speeds.  Taking to more significant figures, the latter is 27.8% faster than the former at things that scale with TFLOPS--which includes some cache bandwidths.  It also has 37.5% more memory bandwidth, at least if you're assuming the same GDDR5X clock speed for both.  So you'd expect gains in that ballpark.  35% is on the high side, and you're not going to get to 50%.
  • CleffyCleffy Member RarePosts: 6,412
    My bad, I actually didn't look up the specific gigaflops, I just knew the 1080 was in the 8 gigaflop range and the 1080ti around 12
  • AmazingAveryAmazingAvery Age of Conan AdvocateMember UncommonPosts: 7,188
    If the gains being talked about are in gaming then lets be really clear if your're looking at tflops. A 8.9 Tflop 1080 beats the Fury X at 8.5 Tflops and do so in DX11 up to 40% and even in DX12 in AMD "devolved" titles it still wins. 

    AMD cards have always had significantly more teraflops and memory bandwidth than Nvidia cards of the same price bracket and tier and often higher too.

    Real gaming performance is all that matters.
    980ti 6.5 tflops vs Fury X 8.5 tflops
    780ti 5.04 tflops vs 290x 5.6 tflops
    680 3.1 tflops vs 7970 3.8 tflops
    these examples clearly show more tflops doesn't alway mean more performance in games.
    AMD has always had more FP32 performance and it sounds good on paper or when someone quotes it but in real world situations for gamers the performance of their flagships was always about equal.
    This is why I feel 1080Ti with 11.5 Tflops will beat a VEGA 10.

    An overclocked 1080TI at around 14 tflops will be a beast. We could see Vega reaching 15 TFLOPS of FP32 with some high end card. 4096 shaders running at 1500 MHz will give 12.5 TFLOPS of FP32. I believe AMD will keep the clock at 1500 MHz base, 1600 Boost. They will give there partners advantage to clock the card between 1600 MHz to 1900 MHz with better PCB design, and sell it for higher price.

    VEGA will have big advantage against NVIDIA Pascal cards in DX12 and Vulkan, no thing NVIDIA can do to fix that. I also think for VEGA the AMD reference design 12.5 TFLOPS of FP32 for $599. AMD partners with a higher clock speed 14.5 TFLOPS of FP32 with better PCB design for $699 same price as 1080TI.



  • QuizzicalQuizzical Member LegendaryPosts: 25,351
    If the gains being talked about are in gaming then lets be really clear if your're looking at tflops. A 8.9 Tflop 1080 beats the Fury X at 8.5 Tflops and do so in DX11 up to 40% and even in DX12 in AMD "devolved" titles it still wins. 

    AMD cards have always had significantly more teraflops and memory bandwidth than Nvidia cards of the same price bracket and tier and often higher too.

    Real gaming performance is all that matters.
    980ti 6.5 tflops vs Fury X 8.5 tflops
    780ti 5.04 tflops vs 290x 5.6 tflops
    680 3.1 tflops vs 7970 3.8 tflops
    these examples clearly show more tflops doesn't alway mean more performance in games.
    AMD has always had more FP32 performance and it sounds good on paper or when someone quotes it but in real world situations for gamers the performance of their flagships was always about equal.
    This is why I feel 1080Ti with 11.5 Tflops will beat a VEGA 10.

    An overclocked 1080TI at around 14 tflops will be a beast. We could see Vega reaching 15 TFLOPS of FP32 with some high end card. 4096 shaders running at 1500 MHz will give 12.5 TFLOPS of FP32. I believe AMD will keep the clock at 1500 MHz base, 1600 Boost. They will give there partners advantage to clock the card between 1600 MHz to 1900 MHz with better PCB design, and sell it for higher price.

    VEGA will have big advantage against NVIDIA Pascal cards in DX12 and Vulkan, no thing NVIDIA can do to fix that. I also think for VEGA the AMD reference design 12.5 TFLOPS of FP32 for $599. AMD partners with a higher clock speed 14.5 TFLOPS of FP32 with better PCB design for $699 same price as 1080TI.
    For many years, the situation has been that AMD offered more brute force hardware available to do work, while Nvidia offered a more clever scheduler that was better able to exploit the hardware available.  They've both moved toward each other quite a bit over the years, most notably with Kepler adding massively more shaders than before and GCN greatly improving AMD's scheduling.

    But just because something has historically been the case doesn't mean that it has to be that way forever.  If Vega is just GCN/Polaris plus HBM2, then yeah, Nvidia is still going to be ahead for gaming, and a Radeon RX Vega would probably be a little faster than a GeForce GTX 1080 and shy of a GTX 1080 Ti.  But if it's a very new architecture that can match what Maxwell did for scheduling three years ago, then the question isn't so much whether Vega will beat a GTX 1080 Ti but by how much.  And on that, we just don't know yet.
  • CleffyCleffy Member RarePosts: 6,412
    edited March 2017
    I used tflops(miswrote as gigaflops) because they are the same architecture. Difference in tflop should show pretty close to the difference in fps.
  • HrimnirHrimnir Member RarePosts: 2,415

    Just to add a car example, which I've used before.  I had a buddy who has a 03' Mustang Cobra with a Ken Bell supercharger kit.  He dyno'd around 540 WHP at 6000' altitude.  He raced a guy with a 670 WHP Toyota Supra from a rolling 40mph and walked 3 car lengths on him by the time they hit 100.

    Now, if you know anything about cars, it's more than just peak HP figures that determine performance. Yes, the stang had a lower peak HP, however it had a higher HP figure at literally every other point on the dyno chart, meaning it had more area under the curve, so from say 2000-5000 RPM he was putting significantly more power to the ground to his peak, whereas the Supra was anemic under about 5500rpm.

    Now, how is this relevant?

    The number's don't always tell the whole picture. Even things as simple as drivetrain types can make a difference, different transmissions are more or less efficient.  Different gear ratios matter, etc etc etc.

    So just looking at the "raw power" of a video card like a tflops or memory bandwidth, generally isn't going to tell you everything you need to know.

    Basically like quizzical said, AMD typically had more "Horsepower" but had trouble putting that horsepower to good use, and was getting beat by a "less powerful" card that had more tricks up its sleeve to utilize that horsepower.  Think rear wheel drive vs all wheel drive.

    "The surest way to corrupt a youth is to instruct him to hold in higher esteem those who think alike than those who think differently."

    - Friedrich Nietzsche

Sign In or Register to comment.