Quantcast

Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

What is beyond volta 11xx nvidia? 12xx of course, but what is it?

centkincentkin Member RarePosts: 1,526
We knew quite a bit about Volta long before Pascal hit the market.  Pascal is now aging and Volta is about to be released, and ... 

There is remarkably little that I have seen being said.  I mean there is some talk of active ray-tracing, but certainly no 12xx talk and what the cards will look like and when.  We had all of that far earlier than this in the prior generations. 

Do I take it 11xx will be it for relatively quite a long time as opposed to there being a new significant 12xx series a couple of years out?
«1

Comments

  • RidelynnRidelynn Member EpicPosts: 7,057
    edited May 2018
    Wasn’t Tesla in the mix somewhere?

    *edit* Turing*
  • centkincentkin Member RarePosts: 1,526
    Turing is merely the name of the architecture of the consumer level volta cards -- assuming they even use that.  Things are just so opaque right now.
  • VrikaVrika Member EpicPosts: 6,411
    I think the reason is that NVidia is dominating AMD in the high-end market. They don't want to make any noise about their future generation GPUs so that those who want high end GPU would buy now and choose NVidia.
    Ozmodan
     
  • CleffyCleffy Member RarePosts: 6,246
    edited May 2018
    This summer may be interesting. Leaked benchmarks of AMDs next GPUs are showing 1080 performance at $250 price point. On both the Volta and Vega 2, there is speculation on the memory adopted and the production process. Will it be on 7nm or 10nm? In any case you won't be able to buy any until late this year if at all.


    Real-time raytracing has been a thing for a decade. The problem was that you need an architecture like AMD had, and it will decrease the quality of the visuals for little benefit.
    Ozmodan
  • TorvalTorval Member LegendaryPosts: 19,934
    We're at this weird point where 1080 and 1440 doesn't need the highest end video cards anymore. Soon integrated graphics will tackle that. 4K is a few years away, or at least until 4K monitors are $300 - $500 for 144hz gsync IPS or better panels. Right now a good 4K panel that about meets those requirements is $2K.

    I'm gaming at 1080/60hz now with a nice ASUS 29" IPS monitor. My next upgrade will be to 1440/144hz. For me increased power is less interesting than size and power reductions. 4K is too expensive overall - expensive bandwidth, expensive storage, expensive to produce, expensive to render, and for what? A small increase in texture fidelity? For me 1440 at a higher refresh and higher quality panel is a better experience so more powerful features aren't as attractive anymore.

    I'm looking for smaller, lighter, and less power consumption for the same or slightly better performance. My next desktop build is centered around PC power in a console form factor. That means mITX and ideally integrated graphics with the power of 12xx or Zen. I want to do that without creating a space heater as a side effect.

    So Ray Tracing and fancy features or huge power bumps aren't going to grab my attention as much as shrinking the size and power rating.
    blueturtle13[Deleted User]ceratop001
    Fedora - A modern, free, and open source Operating System. https://getfedora.org/

    traveller, interloper, anomaly, iteration


  • RidelynnRidelynn Member EpicPosts: 7,057
    edited May 2018
    centkin said:
    Turing is merely the name of the architecture of the consumer level volta cards -- assuming they even use that.  Things are just so opaque right now.
    Or Turing is the mining-specific architecture. Or Turing is an AI-oriented card.

    Or whatever other rumor wants to be floated next week.

    Volta is out in the compute-oriented  cards (Tesla/Titan-V), so there's a good case to be made for 1100 being Volta. But it's also still at insane price points (maybe yields, maybe because Pascal is still selling out), and Pascal is still selling out... so maybe you just refresh Pascal into an 1100 to keep people from waiting for Volta (at least for a while).

    Pascal right now is still King of the Hill. The Intel model in this situation was to milk it for everything that it's worth - don't release big jumps, because that only makes it harder to one-up yourself in the future. Do small but measurable performance increases on a steady cadence to keep the sales numbers rolling. That served Intel well for a long time, and AMD has just now caught up. Not saying I like it, but it makes for smart business sense.

    nVidia as a company, though, has a reputation for making those big leaps, and their stock price has been rewarded because they have been able to deliver for a long while now.

    If I were Jensen Haung... 
    Computer graphics SKU market right now is a mess, largely because of the influence of mining. That needs to be addressed. nVidia is "sort of" doing that, they have a new 1050 SKU that shouldn't appeal to miners. They need to come out with some product that specifically does appeal to miners, so the rest of the GPU lineup can get unconstrained. Or just wait for Bitmain to do it for them and watch their mining market dry up and blow away (which may already be done).

    A mining-specific SKU would probably go a long way. No video out required, optimized clocks and firmware, datacenter-grade high density cooling. Get them to where they are competitive in price per hash, cut out what miners don't need, and add in what they do need. Either release as a reference card for the AIBs to manufacture, or get your first-party OEM to crank them out and make them available for bulk order.

    Then "soft lock" the drivers, so that consumer cards can mine, but only with 1 or 2 cards at a time (similar to SLI). That lets Joe Miner keep mining in his basement on his gaming rig, but the folks with 200 8-card risers that are turning their basements into 1MW datacenters from buying up all your stock out of Best Buy. Sure, there's ways around that, but with a mining-specific SKU that goes around that and offers other benefits, there's no incentive to put in that work.

    RAM is going to be an issue no matter what. Supply is constrained for everything right now. Not a lot you can do about that, apart from get those smart engineers out there to keep optimizing your memory use algorithms and figure out ways to do more with less, or mix different types and create even more cache levels, or something. It's like Hard Drive space in the early 90's. It was very expensive per MB - most programs were only a few hundred kB at most, very optimized for space, and we had programs that would compress/decompress on the fly (those still exist, and are used on SSDs still). Not that GPU-grade RAM has ever been cheap or there has ever been a lot of it... but there's a big difference between it's just expensive, or it just isn't available at any cost.

    Whatever comes after Pascal doesn't really matter, until you get this situation with mining and RAM resolved. Without that, it will just be more extremely high prices with almost no availability... you would be just as well served going out and getting the Titan-V, which is available now.
    Post edited by Ridelynn on
    Torval
  • VrikaVrika Member EpicPosts: 6,411
    Cleffy said:
    This summer may be interesting. Leaked benchmarks of AMDs next GPUs are showing 1080 performance at $250 price point.
    Where did you find those benchmarks? I didn't think we'd have any benchmarks of AMD's next GPUs yet, or even knowledge whether it's going to be Vega 12, Navi, or something else.
    AmazingAvery
     
  • VrikaVrika Member EpicPosts: 6,411
    Ridelynn said:

    Pascal right now is still King of the Hill. The Intel model in this situation was to milk it for everything that it's worth - don't release big jumps, because that only makes it harder to one-up yourself in the future. Do small but measurable performance increases on a steady cadence to keep the sales numbers rolling. That served Intel well for a long time, and AMD has just now caught up. Not saying I like it, but it makes for smart business sense.

    nVidia as a company, though, has a reputation for making those big leaps, and their stock price has been rewarded because they have been able to deliver for a long while now.
    I think NVidia would like to make as big leap as possible. With GPUs if you make a big leap, that's easier to translate into decreasing the chip size and thus manufacturing costs.

    NVidia might want to decrease their chip size if they feel they're beating AMD by margin, but I don't think they'd want to withhold any performance.
     
  • CleffyCleffy Member RarePosts: 6,246
    edited May 2018
  • centkincentkin Member RarePosts: 1,526
    7nm is like 29 atoms wide -- it is amazing anything works at all at that scale.
  • QuizzicalQuizzical Member LegendaryPosts: 22,078
    centkin said:
    7nm is like 29 atoms wide -- it is amazing anything works at all at that scale.
    The top line process node number is the smallest dimension of anything in the entire process node.  Individual transistors are actually much, much larger than that.  For example, on Nvidia's GV100 process node, if you do some back of the envelope arithmetic from the announced numbers of 21.1 billion transistors and 815 mm^2, you get about 26 transistors per um^2.  If you assume that transistors are square and that the entire chip is covered by transistors of some sort, that comes to about 197 nm on a side for the size of a transistor.

    Now, there are a variety of reasons why those simplifying assumptions are wrong, but being off by 20% here and 50% there isn't the difference between 197 nm on a side and a 12 nm process node.
    blueturtle13
  • OzmodanOzmodan Member EpicPosts: 9,726
    Personally, I would not look to Nvidia making a huge jump with the next graphic card series.  Just like Intel, they still own the market, but with the competition catching up.  The yields on the 7nm are poor ATM and it does not seem to be improving fast enough for any sort of volume production for this year.

    As others have suggested, you might just see an improved Pascal for the next release.

    I completely agree with Torval, 4k is still a ways off for most gamers.  I am using a 40" 4k TV for my main monitor, using it at 1440 resolution and have few issues.  My expectations are that I will not bother to upgrade my GPU this year unless something like Cleffy surmises happens with AMD.
    Torval
  • OzmodanOzmodan Member EpicPosts: 9,726
    edited May 2018
    Looks like we will get the info on Nvidia's new GPU on August 20th.

    https://www.pcgamesn.com/nvidia-next-gen-gpu-hot-chips

    That means you will be lucky to find one of these cards for Xmas.  

    Torval
  • VrikaVrika Member EpicPosts: 6,411
    Ozmodan said:
    Looks like we will get the info on Nvidia's new GPU on August 20th.

    https://www.pcgamesn.com/nvidia-next-gen-gpu-hot-chips

    That means you will be lucky to find one of these cards for Xmas.  
    If you look closely they are also giving "the info" on AMD's Raven Ridge APU that was already released.

    It's good to know that we're getting some info on NVidia's next gen GPUs, but that doesn't necessarily mean a product reveal or release date announcement.
     
  • OzmodanOzmodan Member EpicPosts: 9,726
    Vrika said:
    Ozmodan said:
    Looks like we will get the info on Nvidia's new GPU on August 20th.

    https://www.pcgamesn.com/nvidia-next-gen-gpu-hot-chips

    That means you will be lucky to find one of these cards for Xmas.  
    If you look closely they are also giving "the info" on AMD's Raven Ridge APU that was already released.

    It's good to know that we're getting some info on NVidia's next gen GPUs, but that doesn't necessarily mean a product reveal or release date announcement.
    Maybe I am interpreting it wrong, but it sure looks like a Nvidia GPU product reveal to me.
  • VrikaVrika Member EpicPosts: 6,411
    Ozmodan said:
    Vrika said:
    Ozmodan said:
    Looks like we will get the info on Nvidia's new GPU on August 20th.

    https://www.pcgamesn.com/nvidia-next-gen-gpu-hot-chips

    That means you will be lucky to find one of these cards for Xmas.  
    If you look closely they are also giving "the info" on AMD's Raven Ridge APU that was already released.

    It's good to know that we're getting some info on NVidia's next gen GPUs, but that doesn't necessarily mean a product reveal or release date announcement.
    Maybe I am interpreting it wrong, but it sure looks like a Nvidia GPU product reveal to me.
    It could be a product reveal. But the problem is that it could also be something else.

    Tom's Hardware writes:
       "Nvidia's presentation doesn't directly signify that the new GPUs will come to market soon. We first learned about AMD's Zen microarchitecture at the event in August 2016, but those processors didn't land in desktop PCs until March 2017. Conversely, many companies provide more detail on shipping products, so there's a chance that Nvidia's latest GPUs could already be on the market when the presentation takes place"

    https://www.tomshardware.com/news/nvidia-gtx-1180-gtx-2080-hot-chips-30,37152.html
    blueturtle13
     
  • OzmodanOzmodan Member EpicPosts: 9,726
    I very much doubt that the new architecture will be available to consumers by August.  We certainly would have heard leaks before now if they were going to have boards out by then.

    We might have more info on the architecture by then, but shipping products, sorry if I scoff at that.
    Ridelynnblueturtle13
  • AmazingAveryAmazingAvery Age of Conan AdvocateMember UncommonPosts: 7,179
    August or just prior will be announcement, I agree no shipping then. I'm thinking Founders Edition cards might be available in Sept/Oct and 3rd party Nov/Dec. We'll possibly see some BFGD by then too. GDDR6 is in production and ramping, more or less the reason for the wait.
    It'll still be a year+ for AMD high end and with Vega being a disappointing failure unable to capture any high end market history will repeat again with folks in a long wait, later to market offering. 


    a55.jpg 160.1K
    Ozmodan



  • RidelynnRidelynn Member EpicPosts: 7,057
    edited June 2018

    It'll still be a year+ for AMD high end and with Vega being a disappointing failure unable to capture any high end market history will repeat again with folks in a long wait, later to market offering. 


    Don't know that I would call Vega a failure. I do agree about disappointing though, but not in the same manner.

    Vega is just now starting to show up near MSRP. It's been sold out/limited availability, almost constantly.

    Now, those on Team Green will say it's because of production problems - AMD can't make enough, yields are bad, HBM availability sucks, whatever. (Let's ignore the fact that nVidia is in almost the same boat). And they do have some evidence - Steam Hardware survey (however much stock you put into that) hasn't shown a big uptick in AMD ownership (it is up, but it's still not a big number).

    Those on Team Red will say "WTF Miners suck". And they have a point there. AMD Vega is among the most desirable card for mining. And they are buying them all. And that would be one logical explanation as to why Steam Survey numbers haven't budged much.

    AMD is selling every card they can make, and they are almost all selling for more than AMD's recommended price point. From AMD's point of view, it's hardly a failure. It is disappointing because the intended audience - gamers - can't hardly get a hold of them though.
    TorvalOzmodan
  • QuizzicalQuizzical Member LegendaryPosts: 22,078
    In terms of Vega cards selling, thus far, it hasn't even mattered how good they are at graphics.  They're good at mining, so the miners buy them all.  That seems to be dying down now, but you still can't get them anywhere near MSRP, at least apart from the professional cards.

    I don't think it's a yield problem, either--or at least not GPU yields.  (There could be issues with HBM2 yields.)  If the problem were yields, we'd expect to see more cut-down versions of them.  Instead, there are only two bins, one of which is fully functional, and the other not really that far from it.  Besides, the Polaris 10 cards were in stock just fine for a number of months before the miners starting grabbing them all, and to have good yields for a while, then suddenly have yield problems that last for an entire year just doesn't happen.  (Shorter-lived yield problems because the fab goofed and wrecked a batch can happen.)
    Ozmodan
  • QuizzicalQuizzical Member LegendaryPosts: 22,078
    https://www.hardocp.com/news/2018/06/04/dont_expect_new_geforce_gpu_for_long_time

    Nvidia's CEO just said that the next GeForce GPU launch will be "a long time from now".
    TorvalRidelynnVrika
  • TorvalTorval Member LegendaryPosts: 19,934
    I wanted to just click WTF but I didn't want to shoot the messenger. Seriously. WTF
    ceratop001Ozmodan
    Fedora - A modern, free, and open source Operating System. https://getfedora.org/

    traveller, interloper, anomaly, iteration


  • QuizzicalQuizzical Member LegendaryPosts: 22,078
    Torval said:
    I wanted to just click WTF but I didn't want to shoot the messenger. Seriously. WTF
    Would Volta be better than Pascal at graphics?  Given that Volta puts more emphasis on compute while Pascal is more narrowly focused on graphics, it could easily be worse.  Transistor density makes it look like TSMC "12 nm FFN" isn't really a die shrink from "16 nm".  If they don't have improvements available from either a substantially better process node or a substantially better architecture, then what's the point of a new generation?  An expensive respin to improve clock speeds by 5%?  It's not like they're badly trailing AMD and desperately trying to catch up.
    TorvalOzmodan
  • RidelynnRidelynn Member EpicPosts: 7,057
    Quizzical said:
    Torval said:
    I wanted to just click WTF but I didn't want to shoot the messenger. Seriously. WTF
    Would Volta be better than Pascal at graphics?  Given that Volta puts more emphasis on compute while Pascal is more narrowly focused on graphics, it could easily be worse.  \

    Titan V is available today, if you want to pay for it ($3,000). So there have been comparisons of Volta versus Pascal.

    https://www.hardocp.com/article/2018/03/20/nvidia_titan_v_video_card_gaming_review

    There’s no denying that NVIDIA TITAN V smashes GeForce GTX 1080 Ti FE gaming performance at 4K and 1440p. We experienced performance differences up to 40%, most were around the 30% mark, with varying games below or above that average.

    Yeah, Volta will be a decent bump over Pascal, if we are just looking at Architecture vs Architecture and not specific card versus specific card.

    Torval
  • QuizzicalQuizzical Member LegendaryPosts: 22,078
    So basically, it's an 815 mm^2 die on a supposedly better process node offering 30% more performance than a 471 mm^2 die.  If it takes 73% more die size to offer 30% more performance, that's not an architectural advantage.  Scale it down to the same die size (which might still cost more for Volta because of the process node) and will it still be faster?
Sign In or Register to comment.