Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

GPU vendors decide that ray-tracing is their next excuse to make you need a faster video card

QuizzicalQuizzical Member LegendaryPosts: 24,436
Microsoft, Nvidia, and AMD have now all announced new ray-tracing initiatives within a day of each other.  This is not a coincidence.

https://techreport.com/news/33392/directx-12-dxr-and-nvidia-rtx-bring-real-time-ray-tracing-to-3d-engines
https://techreport.com/news/33399/amd-casts-a-light-on-its-radeon-rays-real-time-ray-tracing-tools

There are a lot of ways to do 3D graphics, and in the early days of it, it wasn't immediately obvious what would win out.  Rasterization basically won because it can run fast, and all but the earliest of 3D GPUs are heavily optimized to do rasterization efficiently.  But there are other approaches, and ray-tracing is one of them.

The idea of rasterization is that all of the models in a scene get broken up into a bunch of triangles.  For each triangle, you figure out which pixels on the screen it covers.  For each of those pixels, you figure out whether anything else previously drawn is in front of it, and if not, you color the pixel with an appropriate color from the model.  If you later discover a new triangle in front of a previously colored pixel, you can overwrite the previous color with the new one.  Repeat for all the rest of the triangles in all the rest of the models and you're done with the frame.

The downside of rasterization is that you can't do lighting effects.  Or perhaps rather, you can do lighting effects wrongly, but can't make them accurate except in the simplest of cases.  If you look closely at shadows in a game cast upon anything other than a large, flat object, they're guaranteed to be wrong.  They can make parts of the object upon which the shadow is cast look darker, and it looks like it might be a shadow if you don't look too closely, but it won't be the correct parts of it.  What you can do with reflections and transparency is likewise very restricted.

The idea of ray-tracing is that, for each pixel on the screen, you cast a ray from the camera and see what it hits.  Or maybe several rays and then average the colors if you want to do anti-aliasing.  When you do it this way, you can have a ray hit a reflective object, then cast the ray from there to see what else it hits.  That lets you draw very accurate reflections.  You can cast rays from what one hits toward light sources to see if it makes it there without running into anything else, and get highly accurate shadows that way.  You can do transparency by having the color chosen be some mix of a semi-transparent object with whatever the ray hits after passing through it.

The downside of ray-tracing is that it's slow.  Really, really slow.  Especially on GPUs.  GPUs are built to do things in a very heavily SIMD manner, where many different threads (32 on Nvidia, 64 on AMD) do the same thing at the same time, each on their own data.  Nvidia calls the collections of threads a "warp", while AMD calls them a "wavefront".  That data could be a vertex of a model, a pixel from a triangle, or various other things.  But it tremendously simplifies the scheduling.

GPU memory controllers also rely very heavily on having coalescence in order to work well.  Whenever you touch GDDR5 or HBM2 memory (and probably GDDR5X, though I'm not 100% sure about that), you have to access a 128 byte chunk.  Ideally, you have 32 threads in a warp each grab 4 bytes from the same 128-byte chunk so that the memory controllers can do all the reads at once just by pulling the 128 bytes in and distributing each thread's requested portion to it.  Or maybe 8 of the 32 threads in a warp each grab 16 bytes out of a 128 byte chunk, or several different threads grab the same memory, or whatever.  But you want a whole lot of cases of different threads grabbing data from the same 128 byte cache line at the same time.

Ray-tracing completely breaks this.  After reflection or transparency or hitting the edge of an object, adjacent pixels might have their rays go off in wildly different directions.  Many optimizations that GPUs have been able to do to make rasterization efficient simply don't work for ray-tracing.  That will make ray-tracing massively slower than rasterization for a given complexity of scene.

From the perspective of the GPU vendors, that's the attraction of it.  They need to give you some reason why your current video card isn't good enough and you need to buy a new one.  They've been pushing higher resolutions, higher frame rates, and VR, but that only goes so far before it gets kind of ridiculous.

Just don't expect ray-tracing to come to MMORPGs anytime soon.  If you thought standing around in town with 50 other players nearby was bad with rasterization, just wait until you see the number of seconds per frame that you'd get from a comparably messy scene with proper ray-tracing and all the lighting effects enabled that are the point of using ray-tracing in the first place.
bartoni33GdemamiVrikajimmywolfScot[Deleted User]Avarix[Deleted User]AmazingAvery
«1

Comments

  • VrikaVrika Member LegendaryPosts: 7,375
    edited March 2018
    There's some speculation that the Tensor Cores in NVidia's Volta could be used to enhance ray-tracing speed
      https://www.pcgamer.com/nvidia-talks-ray-tracing-and-volta-hardware/

    If it turns out to be true, then NVidia might be about to launch a line of very expensive ray-tracing enabled GPUs, and cheaper product line of GPU's not meant for ray-tracing.

    This post is only wild guesswork, but with all the money NVidia is investing in their AI/deep learning hardware, I bet they'd love any excuse to sell it also to high-end gamers.
     
  • ScotScot Member LegendaryPosts: 18,686
    With ray tracing we get better shadows and reflections? That's it? The word gimmick springs to mind.
    Gdemami[Deleted User]
  • PhryPhry Member LegendaryPosts: 11,004
    If Nvidia or AMD wanted to help consumers then one way would be to enable measures that prevented their GPU's from being used to generate e coins etc. That way the things wouldn't be over priced or in short supply. :/
    GdemamiScotberenimAnOldFart
  • jusomdudejusomdude Member RarePosts: 2,706
    I'm wondering if they released this later than they could have and are limiting the performance of it on purpose to sell more units. I remember 4+ years ago seeing videos of real time GPU based ray tracers while messing around with trying to create my own. Look up octane and brigade engines. They have videos from around 4 years ago showing real time GPU ray tracing. Although I think they were using multiple GTX Titans to do it.
  • jusomdudejusomdude Member RarePosts: 2,706
    edited March 2018
    Scot said:
    With ray tracing we get better shadows and reflections? That's it? The word gimmick springs to mind.
    I don't think you realize how much photo realism this can add to computer graphics. It doesn't just work with highly reflective surfaces but it reflects the subtle light from every surface, improving lighting quality.
    Gdemami[Deleted User][Deleted User]
  • jusomdudejusomdude Member RarePosts: 2,706
    I've seen the Ray Tracing debate being tossed around the 3D Art and programming community since back n 2003. Is it better, will it be the future, or will it all end in tears? Hell if I know.
    If it can be done at acceptable speeds with a good number of bounces I'd imagine it would win, since it closely simulates how light actually works.. or how we think it works.
  • ScotScot Member LegendaryPosts: 18,686
    edited March 2018
    jusomdude said:
    Scot said:
    With ray tracing we get better shadows and reflections? That's it? The word gimmick springs to mind.
    I don't think you realize how much photo realism this can add to computer graphics. It doesn't just work with highly reflective surfaces but it reflects the subtle light from every surface, improving lighting quality.

    Fair enough, but do you think our top end graphic games need more photo realism? I would be on the fence there, on the one hand its better quality graphics, on the other maybe not being able to make graphics any better would turn game designers minds to gameplay? :)
    [Deleted User]Gdemami
  • jusomdudejusomdude Member RarePosts: 2,706
    I think Nintendo already has that ground covered. Probably some indie developers too.
  • VrikaVrika Member LegendaryPosts: 7,375
    edited March 2018
    Scot said:
    jusomdude said:
    Scot said:
    With ray tracing we get better shadows and reflections? That's it? The word gimmick springs to mind.
    I don't think you realize how much photo realism this can add to computer graphics. It doesn't just work with highly reflective surfaces but it reflects the subtle light from every surface, improving lighting quality.

    Fair enough, but do you think our top end graphic games need more photo realism? I would be on the fence there, on the one hand its better quality graphics, on the other maybe not being able to make graphics any better would turn game designers minds to gameplay? :)
    It's not about what we think, it's about what we buy.

    And we're definitely sucking up all those new GPUs and games with better graphics.
    Post edited by Vrika on
    Scot
     
  • ET3DET3D Member UncommonPosts: 320
    Quizzical said:
    Microsoft, Nvidia, and AMD have now all announced new ray-tracing initiatives within a day of each other.  This is not a coincidence.
    Of course it's not coincidence. Microsoft doesn't release an API without making sure there's industry support for it. I'm sure it's been working with AMD and NVIDIA (and probably also Intel) for a while.

    Ray tracing on GPU has been around for a while, but with a Microsoft API it's obviously getting a big push.
    [Deleted User]Ridelynn
  • QuizzicalQuizzical Member LegendaryPosts: 24,436
    Scot said:
    With ray tracing we get better shadows and reflections? That's it? The word gimmick springs to mind.
    Transparency, too.  I don't think you realize just how bad lighting effects are now with rasterization.  In a lot of games, I turn a lot of lighting effects off entirely because I think they make the game look worse than not using them at all.
  • QuizzicalQuizzical Member LegendaryPosts: 24,436
    Vrika said:
    There's some speculation that the Tensor Cores in NVidia's Volta could be used to enhance ray-tracing speed
      https://www.pcgamer.com/nvidia-talks-ray-tracing-and-volta-hardware/

    If it turns out to be true, then NVidia might be about to launch a line of very expensive ray-tracing enabled GPUs, and cheaper product line of GPU's not meant for ray-tracing.

    This post is only wild guesswork, but with all the money NVidia is investing in their AI/deep learning hardware, I bet they'd love any excuse to sell it also to high-end gamers.
    I would regard that as extremely unlikely.  In my original post, I said that the performance problem with ray-tracing is that it breaks the SIMD and memory coalescence optimizations that work so well with rasterization.  Using the tensor cores isn't just SIMD, but a very, very restricted version of SIMD that hardly anything can use.

    GPUs are pretty heavily optimized for floating-point FMA operations, where fma(a, b, c) = a * b + c, as a single instruction, with all of variables floats.  The tensor cores can basically do that same operation, except that a, b, and c are half-precision 4x4 matrices, and with 1/8 of the throughput of doing the same thing with floats.  That's a huge win if you need massive amounts of it, as doing the matrix multiply-add naively would be 64 instructions.  Being able to do that with 1/8 of the throughput of one instruction is an 8x speed improvement.

    The problem is that basically nothing fits that setup.  Pretty much nothing in graphics does.  Pretty much nothing in non-graphical compute does.  Nvidia thinks that machine learning will, which strikes me as plausible, though I haven't had a look at the fine details.  But I'd think of the tensor cores as being special purposes silicon to do one dedicated thing (i.e., machine learning), kind of like the video decode block or tessellation units in a GPU, or the AES-NI instructions in a CPU.

    What's far more likely is that Nvidia is getting considerable mileage out of their beefed-up L2 cache in Maxwell and later GPUs.  So long as the scene is simple enough that most memory accesses can go to L2 cache rather than off-chip DRAM, the memory bandwidth problems wouldn't be as bad.
    [Deleted User]
  • QuizzicalQuizzical Member LegendaryPosts: 24,436
    jusomdude said:
    I'm wondering if they released this later than they could have and are limiting the performance of it on purpose to sell more units. I remember 4+ years ago seeing videos of real time GPU based ray tracers while messing around with trying to create my own. Look up octane and brigade engines. They have videos from around 4 years ago showing real time GPU ray tracing. Although I think they were using multiple GTX Titans to do it.
    It's not that hard to make a real-time ray-tracer if you're willing to accept not that complicated of scenes.  I'll bet that the demo that you're referring to didn't feature 50 people running around in town like you see in some MMORPGs.  Reducing the resolution and frame rate can also help a lot.  More subtly, it's probable that it had no anti-aliasing at all.
  • SovrathSovrath Member LegendaryPosts: 30,630
    Quizzical said:
    They've been pushing higher resolutions, higher frame rates, and VR, but that only goes so far before it gets kind of ridiculous.


    Great post up to there.

    That's very much opinion. Where should they stop? I mean, I'm sure there were people who thought things were fine years ago. Suddenly new breakthroughs and we get absolutely beautiful images/vistas/characters.

    While I'm very much capable of enjoying a game that has dated graphics or is only "so good" I'm all for them pushing the bounds of technology to bring me breathtaking worlds.

    I say "bring it".
    [Deleted User]AmazingAvery
    Like Skyrim? Need more content? Try my Skyrim mod "Godfred's Tomb." 

    Godfred's Tomb Trailer: https://youtu.be/-nsXGddj_4w


    Original Skyrim: https://www.nexusmods.com/skyrim/mods/109547

    Try the "Special Edition." 'Cause it's "Special." https://www.nexusmods.com/skyrimspecialedition/mods/64878/?tab=description

    Serph toze kindly has started a walk-through. https://youtu.be/UIelCK-lldo 
  • QuizzicalQuizzical Member LegendaryPosts: 24,436
    Sovrath said:
    Quizzical said:
    They've been pushing higher resolutions, higher frame rates, and VR, but that only goes so far before it gets kind of ridiculous.


    Great post up to there.

    That's very much opinion. Where should they stop? I mean, I'm sure there were people who thought things were fine years ago. Suddenly new breakthroughs and we get absolutely beautiful images/vistas/characters.

    While I'm very much capable of enjoying a game that has dated graphics or is only "so good" I'm all for them pushing the bounds of technology to bring me breathtaking worlds.

    I say "bring it".
    I'm not saying that resolutions and frame rates are high enough today that nothing more will ever matter.  I'm saying that there are sharply diminishing returns at some point, and eventually, it doesn't really matter anymore.  For example, most people would agree that 60 frames per second is better than 30.  I don't think it's that hard to make a case that 120 is better than 60.  Maybe you could argue that 240 is a little better than 120.  Is 480 frames per second really better than 240?  Even if you say it is, the real-world difference between 240 and 480 frames per second is massively smaller than the difference between 30 and 40.

    There used to be a thriving, competitive market for sound cards.  Then they got good enough, and then integrated sound chips got plenty good enough for most people.  Now hardly anyone buys a discrete sound card anymore.  The GPU vendors really, really want for that to not happen to GPUs.
    Sovrath[Deleted User]ScotWaan
  • MadFrenchieMadFrenchie Member LegendaryPosts: 8,505
    I wish I understood the OP.  I read it, though.  Is there a test I can take to check my understanding? :D
    [Deleted User][Deleted User]Ridelynn

    image
  • SovrathSovrath Member LegendaryPosts: 30,630
    I wish I understood the OP.  I read it, though.  Is there a test I can take to check my understanding? :D
    Here you go ...


    [Deleted User][Deleted User][Deleted User]MadFrenchieAmazingAvery
    Like Skyrim? Need more content? Try my Skyrim mod "Godfred's Tomb." 

    Godfred's Tomb Trailer: https://youtu.be/-nsXGddj_4w


    Original Skyrim: https://www.nexusmods.com/skyrim/mods/109547

    Try the "Special Edition." 'Cause it's "Special." https://www.nexusmods.com/skyrimspecialedition/mods/64878/?tab=description

    Serph toze kindly has started a walk-through. https://youtu.be/UIelCK-lldo 
  • RidelynnRidelynn Member EpicPosts: 7,341
    ET3D said:
    Quizzical said:
    Microsoft, Nvidia, and AMD have now all announced new ray-tracing initiatives within a day of each other.  This is not a coincidence.
    Of course it's not coincidence. Microsoft doesn't release an API without making sure there's industry support for it. I'm sure it's been working with AMD and NVIDIA (and probably also Intel) for a while.

    Ray tracing on GPU has been around for a while, but with a Microsoft API it's obviously getting a big push.
    DX10. DX12. UWP. Games for Windows Live. Moving past just APIs used in and around gaming: Zune, Bob, RT, ME, Mobile, Vista, 8. The list goes on for miles.

    Not saying that ray tracing is bad, just saying that having MS's backing isn't necessarily a guarantee of success. It's a large company and they aren't afraid to shotgun a lot of different approaches, knowing that not all of them are going to succeed.
    [Deleted User][Deleted User]Quizzical
  • RidelynnRidelynn Member EpicPosts: 7,341
    Quizzical said:

    There used to be a thriving, competitive market for sound cards.  Then they got good enough, and then integrated sound chips got plenty good enough for most people.  Now hardly anyone buys a discrete sound card anymore.  The GPU vendors really, really want for that to not happen to GPUs.
    It is getting awfully close though. And not a moment too soon, with the Great Mining Shortage and all that. 

    For some people, the best is never enough. But for most people, good enough is good enough. I think we are almost there with 1080p/60Hz - nearly every discrete card out there can run nearly every title at that metric with moderate visuals in the last couple of generations, and IGP/APU is almost there (maybe it is there for the Vega APUs). For most people, yeah, they can tell the difference between the Medium and Ultra notches on visuals, but it's probably not worth the extra $500-700 it takes to get there. 

    I don't know what level it takes before we just say good enough - other than we are getting closer and closer all the time.

    (Side Note: Maybe we should stop worrying about pixel count and start worrying about PPI instead)
    [Deleted User]
  • VrikaVrika Member LegendaryPosts: 7,375
    Ridelynn said:

    (Side Note: Maybe we should stop worrying about pixel count and start worrying about PPI instead)
    Pixels per inch is a meaningful measurement for hand-held devices because the length of your arm doesn't change. It's less meaningful measurement for a monitor that you can move closer or further away from your eyes depending on your personal preference.
    [Deleted User]
     
  • ConstantineMerusConstantineMerus Member EpicPosts: 3,338
    Scot said:
    With ray tracing we get better shadows and reflections? That's it? The word gimmick springs to mind.
    If you ever to work a 3D software mate, you'd realize it's all about shadows and reflections. 
    [Deleted User]
    Constantine, The Console Poster

    • "One of the most difficult tasks men can perform, however much others may despise it, is the invention of good games and it cannot be done by men out of touch with their instinctive selves." - Carl Jung
  • ScotScot Member LegendaryPosts: 18,686
    Quizzical said:
    Sovrath said:
    Quizzical said:
    They've been pushing higher resolutions, higher frame rates, and VR, but that only goes so far before it gets kind of ridiculous.


    Great post up to there.

    That's very much opinion. Where should they stop? I mean, I'm sure there were people who thought things were fine years ago. Suddenly new breakthroughs and we get absolutely beautiful images/vistas/characters.

    While I'm very much capable of enjoying a game that has dated graphics or is only "so good" I'm all for them pushing the bounds of technology to bring me breathtaking worlds.

    I say "bring it".
    I'm not saying that resolutions and frame rates are high enough today that nothing more will ever matter.  I'm saying that there are sharply diminishing returns at some point, and eventually, it doesn't really matter anymore.  For example, most people would agree that 60 frames per second is better than 30.  I don't think it's that hard to make a case that 120 is better than 60.  Maybe you could argue that 240 is a little better than 120.  Is 480 frames per second really better than 240?  Even if you say it is, the real-world difference between 240 and 480 frames per second is massively smaller than the difference between 30 and 40.

    There used to be a thriving, competitive market for sound cards.  Then they got good enough, and then integrated sound chips got plenty good enough for most people.  Now hardly anyone buys a discrete sound card anymore.  The GPU vendors really, really want for that to not happen to GPUs.
    The rate of growth of computer power is slowing down, still years of good growth ahead but what is being done now to achieve that growth has been compared to "tweaking". The leaps forward we have seen in graphics are slowing down as well, once we get to photorealism, still many years off, what then for video cards?

    https://www.davincicoders.com/codingblog/2017/2/28/exponential-growth-of-computing-power

  • jusomdudejusomdude Member RarePosts: 2,706
    Quizzical said:
    jusomdude said:
    I'm wondering if they released this later than they could have and are limiting the performance of it on purpose to sell more units. I remember 4+ years ago seeing videos of real time GPU based ray tracers while messing around with trying to create my own. Look up octane and brigade engines. They have videos from around 4 years ago showing real time GPU ray tracing. Although I think they were using multiple GTX Titans to do it.
    It's not that hard to make a real-time ray-tracer if you're willing to accept not that complicated of scenes.  I'll bet that the demo that you're referring to didn't feature 50 people running around in town like you see in some MMORPGs.  Reducing the resolution and frame rate can also help a lot.  More subtly, it's probable that it had no anti-aliasing at all.
    There was one video from one of the two engines I mentioned with 50-100 animated but kind of low rez characters in a city scene randomly running around, but that video seems to have disappeared. At least I can't find it anymore. Anyways their other videos aren't really that simple. They are moderate sized city scenes ray traced in real time.

    I'm sure with enough interest, the API/hardware developers will come up with something for large animated scenes that will work well with their cards.

  • VrikaVrika Member LegendaryPosts: 7,375
    edited March 2018
    jusomdude said:
    Quizzical said:
    jusomdude said:
    I'm wondering if they released this later than they could have and are limiting the performance of it on purpose to sell more units. I remember 4+ years ago seeing videos of real time GPU based ray tracers while messing around with trying to create my own. Look up octane and brigade engines. They have videos from around 4 years ago showing real time GPU ray tracing. Although I think they were using multiple GTX Titans to do it.
    It's not that hard to make a real-time ray-tracer if you're willing to accept not that complicated of scenes.  I'll bet that the demo that you're referring to didn't feature 50 people running around in town like you see in some MMORPGs.  Reducing the resolution and frame rate can also help a lot.  More subtly, it's probable that it had no anti-aliasing at all.
    There was one video from one of the two engines I mentioned with 50-100 animated but kind of low rez characters in a city scene randomly running around, but that video seems to have disappeared. At least I can't find it anymore. Anyways their other videos aren't really that simple. They are moderate sized city scenes ray traced in real time.

    I'm sure with enough interest, the API/hardware developers will come up with something for large animated scenes that will work well with their cards.
    It's not about interest. The problem is that ray tracing requires so much from hardware that it would either
     a) Be so costly that no-one buys it, or
     b) The game looks like crap because there's no hardware left for other graphic effects

    There's no conspiracy and the devs aren't being lazy. They just haven't managed to place enough hardware to a cheap enough package so they've had to compromise and leave some of the most hardware demanding effects out.



    Here's one of those engine demos with ray tracing you were talking about. If you bother to take another look, it uses ray tracing, but it looks bad compared to games from that age


    Post edited by Vrika on
     
  • AnOldFartAnOldFart Member RarePosts: 554
    Thank you for an enlightening post, I would personally love to see games that look extremely realistic but I won't be pushing for a new gpu now unless I had to
Sign In or Register to comment.