Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!




Last Active
Favorite Role
  • GPU vendors decide that ray-tracing is their next excuse to make you need a faster video card

    Microsoft, Nvidia, and AMD have now all announced new ray-tracing initiatives within a day of each other.  This is not a coincidence.


    There are a lot of ways to do 3D graphics, and in the early days of it, it wasn't immediately obvious what would win out.  Rasterization basically won because it can run fast, and all but the earliest of 3D GPUs are heavily optimized to do rasterization efficiently.  But there are other approaches, and ray-tracing is one of them.

    The idea of rasterization is that all of the models in a scene get broken up into a bunch of triangles.  For each triangle, you figure out which pixels on the screen it covers.  For each of those pixels, you figure out whether anything else previously drawn is in front of it, and if not, you color the pixel with an appropriate color from the model.  If you later discover a new triangle in front of a previously colored pixel, you can overwrite the previous color with the new one.  Repeat for all the rest of the triangles in all the rest of the models and you're done with the frame.

    The downside of rasterization is that you can't do lighting effects.  Or perhaps rather, you can do lighting effects wrongly, but can't make them accurate except in the simplest of cases.  If you look closely at shadows in a game cast upon anything other than a large, flat object, they're guaranteed to be wrong.  They can make parts of the object upon which the shadow is cast look darker, and it looks like it might be a shadow if you don't look too closely, but it won't be the correct parts of it.  What you can do with reflections and transparency is likewise very restricted.

    The idea of ray-tracing is that, for each pixel on the screen, you cast a ray from the camera and see what it hits.  Or maybe several rays and then average the colors if you want to do anti-aliasing.  When you do it this way, you can have a ray hit a reflective object, then cast the ray from there to see what else it hits.  That lets you draw very accurate reflections.  You can cast rays from what one hits toward light sources to see if it makes it there without running into anything else, and get highly accurate shadows that way.  You can do transparency by having the color chosen be some mix of a semi-transparent object with whatever the ray hits after passing through it.

    The downside of ray-tracing is that it's slow.  Really, really slow.  Especially on GPUs.  GPUs are built to do things in a very heavily SIMD manner, where many different threads (32 on Nvidia, 64 on AMD) do the same thing at the same time, each on their own data.  Nvidia calls the collections of threads a "warp", while AMD calls them a "wavefront".  That data could be a vertex of a model, a pixel from a triangle, or various other things.  But it tremendously simplifies the scheduling.

    GPU memory controllers also rely very heavily on having coalescence in order to work well.  Whenever you touch GDDR5 or HBM2 memory (and probably GDDR5X, though I'm not 100% sure about that), you have to access a 128 byte chunk.  Ideally, you have 32 threads in a warp each grab 4 bytes from the same 128-byte chunk so that the memory controllers can do all the reads at once just by pulling the 128 bytes in and distributing each thread's requested portion to it.  Or maybe 8 of the 32 threads in a warp each grab 16 bytes out of a 128 byte chunk, or several different threads grab the same memory, or whatever.  But you want a whole lot of cases of different threads grabbing data from the same 128 byte cache line at the same time.

    Ray-tracing completely breaks this.  After reflection or transparency or hitting the edge of an object, adjacent pixels might have their rays go off in wildly different directions.  Many optimizations that GPUs have been able to do to make rasterization efficient simply don't work for ray-tracing.  That will make ray-tracing massively slower than rasterization for a given complexity of scene.

    From the perspective of the GPU vendors, that's the attraction of it.  They need to give you some reason why your current video card isn't good enough and you need to buy a new one.  They've been pushing higher resolutions, higher frame rates, and VR, but that only goes so far before it gets kind of ridiculous.

    Just don't expect ray-tracing to come to MMORPGs anytime soon.  If you thought standing around in town with 50 other players nearby was bad with rasterization, just wait until you see the number of seconds per frame that you'd get from a comparably messy scene with proper ray-tracing and all the lighting effects enabled that are the point of using ray-tracing in the first place.
  • Nvidia trying to force companies to stop selling AMD GPUs


    Basically, the claim there is that if an GPU board partner (e.g., Asus, Gigabyte, MSI) or an OEM (e.g., Dell, HP) sells GPUs from both AMD and Nvidia, then Nvidia is threatening that they won't get any Nvidia GPUs until the partners that are Nvidia-only have already been able to get all that they want.  When supplies are plentiful several months after launch, everyone will be able to get and sell all of the Nvidia GPUs that they want.  But when there's a short supply at launch, companies that also sell GPUs from AMD will get nothing from Nvidia for a while.

    Intel was forced to pay over $1 billion several years ago for illegally pressuring companies not to sell AMD CPUs.  It looks like Nvdiia wants to repeat that with GPUs, even though they're in a far less dominant position in the GPU market than Intel was in CPUs.

    From a consumer perspective, this is unambiguously terrible.  I don't know if it's illegal, but it sounds like it could be.  Nvidia has proven many times in the past that they don't particularly care about making their customers or business partners hate them, and if the story is accurate, this is one more example of that.
  • OH God ... Can one of these upcoming titles go into beta already?

    If there isn't a single computer game out that you like, then I'd suggest that you just don't like computer games.  Future games won't fix that for you.

    If you've had quite a few games that you liked over the course of the last decade, but have merely played them enough that you're tired of them, then that's a different problem entirely.  If that's the case, then there are probably other games out that you'd like but just aren't aware of.  If you were to say what you liked, that might help.
  • Latest PTR Patch Reveals Increased Minimum Graphics API & General Requirements - World of Warcraft -

    For those who don't know what the jargon means, you basically need a video card released in September 2009 or later, apart from Intel graphics, which would require 2012 or later. The Radeon HD 5000 series or GeForce 400 series would still be supported. Note that that is a weaker requirement than asking you to have a video card for which driver support isn't discontinued.
  • Major security flaw in ALL intel chip......

    gervaise1 said:
    The base links:


    Reading these indicates that the fundamental issue stems from how computing has developed in the last few years.  As the Spectre paper concludes the drive to maximise performance.

    Both papers make clear that as all manufacturers / developers have gone in the same general direction this a cross-hardware, cross-operating system issue. The fact that something they did worked on one combo and not another, in their opinion, doesn't suggest a given combo is "immune" simply that they hadn't got the "attack" right. 

    What is comforting is that this stuff is pursued by e.g. the EU and fully supported by Intel/Qualcomm/AMD/ARM/MS/Google etc. 

    And its why people should keep their software up-to-date!  (Yes, yes its the Martian conspiracy.) 
    There are plenty of gradations between vulnerable and immune.  "I believe it's possible" is a long way from "I've demonstrated how to do it".  And a proof of concept is itself a long way from cyber-criminals being able to use it to steal from you.  You can't make computers 100% immune to all possible exploits, but if one system is considerably harder to attack than another, that's a big deal.