Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Threadripper CPU is pretty awesome, except for gaming

AmazingAveryAmazingAvery Age of Conan AdvocateMember UncommonPosts: 7,188

Some quotes from review "conclusion" pages and also a list of reviews in one spot.
These quotes are just focused on the gaming aspect. However, if you do other things then this should be a consideration if you can afford it because anything multi threaded it chews up well.
Source: Ryzen Threadripper gaming performance:
For all the good stuff we saw on the CPU-centric testing, if you've read all the earlier Ryzen coverage, you can probably already guess that Threadripper doesn't really change the formula. Other than a few games now having tuned builds that run better on Ryzen, in general Intel wins the gaming tests. Of course, this is at 1080p ultra quality, using a GTX 1080 Ti—which honestly isn't that far off what I'd expect any gamer considering Threadripper or Core i9 to be running. Anyway, a slower GPU would show far less difference, and running at 1440p or 4K would also narrow the gap. But if you want maximum gaming performance, AMD still has some work to do.
Source: Despite Threadripper's design arguably being better tuned to highly threaded workstation-like workloads, the fact that it still has high clocks compared to Ryzen 7 means that gaming is going to be a big part of the equation too. In its default Creative Mode, Threadripper’s gaming performance is middling at best: very few games can use all those threads and the variable DRAM latency means that the cores are sometimes metaphorically tripping over themselves trying to talk to each other and predict when work will be done. To solve this, AMD is offering Game Mode, which cuts the number of threads and focuses memory allocations to the DRAM nearest to the core (at the expense of peak DRAM bandwidth). This has the biggest effect on minimum frame rates rather than average frame rates, and affects 1080p more than 4K, which is perhaps the opposite end of the spectrum to what a top-level enthusiast would be gaming on. In some games, Game Mode makes no difference, while in others it can open up new possibilities.
If I were to turn around and say that Threadripper CPUs were not pure gaming CPUs, it would annoy a fair lick of the tech audience. The data is there – it’s not the best gaming CPU. But AMD would spin it like this: it allows the user to game, to stream, to watch and to process all at the same time
Source: As you may have already guessed, these massive thread-happy processors represent a poor value for any gaming-focused system. I’ve said this once and I will say it again: this isn’t an AMD-exclusive issue but rather one that’s endemic of every capable yet low-clocked 8+ core processors. When compared against the likes of Ryzen 5 / 7 or Intel’s i7 series, the i9 and Threadripper CPUs suffer from a serious case of framerate envy due to their very nature. Games love low latency and high frequencies and neither HEDT lineup really offers that combination. 
Where the other shoe drops for Threadripper is that its framerates suffer more than Ryzen 7 did, to the point where we see simple Ryzen 5 processors matching or surpassing AMD’s $1000 wunderkind in several games. We’ll be testing some theories about this shortfall in the coming weeks, but it seems like AMD’s dual die design just doesn’t benefit the serial nature of many game engines.

List of different review sites to peruse.



Comments

  • azurreiazurrei Member UncommonPosts: 332
    edited August 2017
    I usually have multiple MMO's launched at once with youtube / twitch going as well... I'm guessing TR would likely trounce the i9 in my case.  Though I wouldn't buy a TR but my next build will have Ryzen unless Intel is actually competitive for my needs.
  • RidelynnRidelynn Member EpicPosts: 7,383
    The interesting part of the review - there were no big surprises. It's pretty much exactly what everyone was expecting based how Ryzen 7 has turned out.

    That's good news for AMD, honestly.

    Yeah, it's not the best choice for gaming. Not everyone plays games with their computer. Not everyone needs 16 cores and 32 threads. And not everyone is going to go out and spend $1,000 on just the CPU.

    For that niche that TR is targeting, it does so very well. But it's very much a niche, and apart from that the TR marketing and buzz effective serves to just make it a halo product. It's certainly gotten Intel to dance to the tune.
    [Deleted User]
  • QuizzicalQuizzical Member LegendaryPosts: 25,348
    Torval said:
    One thing I wondered is how much of a pain switching between X and G modes is going to be. What if you have a game that performs better in non-game mode. If you have to manage that per app or put your system into a mode then I think that will sap a lot of enthusiasm for that feature. On the other hand if it's seamless and easy then it could take off.
    It takes a reboot.  I see "game mode" more as a way to partially salvage the product for people who buy it and shouldn't have than as a more directly desirable feature to offer.  People who genuinely need the extra cores a lot of the time would probably be more willing to just take a modest hit when gaming than to want to fuss with game mode.

    Threadripper doesn't have much of a point unless you need more than 8 CPU cores, more than two channels of DDR4 memory, or a whole lot of PCI Express lanes.  But for those who do, and would prefer not to spend a fortune, it's likely to be a compelling product.  That's not to say that it's a bad product; there's nothing wrong with a product being good for its intended uses and not for other purposes where it was never meant to be all that great.

    There are pretty sharply diminishing returns to adding more cores for a lot of purposes, including gaming.  You can argue for benefits to more than four cores in some cases.  But a game that won't run well on Ryzen 7 because 8 cores isn't enough is not going to have much of a market in the near future.
    [Deleted User]Loke666GladDog
  • RidelynnRidelynn Member EpicPosts: 7,383
    edited August 2017
    Gaming mode is just a marketing thing to directly combat the "Oh but it doesn't game as well at 1080" comments. Because most people who use $1000 CPUs and $2500+ computer builds ~just~ game at 1080.

    That's probably why AMD didn't put a lot of time into it to make it a seamless transition - of the 3 people in the world who will use it and care that it takes a reboot, the other thousands of TR customers don't want to interrupt the uptime on their server processes to play a round of Quake and won't care it's 2FPS lower.
    [Deleted User]
  • OzmodanOzmodan Member EpicPosts: 9,726

    Some quotes from review "conclusion" pages and also a list of reviews in one spot.
    These quotes are just focused on the gaming aspect. However, if you do other things then this should be a consideration if you can afford it because anything multi threaded it chews up well.
    Source: Ryzen Threadripper gaming performance:
    For all the good stuff we saw on the CPU-centric testing, if you've read all the earlier Ryzen coverage, you can probably already guess that Threadripper doesn't really change the formula. Other than a few games now having tuned builds that run better on Ryzen, in general Intel wins the gaming tests. Of course, this is at 1080p ultra quality, using a GTX 1080 Ti—which honestly isn't that far off what I'd expect any gamer considering Threadripper or Core i9 to be running. Anyway, a slower GPU would show far less difference, and running at 1440p or 4K would also narrow the gap. But if you want maximum gaming performance, AMD still has some work to do.
    Source: Despite Threadripper's design arguably being better tuned to highly threaded workstation-like workloads, the fact that it still has high clocks compared to Ryzen 7 means that gaming is going to be a big part of the equation too. In its default Creative Mode, Threadripper’s gaming performance is middling at best: very few games can use all those threads and the variable DRAM latency means that the cores are sometimes metaphorically tripping over themselves trying to talk to each other and predict when work will be done. To solve this, AMD is offering Game Mode, which cuts the number of threads and focuses memory allocations to the DRAM nearest to the core (at the expense of peak DRAM bandwidth). This has the biggest effect on minimum frame rates rather than average frame rates, and affects 1080p more than 4K, which is perhaps the opposite end of the spectrum to what a top-level enthusiast would be gaming on. In some games, Game Mode makes no difference, while in others it can open up new possibilities.
    If I were to turn around and say that Threadripper CPUs were not pure gaming CPUs, it would annoy a fair lick of the tech audience. The data is there – it’s not the best gaming CPU. But AMD would spin it like this: it allows the user to game, to stream, to watch and to process all at the same time
    Source: As you may have already guessed, these massive thread-happy processors represent a poor value for any gaming-focused system. I’ve said this once and I will say it again: this isn’t an AMD-exclusive issue but rather one that’s endemic of every capable yet low-clocked 8+ core processors. When compared against the likes of Ryzen 5 / 7 or Intel’s i7 series, the i9 and Threadripper CPUs suffer from a serious case of framerate envy due to their very nature. Games love low latency and high frequencies and neither HEDT lineup really offers that combination. 
    Where the other shoe drops for Threadripper is that its framerates suffer more than Ryzen 7 did, to the point where we see simple Ryzen 5 processors matching or surpassing AMD’s $1000 wunderkind in several games. We’ll be testing some theories about this shortfall in the coming weeks, but it seems like AMD’s dual die design just doesn’t benefit the serial nature of many game engines.

    List of different review sites to peruse.

    Yep, some games will play better on Intel.  Try some strategy games, Threadripper will blow away anything Intel can throw at it.  
    Gdemami
  • RidelynnRidelynn Member EpicPosts: 7,383
    edited August 2017
    Ozmodan said:

    Some quotes from review "conclusion" pages and also a list of reviews in one spot.
    These quotes are just focused on the gaming aspect. However, if you do other things then this should be a consideration if you can afford it because anything multi threaded it chews up well.
    Source: Ryzen Threadripper gaming performance:
    For all the good stuff we saw on the CPU-centric testing, if you've read all the earlier Ryzen coverage, you can probably already guess that Threadripper doesn't really change the formula. Other than a few games now having tuned builds that run better on Ryzen, in general Intel wins the gaming tests. Of course, this is at 1080p ultra quality, using a GTX 1080 Ti—which honestly isn't that far off what I'd expect any gamer considering Threadripper or Core i9 to be running. Anyway, a slower GPU would show far less difference, and running at 1440p or 4K would also narrow the gap. But if you want maximum gaming performance, AMD still has some work to do.
    Source: Despite Threadripper's design arguably being better tuned to highly threaded workstation-like workloads, the fact that it still has high clocks compared to Ryzen 7 means that gaming is going to be a big part of the equation too. In its default Creative Mode, Threadripper’s gaming performance is middling at best: very few games can use all those threads and the variable DRAM latency means that the cores are sometimes metaphorically tripping over themselves trying to talk to each other and predict when work will be done. To solve this, AMD is offering Game Mode, which cuts the number of threads and focuses memory allocations to the DRAM nearest to the core (at the expense of peak DRAM bandwidth). This has the biggest effect on minimum frame rates rather than average frame rates, and affects 1080p more than 4K, which is perhaps the opposite end of the spectrum to what a top-level enthusiast would be gaming on. In some games, Game Mode makes no difference, while in others it can open up new possibilities.
    If I were to turn around and say that Threadripper CPUs were not pure gaming CPUs, it would annoy a fair lick of the tech audience. The data is there – it’s not the best gaming CPU. But AMD would spin it like this: it allows the user to game, to stream, to watch and to process all at the same time
    Source: As you may have already guessed, these massive thread-happy processors represent a poor value for any gaming-focused system. I’ve said this once and I will say it again: this isn’t an AMD-exclusive issue but rather one that’s endemic of every capable yet low-clocked 8+ core processors. When compared against the likes of Ryzen 5 / 7 or Intel’s i7 series, the i9 and Threadripper CPUs suffer from a serious case of framerate envy due to their very nature. Games love low latency and high frequencies and neither HEDT lineup really offers that combination. 
    Where the other shoe drops for Threadripper is that its framerates suffer more than Ryzen 7 did, to the point where we see simple Ryzen 5 processors matching or surpassing AMD’s $1000 wunderkind in several games. We’ll be testing some theories about this shortfall in the coming weeks, but it seems like AMD’s dual die design just doesn’t benefit the serial nature of many game engines.

    List of different review sites to peruse.

    Yep, some games will play better on Intel.  Try some strategy games, Threadripper will blow away anything Intel can throw at it.  
    You would think so, but it's not.

    https://www.hardocp.com/article/2017/08/10/amd_ryzen_threadripper_1950x_1920x_cpu_review/6

    https://arstechnica.com/gadgets/2017/08/amd-threadripper-review-1950x-1920x/

    Look at Civ6 & Ashes of the Singularity specifically - two games which typically dominate on core count.

    Looks like even the "heavily threaded" gaming engines kind of choke or hit severe diminishing returns once you get much beyond 8T. It could be that some engine optimizations turn that around, hard to say, but as it sits right now...

    TR is competitive, but it doesn't blow anything away in any of the gaming benches.
    AmazingAvery
  • AmazingAveryAmazingAvery Age of Conan AdvocateMember UncommonPosts: 7,188
    Wanted to provide an update from anandtech - http://www.anandtech.com/show/11726/retesting-amd-ryzen-threadrippers-game-mode-halving-cores-for-more-performance/17

    They re-tested using Game Mode (GM) on Threadripper.

    A simpler way to imagine Game Mode is this: enabling Game Mode brings the top tier Threadripper 1950X down to the level of a Ryzen 7 processor for core count at around the same frequency, but still gets the benefits of quad channel memory and all 60 PCIe lanes for add-in cards. In this mode, the CPU will preferentially use the lower latency memory available first, attempting to ensure a better immediate experience. You end up with an uber-Ryzen 7 for connectivity.



    GladDogRidelynn



  • RidelynnRidelynn Member EpicPosts: 7,383
    I did some more reading on the article that AmazingAvery links to. This is is an interesting read.

    Game mode disables one of the two dies on a TR - turning it essentially into a Ryzen with more PCI lanes and memory channels. It also shifts to NUMA memory access - meaning that faster RAM channels on the enabled die get used before the other channels, to help prevent data from having to cross the infinity layer.

    Some interesting take aways from the article:

    Anandtech misunderstood Game Mode at first, and assumed it disabled SMT (going to a 16C/16T chip - you can manually do this but it is not what Game Mode does) rather than disabling a die (going to a 8C/16T chip - this is what Game Mode does).

    The second die isn't really disabled, it's CPU core power setting is set down to something really low so the cores never get loaded. That keeps the PCI lanes and memory channels active still, although all access to them from the active die has to cross the cross-die infinity fabric for additional latency hits.

    In their testing, Anandtech finds that disabling SMT provided a bigger benefit than enabling Game Mode (although the gains were still pretty unremarkable over all). 

    It it is theorized that disabling SMT could essentially be done without requiring a reboot just by setting core affinities.
    [Deleted User]
  • OzmodanOzmodan Member EpicPosts: 9,726
    G.Skill have come out with some amazing DD4 chips designed specifically for the Intel X series:

    http://www.anandtech.com/show/11830/gskill-announces-16gb-ddr4-4600-15v-kit

    Kind of find it odd that it is specifically designed for the Intel chips as they do not seem to derive much benefit from faster memory while the AMD chips thrive on it.
    Gdemami
  • KyleranKyleran Member LegendaryPosts: 43,498
    I consider myself a common, salt of the earth gamer.  

    I buy gaming laptops which contain Intel processors,  nVidia video cards and heck, creative labs sound cards with big SSD drives.

    You guys knock my head off with the depth of these analysis on processors and such.

    Heck, usually I just ask Quizzical  what should I buy and he always ask me, "why a gaming laptop?" and we go from there.

    ;)
    Ridelynn[Deleted User]

    "True friends stab you in the front." | Oscar Wilde 

    "I need to finish" - Christian Wolff: The Accountant

    Just trying to live long enough to play a new, released MMORPG, playing New Worlds atm

    Fools find no pleasure in understanding but delight in airing their own opinions. Pvbs 18:2, NIV

    Don't just play games, inhabit virtual worlds™

    "This is the most intelligent, well qualified and articulate response to a post I have ever seen on these forums. It's a shame most people here won't have the attention span to read past the second line." - Anon






  • RidelynnRidelynn Member EpicPosts: 7,383
    Ozmodan said:
    G.Skill have come out with some amazing DD4 chips designed specifically for the Intel X series:

    http://www.anandtech.com/show/11830/gskill-announces-16gb-ddr4-4600-15v-kit

    Kind of find it odd that it is specifically designed for the Intel chips as they do not seem to derive much benefit from faster memory while the AMD chips thrive on it.
    Intel has had "specifically designed RAM" for a long time now. It may have started with RAMBUS, but it's been around for at least the last 10 years with DDR 3 and 4, in the form of XMP profiles.

    I am not surprised that AMD hasn't pushed their Radeon Memory, but I am surprised that they haven't at least revived their version of XMP, called AMP.
    [Deleted User]
Sign In or Register to comment.