Quantcast

Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Intel officially launches Sky Lake Refresh Refresh Refresh

13

Comments

  • QuizzicalQuizzical Member LegendaryPosts: 22,642
    edited November 2018
    Cleffy said:
     If you are picking a CPU for a specific software, always base your judgement on how a CPU performs with that software.
    This, exactly.  What matters is not how good a CPU is at the programs that reviewers use.  What matters is how good it is at the programs you use.  Gamers don't know what games we will play years from now, but enterprise users sometimes do know that it's for some particular program.
  • Jean-Luc_PicardJean-Luc_Picard Member LegendaryPosts: 8,461
    Cleffy said:
    If I was buying a CPU for 3D rendering and video editing. I would definitely go AMD. However, if I wasn't the one buying it then the 9700K seems like a solid choice.
    IPC is dependent on workload. If you are picking a CPU for a specific software, always base your judgement on how a CPU performs with that software. Where a CPU might perform quickly in consumer applications, it might be overwhelmed with enterprise applications.
    Exactly. And since I'm using my computer for a wide range of activities, the 9700k (and the 8700k before) are perfect. Best in gaming, great for rendering, great for video encoding (but there the GPU does most of the job now), and great for compiling/programming.
    I could have gotten a 9900k by paying the difference and have the best of the best, but that would be overkill for my usage, what I have is just the right balance.
    Ridelynn
    "The ability to speak doesn't make you intelligent" - Qui-gon Jinn in Star Wars.
    After many years of reading Internet forums, there's no doubt that neither does the ability to write.
    CPU: Intel Core I7 9700k (4.90ghz) - GPU: ASUS Dual GeForce RTX 2070 SUPER EVO 8GB DDR6 - RAM: 32GB Kingston HyperX Predator DDR4 3000 - Motherboard: Gigabyte Z390 Aorus Ultra - PSU: Antec TruePower New 750W - Storage: Kingston KC1000 NVMe 960gb SSD and 2x1TB WD Velociraptor HDDs (Raid 0) - Main display: Samsung U32J590 32" 4K monitor - Second display: Philips 273v 27" monitor - VR: Pimax 8K headset - Sound: Sony STR-DH550 AV Receiver HDMI linked with the GPU and the TV, with Jamo S 426 HS 3 5.0 speakers and Pioneer S-21W subwoofer - OS: Windows 10 Pro 64 bits.


  • TorvalTorval Member LegendaryPosts: 20,627
    Here is the rumor source for Intel production cuts. https://digitimes.com/news/a20181115PD206.html

    Fedora - A modern, free, and open source Operating System. https://getfedora.org/

    traveller, interloper, anomaly, iteration


  • OzmodanOzmodan Member EpicPosts: 9,726
    edited November 2018
    Not sure about you, but when  I game, I use ALL 8 cores.  I have a bunch of peripheral programs that I have running.  So you can take your intel CPUs and shove them where the sun does not shine.   That is why so many of the workstations I build are AMD CPUs, they also have many programs running at the same time.

    I also tell the people I build computers for, that when AMD comes out with their 7nm CPUs, I can upgrade their system with a new CPU without changing anything else.  Good luck trying that with an Intel processor.
    BeezerbeezGdemami
  • QuizzicalQuizzical Member LegendaryPosts: 22,642
    Ozmodan said:
    Not sure about you, but when  I game, I use ALL 8 cores.  I have a bunch of peripheral programs that I have running.
    Why do you do that?  I mean, this seems like an easily fixable problem.  Most programs while running in the background will take only a trivial processing load.

    Though if you genuinely do need a ton of stuff running in the background that is a heavy CPU load, you might want to look at a Threadripper 2950X.
  • Asm0deusAsm0deus Member EpicPosts: 3,342
    Torval said:
    Here is the rumor source for Intel production cuts. https://digitimes.com/news/a20181115PD206.html

    "Sorry, the page you are trying to open is available only for our paid subscribers."

    Brenics ~ Just to point out I do believe Chris Roberts is going down as the man who cheated backers and took down crowdfunding for gaming.





  • gervaise1gervaise1 Member EpicPosts: 6,919
    Quizzical said:
    gervaise1 said:

    You took Skylake - why? There was no die shrink involved with Skylake - it was a Broadwell refresh! And some people pointed this out at the time. No real gain; just another 14nm cpu; the big improvement is the motherboard. All true of course.
    Sky Lake was a new architecture, not just Broadwell on a new process node.  Sky Lake is heavily derivative of Broadwell, so it's not a huge overhaul like going from Clarkdale to Sandy Bridge.  Sky Lake improved some caches and made certain things wider.  But it is still a new architecture, and not just Broadwell on a different process node.

    One way that you can tell this is with benchmarks that normalize everything.  For example, get a Sky Lake CPU and a Broadwell CPU, disable all but one CPU core, and fix the clock speed at 3 GHz with turbo disabled.  The Sky Lake CPU will tend to perform several percent faster than the Broadwell CPU.  Meanwhile, under that test, the Sky Lake CPU will perform identically (at least up to rounding) to Kaby Lake, Coffee Lake, or the latest "Coffee Lake Refresh", at least if you can pick models with exactly the same L3 cache capacity.

    The last time Intel offered a mainstream consumer desktop CPU that improved the functionality of the cores on a per clock basis was Sky Lake.  Everything since then is just a refresh.  Incidentally, Sky Lake-X cores are a little different from Sky Lake, with a different L2 cache.

    Refreshes aren't necessarily a bad thing.  They can mean that a CPU vendor makes something a little better after a year rather than having to wait two years for an update.  But three new generations just being a refresh like this is unprecedented.  The reason for so many refreshes is that Intel's 10 nm process is severely delayed, and that's what is breaking everything for Intel right now.  The problems might not look that bad right now because AMD is also still on 12/14 nm, and on an inferior process node to Intel's 14++ nm, at that.  But once AMD has numerous products out on 7 nm, Intel is going to be in a world of hurt until they can catch up.
    We agree - your last paragraph is the key. To some extent we are talking about two different aspects: design and manufacture.

    A product is the combination of its design plus its manufacture however. And not only was Skylake a new architecture it also marked the maturation of Intel's 14nm manufacturing.

    And its the manufacturing side @ 10nm that Intel is struggling with now. A new killer 10nm architecture won't change that. Leading - as you say - to three design refreshes. And on the manufacturing side ... probably four? Broadwell to Skylake was a refresh but in truth we don't see how they are made just the design side.
    Torval
  • VrikaVrika Member EpicPosts: 6,692
    Quizzical said:
    Ozmodan said:
    Not sure about you, but when  I game, I use ALL 8 cores.  I have a bunch of peripheral programs that I have running.
    Why do you do that?  I mean, this seems like an easily fixable problem.  Most programs while running in the background will take only a trivial processing load.

    Though if you genuinely do need a ton of stuff running in the background that is a heavy CPU load, you might want to look at a Threadripper 2950X.
    Considering Ozmodan's post history, I think he just goes out of his way to present a situation that's unfavourable to Intel.

    That's not to say he couldn't have real reason this time, but if it's unfavourable to Intel or NVidia's RTX cards, you can be sure Ozmodan will present it on these forums.
    Hatefull
     
  • Jean-Luc_PicardJean-Luc_Picard Member LegendaryPosts: 8,461
    Vrika said:
    Quizzical said:
    Ozmodan said:
    Not sure about you, but when  I game, I use ALL 8 cores.  I have a bunch of peripheral programs that I have running.
    Why do you do that?  I mean, this seems like an easily fixable problem.  Most programs while running in the background will take only a trivial processing load.

    Though if you genuinely do need a ton of stuff running in the background that is a heavy CPU load, you might want to look at a Threadripper 2950X.
    Considering Ozmodan's post history, I think he just goes out of his way to present a situation that's unfavourable to Intel.

    That's not to say he couldn't have real reason this time, but if it's unfavourable to Intel or NVidia's RTX cards, you can be sure Ozmodan will present it on these forums.
    I guess he's having a heavy 3D rendering, as well as 4K video encoding running in the background at the same time that he's gaming - that guy is a pro, he does things us mortals can't understand ;)
    "The ability to speak doesn't make you intelligent" - Qui-gon Jinn in Star Wars.
    After many years of reading Internet forums, there's no doubt that neither does the ability to write.
    CPU: Intel Core I7 9700k (4.90ghz) - GPU: ASUS Dual GeForce RTX 2070 SUPER EVO 8GB DDR6 - RAM: 32GB Kingston HyperX Predator DDR4 3000 - Motherboard: Gigabyte Z390 Aorus Ultra - PSU: Antec TruePower New 750W - Storage: Kingston KC1000 NVMe 960gb SSD and 2x1TB WD Velociraptor HDDs (Raid 0) - Main display: Samsung U32J590 32" 4K monitor - Second display: Philips 273v 27" monitor - VR: Pimax 8K headset - Sound: Sony STR-DH550 AV Receiver HDMI linked with the GPU and the TV, with Jamo S 426 HS 3 5.0 speakers and Pioneer S-21W subwoofer - OS: Windows 10 Pro 64 bits.


  • OzmodanOzmodan Member EpicPosts: 9,726
    Vrika said:
    Quizzical said:
    Ozmodan said:
    Not sure about you, but when  I game, I use ALL 8 cores.  I have a bunch of peripheral programs that I have running.
    Why do you do that?  I mean, this seems like an easily fixable problem.  Most programs while running in the background will take only a trivial processing load.

    Though if you genuinely do need a ton of stuff running in the background that is a heavy CPU load, you might want to look at a Threadripper 2950X.
    Considering Ozmodan's post history, I think he just goes out of his way to present a situation that's unfavourable to Intel.

    That's not to say he couldn't have real reason this time, but if it's unfavourable to Intel or NVidia's RTX cards, you can be sure Ozmodan will present it on these forums.
    No, Intel is a good buy if money is no object.  My big dispute is people that think they need the top end processor for gaming.  For most of us that game, it is important to watch the cost and the CPU for most uses is secondary to the GPU.  So it becomes a practical choice to throw that extra $100+ or so money you could spend on a Intel CPU to a AMD CPU while getting a bigger performance gain with the better GPU.  I just don't think you will see noticeable differences in most games. 
    RidelynnAsm0deusGdemami
  • Jean-Luc_PicardJean-Luc_Picard Member LegendaryPosts: 8,461
    OG_Zorvan said:
    Hell, my I7 4790k keeps up with most of the "modern" cpus and it's what, 5 "generations" old now?
    The 4790k is an awesome CPU for gamers, it will last you many more years.
    I'd still have mine if I didn't get free upgrades.
    OzmodanHatefull
    "The ability to speak doesn't make you intelligent" - Qui-gon Jinn in Star Wars.
    After many years of reading Internet forums, there's no doubt that neither does the ability to write.
    CPU: Intel Core I7 9700k (4.90ghz) - GPU: ASUS Dual GeForce RTX 2070 SUPER EVO 8GB DDR6 - RAM: 32GB Kingston HyperX Predator DDR4 3000 - Motherboard: Gigabyte Z390 Aorus Ultra - PSU: Antec TruePower New 750W - Storage: Kingston KC1000 NVMe 960gb SSD and 2x1TB WD Velociraptor HDDs (Raid 0) - Main display: Samsung U32J590 32" 4K monitor - Second display: Philips 273v 27" monitor - VR: Pimax 8K headset - Sound: Sony STR-DH550 AV Receiver HDMI linked with the GPU and the TV, with Jamo S 426 HS 3 5.0 speakers and Pioneer S-21W subwoofer - OS: Windows 10 Pro 64 bits.


  • OzmodanOzmodan Member EpicPosts: 9,726
    Interesting article here.  While only dealing with a specific retailer, the volume is somewhat indicative of the market.

    https://www.techradar.com/news/amd-is-now-selling-twice-as-many-processors-as-intel

    Most of the people I build PCs for these days don't want Intel and prefer AMD, but if you need top end single thread power, Intel is still the leader.
  • 13lake13lake Member UncommonPosts: 718
    edited January 2019
    Seems i was right about the chiplet design, and i said it before AdoredTV leaks. YaY ME :) *pats thyself on the back.

    Quizzical said:
    There's also no guarantee that third generation Ryzen will go with the chiplet design.  In fact, it probably won't, at least excluding Threadripper. 

    So much for non-guaranteed chiplet design, and only threadripper chiplet design xD
  • QuizzicalQuizzical Member LegendaryPosts: 22,642
    RIP integrated memory controller.

    Now the question is what AMD did to solve the latency problem of needing multiple hops to reach system memory.  Or whether they solved it.  Because if the latency to reach DDR4 on this part looks like it did to go through the another die for Threadripper, then it's really not going to be that good of a part.

    At this point, I think that we can say that the supposed leak passed around of a bunch of 7 nm Ryzen processors being announced at CES was completely fake.
  • 13lake13lake Member UncommonPosts: 718
    edited January 2019
    Quizzical said:
    At this point, I think that we can say that the supposed leak passed around of a bunch of 7 nm Ryzen processors being announced at CES was completely fake.
    The timing of it yeah, i still think the actual details of the leak are correct, except the price, and the exact clocks, which will be finalized once they finish binning the chips.
    Oh and the leaks accurately predicted more than 8 cores on top of the chiplets as well.

    Also AMD confirmed that the ES sample in the 9900k test was running up to [email protected]~75W, compared to ~125w for the 9900k, so that's 3 for 3 for the leaks atm.
    Post edited by 13lake on
    Torval
  • AvanahAvanah Member RarePosts: 1,551
    As a First time Ryzen PC Builder (Two previous were Intel PCs) I LOVEEEEEE my 6 month old 2700X.
    The King of Multitasking and Gaming...At the same time! But that's just me. :)
    Ozmodan

    "My Fantasy is having two men at once...

    One Cooking and One Cleaning!"

    ---------------------------

    "A good man can make you feel sexy,

    strong and able to take on the whole world...

    oh sorry...that's wine...wine does that..."





  • 13lake13lake Member UncommonPosts: 718
    edited January 2019
    Quizzical said:
    RIP integrated memory controller.

    Now the question is what AMD did to solve the latency problem of needing multiple hops to reach system memory.  Or whether they solved it.  Because if the latency to reach DDR4 on this part looks like it did to go through the another die for Threadripper, then it's really not going to be that good of a part.
    This chiplet design is actually supposed to solve the latency problem, that's the main reason for it's existence.
    They way i understood it is compared to the possibility of multiple hops through CCX-es producing latency penalty on the old design, with the new chiplet design you are guaranteed to only do 1 additional hop before the RAM, and that's the hop to the I/O die, and that's it, no cross hoping like before. So at least 8 cores are latency free :), i don't know what the impact will be when they add the second 8 core die.

    And also there's rumors swirling that RAM will not impact the Infinity Fabric as much now, and  that AMD is switching to a highly clocked uncore with ryzen 3.
    Torval
  • QuizzicalQuizzical Member LegendaryPosts: 22,642
    13lake said:
    Quizzical said:
    RIP integrated memory controller.

    Now the question is what AMD did to solve the latency problem of needing multiple hops to reach system memory.  Or whether they solved it.  Because if the latency to reach DDR4 on this part looks like it did to go through the another die for Threadripper, then it's really not going to be that good of a part.
    This chiplet design is actually supposed to solve the latency problem, that's the main reason for it's existence.
    They way i understood it is compared to the possibility of multiple hops through CCX-es producing latency penalty on the old design, with the new chiplet design you are guaranteed to only do 1 additional hop before the RAM, and that's the hop to the I/O die, and that's it, no cross hoping like before. So at least 8 cores are latency free :), i don't know what the impact will be when they add the second 8 core die.

    And also there's rumors swirling that RAM will not impact the Infinity Fabric as much now, and  that AMD is switching to a highly clocked uncore with ryzen 3.
    Nonsense.  Yields and production cost are the reason for the chiplet design, not latency.  Moving from one chip to another physical chip is always going to be slower than moving the same distance within a chip.  The only question is how much slower, how much more power it will consume, and how much worse it will be by any other metric you can think of.

    If you want to build a 64-core server CPU with 8 DDR4 channels and 128 PCI Express 4.0 lanes on a cutting edge process node by making an enormous, monolithic die, then your yields are going to be terrible.  Break that into a bunch of small chips and when something is defective, you can throw that one chiplet into the garbage without having to throw away the entire, enormous die.

    Hopefully AMD has found a way to make it so that the chiplet approach adds not very much latency and not very much power.  Adding 5 ns to your global memory latency is not the same problem that adding 50 ns would be.  If they're going chiplets everywhere, then optimizing for this was surely a major focus of the design.  But there's only so much you can do before physics stops you, and I'm not really sure what is possible here.
    blueturtle13
  • Slapshot1188Slapshot1188 Member LegendaryPosts: 12,539
    OG_Zorvan said:
    Hell, my I7 4790k keeps up with most of the "modern" cpus and it's what, 5 "generations" old now?
    I'm rocking an overclocked 3570k :)
    Paired with a 1060
    Can play all of today's games... maybe not with all the bells and whistles but everything runs fine... 
    Torval

    "I should point out that no other company has shipped out a beta on a disc before this." - Official Mortal Online Lead Community Moderator

    Starvault's reponse to criticism related to having a handful of players as the official "test" team for a supposed MMO: "We've just have another 10ish folk kind enough to voulenteer added tot the test team" (SIC) This explains much about the state of the game :-)

    Proudly wearing the Harbinger badge since Dec 23, 2017. 

    Coined the phrase "Role-Playing a Development Team" January 2018

    "Oddly Slap is the main reason I stay in these forums." - Mystichaze April 9th 2018

    My ignore list finally has one occupant after 12 years. I am the strongest supporter of free speech on here, but free speech does not mean forced listening. Have fun my friend. Hope you find a new stalking target.

  • TorvalTorval Member LegendaryPosts: 20,627
    OG_Zorvan said:
    Hell, my I7 4790k keeps up with most of the "modern" cpus and it's what, 5 "generations" old now?
    I'm rocking an overclocked 3570k :)
    Paired with a 1060
    Can play all of today's games... maybe not with all the bells and whistles but everything runs fine... 
    the rig: i7/4790, GTX970, 16GB RAM. I have an ASROCK Z97 but my CPU isn't unlocked.

    My biggest annoyance is the memory limit of the 970. Everything else feels pretty even. I want to upgrade my graphics card, but I don't need to upgrade anything at all. My system is also really stable and I like that. When upgrading that matters as much to me as the performance bump does. So far I have no reason to upgrade.

    I'm pretty sure my next base system will be AMD because I'm at the place where locking CPU clocking behind SKUs pisses me right the hell off.

    Hardware landscape is becoming weird to me because I'm also watching what they try and offer for a graphics card later this year or maybe next. Intel graphics traditionally work well on Linux and BSD and a more powerful dedicated desktop graphics card with open drivers could be huge. On the other hand the card offering could be a joke, the drivers could be closed, and it might only  support Intel boards. Who knows.
    ArglebargleRidelynnOzmodanblueturtle13
    Fedora - A modern, free, and open source Operating System. https://getfedora.org/

    traveller, interloper, anomaly, iteration


  • RidelynnRidelynn Member EpicPosts: 7,143
    Torval said:
    OG_Zorvan said:
    Hell, my I7 4790k keeps up with most of the "modern" cpus and it's what, 5 "generations" old now?
    I'm rocking an overclocked 3570k :)
    Paired with a 1060
    Can play all of today's games... maybe not with all the bells and whistles but everything runs fine... 
    the rig: i7/4790, GTX970, 16GB RAM. I have an ASROCK Z97 but my CPU isn't unlocked.

    My biggest annoyance is the memory limit of the 970. Everything else feels pretty even. I want to upgrade my graphics card, but I don't need to upgrade anything at all. My system is also really stable and I like that. When upgrading that matters as much to me as the performance bump does. So far I have no reason to upgrade.

    I'm pretty sure my next base system will be AMD because I'm at the place where locking CPU clocking behind SKUs pisses me right the hell off.

    Hardware landscape is becoming weird to me because I'm also watching what they try and offer for a graphics card later this year or maybe next. Intel graphics traditionally work well on Linux and BSD and a more powerful dedicated desktop graphics card with open drivers could be huge. On the other hand the card offering could be a joke, the drivers could be closed, and it might only  support Intel boards. Who knows.
    Almost the same situation here - 4790k but not heavily overclocked, 980, 32G RAM.

    Everything I play runs well - I have more issues with games supporting interface scaling at 4k that I do FPS at 4k. The computer I have is overkill for 90% of what I do with it. For the other 10% - it's still good enough to not need an upgrade.

    I'd like to upgrade, but I don't need to upgrade. I'm thinking my next big purchase will be some nice monitors, and then build a new rig around that. Threadripper looks like it would be a really fun watercooled build to get around, but I don't want to throw that kind of money at a PC again.
    Torval
  • OzmodanOzmodan Member EpicPosts: 9,726
    I built a really nice 2700x system, but my son needed it more so I am back to my old 4670K running at 4ghz system with a 960 and most games still run fine.  I only have 8gb of ddr3 memory, but still intend to build another AMD system so won't bother to waste money adding to that.   I think I will wait until AMD's 3000 series debuts to build my next one.  I hope to avoid Intel and Nvidia in my next build if possible, 

    Torval
  • OzmodanOzmodan Member EpicPosts: 9,726
    This is the reason I am waiting for for the Zen 2, 7nm chips from AMD:

    https://www.hardocp.com/news/2019/01/11/amd_cinebench_benchmark_demo_at_ces_2019_buries_current_intel_lineup/

    Read it and weep for you Intel people.
    Gdemami
  • OzmodanOzmodan Member EpicPosts: 9,726
    DMKano said:
    Ozmodan said:
    This is the reason I am waiting for for the Zen 2, 7nm chips from AMD:

    https://www.hardocp.com/news/2019/01/11/amd_cinebench_benchmark_demo_at_ces_2019_buries_current_intel_lineup/

    Read it and weep for you Intel people.

    I fall into - "buy best gaming CPU" dont' care if it's Intel/AMD

    There are definitely people who stick to one brand.

    As far as "intel" people - I guess there are many of those who only stick to Intel as that's all they've ever used - there are simply way more people like this, than AMD people who only use AMD cpu - especially in the desktop market.

    Intel's desktop market share is over 85% - AMD has their work cut out for them in that segment.
    Yep, Intel has lots of contracts that tie up PC makers with their hardware.   Was looking at laptops the other day and could only find two with the AMD 2400G chips.  Most of the others had Intel cpus with the HD 630 graphics which sucks at about anything that requires graphics.
  • QuizzicalQuizzical Member LegendaryPosts: 22,642
    Ozmodan said:
    DMKano said:
    Ozmodan said:
    This is the reason I am waiting for for the Zen 2, 7nm chips from AMD:

    https://www.hardocp.com/news/2019/01/11/amd_cinebench_benchmark_demo_at_ces_2019_buries_current_intel_lineup/

    Read it and weep for you Intel people.

    I fall into - "buy best gaming CPU" dont' care if it's Intel/AMD

    There are definitely people who stick to one brand.

    As far as "intel" people - I guess there are many of those who only stick to Intel as that's all they've ever used - there are simply way more people like this, than AMD people who only use AMD cpu - especially in the desktop market.

    Intel's desktop market share is over 85% - AMD has their work cut out for them in that segment.
    Yep, Intel has lots of contracts that tie up PC makers with their hardware.   Was looking at laptops the other day and could only find two with the AMD 2400G chips.  Most of the others had Intel cpus with the HD 630 graphics which sucks at about anything that requires graphics.
    AMD has two big problems in laptops:

    1)  Their chips use more power at idle than Intel's, which hurts battery life.  That's makes them a non-starter for a lot of purposes, and at minimum, a big problem if you weren't going to use their integrated GPU heavily.
    2)  AMD doesn't release drivers for their laptop GPUs.  That's a showstopper if you were hoping to make heavy use out of that integrated GPU.

    AMD has promised to fix the latter problem sometime in Q1 of this year.  It's not clear when they'll address the former.  Neither problem is relevant to desktops, however.
Sign In or Register to comment.