Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

2013 is set to be a boring year in desktop hardware

QuizzicalQuizzical Member LegendaryPosts: 25,348

On the processor side, Intel is set to launch Haswell, which will greatly reduce idle power consumption as compared to Ivy Bridge.  This is a huge deal in tablets, and very nice in laptops.  But it's irrelevant to desktops.  Haswell should bring a variety of small performance improvements, but not really a big deal in desktops.  And without a new process node, it won't likely bring much in the way of improved energy efficiency at load, either.

Intel is also set to launch Ivy Bridge-E.  Have you seen the prices of Sandy Bridge-E?  Enough said.

And then there is Silvermont Atom, which might be a nifty chip for tablets or netbooks.  But not desktops, as Atom is low end.

Meanwhile, AMD will launch Kaveri, the successor to Trinity.  Kaveri also brings a die shrink from 32 nm to 28 nm, and AMD is promising considerable performance improvements.  Kaveri might well be a great laptop chip.  But in desktops?  As it won't go over four cores, it has no real hope of catching a Core i5-2500K in single-threaded performance, nor an FX-6300 in highly-threaded performance.  And those aren't exactly today's high end, either.

Moving down the chain, AMD will launch Kabini, based on Jaguar cores, and basically the successor to Brazos.  But like Silvermont Atom, Kabini is low end.  It might be a nifty chip for netbooks, and the Temash variant probably will be a nifty chip for tablets.  But Kabini isn't going to matter for desktops unless your overriding goal is an ultra-small form factor, in which case, Kabini will probably get you something functional in a case smaller than a lot of books.

And then, at AMD's high end, there is the unannounced successor to Vishera.  The reason it isn't announced is that it won't launch in 2013.  Boring year, eh?

Then there are video cards.  With no new process node to move to, neither AMD nor Nvidia will be able to improve performance per watt much.  As they're already limited by power, this means neither will offer much in the way of performance increases.  Both will launch new cards in 2013, but they'll be highly derivative of this year's cards.

-----

But that's not to say that desktops are dead.  2014 brings Intel Rockwell, which is a more-than-a-full-node die shrink of Haswell.  It will also bring AMD Excavator cores, and likely a real successor to Vishera.  Hopefully AMD will be able to move to a 20 nm process node in 2014, rather than being stuck at 28 nm and two full process nodes behind Intel, though that hasn't been announced yet.

On the video card side, 20 nm process nodes should be ready in 2014.  That lets both AMD and Nvidia do a full node die shrink, which gets you about 40% more performance in the same power envelope.  Or maybe more if they decide to burn die size to save power, since transistors will be getting awfully small.

DDR4 memory will likely also be ready in 2014.  That lets you scale up system memory bandwidth without needing to add memory channels or go with high-voltage variants of DDR3.  That's great if you want to feed integrated graphics more effectively.

-----

The upshot is that if you're looking to get a new desktop, then there's no need to wait for hardware.  You might want to wait for Windows 8 to launch tomorrow, but there isn't a bunch of important hardware coming soon that is worth waiting for.

Unless you're interested in a tablet, that is.  Google just launched Chromebooks with Cortex A15 cores, so those should be coming to tablets shortly.  Intel Silvermont Atom and AMD Temash will vastly improve over current generation products for tablets, too.

Comments

  • ClassicstarClassicstar Member UncommonPosts: 2,697


    Originally posted by Quizzical
    On the processor side, Intel is set to launch Haswell, which will greatly reduce idle power consumption as compared to Ivy Bridge.  This is a huge deal in tablets, and very nice in laptops.  But it's irrelevant to desktops.  Haswell should bring a variety of small performance improvements, but not really a big deal in desktops.  And without a new process node, it won't likely bring much in the way of improved energy efficiency at load, either.Intel is also set to launch Ivy Bridge-E.  Have you seen the prices of Sandy Bridge-E?  Enough said.And then there is Silvermont Atom, which might be a nifty chip for tablets or netbooks.  But not desktops, as Atom is low end.Meanwhile, AMD will launch Kaveri, the successor to Trinity.  Kaveri also brings a die shrink from 32 nm to 28 nm, and AMD is promising considerable performance improvements.  Kaveri might well be a great laptop chip.  But in desktops?  As it won't go over four cores, it has no real hope of catching a Core i5-2500K in single-threaded performance, nor an FX-6300 in highly-threaded performance.  And those aren't exactly today's high end, either.Moving down the chain, AMD will launch Kabini, based on Jaguar cores, and basically the successor to Brazos.  But like Silvermont Atom, Kabini is low end.  It might be a nifty chip for netbooks, and the Temash variant probably will be a nifty chip for tablets.  But Kabini isn't going to matter for desktops unless your overriding goal is an ultra-small form factor, in which case, Kabini will probably get you something functional in a case smaller than a lot of books.And then, at AMD's high end, there is the unannounced successor to Vishera.  The reason it isn't announced is that it won't launch in 2013.  Boring year, eh?Then there are video cards.  With no new process node to move to, neither AMD nor Nvidia will be able to improve performance per watt much.  As they're already limited by power, this means neither will offer much in the way of performance increases.  Both will launch new cards in 2013, but they'll be highly derivative of this year's cards.-----But that's not to say that desktops are dead.  2014 brings Intel Rockwell, which is a more-than-a-full-node die shrink of Haswell.  It will also bring AMD Excavator cores, and likely a real successor to Vishera.  Hopefully AMD will be able to move to a 20 nm process node in 2014, rather than being stuck at 28 nm and two full process nodes behind Intel, though that hasn't been announced yet.On the video card side, 20 nm process nodes should be ready in 2014.  That lets both AMD and Nvidia do a full node die shrink, which gets you about 40% more performance in the same power envelope.  Or maybe more if they decide to burn die size to save power, since transistors will be getting awfully small.DDR4 memory will likely also be ready in 2014.  That lets you scale up system memory bandwidth without needing to add memory channels or go with high-voltage variants of DDR3.  That's great if you want to feed integrated graphics more effectively.-----The upshot is that if you're looking to get a new desktop, then there's no need to wait for hardware.  You might want to wait for Windows 8 to launch tomorrow, but there isn't a bunch of important hardware coming soon that is worth waiting for.Unless you're interested in a tablet, that is.  Google just launched Chromebooks with Cortex A15 cores, so those should be coming to tablets shortly.  Intel Silvermont Atom and AMD Temash will vastly improve over current generation products for tablets, too.

    I realy hope your right about desktops i am huge fan of desktops. But as i see it now desktops are losing ground in favor of tablets/smartphones and with win8(i am no fan of win8 i stick to win7) i see it all go towards tablets and phones.

    I bought whole new system(ivy bridge/amd79xx and very pleased with it) this year and will stick with it at least until 2014.

    Games fully implemented DX11 are still not realy made and im very dissapointed that i am playing alot of DX9 GAMES RELEASED IN 2012 FOR PC:(

    Unless as gamer they serieusly improve on games towards DX11 im not sure why i should upgrade in near future?

    I hope in 2014 desktops are still alive?

    Hope to build full AMD system RYZEN/VEGA/AM4!!!

    MB:Asus V De Luxe z77
    CPU:Intell Icore7 3770k
    GPU: AMD Fury X(waiting for BIG VEGA 10 or 11 HBM2?(bit unclear now))
    MEMORY:Corsair PLAT.DDR3 1866MHZ 16GB
    PSU:Corsair AX1200i
    OS:Windows 10 64bit

  • CaldrinCaldrin Member UncommonPosts: 4,505

    lol to the poster above.. we wont see desktops die. they will be aronud forever in some form or another..

     

  • jdnewelljdnewell Member UncommonPosts: 2,237
    I am happy with my I-5 2500k and 7870. From the looks of it I will remain happy until mid/late 2014 which is good news for my budget.
  • gordiflugordiflu Member UncommonPosts: 757
    Originally posted by jdnewell
    I am happy with my I-5 2500k and 7870. From the looks of it I will remain happy until mid/late 2014 which is good news for my budget.

    Haha, +1. No need to upgrade in 2013 is actually good news with my present budgeting. Sigh, crisis.

     

  • ArakaziArakazi Member UncommonPosts: 911
    I'm worried about the long term future of desktops. Windows 8 and the lack of new hardware comming out and the focus on mobile and tablet technology is a sign that companies are moving away from desktops. I love my desktop PC, but in the longterm I don't know if it will be the platform of choice for gamers like myself who play MMOs and RTS. I hope I'm wrong but perhaps the market for people like me simply isn't big enough.
  • dreamsofwardreamsofwar Member Posts: 468

    I can't see developers creating new things on tablets and phones, so I think the desktop will always be the thing for the hardcore developers.

    However the industry does seem to be constantly leaning towards mobile and laptops. 

  • miguksarammiguksaram Member UncommonPosts: 835

    I don't know call me drunk, which would be an accurate statement, but I don't see developers stopping production of the one "system" that allows the multitude of flexability as the PC just to concede to the console market.  Don't get wrong, it's FAR easier to build for a single platform like a console than it is for a multi-specced system like a PC but that has never stopped companies in the past.  Please point me to the evidence that states such a change is in the works as a whole outside a few companies.

     

    As a matter of fact recent PC gaming has given rise to a whole new form of PC gamer which only existed in a few gamers basements in the past, independant developers!  In much the same way as youtube or any other online streaming service has created the newest celeb the availability of indie games via PC (via Steam or other services) has impowered PC gamers to explore gaming in a way that console gamers only wish they could (I'm looking at your Skyrim).

     

    I'm sure in a few years as home brewed console programming becoming more readily available I'll be proven wrong but I dare to say only in the way that the iPhone still owns the consumer cellphone market today.  By which I mean that people tend to blindly follow that which they know but then again the unknown only stays so for so long.  Mark my words, PC gaming is here to stay for a long time to come!

  • RidelynnRidelynn Member EpicPosts: 7,383

    How can you really say that power use doesn't matter in desktops?

    Power means a great deal in desktops. Remember the P4 architecture prior to Core? It wasn't until Intel took a step back, punted, and refocused their efforts on performance-per-watt, rather than just performance, that they started to make headway again. Anytime silicon gets TDP-limited (and that is quite often), power reduction is just as good as a performance increase.

    Now Haswell in and of itself may not be terribly exciting because most of it is aimed at power reduction and the marketing emphasis is towards mobile devices, but I wouldn't go so far as to say irrelevant. We didn't say Ivy was irrelevant just because was just a process node.

  • QuizzicalQuizzical Member LegendaryPosts: 25,348
    Originally posted by Ridelynn

    How can you really say that power use doesn't matter in desktops?

    Power means a great deal in desktops. Remember the P4 architecture prior to Core? It wasn't until Intel took a step back, punted, and refocused their efforts on performance-per-watt, rather than just performance, that they started to make headway again. Anytime silicon gets TDP-limited (and that is quite often), power reduction is just as good as a performance increase.

    Now Haswell in and of itself may not be terribly exciting because most of it is aimed at power reduction and the marketing emphasis is towards mobile devices, but I wouldn't go so far as to say irrelevant. We didn't say Ivy was irrelevant just because was just a process node.

    Haswell will bring idle power consumption way down.  The difference between a processor burning 3 W versus 0.2 W at idle is a huge deal for a tablet, and nifty for a laptop.  But in a desktop, when other components are already using 100 W at idle, taking a few watts off of that doesn't particularly matter.

    For desktop processors, most of the idle power consumption savings that were available some years ago are already in place (Intel since Bloomfield, AMD since Llano, or Zambezi if you refuse to call Llano a desktop processor).  You can cut idle power consumption from 10 W to 3 W once, but you can't subsequently cut it by another 7 W to get -4 W.

    But idle power consumption is irrelevant to TDP.  For performance, load power consumption is what matters.  In a desktop, if one processor uses 65 W at load, while another gives the same performance but needs 95 W, that's a substantial advantage for the first processor.  But if one processor uses 1 W at idle and 95 W at load, and another uses 5 W at idle and 95 W at load, that difference doesn't matter.

    The big power consumption advantage for Haswell basically consists of, if you're not using something, turn it off, and leave it off as much of the time as possible.  Bloomfield could do this at a level of shutting off individual cores, but Haswell will offer far more fine-grained power-gating, with ways to turn off various caches and so forth.  Much of that won't provide any power savings at all in load situations that restrict performance, precisely because the hardware won't be turned off when it's needed, and anything that does will basically amount to a rounding error.

    That's not to say that Haswell will be useless.  Once Haswell is out, it will probably be preferable to Ivy Bridge, just as Ivy Bridge today is preferable to Sandy Bridge.  But you don't want to wait 6 or 7 months for a mere 10% performance increase.  Or at least, I don't.

  • RidelynnRidelynn Member EpicPosts: 7,383


    Originally posted by Quizzical

    Originally posted by Ridelynn How can you really say that power use doesn't matter in desktops? Power means a great deal in desktops. Remember the P4 architecture prior to Core? It wasn't until Intel took a step back, punted, and refocused their efforts on performance-per-watt, rather than just performance, that they started to make headway again. Anytime silicon gets TDP-limited (and that is quite often), power reduction is just as good as a performance increase. Now Haswell in and of itself may not be terribly exciting because most of it is aimed at power reduction and the marketing emphasis is towards mobile devices, but I wouldn't go so far as to say irrelevant. We didn't say Ivy was irrelevant just because was just a process node.
    Haswell will bring idle power consumption way down.  The difference between a processor burning 3 W versus 0.2 W at idle is a huge deal for a tablet, and nifty for a laptop.  But in a desktop, when other components are already using 100 W at idle, taking a few watts off of that doesn't particularly matter.

    For desktop processors, most of the idle power consumption savings that were available some years ago are already in place (Intel since Bloomfield, AMD since Llano, or Zambezi if you refuse to call Llano a desktop processor).  You can cut idle power consumption from 10 W to 3 W once, but you can't subsequently cut it by another 7 W to get -4 W.

    But idle power consumption is irrelevant to TDP.  For performance, load power consumption is what matters.  In a desktop, if one processor uses 65 W at load, while another gives the same performance but needs 95 W, that's a substantial advantage for the first processor.  But if one processor uses 1 W at idle and 95 W at load, and another uses 5 W at idle and 95 W at load, that difference doesn't matter.

    The big power consumption advantage for Haswell basically consists of, if you're not using something, turn it off, and leave it off as much of the time as possible.  Bloomfield could do this at a level of shutting off individual cores, but Haswell will offer far more fine-grained power-gating, with ways to turn off various caches and so forth.  Much of that won't provide any power savings at all in load situations that restrict performance, precisely because the hardware won't be turned off when it's needed, and anything that does will basically amount to a rounding error.

    That's not to say that Haswell will be useless.  Once Haswell is out, it will probably be preferable to Ivy Bridge, just as Ivy Bridge today is preferable to Sandy Bridge.  But you don't want to wait 6 or 7 months for a mere 10% performance increase.  Or at least, I don't.


    It's more than just idle power. Yes, it's easiest to measure using that metric, but when you have that level of control over the power usage in a chip, you can do neat things with it, like drastically raise the limit on Turbo Boost. Right now, we have "If your not using many cores, we'll up your clock speed" - this can be expanded to "If your not using many cores, or the FPU, or MMX2 or whatever, we'll shut off whatever you aren't using and crank the clock up". It applies not only to idling.

    Sure, if you have every circuit on every core cranking, the TDP will be about the same - but let's face it. Even among synthetic benchmarks, how much software can do this. Most will stress one or two things at a time.

    It probably will be only about a 10-15% total performance increase. At stock.

    But power certainly matters in a desktop. As disappointing as Ivy Bridge was in overclocking, I am hoping that Haswell makes up for it. All these little power optimizations can go a long way. The more your able to shut off, the less extra heat your generating doing no useful work. The more power you can push towards doing work. The better you can boost what your doing - either via an approved and autonomous process such as a very aggressive Turbo Boost, or as rudimentary as just manually cranking up the clock/multiplier.

    Would I wait for it over, say, Ivy Bridge? No. There's no telling when it will actually ship. That, and my current Nehalem is still perfectly adequate. Would I use that as a reason to claim that power is irrelevent - because my Nehalem is fast enough it doesn't need to be any more power efficient? Not at all.

  • GruntyGrunty Member EpicPosts: 8,657
    Boring means lower prices on competitive products while manufacturers try to keep market share.
    "I used to think the worst thing in life was to be all alone.  It's not.  The worst thing in life is to end up with people who make you feel all alone."  Robin Williams
  • QuizzicalQuizzical Member LegendaryPosts: 25,348

    There's a big difference between "it's theoretically possible to do this" and "Intel will ship commercial products that do this".  Yes, if you know that certain things are power-gated off, that means you have power headroom to crank the clocks upward on other things.  But we've already seen quite a bit of this in previous generations, and I'd expect Haswell's gains over the previous generation to be much less than Bloomfield, which could turn entire cores off for the first time.  The difference that shutting 3/4 of your cores off entirely makes is surely much bigger than the difference that briefly shutting down selected portions of otherwise active hardware makes.

    So let's look at how much Intel was able to crank up the clock speeds.  In your Bloomfield processor, the difference in max turbo between all cores active and only one core active was 133 MHz.  In my Lynnfield processor, Intel moved that up to 533 MHz.  Impressive, no?

    But then as Intel got better at turbo boost, they went in the other direction.  On Sandy Bridge, the difference in the max steady-state turbo is exactly 0 MHz.  There is a difference of up to 300 MHz for short periods of time.  In Ivy Bridge, again, it's 0 MHz.

    And we've seen the same thing play out with laptops, where greater power sensitivity means you want a more aggressive turbo boost.  For Clarksfield, the laptop version of Lynnfield, the difference in max turbo between one core active and four was 1.067 GHz.  That's huge.  For Sandy Bridge, it came down to 300 MHz, and Ivy Bridge left it at 300 MHz.

    AMD has scaled down the max turbo difference in successive generations, too.  Thuban and Zambezi had a max turbo core considerably higher with only one core/module active than with all of them.  But Llano, Trinity, and Vishera got away from that, just going with whatever there is the power headroom for.

    Huge improvements in idle power consumption haven't always corresponded to better load power consumption in the past, either.  Intel's huge drop in idle power consumption came with Bloomfield, which was also notable for being their biggest power hog in load power consumption (compared to what you'd expect from a given process node) since they moved away from NetBurst.  Lynnfield was able to offer comparable performance with much lower power consumption, even on exactly the same process node.

    On the AMD side, the big drop in idle power consumption for desktops came with the move from Thuban to Zambezi.  And Zambezi was also quite the power hog, with performance per watt comparable to the older Thuban product, in spite of a full node die shrink.  Vishera was a huge improvement over Zambezi in performance per watt, even on the same process node.  Llano also brought idle power consumption way down, and brought load power consumption way down at the same time.  But it came at the expense of bringing performance way down, too.

    I'm not saying that Haswell will be a power hog at load, nor that it will perform poorly in desktops.  After all, correlation does not prove causation.  But I am saying that reduced idle power consumption doesn't necessarily mean reduced load power consumption.  The correlation in past products is in the wrong direction from what one would expect if reduced idle power consumption meant reduced load power consumption.

    Neither of us know exactly what Haswell will be just yet.  It's far enough away that even Intel might not know, though they surely have a pretty good idea by now.  So maybe Haswell will be a super awesome product for desktops.  But I'd bet against it.  A little better than Ivy Bridge?  Sure.  A lot better?  I'm skeptical.

    -----

    And I wouldn't count on Haswell being a massive overclocker, either.  I think it's a process node issue, not an architecture issue.  Process nodes can be tweaked to perform better at high voltages, clock speeds, and power consumption.  Or they can be tweaked to perform better at low voltages, clock speeds, and power consumption.  There are trade-offs, and you can't have a process node simultaneously optimized for everything.

    I think that Intel's 22 nm process node is optimized for relatively high voltages, clock speeds, and power consumption, but less so than older generation process nodes, and less so than they could have if so inclined.  Remember this graph, from when Intel announced tri-gate transistors?

    http://images.anandtech.com/reviews/cpu/intel/22nm/power.jpg

    That claims enormous improvements at 0.7 V over what they could have had with a 22 nm planar process.  It claims only small improvements from tri-gate at 1.0 V.  And the graph conveniently stops at 1.0 V and does not go any higher.  Meanwhile, desktop processors tend to run at something like 1.2 V at stock and 1.3 V with turbo, or maybe 1.4 V if you want to push a pretty big overclock.  Is Intel's tri-gate process really any better at high voltages than a planar one would have been?  Is tri-gate even as good as planar would have been at higher voltages?  Maybe, but it's not obvious.

    Now, there's a big risk in reading too much into a marketing slide that was probably made in Photoshop or Paint or some such.  It's not hard data, and someone simply drew some smooth curves.  But we've seen an actual product made on Intel's 22 nm tri-gate process node:  Ivy Bridge.  And it's a great product at low voltages in laptops, but less impressive in desktops, and not impressive for overclockers.

    Yes, this is just my speculation, and should be taken with a grain of salt.  But I'll be surprised if Haswell is a really great overclocker akin to Sandy Bridge rather than Ivy Bridge.

  • QuizzicalQuizzical Member LegendaryPosts: 25,348
    Originally posted by Arakazi
    I'm worried about the long term future of desktops. Windows 8 and the lack of new hardware comming out and the focus on mobile and tablet technology is a sign that companies are moving away from desktops. I love my desktop PC, but in the longterm I don't know if it will be the platform of choice for gamers like myself who play MMOs and RTS. I hope I'm wrong but perhaps the market for people like me simply isn't big enough.

    One major contributing factor in tablet and phone sales being higher than they would otherwise be as compared to desktops is that the products go obsolete much sooner.  My desktop is over three years old, and still pretty nice for a desktop.  Indeed, it's probably better than most of the desktops that people will buy brand new today.

    But three years old in the tablet world is ancient, and not remotely competitive with modern products.  Ditto for phones.  Any tablet you can buy today will be woefully obsolete in a year.

    If you have to upgrade more often, then you buy more products in order to keep just one active.  That drives tablet and phone sales higher than they would otherwise have been.  And if you're selling chips for various product categories, what matters is not how many are in use, but how many people are buying now.  That's part of the reason for an emphasis on tablets.

    Another part is that there aren't yet any good processors for tablets.  I made another thread about this a while back.  Whoever gets the first one out will have an enormous advantage over the competition for a while, until the competition catches up.  Tablets are about to get a huge jump in either performance or power consumption (depending on which older products you compare to), and it's going to be a one-time thing when the first chips actually built from scratch with tablets in mind show up.

    -----

    There are a lot of things that are best done on a desktop.  Desktops aren't going to go away anytime soon, precisely for that reason.  Barring some radical new way to input data or some revolutionary technology that makes performance not scale with power consumption, desktops will likely be prevalent until the collapse of human civilization.

  • AxiosImmortalAxiosImmortal Member UncommonPosts: 645

    nvm

    Looking at: The Repopulation
    Preordering: None
    Playing: Random Games

  • RidelynnRidelynn Member EpicPosts: 7,383

    I would like to point out that Turbo may have seem to "go down", but that stock clocks went up significantly.

    The base Bloomfield i7 clock was stock 2.66 at 130W TDP (or for a better comparison, the base Lynnfield 1156 at 2.8 at 95W TDP). The first Ivy stock i7 clock is 3.4 at 77W TDP.

    It's not that "Turbo" went down, it's that clocks across the board got more aggressive. My "hypothesis" (because I'm not a silicon engineer, I can't answer it authoritatively, I'm making an educated speculation) is that gains in efficiency largely made this possible.

Sign In or Register to comment.