Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Fuzzy Avatars Solved! Please re-upload your avatar if it was fuzzy!

Micron: DDR4 by the end of the year, to be used initially in servers

QuizzicalQuizzical Posts: 14,779Member Uncommon

http://www.anandtech.com/show/6619/crucial-demonstrates-ddr42133-modules

At CES, Micron (aka Crucial or Lexar) showed off working DDR4 memory.  The advantages of DDR4 over DDR3 are pretty straightforward:  higher clock speeds at lower voltages.  The latter means lower power consumption, and the difference between 1.2 V and 1.5 V at stock voltages should mean 36% less power consumption for DDR4 if all else were equal (which it isn't).  DDR4 will start at 2133 MHz (real clock speed of 1066 MHz, then doubled because memory sellers like to claim higher numbers) and clock speeds will go up from there.

DDR4 has been greatly delayed, however.  For example, AMD's Vishera chips that are out now (FX-*300) were supposed to be paired with DDR4 memory.  Or rather, it was some other chip that was supposed to use Piledriver cores paired with DDR4 memory and Vishera wasn't supposed to exist, but DDR4 wasn't anywhere near ready, so AMD canceled the planned new platform and just fixed up the cores while fitting an old platform.

It's also interesting that Micron is saying that DDR4 will first be used in servers.  Power consumption is a huge deal in data centers, and that's the area most willing to pay a premium price for reduced power consumption.  But that leads me to wonder which chips will be the ones that use DDR4.  Most Intel and AMD server chips are also used in desktops and/or laptops.  But those chips usually launch for desktops and laptops first at the same time as servers if not earlier, as servers have higher reliability demands that require a longer validation process.  If you make a chip with a DDR4 memory controller, then you can't just plug DDR3 into that and expect it to work.

The notable exception to x86 server chips also appearing in desktops and laptops is Intel's super high end whatever-EX platforms.  But I don't think that DDR4 makes sense there, as those are for servers that need a huge amount of performance in a single server with coherent memory and so forth, and that burns a lot of power.

This is my speculation, but I think the most logical candidate is microservers, with many low-power cores.  That's the product targeted at places that are very sensitive to power consumption.  Intel's Avoton Atom-based servers would make sense.  So might the promised Jaguar-based Opteron chips that AMD is promising.  I don't think that the ARM v8 server chips will be ready by the end of the year, but Calxeda, Marvell, and so forth will probably have ARM v7-based servers soon.  (Same ISA as Cortex A7 and A15.)

DDR4 for desktops and laptops will get there when it gets there, and in many computers, the extra bandwidth doesn't really matter.  Consumers may wish to shy away from the higher initial prices, too.  But eventually, DDR4 will be a big deal in APUs, as if you're trying to feed integrated graphics, you need all of the memory bandwidth that you can get.  Starting at 2133 MHz and going up from there means you'll have a lot more bandwidth than DDR3 offers, and there should be much higher binned modules available early on.

Micron says that the initial DDR4 modules will be on a 32 nm process node.  That surely means they'll have a large die size and be expensive, though they'll do a die shrink as soon as they can.  My guess is that Micron wanted a mature process node so that they weren't having to fight with process node problems when trying to get the hang of DDR4 production down.

Comments

  • TorvalTorval Oregon CountryPosts: 7,205Member Uncommon
    So does this have any real meaning for the desktop users in the next year?  How will the hardware advances affect desktop users?  Will this make integrated gpus more viable?  Will it improve laptop / portable memory through reduced power consumption or is that not a factor?
  • QuizzicalQuizzical Posts: 14,779Member Uncommon
    Originally posted by Torvaldr
    So does this have any real meaning for the desktop users in the next year?  How will the hardware advances affect desktop users?  Will this make integrated gpus more viable?  Will it improve laptop / portable memory through reduced power consumption or is that not a factor?

    DDR4 is only relevant in places that need more memory bandwidth than you'd get from LPDDR2, or soon, LPDDR3.  Those are more optimized for low power consumption than DDR4, while DDR4 is more optimized for high bandwidth.  So I don't see DDR4 being used in cell phones, and while it probably will show up in some tablets, it's never going to be the universal standard there.

    DDR4 is going to be expensive on a $/GB basis at first.  While it would be great for Kaveri, expensive memory doesn't fit the budget price tag that a system with integrated graphics needs.  The first consumer chip where DDR4 really matters will probably be a still-unannounced successor to Kaveri sometime around late 2014.

    Intel Silvermont Atom will also release around the same time as DDR4, but that's a low end product that needs a low price tag, and doesn't need that much memory bandwidth, anyway.

    Intel Broadwell might use DDR4, as an expensive CPU chip can afford to require expensive memory.  They might make an ultrabook with single-channel DDR4 memory, but that only matters if you believe ultrabooks matter, which I don't.  They'll likely use a dual-channel DDR4 memory bus for broadwell laptops, and may finally try to compete with AMD in integrated graphics performance there.

    Well, "finally" unless you believe that Intel has been trying to compete with AMD in integrated graphics performance for years, and just failing miserably.  Credible rumors that Intel isn't even going to support recent versions of OpenGL in Haswell graphics lead me to believe that Haswell graphics aren't going to any more competitive than previous generations of Intel graphics.

    But that's all in laptops.  You don't want Intel graphics in a desktop unless you don't care about graphics performance.  Even if they do manage to make the top end Broadwell graphics competitive with AMD Kaveri, it surely won't be competitive on price or driver quality.  There are rumors that Broadwell won't even have a socketed version at all for desktops, so there's a good chance that Broadwell won't particularly matter for desktops, whether it has DDR4 or not.

    Apart from integrated graphics, DDR4 is also a way to feed many CPU cores.  It won't be ready in time for Ivy Bridge-E, but it could conceivably be used in a successor Haswell-E chip if it exists--which it might not.  I don't think there's any reason to make a successor to this year's coming Ivy Bridge-E until you're going to use DDR4 memory.  Ultra high end products can afford expensive memory, and the quad channel DDR3 on Sandy Bridge-E is itself rather expensive.

    If AMD makes a successor to Vishera using Steamroller cores in 2014, that will almost certainly use DDR4.  They might well want to stick 10 or 12 cores on such a chip, and even 1866 MHz DDR3 just isn't enough memory bandwidth to feed that many cores.  Remember that the canceled chip with Piledriver cores and DDR4 memory would have had 10 cores.  That, like Zambezi and Vishera (and, for that matter, Sandy Bridge-E), will primarily be a server chip that AMD will also offer in desktops.

    While DDR4 will be expensive when it first launches, it won't stay that way forever.  Memory manufacturers realize that DDR4 is the future and DDR3 is not.  Once they ramp up DDR4 production, they'll stop moving DDR3 to new process nodes and focus on DDR4 instead.  DDR4 will eventually be cheaper than DDR3, and possibly even by the end of 2014.

Sign In or Register to comment.