Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Intel announces 7* W* Ivy Bridge* dual* cores

QuizzicalQuizzical Member LegendaryPosts: 25,351

Okay, so the asterisks on the "Ivy Bridge" and "dual cores" aren't necessary.  But they sure are necessary on the 7 W bit.

Basically, Intel has had a 17 W bin of Ivy Bridge dual core chips out for quite a while.  They're used in high priced, low performance laptops such as Ultrabooks.  Actually, they might just be used in Ultrabooks, as the market for high price, low performance devices that are far less portable than tablets isn't terribly large.  Oh, also the Razer Edge, but that just repeats my point about "not terribly large" markets.

But if a bunch of chips can keep inside of a 17 W TDP, then you probably get a decent number of them that can stay inside of 13 W at further reduced clock speeds and voltages.  And there might even be a market of people willing to pay extra for less performance so that they can get a reduced power chip that would have great CPU performance for a tablet except that it runs way too hot for a tablet and has broken video drivers.  So naturally, Intel binned out some 13 W chips.

And then called them 7 W chips.  Apparently it an "SDP" of  7 W.  The technical definition of SDP is that whatever numbers the marketing department decides on are correct.  It's kind of like the nominal wattage on a power supply in that way.  Intel says it stands for "Scenario Design Power".

I think it's more appropriate to call it "Sometimes Design Power".  In other words, sometimes the chip will stay below 7 W at load.  So if you put it in a tablet that can only handle 7 W of heat dissipation, sometimes it won't overheat and fry.  But only sometimes.  The real TDP is 13 W, so you'd better have 13 W of cooling capability on hand.  Actually, you'd better just not put it in a tablet.

In fairness to Intel, they also have a real 10 W chip, the Pentium 2129Y, which also gets a Sometimes Design Power of 7 W.  That's a 1.1 GHz dual core with no hyperthreading and Intel HD Graphics--not HD 4000 or even HD 2500.  Oh, and the graphics are also clocked really low, and also lacks working video drivers.  But it's a budget chip, so it's $150.  The only somewhat budget friendly Core i3 version is $250.  Anyone get the idea that Intel doesn't actually want to sell very many of them?

Give it a few months and AMD will launch Kabini dual cores with a real TDP of 9 W, probably clocked somewhere around 1.7 GHz, which would leave them competitive with--and likely faster than-- the Pentium 2129Y.  Also, GCN graphics (probably 1 CU clocked around 500-600 MHz) with video drivers that actually work.  If AMD wanted to play the Sometimes Design Power game, they could claim that sometimes the chip only uses 5 W.  Or 3 W.  Or whatever number they decide on.  But AMD's marketing department is definitely aware of numbers smaller than 7.  But they'll probably just call it a TDP of 9 W and leave it at that, and point to Temash for lower power consumption.

Comments

  • grndzrogrndzro Member UncommonPosts: 1,162

    Are thoes the ones that were actually more like 11-15 watts. I think there was an article stating that the 7w was at a paltry 800mhz when throttled down.

    *edit* Yea I continued reading. U already said that :)

    AMD's igp's could probably come close to that if they were throttled all the way down.

  • proxy42086proxy42086 Member UncommonPosts: 30
    AMD they also stated they are pulling out of the cpu race with intel  and concentrate on wireless devices like phones
  • QuizzicalQuizzical Member LegendaryPosts: 25,351
    Originally posted by proxy42086
    AMD they also stated they are pulling out of the cpu race with intel  and concentrate on wireless devices like phones

    Actually, AMD isn't making any chips targeted at phones.  Even the dual core version of Temash will be around 4 W, which is very tablet-friendly, but way too much for a phone.  I think AMD is waiting to see if Intel can get a foothold for x86 in the cell phone market.  If Intel fails (as is likely), then AMD wouldn't likely have succeeded either and will be glad they didn't waste the money on it.  If Intel succeeds in getting everything ported to x86, then AMD will probably make their own phone chips to compete.

    AMD is putting more of a focus on low power cores such as their upcoming Jaguar cores.  But they could hardly put less of a focus on than they have traditionally when the first such small x86 AMD cores only launched about two years ago.  AMD says that they aren't trying to compete with Intel for top end x86 CPU performance, but that's really only because they can't.  From a marketing view, it sounds much better to say, "We didn't want to do X, but we did Y instead" than "We tried to do X and failed."  AMD is trying to convince people that their CPU performance is good enough even though it's not as good as Intel's, because if you decide that AMD's CPUs are good enough, then you're probably going to buy from AMD, as they have much better graphics than Intel, and also charge less.

  • QuizzicalQuizzical Member LegendaryPosts: 25,351
    Originally posted by Torvaldr

    What poked you about the crappy low end chips this morning?  It's good info to know though I suppose since marketing will tell us otherwise.

    So apart from the junk Tegra 4 and these tablet chips is there anything interesting and exciting on the horizon?

    CES was last week, so a lot of companies announced a lot of stuff.

    AMD's upcoming Kabini will be a nifty chip if you're in the market for a netbook, as will Temash if you want a Windows-based tablet.  Just switching from ARM Cortex A9 cores to Cortex A15 will be a huge jump in CPU performance in tablets, regardless of what graphics you pair it with, and many Android tablets will make that jump this year.  If you want a budget gaming laptop or desktop running integrated graphics, then AMD's Richland should arrive any day now and will basically be Trinity with faster integrated graphics; Kaveri should arrive around the end of the year and have faster CPU cores as well.

    If you want a super high end desktop video card, then cards based on Nvidia's upcoming GK110 chip should easily be the fastest on the market.  I expect the top bin to have an honest TDP of 300 W and cost about $800, though.

    Apart from that, we'll be waiting until 2014 for interesting new stuff to come to gaming desktops or laptops--and not necessarily early 2014.

  • PhryPhry Member LegendaryPosts: 11,004
    the whole 'they charge less' thing is the reason why all my computers have been amd ones.. at least as far as CPU's are concerned, but then my desktop gaming system is still running on an amd phenom I, 2.6 quad core image
  • grndzrogrndzro Member UncommonPosts: 1,162
    AMD could get into the mobil arms race very easily and sloughter Intel. Their GCN is Modular and would pair up with their Arm A15 liscence to make an APU very quickly. I wouldn't be surprised if they already have a complete package waiting to be sent to the foundry if they saw the need to do so.
  • QuizzicalQuizzical Member LegendaryPosts: 25,351
    Originally posted by grndzro
    AMD could get into the mobil arms race very easily and sloughter Intel. Their GCN is Modular and would pair up with their Arm A15 liscence to make an APU very quickly. I wouldn't be surprised if they already have a complete package waiting to be sent to the foundry if they saw the need to do so.

    A given architecture will only scale up or down so far and still be any good.  The dual core version of Temash will probably be 1 CU clocked around 200-300 MHz, and that's about as low as their GCN architecture can reasonably go.  If AMD wanted to make a 2 W chip out of Jaguar+GCN, they probably could, but it would be a terrible 2 W chip and no cell phone manufacturer would want it.

    But that's a lot lower than Nvidia's Kepler architecture will go.  Try to make a chip with 1 SMX and clock it low and you might end up pulling 10 W for the GPU alone.  You can do that just fine in a desktop or laptop, but it's a very bad idea in a tablet.

    That's actually the problem that Intel is facing in trying to make ultra-low wattage Ivy Bridge chips.  Ivy Bridge is a great architecture if you want a 70 W chip or 50 W or 30 W.  But squeeze it down to 17 W and it isn't so great anymore.  At 10 W, it's downright mediocre.  10 W will be right in Kabini's wheelhouse, but if you want a 50 W chip, Jaguar cores will be mediocre at best and likely downright awful.

    That's why companies make more than one architecture.  If you want a 50 W or 100 W chip, then AMD has Piledriver cores, which are decent.  If you want a 2 W chip, Intel offers Atom cores.  Intel actually doesn't have any good architectures for a 10 W chip, as that's higher than Atom can reasonably scale.  You can clock and volt and Atom chip higher to burn that much power, but performance trails far behind the older AMD Brazos, let alone Kabini.

    We think of ARM as low power, but they play this game, too.  There are different degrees of low power.  Cortex A15 cores are nifty if you want a 5 W chip.  If you want 1 W, you can kind of do it with Cortex A15 cores, but Cortex A7 cores would be much better.  If you want 0.1 W, ARM would point you to their Cortex A4 cores.

    The problem is that when designing a chip, there are a bunch of choices that you have to make.  How many cores, what memory standards, how wide of schedulers, which supported instructions, how much out-of-order scheduling, which process node, and so forth.  You can have choices A and B, and choice A is clearly better than choice B in a 50 W chip, and choice B is clearly better than choice A in a 5 W chip.  An architecture has to make one choice and go with it, as you can't have it both ways.

Sign In or Register to comment.