Seems it is launching on June 2, new socket and all.
Think it will be great? I hear not so much better, mildly ignorant myself.
I am hoping for maybe a price drop with the Ivy-bridge this summer, since I'd like to build a new machine....
Discuss freely.
Comments
For most purposes, it should be a little better than Ivy Bridge, but not a lot. There will be some corner cases that can make extensive use of AVX2, FMA3, or TSX, in which case, you'll see far larger gains. Of course, in the sort of program that makes extensive use of FMA3, a simple FX-4300 today is likely to beat a Core i5-3570K that costs twice as much, which should give you some idea of how much of a corner case that is.
The main place where Haswell will be a big advance is if you want good battery life on a fairly capable laptop, in which case, the greatly reduced idle power consumption of Haswell will be a big deal. But that's irrelevant to desktops.
Don't count on Ivy Bridge dropping in price. Rather, Intel will leave Ivy Bridge the same price as before and discontinue it. LGA 1155 motherboards meant for Ivy Bridge have already dropped in price, however, while motherboards that support Haswell will be more expensive at first.
Frankly I am happy with my PII 955 BE, but I think I might like to have another desktop around.
I tend to lean to AMD as I have had good luck with them in the past and their prices are easier to swallow. My current thought with them though is the lack of PCI 3.0 support, but I suspect that will come with Steamroller, which I think is this summer, maybe I heard that.
Thought about building a machine around a PII 965 as you can get them for under a hundo these days, but the FX chips aren't that much farther away and if you are building a machine you can probably squeeze another 50 in somewhere.
There just seems to be more pro-Intel reviews and bench marks around these days and some not so happy business chatter where AMD is concerned, I had hoped a new chip would drop the Ivy price and open a few more options.
I'm still using a first generation i-7 (the 930 clocked at 4.0 GHz) so I keep peeking at new iterations but haven't found a reason to upgrade yet. I use my computer for gaming and Photoshop with a bit of light work (Word, Excel, etc.) thrown in. I usually build with an eye to gaming and everything else works well.
Maybe I'll finally upgrade with Haswell but definitely not at first. I want to see some extensive real world game benching before I decide if it's worth it. Of course, sometimes I do upgrade even when the benefit isn't great just for the fun of building...yes I am a HW geek
“Microtransactions? In a single player role-playing game? Are you nuts?”
― CD PROJEKT RED
If Intel wanted the console business it would of taken it. This tock release IS targeted at mobile platforms they already own desktop.
Anyone that buys an AMD processor for a high end gaming rig is just plain silly. I'd love to see AMD back competing on the high end to drive Intel to a faster schedule. CPU development rate isn't like it used to be two years ago.
And how exactly do you propose for Intel to have done that without a competitive graphics architecture?
Somewhat in Intel's defensive (and mostly to play Devil's Advocate) - discrete graphics as a whole is a niche market. You have gamers, and some professional fields (CAD, graphics design, programs which can use OpenCL/etc) - the vast majority of desktops that are sold are either sold with discrete cards that are so inferior that it's just a token card, or run perfectly fine off integrated graphics.
Consoles aren't PCs, and while the contracts to provide the parts is fairly lucrative when your nVidia/AMD, it's a rounding error to Intel.
Intel focused on integrated graphics. They have done so since 1998. They are the single largest graphics distributor - because most people just need enough to browse the web for Facebook, and most businesses just need enough to drive PowerPoint - and those tasks don't take a lot of GPU power. I can't find any current numbers, but I've never seen a report where, when taking into account ~total~ PC sales (all laptops, desktops, notebooks, etc), Intel was not the undisbuted leader in providing the graphics technology - usually somewhere in the 40%+ range.
So, if your already leading the pack in sales, and your putting in an extreme minimum investment to get there, and it takes an extreme amount of engineering, research, and development to get the corner cases served by niche markets (as Intel found out with Larabee) - why bother?
You just need "good enough" - and that's what Intel graphics are for the vast majority of people. And apparently, that's what Intel CPU power has been as well for the past 3-4 generations, since we haven't seen that dial move much either.