Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

New RIG-Suggestions Wanted

emistzemistz Member Posts: 54

I'm putting together a new setup and I could use some suggestions.  This is what I have in mind:

  1. 2 x 2TB WD Red WD20EFRX
  2. Corsair Vengeance 64GB (8x8GB) DDR3 1600 MHz (PC3 12800) Desktop Memory (CMZ64GX3M8A1600C9)
  3. KingWin Lazer Power Supply 1000 Watts with Universal Modular Connectors/3 Way LED Switch/80 Plus Bronze ATX 1000 Power Supply LZ-1000
  4. Creative Sound Blaster Recon3D THX PCIE Sound Card SB1350
  5. NZXT Aperture M Internal 5.25-Inch Mesh Card Reader with 2x USB 3.0 (8c-aper000-w0b)
  6. Gigabyte GeForce GTX 680 OC 4GB GDDR5 DVI-I/DVI-D/HDMI/Displayport PCI-Express 3.0 SLI Ready Graphics Card GV-N680OC-4GD
  7. NZXT Crafted Series ATX Full Tower Steel Chassis - Phantom White with Red Trim Computer Case CS-NT-PHAN-WNR White/Red
  8. Gigabyte LGA 2011 DDR3 2133 Intel X79 SATA 6Gb/s USB 3.0 ATX Motherboard GA-X79-UP4
  9. Intel Core i7-3930K Hexa-Core Processor 3.2 Ghz 12 MB Cache LGA 2011 - BX80619I73930K
  10. 3 x ViewSonic VA2451M-LED 24-Inch Screen LED-Lit Monitor
  11. 1 x Samsung MD32B 32 LED Monitor 16:9 8ms 1920x1080 5000:1 DVI/HDMI/VGA/USB Speaker
Primarily I'll use it for programming, gaming, video editing, and running multiple VMs with automating software.
 
I'm open to suggestions, what do you think I should add/remove/change and why?

image

«1

Comments

  • QuizzicalQuizzical Member LegendaryPosts: 25,355

    64 GB of system memory is completely ridiculous for gaming purposes.  It's probably completely ridiculous for the rest, too, but if you're going to run enough VMs with your automating software, it might make sense.  It would be a lot cheaper to get 32 GB now (as four 8 GB modules), and if you later discover that you actually need more, you can add another 8 GB later, probably for a lot less than it would cost now.  And even 32 GB is completely ridiculous for most consumer uses.

    You're completely missing an SSD.  You don't want to run real programs off of a hard drive, as that's slow.  You seem to have a big budget, which should make it readily possible to get a good SSD, like this:

    http://www.newegg.com/Product/Product.aspx?Item=N82E16820249025

    I would go higher end on the power supply with something like this:

    http://www.newegg.com/Product/Product.aspx?Item=N82E16817151118

    http://www.newegg.com/Product/Product.aspx?Item=N82E16817121094

    You don't need ridiculous wattage for what you're doing, as you'll probably never pull 400 W from the power supply.

    If you're going to use four monitors, then you probably want to go AMD on the video card.  Nvidia Kepler cards can do 3 monitors fine, but can be finicky about what they'll do if you want four.  AMD GPU chips can do six monitors, and while many particular cards can only do four or five, what they can do with four monitors is a lot more versatile than what Nvidia cards can do.

    This will work:

    http://www.newegg.com/Product/Product.aspx?Item=N82E16814125413

    Or if you're willing to pay more for Gigabyte to clock it higher, there's this:

    http://www.newegg.com/Product/Product.aspx?Item=N82E16814125439

    Either of those come with four monitor ports, though you'll probably have to buy Mini DisplayPort to something else adapters.  Once you want more than two monitors, you're likely looking at adapters, anyway.  This card is specifically built to do six monitors at once, and with normal DisplayPort for four of them rather than Mini DisplayPort, which may eliminate the need for adapters depending on what monitors you're using:

    http://www.newegg.com/Product/Product.aspx?Item=N82E16814121560

    A stock Radeon HD 7970 tends to perform in the same ballpark as a GeForce GTX 680 anyway, and the 7970 GHz Edition is faster.

  • jdnewelljdnewell Member UncommonPosts: 2,237

    With that kind of budget you need an SSD for sure. If it were me I would probably drop down to 32G RAM and add a nice sized SSD for your OS and programs.

    Get the 7970 for multiple monitor setup, they are designed with that in mind. And i would also go with a better PSU, even drop down in wattage for a better rated. Like an 750-800w gold rated.

  • emistzemistz Member Posts: 54

    Thanks for the feedback guys.

    I'm going to look at dropping down to a 750W or 800W PSU with a better rating, since I think the GPU will only pull about 500W at max load and with the other components it should not come close to 800W.

    About SSD drives, the problem is it will screw up my Raid 1 setup.  The way I envision it now is 2 x 2 TB HDs, setup in raid 1 for redundancy.  If I get an SSD drive, say 250GB or whatever, then I'm stuck getting a 250GB drive to mirror it then buying another 2 drives to be able to setup the storage portion of the rig in raid 1.  I don't know if that makes sense, but the way I'm seeing it it looks like I'd have to invest in 4 drives for the rig.

    One thing about the video card: I'm a linux fan and I intend on running a dual boot setup with the ocassional linux gaming.  So I'm looking for a card that has decent linux support which is why I stick to nvdia.  I'm aware that with the current nvidia setups you can only have 3 monitors as a single monitor(if set up that way), and the 4th monitor is only an auxiliary monitor or whatever.  But that's fine, that's why one of the monitors is 32 inch and the other 3 are 27 I intend to have the three of them set up next to each other and the bigger one hanging from the wall over them.

    Anyway, I lost my train of thought there.  What has your experience been like with running linux games in amd chipsets?

    image

  • QuizzicalQuizzical Member LegendaryPosts: 25,355

    A Radeon HD 7970 has a PowerTune cap of 250 W, meaning that if the card would have used more than 250 W, it will figure that out and throttle back clock speeds within a fraction of a second.  That rarely happens in real games (as opposed to artificial stress tests), but it does mean that you can assume that the card won't use more than 250 W.  A GeForce GTX 680 will tend to use less power yet.  The CPU you're looking at has a TDP of 130 W.  So at stock speeds, you'll probably never pull 400 W from the power supply.  The only real reasons to go over about 650 W on the power supply is if you're looking at an SLI/CrossFire rig, extreme overclocking (e.g., liquid nitrogen), or happen to see a higher wattage power supply that day that is actually cheaper than an equivalent lower wattage version.

    Anecdotally, Nvidia's Linux video drivers used to be better than ATI's.  I'm not sure if AMD has now caught up or not.  Phoronix periodically reviews an AMD video card in Linux and it always seems to work just fine.  I'd be shocked if AMD's video drivers handle GLSL differently depending on what OS you're using, so that side of things should be fine on drivers.  I'd think that a large chunk of OpenGL would be done about the same way regardless of OS, too, but I don't really know how that works at a low level, and the "create a window and display stuff on it" side of things could easily be very different from one OS to the next.

    If you're planning on playing DirectX games using Wine or some such, then that's hit and miss no matter what you're using, and I'm not sure how many of the "misses" are due to video driver problems as opposed to Wine being unable to properly convert a DirectX command to an OpenGL equivalent.  While DirectX and OpenGL do about the same things (and OpenGL has largely caught up in functionality after trailing far behind for several years), there isn't a perfect one-to-one correspondence between commands in one and commands in the other, so to me, the surprising thing about Wine is that it can be made to work at all.

    AMD's multi-monitor support in Windows is much better than Nvidia's, simply because AMD's hardware is capable of doing a lot more things with multiple monitors well.  I don't know how that plays out in Linux, but if something is dicey in Windows, I wouldn't expect it to work better in Linux.

    Is there some reason why you couldn't have both a lone SSD for the OS and main programs (especially games) and also a RAID 1 setup for hard drives?  You could put all of the important data on the hard drives, and perhaps do an image backup of the SSD periodically so that you wouldn't lose much if the SSD failed.  If you're so concerned about reliability that you're merely unwilling to have anything without redundancy, then I hope you have an uninterruptible power supply.

  • emistzemistz Member Posts: 54

    I'll ask around on the linux boards and see how the support is these days.  TBH I haven't looked in like 5 years, but back then nvidia was ahead of the game when it came to linux support, with some ati cards not even supporting linux at all causing hardware accel to not work, etc.

    I'll give a look at getting into some amd cards depending on what I hear the linux support is like.  I'm not really concerned about running stuff in wine, mainly about linux driver support for the cards so I can run native linux games.

    About drives though, I'm not sure I want to manually continue to back things up just for the minor added benefit of faster lookups that ssd brings to the table.   Redundant PSU's is not really something I'm looking at mainly because I'm not concerned with uptime(its not a server), I'm just concered with data redundancy.

    image

  • QuizzicalQuizzical Member LegendaryPosts: 25,355
    Originally posted by emistz

    I'll ask around on the linux boards and see how the support is these days.  TBH I haven't looked in like 5 years, but back then nvidia was ahead of the game when it came to linux support, with some ati cards not even supporting linux at all causing hardware accel to not work, etc.

    I'll give a look at getting into some amd cards depending on what I hear the linux support is like.  I'm not really concerned about running stuff in wine, mainly about linux driver support for the cards so I can run native linux games.

    About drives though, I'm not sure I want to manually continue to back things up just for the minor added benefit of faster lookups that ssd brings to the table.   Redundant PSU's is not really something I'm looking at mainly because I'm not concerned with uptime(its not a server), I'm just concered with data redundancy.

    http://support.amd.com/us/gpudownload/linux/Pages/radeon_linux.aspx?type=2.4.1&product=2.4.1.3.42&lang=English

    That's the link to AMD's Linux drivers.  Laptop video card driver support can sometimes be finicky becasue laptop vendors (e.g., Toshiba or Sony, not AMD or Nvidia) make it so, and discrete switchable graphics also introduces complications.  I wouldn't be surprised if the Radeon HD 7790 isn't supported yet just because it's so new, but the release notes say that basically everything else is supported.

    -----

    An uninterruptible power supply is not at all the same thing as redundant power supplies.  Rather, think of it as a battery backup for your computer.  In case of an unwanted power event (e.g., a power outage), rather than the computer shutting down or crashing, the UPS will disconnect the power incoming from the wall and keep your computer running on the battery backup.  In the case of an extended power outage, that gives you several minutes to save what you're doing and shut down the computer properly.

    -----

    The improved speed of an SSD is not a "minor benefit".  It's the difference between your computer promptly doing what you ask versus making you wait and wait and wait and then finally doing what you asked.  There are also many little things where it reduces delays of a fraction of a second to a much smaller fraction of a second, which is hard to quantitatively measure, but qualitatively makes the computer feel much more responsive.

    If you don't want to lose irreplaceable data in case of SSD failure, then don't put the data on an SSD.  SSDs are for performance sensitive things such as an operating system and programs.  Bulk data usually doesn't benefit meaningfully from an SSD, so you'd put that on your hard drives instead.  If the SSD fails, then at worst, you'd have to reinstall the OS and your various programs the same way you installed them the first time.  But you don't lose any irreplaceable data, as that's always on the hard drives instead.

  • emistzemistz Member Posts: 54

    Thanks for the heads up.  I think you've convinced me to go with a ssd.  I'll do a 3 hd setup, with ssd for OS and related files and 2 x 2TB HDs set up in RAID 1 to have redudancy of data.

    I misread what you said earlier about power supply.  I'm going for a CyberPower CP1500PFCLCD as far as UPS goes.  A bit on the expensive side and probably could get away with something cheaper.

    image

  • QuizzicalQuizzical Member LegendaryPosts: 25,355
    Originally posted by emistz

    I misread what you said earlier about power supply.  I'm going for a CyberPower CP1500PFCLCD as far as UPS goes.  A bit on the expensive side and probably could get away with something cheaper.

    You seem to have a big budget and value reliability, so that UPS looks appropriate to your needs.

    Cyber Power Systems' PFC line of UPSes is unusual for consumer ones in that it outputs a sine wave, which is what computer power supplies want.  Most consumer UPSes will output a "simulated sine wave", which really just means a step function, and some power supplies don't handle that very well.  For example, a 650 W power supply that works fine when outputting 650 W under normal power from the utility may shut down if you try to pull more than 300 W from it while handing it a given "simulated sine wave".

    Now, a step function with several intermediate steps is a lot better than a square wave (which is what some cheap UPSes will do), so not all UPSes that offer a "simulated sine wave" are equivalent.  And different power supplies may handle a given input very differently, too.  This isn't just a problem of junk power supplies, either, as they're only rated as being able to give a given output assuming that they get an appropriate input.  What happens when there is a "simulated sine wave" isn't very well documented for particular units on either the UPS or the power supply end, and there isn't much independent testing of it, either.

    For what it's worth, I have one of the older AVR UPSes from Cyber Power Systems, which does use a "simulated sine wave".  I've probably had 20 or so power events since I got it, and the computer has functioned perfectly right through all of them.

  • emistzemistz Member Posts: 54
    That's a lot of useful insight man.  Thanks a lot for the help, I really appreciate it.  My setup is going to be that much better because of it.  Now I just gotta figure out whether I'll end up going AMD or Nvidia for the graphics.

    image

  • RidelynnRidelynn Member EpicPosts: 7,383

    AMD linux drivers have traditionally blown, badly. I can't count the number of linux installations I've had that get completely botched by a bad AMD driver push to the repo and take a lot of time and patience to recover.

    Maybe they have improved in the last couple of years, I don't know, but I have a lot of dead AMD cards from 2000-2010 that hit the linux driver graveyard and I gave up on them for linux support (although I haven't had any problems with their Windows drivers for the past 5 years or so) - at least until proven otherwise. Not that nVidia or Intel are perfect, but they seem to at least not break as often.

    For your drive setup:

    RAID 1 is good. You don't need Red drives though: those have specific firmware for NAS stations, they don't perform quite as well in PC's. If you want WD's, go with either Blue (cheaper) or Black (better performing) - I'd stay clear of Greens, they can be a pain to work with on a PC on a daily basis. If your worried about reliability more so than using consumer drives, get RE drives (not Red, RE). I'm not as familiar with other brand models at the moment, although I wouldn't be too afraid to deviate from WD based on price - all of the hard drive vendors are pretty much similar when it comes to reliability and speed (at least when it comes to 7200rpm drives).

    To add in an SSD effectively to that:
    Your RAID 1 holds all your VMs, and your data you really want to protect
    Your SSD holds your host OS(s). It also holds any frequently used games/programs, and it holds your hypervisor.

    We do a lot of linix installs like this:
    the boot drive is a single fast (often SSD) drive: it just has the base installation
    /home may have it's own RAID10
    /var/http may have it's own RAID1
    /var/lib/mysql may have it's own RAID1 or RAID10, depending on the server

    The boot drive is easily replaceable, and we can roll out new distributions on a clean hard drive install, then just swap it out in the main server, fix the fstab links and etc configs up, and we're rolling with all the same data as before without having to even touch the data itself.

    It's similar with your VM setup - the host OSes and hypervisor lives on the boot drive. Your VM's all live on the RAID array. You can swap out the hypervisor - you can even boot alternative OSes on the computer and access the same VM's with different hypervisors running on different hosts

    As far as the RAM goes - that depends on how many VM's you want to run at once largely. You need enough for the host OS, and then enough for each VM. If you skimp on any of it, your hitting the hefty swap penalty. 64GB is enough for 15-20 32-bit clients or 8-12 64-bit clients (taking into account your host OS will automatically be 64-bit), and possibly a lot more (if you run lean Linux distro's without X, for instance).

    VM's in and of themselves won't use a lot of CPU power; they share CPU resources very well. Sure, if you decide to crank up Prime95 on a VM it will use everything you have allocated to it, but if your VM's are largely just sitting there (automating whatever - even if it's game "automating"/chinese-style farming), they probably aren't going to be all that CPU-intensive. Most VM's just get provisioned with access to a single core anyway, and a typical VM that just sits there doing light tasks, you can run 20-30 on a single core and still have room on that core for all the VM's to be responsive. You could save a lot of money ($200+) dropping to a quad core. If your doing a lot of video editing and rendering and such then maybe it's worthwhile. If your thinking about doing that on your VM's though, you may want to rethink this entire thing again and look at distributed processes (like XGrid/gc3d or built into several premier-level products).

  • RidelynnRidelynn Member EpicPosts: 7,383

    To amplify the AMD driver situation:

    Phoronix.org does a lot of graphics comparisons for Linux. They look at both closed (released by the vendor) and open source drivers (since a lot of distributions won't/can't include software that isn't open source).

    Their latest driver comparison, from just a couple weeks ago. They don't test every card; rather, the emphasis is on the driver, and in particular vendor-provided versus open sourced drivers.

    http://www.phoronix.com/scan.php?page=article&item=amd_nvidia_15way&num=1

    More or less the conclusion:

    The nVidia closed source driver is heads and tails better than everything else. To the point that low-end nVidia cards often beat AMD higher end cards. There are some tests where a 8800GT beats a 6950, with the commercial drivers. Or cases where the best performing AMD card is a 6450 open source driver, beating out 6950 on the commercial driver (which happens in several benchmarks across other reviews as well - not a testing fluke)

    AMD's open source driver beats nVidia's open source driver, but it's closer competition there. Newer cards beware though, open source development depends largely on reverse engineering and a lot of trial and error, the AMD driver still barely supports 69xx cards, and doesn't officially support 7000 series cards yet. nVidia open source drivers struggle with Kepler as well, so that's not a Blue/Red thing - open source just takes time.

    But most people will run the vendor drivers if they can get it to work with their distro and they don't have to stick with open source for some other reason. The nVidia drivers are just there, whereas AMD's aren't quite as there for Linux.

    AMD official Linux support has been embarrassingly bad for as long as I can remember, and it appears it's still that way. They were trying to make strides at one point (shortly after the AMD aquisition iirc -- Some info), but I think they recanted on a lot of their Linux development in order to double down on Windows, and their recent rounds of layoffs can't have helped either.

  • QuizzicalQuizzical Member LegendaryPosts: 25,355
    Originally posted by Ridelynn

    To amplify the AMD driver situation:

    Phoronix.org does a lot of graphics comparisons for Linux. They look at both closed (released by the vendor) and open source drivers (since a lot of distributions won't/can't include software that isn't open source).

    Their latest driver comparison, from just a couple weeks ago. They don't test every card; rather, the emphasis is on the driver, and in particular vendor-provided versus open sourced drivers.

    http://www.phoronix.com/scan.php?page=article&item=amd_nvidia_15way&num=1

    More or less the conclusion:

    The nVidia closed source driver is heads and tails better than everything else. To the point that low-end nVidia cards often beat AMD higher end cards. There are some tests where a 8800GT beats a 6950, with the commercial drivers. Or cases where the best performing AMD card is a 6450 open source driver, beating out 6950 on the commercial driver (which happens in several benchmarks across other reviews as well - not a testing fluke)

    AMD's open source driver beats nVidia's open source driver, but it's closer competition there. Newer cards beware though, open source development depends largely on reverse engineering and a lot of trial and error, the AMD driver still barely supports 69xx cards, and doesn't officially support 7000 series cards yet. nVidia open source drivers struggle with Kepler as well, so that's not a Blue/Red thing - open source just takes time.

    But most people will run the vendor drivers if they can get it to work with their distro and they don't have to stick with open source for some other reason. The nVidia drivers are just there, whereas AMD's aren't quite as there for Linux.

    AMD official Linux support has been embarrassingly bad for as long as I can remember, and it appears it's still that way. They were trying to make strides at one point (shortly after the AMD aquisition iirc -- Some info), but I think they recanted on a lot of their Linux development in order to double down on Windows, and their recent rounds of layoffs can't have helped either.

    Interesting graphs, but I'm not really sure what to make of it.  For starters, they're all older cards, including some that are very old.  There isn't a single current generation card from either AMD or Nvidia in the article, largely because they were more interested in comparing open versus closed source drivers more so than AMD versus Nvidia.  Furthermore, all of decently capable gaming cards performed well in every single game.  In some games, several AMD cards performed about the same, in spite of some cards ordinarily being much faster than others, which makes me wonder if the card was throttling back frame rates to "only" deliver 100 or so.

    In some OpenGL testing, I've caught my own video card (Radeon HD 5850 in Windows 7) claiming to have finished a frame and moved to the next unreasonably fast, so much so that I'm pretty sure that the video card saw the next frame coming in and decided it could skip the previous one and claim to have done it because it wouldn't matter anyway.  I wonder if there's a fair bit of that going on in some of those tests, which would explain some of the crazily high numbers, especially for the open source drivers on the Radeon HD 6450.

    Their most recent article that focused on AMD's current high end versus Nvidia's is here:

    http://www.phoronix.com/scan.php?page=article&item=linux_hd7950_gtx680&num=1

    There, the Radeon HD 7950 did better relative to the GeForce GTX 680 in Linux than you'd expect based on their Windows results.  Their latest video card review is here:

    http://www.phoronix.com/scan.php?page=article&item=amd_radeon_hd7850&num=1

    That puts the GeForce GTX 680 in a more favorable light.  But the GeForce GTX 460 performs terribly, and the GTX 550 Ti sometimes does quite badly as well.

    -----

    But all of those tests are with a single monitor.  You might want to ask about gaming across three monitors in Linux, and see if you can find anyone who has done it.  It's not hard to imagine that a game would work fine on a single monitor but then choke on multiple monitors.

  • koboldfodderkoboldfodder Member UncommonPosts: 447

    Um, where is teh mouse pad?

     

    That is a critical component that is missing.

  • QuizzicalQuizzical Member LegendaryPosts: 25,355
    Originally posted by koboldfodder

    Um, where is teh mouse pad?

     

    That is a critical component that is missing.

    If you have a mouse that won't work properly without a mousepad, then either you're putting it on a very strange surface (e.g., glass) or the mouse is defective.  If it's the latter, then the solution is to get a new mouse.

  • RidelynnRidelynn Member EpicPosts: 7,383


    Originally posted by Quizzical
    Originally posted by koboldfodder Um, where is teh mouse pad?   That is a critical component that is missing.
    If you have a mouse that won't work properly without a mousepad, then either you're putting it on a very strange surface (e.g., glass) or the mouse is defective.  If it's the latter, then the solution is to get a new mouse.

    Over the head joke shot.

  • emistzemistz Member Posts: 54
    Ok, so this is my tentative setup now taking in the advice given here:
    1. Intel Core i7-3820 Quad-Core Processor 3.6 GHz 10 MB Cache LGA 2011 - BX80619I73820
       
    2. 250GB - Samsung 840 Series Solid State Drive (SSD) with Desktop and Notebook Installation Kit 250 sata_6_0_gb 2.5-Inch MZ-7TD250KW
    3. 2 x Seagate Barracuda 2 TB HDD SATA 6 Gb/s NCQ 64MB Cache 3.5-Inch Internal Bare Drive ST2000DM001

    4. Corsair Vengeance 32GB (4x8GB) DDR3 1600 MHz (PC3 12800) Desktop Memory (CMZ32GX3M4X16?00C10)

    5. Corsair Professional Series  HX 750 Watt ATX/EPS Modular 80 PLUS Gold (HX750)

    6. CyberPower CP1500PFCLCD PFC Sinewave UPS 1500VA 900W PFC Compatible Mini-Tower

    7. Samsung MD32B 32 LED Monitor 16:9 8ms 1920x1080 5000:1 DVI/HDMI/VGA/U?SB Speaker

    8. 3 x ViewSonic VA2451M-LED 24-Inch Screen LED-Lit Monitor

    9. Gigabyte GeForce GTX 680 OC 4GB GDDR5 DVI-I/DVI-D/HD?MI/Displayport PCI-Express 3.0 SLI Ready Graphics Card GV-N680OC-4GD

    10. Gigabyte LGA 2011 DDR3 2133 Intel X79 SATA 6Gb/s USB 3.0 ATX Motherboard GA-X79-UP4

    11. NZXT Crafted Series ATX Full Tower Steel Chassis - Phantom White with Red Trim Computer Case CS-NT-PHAN-WNR White/Red

    12. Zalman CNPS9900A LED Ultra Quiet CPU Cooler

    13. ASUS Xonar DGX PCI-E GX2.5 Audio Engine Sound Cards

     

     

    image

  • QuizzicalQuizzical Member LegendaryPosts: 25,355

    Western Digital's Red line is designed with a NAS with RAID in mind.  I don't know how much that affects the drives themselves or if it's just a marketing thing.

    The Samsung 840 uses TLC NAND flash, which makes it very much a budget model.  That's fine if you're saving money by getting it as opposed to a good MLC SSD, but the 840 doesn't seem to be cheaper than the competition, at least on New Egg.  I've pooh-poohed write endurance worries in the past, but using TLC and only getting 1000 program-erase cycles does give me some pause.  It would probably be fine, but if you can get MLC for the same price, you might as well.  And you can, at least if you're looking at New Egg prices:

    http://www.newegg.com/Product/Product.aspx?Item=N82E16820249025

    http://www.newegg.com/Product/Product.aspx?Item=N82E16820226226

    Or for $10 more, there are also these:

    http://www.newegg.com/Product/Product.aspx?Item=N82E16820233403

    http://www.newegg.com/Product/Product.aspx?Item=N82E16820148443

  • QuizzicalQuizzical Member LegendaryPosts: 25,355
    Originally posted by emistz
    Ok, so this is my tentative setup now taking in the advice given here:
    1. Intel Core i7-3820 Quad-Core Processor 3.6 GHz 10 MB Cache LGA 2011 - BX80619I73820  

    If you're going to get a quad core processor, then why go with an Sandy Bridge-E setup rather than the newer Ivy Bridge?  Before, you were looking at a 6-core processor, and for that, yeah, you do need to go Sandy Bridge-E.

  • QuizzicalQuizzical Member LegendaryPosts: 25,355

    More generally, where are you looking at buying parts?  Apparently the Samsung 840 is much cheaper on Amazon at the moment, and if that's what you were looking at, then have at it:

    http://www.amazon.com/Samsung-Electronics-sata_6_0_gb-2-5-Inch-MZ-7TD250BW/dp/B009NHAEXE?tag=hardfocom-20

  • cybertruckercybertrucker Member UncommonPosts: 1,117
    Just wait for the PS4
  • emistzemistz Member Posts: 54
    Originally posted by Quizzical
    Originally posted by emistz
    Ok, so this is my tentative setup now taking in the advice given here:
    1. Intel Core i7-3820 Quad-Core Processor 3.6 GHz 10 MB Cache LGA 2011 - BX80619I73820  

    If you're going to get a quad core processor, then why go with an Sandy Bridge-E setup rather than the newer Ivy Bridge?  Before, you were looking at a 6-core processor, and for that, yeah, you do need to go Sandy Bridge-E.

    Mostly because clock speed, cache, etc are higher for the price.  I'm not a big fan of the newest and latest architectures just for the cutting edge sake. I can't really see many benefits from Ivy Bridge over Sandy Bridge-E that would affect performance enough to make up for a 300mhz difference in clockspeed and 2mb difference in cache for the same price.  But I haven't been on the hardware side of things in a while so I could be missinformed.

    About the SSD, I looked at the performance difference and it looks like Corsair Neutron Series CSSD-N256GB3-BK out performs the samsung 840 so I'll take your advise and go with that for about $30 etra (amazon).

    I'm looking at newegg and amazon primarily for parts, whichever has the best deal.

    image

  • QuizzicalQuizzical Member LegendaryPosts: 25,355
    Originally posted by emistz
    Originally posted by Quizzical
    Originally posted by emistz
    Ok, so this is my tentative setup now taking in the advice given here:
    1. Intel Core i7-3820 Quad-Core Processor 3.6 GHz 10 MB Cache LGA 2011 - BX80619I73820  

    If you're going to get a quad core processor, then why go with an Sandy Bridge-E setup rather than the newer Ivy Bridge?  Before, you were looking at a 6-core processor, and for that, yeah, you do need to go Sandy Bridge-E.

    Mostly because clock speed, cache, etc are higher for the price.  I'm not a big fan of the newest and latest architectures just for the cutting edge sake. I can't really see many benefits from Ivy Bridge over Sandy Bridge-E that would affect performance enough to make up for a 300mhz difference in clockspeed and 2mb difference in cache for the same price.  But I haven't been on the hardware side of things in a while so I could be missinformed.

    About the SSD, I looked at the performance difference and it looks like Corsair Neutron Series CSSD-N256GB3-BK out performs the samsung 840 so I'll take your advise and go with that for about $30 etra (amazon).

    I'm looking at newegg and amazon primarily for parts, whichever has the best deal.

    Ivy Bridge cores run faster than Sandy Bridge at a given clock speed.  Assuming that you're not into overclocking, a Core i7-3770 is faster at stock speeds than a Core i7-3820.  If you were to disable turbo boost, 3.4 GHz Ivy Bridge cores for the former and 3.6 GHz Sandy Bridge cores for the latter would perform about the same.  But with turbo, you're looking at 3.9 GHz and 3.8 GHz, respectively, so not only does the 3770 clock higher, but it would be faster even at the same clock speed.  The 2 MB difference in L3 cache won't make much of a difference.

    And then the Core i7-3770 is cheaper than the -3820, in addition to letting you go with a much cheaper motherboard because Intel charges a lot less for a Z77 chipset than X79.

    For what it's worth, Sandy Bridge-E is primarily meant for servers (where it is branded as Xeon E5), but Intel also offers some of the relatively lower end versions as their high end desktop chips.  Intel charges a lot for the consumer version because they don't actually want to sell a lot of them.

    Sandy Bridge-E does have four memory channels rather than two for Ivy Bridge, but that's because the former needs to feed up to eight CPU cores (in some Xeon E5-26** versions) and the latter only four.  The extra memory channels don't matter much if you only have four cores unless you're doing something that is extremely memory bandwidth intensive.  There are some HPC situations where this happens, but not for typical consumer or professional programs.

    Sandy Bridge-E also has the X79 chipset with something like 36 PCI Express 3.0 lanes rather than 16 or so.  That can be marginally useful if you're looking at a CrossFire/SLI rig, but with a single video card, it's just a bunch of extra PCI Express lanes that won't have anything plugged into them.

  • emistzemistz Member Posts: 54
    Thanks for the feedback dude.  One thing though, the Core i7-3770 is actually more expensive than the 3820 on amazon.  Its $316 for the 3.5ghz i7-3770 processor but $289 for the 3.6ghz i7-3820.

    image

  • QuizzicalQuizzical Member LegendaryPosts: 25,355
    Originally posted by emistz
    Thanks for the feedback dude.  One thing though, the Core i7-3770 is actually more expensive than the 3820 on amazon.  Its $316 for the 3.5ghz i7-3770 processor but $289 for the 3.6ghz i7-3820.

    http://www.newegg.com/Product/Product.aspx?Item=N82E16819116502

    Now that I look it up, Intel charges the same price for both processors.  The Core i7-3770K tends to be more expensive, but that's the overclockable version of it.  If you care enough about reliability to use RAID 1 and a UPS, then overclocking is presumably out.  (Not that I'm against caring about reliability; I have a UPS, too, and do daily backups of the data that I care about in addition to periodically imaging my SSD onto an external hard drive that is rarely plugged into the computer.)

    Note also that the nominal speed only means that Intel promises that all four cores can run at that speed all of the time.  Recent Intel architectures have turbo boost, which means that the processor will automatically clock itself above the nominal clock speed if temperatures and power draw permits.  For example, if a program is pushing two cores very hard while leaving the others alone, you can clock those two cores higher without risking overheating or excessive power draw to damage a motherboard or power supply because the total heat and power for the entire CPU will still be low because the other two cores are idle.  The maximum turbo boost speeds are 3.9 GHz for the Core i7-3770 as compared to 3.8 GHz for the Core i7-3820, and those are the speeds that you'll typically see in demanding games.

    And still, the Core i7-3770 would mean you want an LGA 1155 motherboard, not LGA 2011, and those tend to be cheaper because Intel charges less for the chipset, among other things.

     

  • emistzemistz Member Posts: 54
    Originally posted by Quizzical
    Originally posted by emistz
    Thanks for the feedback dude.  One thing though, the Core i7-3770 is actually more expensive than the 3820 on amazon.  Its $316 for the 3.5ghz i7-3770 processor but $289 for the 3.6ghz i7-3820.

    http://www.newegg.com/Product/Product.aspx?Item=N82E16819116502

    Now that I look it up, Intel charges the same price for both processors.  The Core i7-3770K tends to be more expensive, but that's the overclockable version of it.  If you care enough about reliability to use RAID 1 and a UPS, then overclocking is presumably out.  (Not that I'm against caring about reliability; I have a UPS, too, and do daily backups of the data that I care about in addition to periodically imaging my SSD onto an external hard drive that is rarely plugged into the computer.)

    Note also that the nominal speed only means that Intel promises that all four cores can run at that speed all of the time.  Recent Intel architectures have turbo boost, which means that the processor will automatically clock itself above the nominal clock speed if temperatures and power draw permits.  For example, if a program is pushing two cores very hard while leaving the others alone, you can clock those two cores higher without risking overheating or excessive power draw to damage a motherboard or power supply because the total heat and power for the entire CPU will still be low because the other two cores are idle.  The maximum turbo boost speeds are 3.9 GHz for the Core i7-3770 as compared to 3.8 GHz for the Core i7-3820, and those are the speeds that you'll typically see in demanding games.

    And still, the Core i7-3770 would mean you want an LGA 1155 motherboard, not LGA 2011, and those tend to be cheaper because Intel charges less for the chipset, among other things.

     

    I think you pasted the wrong linke by mistake, that link shows a 3.4 base for $299 as opposed to a 3.6 bse for $289.

    While on the subject of turbo boost, whats the deal with those graphs showing that the maximum turbo boost speed achieved depends on the number of active CPU cores? I can't find one for the newer models, but for some older models it shows that if 4 cores are active it cannot achieve as high a clock speed as if 1 core is active while boosting.

    Does anyone know anything about that?

    image

Sign In or Register to comment.