Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

New RIG-Suggestions Wanted

2»

Comments

  • QuizzicalQuizzical Member LegendaryPosts: 25,351
    Originally posted by emistz
    Originally posted by Quizzical
    Originally posted by emistz
    Thanks for the feedback dude.  One thing though, the Core i7-3770 is actually more expensive than the 3820 on amazon.  Its $316 for the 3.5ghz i7-3770 processor but $289 for the 3.6ghz i7-3820.

    http://www.newegg.com/Product/Product.aspx?Item=N82E16819116502

    Now that I look it up, Intel charges the same price for both processors.  The Core i7-3770K tends to be more expensive, but that's the overclockable version of it.  If you care enough about reliability to use RAID 1 and a UPS, then overclocking is presumably out.  (Not that I'm against caring about reliability; I have a UPS, too, and do daily backups of the data that I care about in addition to periodically imaging my SSD onto an external hard drive that is rarely plugged into the computer.)

    Note also that the nominal speed only means that Intel promises that all four cores can run at that speed all of the time.  Recent Intel architectures have turbo boost, which means that the processor will automatically clock itself above the nominal clock speed if temperatures and power draw permits.  For example, if a program is pushing two cores very hard while leaving the others alone, you can clock those two cores higher without risking overheating or excessive power draw to damage a motherboard or power supply because the total heat and power for the entire CPU will still be low because the other two cores are idle.  The maximum turbo boost speeds are 3.9 GHz for the Core i7-3770 as compared to 3.8 GHz for the Core i7-3820, and those are the speeds that you'll typically see in demanding games.

    And still, the Core i7-3770 would mean you want an LGA 1155 motherboard, not LGA 2011, and those tend to be cheaper because Intel charges less for the chipset, among other things.

     

    I think you pasted the wrong linke by mistake, that link shows a 3.4 base for $299 as opposed to a 3.6 bse for $289.

    While on the subject of turbo boost, whats the deal with those graphs showing that the maximum turbo boost speed achieved depends on the number of active CPU cores? I can't find one for the newer models, but for some older models it shows that if 4 cores are active it cannot achieve as high a clock speed as if 1 core is active while boosting.

    Does anyone know anything about that?

    No, I gave the right link.  New Egg's prices bounce around.

    The original idea of turbo boost dates back to the early days of multicore processors.  At a given point in time, you could, for example, get a Core 2 Duo at 3.33 GHz or a Core 2 Quad at 3.0 GHz.  You were faced with a choice of fewer, faster cores or more, slower cores.  The former was better in a workload that couldn't scale to use several cores, but the latter was better in a workload that could put all of the cores to good use.

    Meanwhile, the individual CPU cores in the Core 2 Duo were exactly the same as in the Core 2 Quad.  They were likely the same dies, even, with Intel putting tow Core 2 Duo dies in the same package to make a Core 2 Quad.  So if the cores could clock higher in the Core 2 Duo incarnation, why couldn't the same cores clock higher as a Core 2 Quad?  People didn't like this.

    Well, the answer was, the Core 2 Quad could clock higher, too.  The problem was power draw.  Four cores clocked high will pull a lot more power than two cores clocked high.  If two cores clocked high has a TDP of 65 W, and Intel isn't willing to have a Core 2 Quad with a TDP over 95 W, then you can't clock all four cores as high.  You could overclock the chips, of course, and that worked fine if you had a motherboard and power supply that could handle it.  But Intel wasn't willing to go there on stock speeds for fear that people would stick the chips into a motherboard or power supply that couldn't handle the power draw and fry things.

    The solution that Intel came up when Bloomfield launched (2008) was to have more cores, but clock them higher when only some of the cores are active.  That meant that you could dynamically have more cores clocked lower when you had a workload that could push all of the cores, and switch within a fraction of a second to fewer cores clocked higher when you had some cores that weren't in use.  That gets you the best of both worlds.

    But of course, if you have more cores and every core can clock higher, why can't you clock more cores higher at once?  There is the fear of power consumption if you get a program that pushes all of the cores very hard.  But some instructions put more load on a CPU core than others, so a typical program that makes out all of the CPU cores typically won't burn nearly as much power as a program that is designed with the specific intent of maximizing power draw.  While you have to make stock speeds such that the latter is safe, why not clock all of the cores higher when running the former type of program?

    Intel got there with Sandy Bridge (2011).  In desktops, they're often willing to give all of the cores the max turbo boost indefinitely provided that temperatures and power draw remain suitable.  The chips do monitor power draw and temperatures and if either gets too high, they'll throttle back the clock speeds.

    In laptops, however, they're still far more cautious.  A Core i7-3770 has a TDP of 77 W and a max turbo boost of 3.9 GHz.  A Core i7-3840QM has a TDP of 45 W and a max turbo boost of 3.8 GHz.  Those two processors are just the same chip clocked differently.  I can assure you that 100 MHz isn't the difference between 77 W and 45 W all by itself.  While the chip with all four cores active at 3.9 GHz will usually stay inside of 77 W, with all four cores active at 3.8 GHz, it will commonly go over 45 W, and often way over.  So in a laptop, Intel won't allow all cores to clock at 3.8 GHz.  If all four cores are active, they cap it at 3.5 GHz so that if it does go over 45 W, it won't go that far over and fry things before the chip figures out that power draw is out of hand and throttles it back.  One core at 3.8 GHz while the rest are idle is perfectly safe, however.

    In desktops, if you want to build around it, you can pretty easily make a rig that can dissipate 200 W from the CPU.  But you can't do that in laptops, which is why Intel needs to be more cautious with turbo boost in laptops.

  • 190100190100 Member UncommonPosts: 52

    I would ditch the soundcard (this coming from an audiophile (not audiophool)). That aside, Creative has the worst drivers of all, and is a downright evil, vile company that should go die.

     

    Most integrated audio these days is audibly transparent. Most sound cards are transparent. The tricks your mind plays on you!

    I do however think laptop integrated is typically bad - always sounds worse than my desktop for some reason, especially on expensive IEMs.

  • QuizzicalQuizzical Member LegendaryPosts: 25,351
    Originally posted by 190100

    I do however think laptop integrated is typically bad - always sounds worse than my desktop for some reason, especially on expensive IEMs.

    I assumed that was the speakers rather than the audio chip itself.  But that's an assumption on my part, so it could be wrong.

Sign In or Register to comment.