Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Prediction: future desktop CPUs will be less overclockable

2»

Comments

  • g0m0rrahg0m0rrah Member UncommonPosts: 325
    Originally posted by Quizzical
    Originally posted by g0m0rrah

     

     I believe for home use CPU's are going to go one way...

    1.  The CPU is simply good enough that its not the bottleneck.

           This I believe is what amd is trying to reach, a CPU that is just fast enough that it isnt whats slowing you down.  With gaming specifically, balancing a CPU with the correct GPU seems core to budget builds.  You can always tell when someone is new to building a PC because they buy a monster CPU or GPU but the other one is so inferior that they arent getting any real performance out of that expensive product.  With my wifes PC i tried to put together a CPU thats good enough to go through 1 gpu upgrade in the future before I need to possibly replace it.  New GPU and CPU are coming out slow enough that either should last a couple years.   

       It doesnt seem that software developers either arent good enough to create code that runs efficiently on the hardware that we have today or they simply code for lowest common denominator to get the most money possible.  I am not saying coding for the high end should be priority, but in all honesty if software is released today that cant use 4+ cores when available, these are the people holding computers back.  Skyrim is a good example of shit software.  They port Skyrim to the PC and I believe it was limited to addressing 1 gig of ram, I mean seriously 1 gig.  Of course modders take over and fix the problem that shouldnt have existed in the first place.

       I am sure due to physics, the CPU in its current form is limited on speed due to heat or simply due to size restrictions.  How small can you possibly make an insulator between two conductors without some sort of breakdown.  At some point smaller is no longer going to be better.  It seems adding cores and actually using them efficiently will be the interim step to some technology leap, or at least some change in insulator. 

       Everytime I see a decent looking game on an X bone or ps4 I begin to wonder how they managed to leverage the pretty mediocre hardware available.  Then I peer at a mediocre PC with a 1090t, a radeon 6950,  and 8 gig of 1600 and I wonder why the fuck cant they leverage the hardware in that system. 

    Let's not be quick to praise games for requiring higher end hardware by being inefficient.  In the case of Skyrim, if you can do everything you want inside of 1 GB, what's the advantage to requiring more memory?  The Xbox 360 and PS3 each had 512 MB of memory--for system memory and video memory added together   

     

    One of the big problems is that the PC version addresses memory poorly.  It starts with a 256 block of ram and once filled it tried to address another 256 block, which for some odd reason leads to crashes and infinite loading screens.  So granting it a bigger block of memory at the start helped curb crashing.  I think the new mod  did so well that bethesda took it and put it in their own patch...

      This just shows my big gripe with PC gaming.  The made damn sure it worked well on the consoles but they wet the bed on the PC version.

  • SupaAPESupaAPE Member Posts: 100
    Originally posted by grndzro

    It's a matter of transistor density vs heat dissipation. The heat transistors generate does not decrease proportionaly to the chips ability to dissipate heat with node shrinks.

    So eventually you are left with performance gains strictly from lower transistor costs, and density.

    this is the answer

  • SupaAPESupaAPE Member Posts: 100

    @OP Intel and others are already looking into using different materials for their CPU dies. Therefore we do not know the future of overclocking.

     

    however, given the fact that CPU manufacturers will set frequency settings within a certain safe margin, to maintain integrity and stability, this means that there is always a little headroom to reprogram frequencies above stock specifications. 

     

    But the poster I quoted is right with his assumptions of current dies and materials used.

  • SupaAPESupaAPE Member Posts: 100
    We will also be moving away from 32/64 bit era and into quantum computing era, which means processors which are designed different to what we have in this day and age. 
  • RidelynnRidelynn Member EpicPosts: 7,383


    Originally posted by SupaAPE
    We will also be moving away from 32/64 bit era and into quantum computing era, which means processors which are designed different to what we have in this day and age. 

    That is farther into the future than is really practical to discuss.

    Right now, I don't know if you can really even talk very credibly about anything past x64, unless you look at ARM (which is really the only competition on the horizon right now).

  • QuizzicalQuizzical Member LegendaryPosts: 25,355
    Originally posted by Ridelynn

     


    Originally posted by SupaAPE
    We will also be moving away from 32/64 bit era and into quantum computing era, which means processors which are designed different to what we have in this day and age. 

     

    That is farther into the future than is really practical to discuss.

    Right now, I don't know if you can really even talk very credibly about anything past x64, unless you look at ARM (which is really the only competition on the horizon right now).

    You don't need to go off to the far future, pie in the sky stuff to get interesting things that upcoming.  FinFETs, 3D NAND, silicon interposers, EUV lithography, adaptive sync, and DirectX and OpenGL allowing lower level access to hardware now that GPU architectures have converged some will all be widespread soon.

    If quantum computers can be made, they'll be very good at some things that classical computers are bad at, but the converse is also true.  Don't think of a quantum computer as just meaning a really fast computer.  Quantum computers will also be extremely expensive if they need to be kept at a temperature extremely close to absolute zero.

    And even the things that quantum computers will be good at can be worrisome.  If quantum computers are out there, then both RSA and Diffie-Hellman are dead, immediately.  Those are the widely used public-key encryption algorithms, and if there aren't others to replace them, Internet commerce will be dead, too.

Sign In or Register to comment.