Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Impact of FPS on power consumption.

MalaboogaMalabooga Member UncommonPosts: 2,977
edited January 2017 in Hardware
Since AMD has intriduced Radeon chill i tested it to see the results and i can say its quite ingenious as it exploits fact that lower FPS equals lower power consumption of the whole system.

So it works on the premise that if theres no player input, theres no need to have high frame rates and is an opportunity to lower power consumption. Algorithm even goes further to determine what kind of player input is there to match frame rate accordingly. So while youre only using keyboard, it can say it will display only 60FPS as its assuming youre only running around and not doing much else, then when you start using mouse as well it can say "max attainable FPS" as you are now fighting and you want best performance (its just an example of principle on how it works, its on game by gaeme basis and only list of game supported for now....of course, theyre focusing most popular games first)

So i put it to the test in Witcher 3. With some very small delay after i stopped using mouse/keyboard FPS dropped to 40 (this can be configure on game by game basis if for some reason you find 40FPS too low when your enot doing anything). The moment i started using mouse or keyboard FPS got back up to max (Witcher 3 on max settings doesnt really go much beyond 60 FPS especially in the area im testing in which is graphically wise most demanding).

The effect on power consumption is quite drastic. Settings on the GPU were 1280 MHz @ 1.055v

input -> ~62 FPS -> power ~125W
no input-> 40 FPS -> power ~85W

This isnt only about that particular AMD feature, but general impact of FPS on power consumption (that was very easy to test with this feature and to test that feature itself and how well it works and if it has any effect at all).

Reagrdless of MHz/v the more work GPU has to do the more power it uses (and primary function of GPU is to send images x times/s to your monitor). Not only that, your CPU works on the same principle as CPU is the "brain" that has to tell your GPU what it has to do, so less frame rates means less CPU power consumption as CPU also has less work to to (preparing 100 images vs preparing 40 images for example. Also, above test is ONLY GPU power consumption reduction, CPU has some power consumtion on top of that as well but i cant really measure that, nor is it anythng close to GPU power consumtion reduction, but its worth to note there is some)

So limiting your FPS can have big impact on power consumption. Using VSync was first solution but its not most elegant one as it will sync your FPS to your monitor refresh rate and as long as your FPS are > than that refresh rate it works great, but if your FPS falls < that it introduces FPS spikes (as it will halve your FPS so 60 Hz moinitor -> 30 FPS) and stutter in the process of "syncing" to new FPS. And next moment you can jump back at > FPS than that refresh rate (for the sake of argument again 60 Hz) and have same process done but now in reverse 30-> 60 FPS.

Next in line are frame rate limiters. They can be found in drivers or programs like RIva Tuner Statistics Server (also used for OSD display and is part of MSI Afterburner package but can be used as a standalone) or nowadays in settings in games (as its very useful feature). It will limit your FPS to your desired limit but if FPS drops < monitor refresh rate youll have to deal with standard screen tearing because your FPS and monitors refresh rate are out of sync. The frame limiter works best when set at your monitor refresh rate, so 60Hz monitor -> 60 FPS frame rate limit. But this works great if you have Freesync/GSync monitor as those monitors will sync their refresh rate to your FPS in their respected ranges and it will be as smooth as it gets as its your monitor that is seamlessly syncing to your FPS, not the other way around like VSync.


So there you have it, pushing hundreds of FPS does nothing but increases your power consumption for no/any reason at all. If you have 60 Hz monitor, your monitor will ever be able to display 60 FPS, all those FPS beyond that are lost and redundant but your GPU will still process them and waste power on it just for you not to see them ever. And from using higher refresh monitors i can say that its definitely pretty worthless "feature", using 144 Hz monitor is more a gimmick than anything else AND introduces more headache as 60 FPS on 60Hz monitor looks better than 90 FPS on 144 Hz monitor (unless you have Freesync/GSync in which case it looks pretty much identical, almost no difference lol) and youll have to push FPS and reduce image quality settings to get there for very little/no improvement in experience (even decreased experiesnce because of lowered image quality settings) and and in addition that will increase your systems power consumption.



*CPUs and GPUs dont work as some people think they work in 0/1 configuration like its only minimum power consumption in idle or maximum power consumption in load, power consumption is directly tied to the amount of work they are doing. Most tests found online only state max power consumption in worst case scenario, which rarely if ever happens in every day use. Even worse are those who only note peak power consumption as that is completely worthless to determine anything as especially GPUs can have large peaks when they change power states for very short time durations that measure in ms but are irrelvant otherwise (some GTX950/960 had peaks up to 400W, while their average power consumption under full load is ~120W as an extreme example)
Post edited by Malabooga on
«1

Comments

  • PhryPhry Member LegendaryPosts: 11,004
    While the amount of CPU a game uses is rarely 100%, when it comes to gaming unless you are playing something that doesn't need much GPU resources, older games spring to mind here, then chances are you are using as much of your GPU's resources as is available, unless you deliberately downscale/reduce resolution in order to reduce power/heat generation.
    Though the point of having higher than 60 fps on a 60hz monitor is valid, there is nothing 'gimmicky' about 144hz monitors, for gaming they are pretty awesome even though the down side is that you are also then restricted to 1080 resolutions, comparing 90fps on 144hz monitor to 60fps on a 60hz monitor is purely a matter of personal opinion, those who can, notice the difference, and honestly, i don't know why some people can't tell the difference, though if there are any 144hz monitors out there that don't have either gsync or freesync i would be very surprised.
    When it comes to FPS more is better, of course this is within the limits of the monitor in use, and the capability of the GPU installed, frankly not all can support over 60 fps, some even struggle to get 60 fps in the first place without having to reduce graphic resolution/features, but if you have a system and a monitor that can support it, then 90 fps+ does indeed look better, at least in my opinion and i suspect in the opinions of others who indeed have such systems. :o
  • MalaboogaMalabooga Member UncommonPosts: 2,977
    edited January 2017
    Its a gimmick form industry that wants to convince you to buy more expencive products that they have higher margins on but that has no foothold in reality. Same with 4k. Same with "super ultra" "hyper" "nightmare" and w/e game settings (as those settings are generally there to take screenshots (stills) not really to play on as they have huge performance impacts for very little image quality return that you can only notice if you stick your nose into monitor and go over every smallest detail to notice differences. I dont know about you, but i dont play games like that lol).

    With these things there are severe diminishing returns and points on no returns whatsoever. In street terms, youre paying large premium for nothing but epeen.

    Of course they want you to buy 700$ GPU with 1500$ monitor, but you only marginally improve your experience over buying 200$ GPU and 180$ monitor. Thats the reality of it lol

    And Freesync and GSync are exactly there for reason of making gameplay smooth for < than those 60 FPS. Magic "60FPS" was only there because monitor standard is 60 Hz and only way to get smooth gameplay was to have those 60 FPS (otherwise you either play at 30 FPS with VSync or screen tearing without VSync). That "need" completely dissapeared with Freesync/GSync and youll get same smooth gameplay within their respective VRR range as you would get with 60Hz monitor at 60 FPS.
    Post edited by Malabooga on
  • filmoretfilmoret Member EpicPosts: 4,906
    Yes the GPU doesn't always draw 100% power.  It draws what it needs.   As for your theory of imaging.  The more details that a game has per frame makes a huge difference when you are looking at 60fps vs 120fps.  Take a flowing cape for example.  Thousands of ripples all in the cape flowing in the wind all moving constantly.  You cannot fully capture that with 60fps.  You just cant.  

    The reason you don't notice it as much because most games don't have animations that detailed.  A full animation is just a bunch of  frame captures.  You can have running animation that is only 60 frames for 1 step or you can make it over 200 frames for that one step animation.  If the software doesn't have the detail then running 120fps doesn't help.
    Are you onto something or just on something?
  • holdenhamletholdenhamlet Member EpicPosts: 3,772
    edited January 2017
    144 hz is pretty sweet.  Only problem is games limited to 60fps look worse now after being used to more (like Dark Souls 3).

    Freesync is free.  There's no extra cost associated with getting a 144hz freesync monitor and an AMD GPU.  GSync is ridiculous, though, you're right about that.

    You're also right that there's nothing magical about 60fps, but I don't see why that's a good argument for limiting your display to 60fps.

    You don't need it though if you're running a game at 144 fps, which is easily doable on a game like Overwatch.

    As for power consumption, I'm a good American so I don't really care about that.
  • MalaboogaMalabooga Member UncommonPosts: 2,977
    edited January 2017
    144Hz is sweet but not really useful lol. And some games even have whole gameplay tied to FPS and changes in FPS induce all sorts of problems (like street fighter/fallout 3/4/Skyrim which all had actual game speed tied to FPS, or DS which have animations tied to FPS... .. ...)

    IMO, 75Hz monitor is as far as someone should go to get actual benefit from it, above that difference is so miniscule its really paying for nothing. 60 FPS was example for 60 Hz monitor, as youll get smoothest gameplay out of it, 75Hz 75 FPS etc etc. Its just a fact that 60 Hz is standard and overwhelming majority of monitors out there are 60 Hz ones (with no variable refresh rate capabilities aka Freesyn/Gsync). thats why "60FPS" is still magical number.

    You can run the game > 144 FPS even on 60 Hz monitor, thats not really the issue, the issue is that it isnt all that noticable or mindblowing or anything if you do it on 144Hz monitor instead.

    Well, from past few years (since Nov 2014. to be more precise) it seems that somehow power consumption is most important spec of the card (thats the date NVidia released Maxwell and became a bit more power effeicient than AMD btw ;) ) Up until then it wasnt important at all and it didnt matter at all lol.

    But anyway, i gave 2 guides how to reduce power consumption drastically, it also reduces heat and fan noise as well. If i keep my RX480 at those twekaed 1280 MHz (for example) and put frame limiter at 70 FPS (i have 75 Hz Freesync monitor with 40-75 Freesync range) my card is mostly pasively cooled in lot of games (with occasional fan turn on for 20-30s at minimum RPM now and then) lol
  • filmoretfilmoret Member EpicPosts: 4,906
    Again you won't notice anything beyond a certain point because the 3d imaging doesn't account for 100% movement.  Its like watching a movie that was only captured at 120fps.  You cannot increase that no  matter what you do.  Yes you can run the movie at 400 fps but since its only captured at 120 you will never see a difference.
    Are you onto something or just on something?
  • 13lake13lake Member UncommonPosts: 719
    filmoret said:
    Again you won't notice anything beyond a certain point because the 3d imaging doesn't account for 100% movement.  Its like watching a movie that was only captured at 120fps.  You cannot increase that no  matter what you do.  Yes you can run the movie at 400 fps but since its only captured at 120 you will never see a difference.
    motion blur, ... and overall latency of the monitor, ...
  • holdenhamletholdenhamlet Member EpicPosts: 3,772
    edited January 2017
    Malabooga said:
    144Hz is sweet but not really useful lol. And some games even have whole gameplay tied to FPS and changes in FPS induce all sorts of problems (like street fighter/fallout 3/4/Skyrim which all had actual game speed tied to FPS, or DS which have animations tied to FPS... .. ...)

    IMO, 75Hz monitor is as far as someone should go to get actual benefit from it, above that difference is so miniscule its really paying for nothing. 60 FPS was example for 60 Hz monitor, as youll get smoothest gameplay out of it, 75Hz 75 FPS etc etc. Its just a fact that 60 Hz is standard and overwhelming majority of monitors out there are 60 Hz ones (with no variable refresh rate capabilities aka Freesyn/Gsync). thats why "60FPS" is still magical number.

    You can run the game > 144 FPS even on 60 Hz monitor, thats not really the issue, the issue is that it isnt all that noticable or mindblowing or anything if you do it on 144Hz monitor instead.

    Well, from past few years (since Nov 2014. to be more precise) it seems that somehow power consumption is most important spec of the card (thats the date NVidia released Maxwell and became a bit more power effeicient than AMD btw ;) ) Up until then it wasnt important at all and it didnt matter at all lol.

    But anyway, i gave 2 guides how to reduce power consumption drastically, it also reduces heat and fan noise as well. If i keep my RX480 at those twekaed 1280 MHz (for example) and put frame limiter at 70 FPS (i have 75 Hz Freesync monitor with 40-75 Freesync range) my card is mostly pasively cooled in lot of games (with occasional fan turn on for 20-30s at minimum RPM now and then) lol
    Um, no 144hz is very noticeable.  The reason DS looks bad is because 60fps seems almost choppy now that I'm used to 144 hz.  DS is an unusual situation because it's a console port.

    120 is probably enough but 75 is certainly not anywhere near maxing out the quality of fps.

    Talk to anyone with a 144hz monitor including me and they will tell you it's very noticeable.


  • MalaboogaMalabooga Member UncommonPosts: 2,977
    edited January 2017
    Ive used 144 Hz monitor extensively and most people i talked to agree with me, many are even returning to 60 Hz monitors because of all additional hedaches that 144 Hz causes for very little to no impact.
  • holdenhamletholdenhamlet Member EpicPosts: 3,772
    edited January 2017
    Malabooga said:
    Ive used 144 Hz monitor extensively and most people i talked to agree with me, many are even returning to 60 Hz monitors because of all additional hedaches that 144 Hz causes for very little to no impact.

    Well, personally I'd never return to a 60 hz monitor if I could at all help it.  If you and your friends can't discern a difference between 60 and 144 actual fps, I'd suggest you all get your eyes checked.

    Strange you can tell a difference between 75 and 60 but not 144 and 60.
  • laseritlaserit Member LegendaryPosts: 7,591
    FPS not only about what you see. It's also about what you feel (responsiveness) This is totally noticeable in applications such as flight sims where my 4k screen is showing me 30 frames but my system is processing at 60 frames. The smoothness/responsiveness of the controls and the aircraft are night and day.

    "Be water my friend" - Bruce Lee

  • MalaboogaMalabooga Member UncommonPosts: 2,977
    edited January 2017
    Malabooga said:
    Ive used 144 Hz monitor extensively and most people i talked to agree with me, many are even returning to 60 Hz monitors because of all additional hedaches that 144 Hz causes for very little to no impact.

    Well, personally I'd never return to a 60 hz monitor if I could at all help it.  If you and your friends can't discern a difference between 60 and 144 actual fps, I'd suggest you all get your eyes checked.

    Strange you can tell a difference between 75 and 60 but not 144 and 60.
    1. There are a lot of games wheer you cant theoretically get 144 FPS even if you buy most expencive hardware you can buy due to various bottlenecks and many people plunged into 144 Hz without Freesync or GSync. And after theyve spent a furtune on so marginal improvements for all the money they invested they say its definitely not worth it. And there is a smaller number that claim its worth it. Eyes are just fine.

    2. Where have i said i can see difference between 60 and 75 FPS? When i bought my monitor i had NVidia GPU and without Freesync i could use only 60 Hz. And now i lock it to 70 FPS not 75.

    You want to convince yourself, thats fine. For many people such marginal impeovements dont remotely warrant all the hassle that goes along with it.
    Post edited by Malabooga on
  • ShinamiShinami Member UncommonPosts: 825
    Malabooga said:
    Since AMD has intriduced Radeon chill i tested it to see the results and i can say its quite ingenious as it exploits fact that lower FPS equals lower power consumption of the whole system.

    So it works on the premise that if theres no player input, theres no need to have high frame rates and is an opportunity to lower power consumption. Algorithm even goes further to determine what kind of player input is there to match frame rate accordingly. So while youre only using keyboard, it can say it will display only 60FPS as its assuming youre only running around and not doing much else, then when you start using mouse as well it can say "max attainable FPS" as you are now fighting and you want best performance (its just an example of principle on how it works, its on game by gaeme basis and only list of game supported for now....of course, theyre focusing most popular games first)

    So i put it to the test in Witcher 3. With some very small delay after i stopped using mouse/keyboard FPS dropped to 40 (this can be configure on game by game basis if for some reason you find 40FPS too low when your enot doing anything). The moment i started using mouse or keyboard FPS got back up to max (Witcher 3 on max settings doesnt really go much beyond 60 FPS especially in the area im testing in which is graphically wise most demanding).

    The effect on power consumption is quite drastic. Settings on the GPU were 1280 MHz @ 1.055v

    input -> ~62 FPS -> power ~125W
    no input-> 40 FPS -> power ~85W

    This isnt only about that particular AMD feature, but general impact of FPS on power consumption (that was very easy to test with this feature and to test that feature itself and how well it works and if it has any effect at all).

    Reagrdless of MHz/v the more work GPU has to do the more power it uses (and primary function of GPU is to send images x times/s to your monitor). Not only that, your CPU works on the same principle as CPU is the "brain" that has to tell your GPU what it has to do, so less frame rates means less CPU power consumption as CPU also has less work to to (preparing 100 images vs preparing 40 images for example. Also, above test is ONLY GPU power consumption reduction, CPU has some power consumtion on top of that as well but i cant really measure that, nor is it anythng close to GPU power consumtion reduction, but its worth to note there is some)

    So limiting your FPS can have big impact on power consumption. Using VSync was first solution but its not most elegant one as it will sync your FPS to your monitor refresh rate and as long as your FPS are > than that refresh rate it works great, but if your FPS falls < that it introduces FPS spikes (as it will halve your FPS so 60 Hz moinitor -> 30 FPS) and stutter in the process of "syncing" to new FPS. And next moment you can jump back at > FPS than that refresh rate (for the sake of argument again 60 Hz) and have same process done but now in reverse 30-> 60 FPS.

    Next in line are frame rate limiters. They can be found in drivers or programs like RIva Tuner Statistics Server (also used for OSD display and is part of MSI Afterburner package but can be used as a standalone) or nowadays in settings in games (as its very useful feature). It will limit your FPS to your desired limit but if FPS drops < monitor refresh rate youll have to deal with standard screen tearing because your FPS and monitors refresh rate are out of sync. The frame limiter works best when set at your monitor refresh rate, so 60Hz monitor -> 60 FPS frame rate limit. But this works great if you have Freesync/GSync monitor as those monitors will sync their refresh rate to your FPS in their respected ranges and it will be as smooth as it gets as its your monitor that is seamlessly syncing to your FPS, not the other way around like VSync.


    So there you have it, pushing hundreds of FPS does nothing but increases your power consumption for no/any reason at all. If you have 60 Hz monitor, your monitor will ever be able to display 60 FPS, all those FPS beyond that are lost and redundant but your GPU will still process them and waste power on it just for you not to see them ever. And from using higher refresh monitors i can say that its definitely pretty worthless "feature", using 144 Hz monitor is more a gimmick than anything else AND introduces more headache as 60 FPS on 60Hz monitor looks better than 90 FPS on 144 Hz monitor (unless you have Freesync/GSync in which case it looks pretty much identical, almost no difference lol) and youll have to push FPS and reduce image quality settings to get there for very little/no improvement in experience (even decreased experiesnce because of lowered image quality settings) and and in addition that will increase your systems power consumption.



    *CPUs and GPUs dont work as some people think they work in 0/1 configuration like its only minimum power consumption in idle or maximum power consumption in load, power consumption is directly tied to the amount of work they are doing. Most tests found online only state max power consumption in worst case scenario, which rarely if ever happens in every day use. Even worse are those who only note peak power consumption as that is completely worthless to determine anything as especially GPUs can have large peaks when they change power states for very short time durations that measure in ms but are irrelvant otherwise (some GTX950/960 had peaks up to 400W, while their average power consumption under full load is ~120W as an extreme example)
    I know far too much about computers and power consumption. 
    I shall simulate a second grade math student in General Education in the U.S under the Common Core State Standard. 

    For now lets take the part of your post that matters, the numbers 

    Input = 125w
    No input = 85w

    So now lets put them to work. 

    First we will start with 2nd Grade Common Core Subtraction 
    125 - 85 = 40 

    This 40 is what we call a Range.

    Now we make a new number line from 0 to 40.
    Now we can also use 2nd Grade Common Core Fractions in which a student learns and works with 1/2 to 1/6th. 

    We Identify that 20 is one-half of 40 (notice I am not using percentages, just the word "one-half" because the concept of percentages are not introduced until the 5th - 6th grade) 

    Now we move to the 2nd grade Common Core Math Standard Expectation of Mathematical Practices:

    The maximum power saved is 40w. However, this is if no input exists. If Half of 40 is 20, then it would mean that if half the time you are using input, the power save is really 20w. 

    /*Simulating a 2nd Grade Math Student learning under the CCSS means I can't multiply to 1000 yet! Crud! This means I can't tell you that 50 hours x 20 watts = 1 Kilowatt hour, and that you will save 15 cents of electricity every 50 hours on the average!*/
  • MalaboogaMalabooga Member UncommonPosts: 2,977
    edited January 2017
    Now point to the part of my post that mentions how youll save a fortune. I mean, you quoted my whole post, it MUST be somewhere in there RIGHT?i

    now i guess i would have to write something about first 4 grades of primary school and how children in first 4 grades of primary school learn somethin called reading comprehension.
  • filmoretfilmoret Member EpicPosts: 4,906
    Malabooga said:
    Now point to the part of my post that mentions how youll save a fortune. I mean, you quoted my whole post, it MUST be somewhere in there RIGHT?i

    now i guess i would have to write something about first 4 grades of primary school and how children in first 4 grades of primary school learn somethin called reading comprehension.
    Well unless you are using those awesome AMD cards in which you can decrease power and get more fps.  Everyone grab some of those amd cards before malabooga tells the world.
    Are you onto something or just on something?
  • holdenhamletholdenhamlet Member EpicPosts: 3,772
    edited January 2017
    Malabooga said:
    Malabooga said:
    Ive used 144 Hz monitor extensively and most people i talked to agree with me, many are even returning to 60 Hz monitors because of all additional hedaches that 144 Hz causes for very little to no impact.

    Well, personally I'd never return to a 60 hz monitor if I could at all help it.  If you and your friends can't discern a difference between 60 and 144 actual fps, I'd suggest you all get your eyes checked.

    Strange you can tell a difference between 75 and 60 but not 144 and 60.
    1. There are a lot of games wheer you cant theoretically get 144 FPS even if you buy most expencive hardware you can buy due to various bottlenecks and many people plunged into 144 Hz without Freesync or GSync. And after theyve spent a furtune on so marginal improvements for all the money they invested they say its definitely not worth it. And there is a smaller number that claim its worth it. Eyes are just fine.

    2. Where have i said i can see difference between 60 and 75 FPS? When i bought my monitor i had NVidia GPU and without Freesync i could use only 60 Hz. And now i lock it to 70 FPS not 75.

    You want to convince yourself, thats fine. For many people such marginal impeovements dont remotely warrant all the hassle that goes along with it.

    Why do you lock it to 70 if 60 fps is a magic number?

    Displaying over double the fps is hardly a "marginal improvement".

    As for headaches- I don't know about when they first introduced 144hz, but I got mine recently and I've had zero problems using it.
  • GodeauGodeau Member UncommonPosts: 83
    edited January 2017
    Off topic: There is actually a noticeable difference going from 60 to 120Hz(That is if your GPU could get to 120fps in the first place in games), from 120 to 144Hz, not much so to be honest. From 60 to 120Hz it's so obvious even at desktop, your mouse cursor moves far more smoother than on a 60Hz screen from when I tested it on my win7 pc years ago, it's hardly what I would call marginal.

    Back to topic, If I spend an upwards of 2000usd on a rig for maximum performance, why would I care about a couple of dollars saving I would get per month in a trade off for limiting my gameplay experience in terms of fps? If we talk in terms of power saving alone, there are far worse culprits of electricity consumption and wastage than a mere shaving of 50+- watts from gaming on your pc; namely, your air conditioner, an oven, your electrical shower heater, or your electrical kettle/flask, just to name a few. Just an hour of usage from those already puts your amount of usage that would take probably 20 to 30 days of 'trade off' from gaming using your method to break even from just one of those devices mentioned earlier.

    So, back to the question, why would you sacrifice your gaming experience by limiting fps when you already paid so much for a rig? I know I wouldn't because there are far better and more efficient ways to cut power consumption.
  • ShinamiShinami Member UncommonPosts: 825
    Malabooga said:
    Now point to the part of my post that mentions how youll save a fortune. I mean, you quoted my whole post, it MUST be somewhere in there RIGHT?i

    now i guess i would have to write something about first 4 grades of primary school and how children in first 4 grades of primary school learn somethin called reading comprehension.
    I do not have to simply because when you post that you have a power-saving feature, the idea is not if it would save you a fortune; The idea is if writing about the power-saving feature would yield something significant.

    The question then becomes "Is this power saving feature significant enough to post about it?"

    You posted with enthusiasm at the fact such a feature exists, which means you are subject to the scrutiny and interpretation that comes from introducing the information itself. You are brave enough to present the information, now just stand by it. 

    its not about saving a fortune
    Its about discussing the critical nature behind your finding...

    In this case it is the idea of sacrificing a nominal framerate in order to save an insignificant amount of power that in doing so, might get your character (or party killed) in an online game, but might be sound for an offline singleplayer game. 

    The number line of 40 dealt a critical blow in the assertion that if we are sending input to our games 50% of the time, that perhaps it would be 20w of power that would be saved in the long run. Economically that accounts to around 15 - 20 cents every KiloWatt. In this case it would be around every 50 hours of gaming. 

    The question is not "Would it save a fortune?" but rather "Is it worth it? or even significant?" 

    I already save more power than most people due to the fact my gaming platform runs on a power supply that has 3% Voltage Protection and on moderate loads runs between 88 - 90% Efficient. I also run profiles that underclock and undervolt my CPU and GPU outside of my main game. Why do I need 100% to play a game if max settings and performance can be achieved at 10% of my power? I don't like having excess power flow around not being used due to the second law of energy.
  • MalaboogaMalabooga Member UncommonPosts: 2,977
    Reallly. Not having 90 FPS but 60 FPS  might get your character or party killed? Dont get me wrong youre writing complete nonsense lol People played perfectly even on 30 FPS in good all days of completely unoptimized games. And 60 FPS IS normal framrate for 60 Hz monitor and will look and play smoother than 90 FPS on 144 Hz monitor.

    Yes, youll save some money but its insignificant. I haven NEVER mentioned it, you did and i completely agree on you on thats one but its insignificant, in fact i told so on many occasions i said so when people said that card A uses x less watts and "huge" savings are involved that thtas nonsense.

    And, for instance, people will spent fortune to get few % more "efficient" PSU which will have MUCH less effect on power savings then just limiting your FPS, in fact you yourself brag about it (although any PSU above 40$ today will have at least bronze standard close to silver) when price of silver and gold PSUs are often twice as much (platinum...titanium...price goes up exponentionally). And ive said that its much better to pay attention to PSU efficiency curve and to buy PSU that will put you inside most efficient area than buying monster PSUs that will put you completely outside of it... ... ...
  • MalaboogaMalabooga Member UncommonPosts: 2,977
    edited January 2017
    Godeau said:
    Off topic: There is actually a noticeable difference going from 60 to 120Hz(That is if your GPU could get to 120fps in the first place in games), from 120 to 144Hz, not much so to be honest. From 60 to 120Hz it's so obvious even at desktop, your mouse cursor moves far more smoother than on a 60Hz screen from when I tested it on my win7 pc years ago, it's hardly what I would call marginal.

    Back to topic, If I spend an upwards of 2000usd on a rig for maximum performance, why would I care about a couple of dollars saving I would get per month in a trade off for limiting my gameplay experience in terms of fps? If we talk in terms of power saving alone, there are far worse culprits of electricity consumption and wastage than a mere shaving of 50+- watts from gaming on your pc; namely, your air conditioner, an oven, your electrical shower heater, or your electrical kettle/flask, just to name a few. Just an hour of usage from those already puts your amount of usage that would take probably 20 to 30 days of 'trade off' from gaming using your method to break even from just one of those devices mentioned earlier.

    So, back to the question, why would you sacrifice your gaming experience by limiting fps when you already paid so much for a rig? I know I wouldn't because there are far better and more efficient ways to cut power consumption.
    WHy are you discussing saving money? Thats a red herring introduced by Shinami.

    You are NOT sacrificing gaming experience, and in fact many people are going back to regular 60 Hz monitors because theres very little to no gains in gaming expereince (or anything else).
  • holdenhamletholdenhamlet Member EpicPosts: 3,772
    edited January 2017
    90 fps on my 144hz monitor looks exactly like how I would expect 90 fps to look like with freesync enabled and my 480, and it looks exactly like you'd imagine 30 more fps would look over a 60 fps monitor (i.e. much smoother).

    144 fps in Overwatch (or any game, Overwatch is just a popular game that is very easy to get 144 fps in- my last card, a 950, could do it on the lowest settings) looks immensely better than 60 fps, and you don't need freesync or gsync for that.

    I find myself going with lower settings in games because higher fps makes games look better than high settings (not because of your theory, but simply because more fps makes games look better).

    I'm not sure who these "many people" are that are ditching 144hz for 60hz, but they aren't here and they aren't anywhere else on the internet that I've seen.

    I think they may be in your head.

    I believe you just refuse to go AMD and refuse to get gsync so you're building a theory around 60fps monitors being ideal (which can be argued but only if you refuse to go AMD and refuse to get gsync- still games like Overwatch that are easy to cap out look much better, so that's not even a universal thing).

    A decent argument for 60 fps monitors is the panels usually display colors better, which is better for things like movies or browsing the internet, but that's not an argument you're making.

    As for extra fps having no effect on player performance, it's not true.  My aim markedly improved in Overwatch when I switched.  Aim is mostly a thing you get good at with practice, but as a pro player pointed out, a 144hz monitor is the one piece of equipment you can buy that actually does give you a little boost, which is of course easy to understand why.
    Post edited by holdenhamlet on
  • MalaboogaMalabooga Member UncommonPosts: 2,977
    Sorry, but a lot of games cannot even theoretically fo to 144 FPS, and 90 FPS looks pretty much identical to 60 FPS.

    I think they may be in your head.

    "I find myself going with lower settings in games because higher fps makes games look better than high settings"


    "but as a pro player pointed out"


    by no means, if youre pro player and your sponsor offers you such monitor, it sure wont hurt


    i think you need to stop trolling now lol



  • holdenhamletholdenhamlet Member EpicPosts: 3,772
    "and 90 FPS looks pretty much identical to 60 FPS."

    You're high.

    I honestly doubt any pro fps player plays on a 60 fps monitor.
  • holdenhamletholdenhamlet Member EpicPosts: 3,772
    edited January 2017
    At a certain point, it's hard for the eye to discern a difference in increased fps.  I assure you it doesn't occur at 60 fps, though.
  • MalaboogaMalabooga Member UncommonPosts: 2,977
    edited January 2017
    "and 90 FPS looks pretty much identical to 60 FPS."

    You're high.

    I honestly doubt any pro fps player plays on a 60 fps monitor.
    Of course, they all get sponsored for equipement and serve as marketing for said equipement lol
Sign In or Register to comment.