Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Influence of frequency vs voltage on power consumprion

24

Comments

  • MalaboogaMalabooga Member UncommonPosts: 2,977
    edited January 2017
    filmoret said:
    Malabooga said:
    27w? you cant even do basic math rofl

    i wont edcuate you (as wh have all learend that its completely pointless to even try), and ive already told you to not embarass yourself, but you insist on doing so, well its hilarious anyway, so we can all get a good laugh at least lol
    Dude what is your problem?   You went from using 157w down to 125w.  That is 27w of power less you used.
    This is the first thing i ever wanted to put in my sig in my life LOL
  • RidelynnRidelynn Member EpicPosts: 7,383
    edited January 2017
    Malabooga said:
    filmoret said:
    Malabooga said:
    27w? you cant even do basic math rofl

    i wont edcuate you (as wh have all learend that its completely pointless to even try), and ive already told you to not embarass yourself, but you insist on doing so, well its hilarious anyway, so we can all get a good laugh at least lol
    Dude what is your problem?   You went from using 157w down to 125w.  That is 27w of power less you used.
    This is the first thing i ever wanted to put in my sig in my life LOL
    /shrug, it was close enough for me, but we've already proven that I don't care about "embarrassing" myself, or using hyperbole, or know math, or what have you. 

    I guess for a "real engineer" such as yourself, or a real mathematician, you guys need all the precision you can get. I'm sure your not compensating for anything.

    The guy didn't break out his calculator, but he was in the ballpark. He had a lot of things you could have keyed in on to discredit him (most of which was at least somewhat technical in natural, and a lot of people probably don't know), but drilling down on this one irrelevant mistake that didn't even affect the crux of the discussion- really? 
  • MalaboogaMalabooga Member UncommonPosts: 2,977
    edited January 2017
    Yeah, i noticed few of you have no problem writing complete nonsense, and thats fine, it proves something lol

    And you want me to explain intricacies of semicondutors to a guy than cant do 157-125 and INSISTS its 27. Sorry, this is not differential equation, but 1st grade math and, contrary to your belief, this "advanced" knowledge of mathematics is definitely not limited to "real engineers" or "mathematicians" but to everyone who successfuly passed 1st grade of primary school.

    And this was not supposed to be indepth analysis of semiconductors (since vast majority simply dont care) but simple guide for AMD users who are willing to tinker a bit and have quite measurable and noticable impact and values/tools that layman user has at his disposal and procedures he can do without any underlaying techical knowledge.
  • QuizzicalQuizzical Member LegendaryPosts: 25,355
    There is a difference between a mistake that you correct when it is pointed out and doubling down by insisting that an obvious arithmetic error is, in fact, correct.
  • filmoretfilmoret Member EpicPosts: 4,906
    Ridelynn said:
    Malabooga said:
    filmoret said:
    Malabooga said:
    27w? you cant even do basic math rofl

    i wont edcuate you (as wh have all learend that its completely pointless to even try), and ive already told you to not embarass yourself, but you insist on doing so, well its hilarious anyway, so we can all get a good laugh at least lol
    Dude what is your problem?   You went from using 157w down to 125w.  That is 27w of power less you used.
    This is the first thing i ever wanted to put in my sig in my life LOL
    /shrug, it was close enough for me, but we've already proven that I don't care about "embarrassing" myself, or using hyperbole, or know math, or what have you. 

    I guess for a "real engineer" such as yourself, or a real mathematician, you guys need all the precision you can get. I'm sure your not compensating for anything.

    The guy didn't break out his calculator, but he was in the ballpark. He had a lot of things you could have keyed in on to discredit him (most of which was at least somewhat technical in natural, and a lot of people probably don't know), but drilling down on this one irrelevant mistake that didn't even affect the crux of the discussion- really? 
    lol i just noticed that it was incorrect.  Bleh it happens sometimes.  But you are also ignoring one major fact that makes you look equally as dumb.  The card only uses a certain amount of amps.  It limits itself on how much amps it can use.  That is why when you increase the volts the watts also increase.  Because the amps are not going to break a certain barrier made by the manufacturers.

    Notice I said in this situation not all situations.  You decrease the volts and the watts decrease when it comes to this situation you cannot get an increase in watts unless you somehow get into the driver itself and make it use more amps and its just not going to do that without special coding.  And its possible the hardware itself limits the amps.  Volts are directly related to watts in this situation because the amps aren't going to change that much.
    Are you onto something or just on something?
  • filmoretfilmoret Member EpicPosts: 4,906
    Ridelynn said:
    filmoret said:

    Watts= amps X volts

    You decrease the volts and you decrease the watts.  Watts is the amount of power used.  Volts limits how much power can be used in this situation.  He is changing the volts not the amps in this equation.  
    That is true, but Amps = Volts / Resistance. So you aren't just changing one variable when you change voltage.

    It gets a lot more complicated in semiconductor math though, because although it's a DC voltage, it has a lot more variables that just Ohms and Kirchoffs. There's an alternating component because of the clock frequency, and a capacitive component because of the nature of semiconductors.

    In general in semiconductors, power is directly proportional to frequency, and proportional to the square of voltage, with a constant based on the specifics of the process node.

    http://www.ti.com/lit/an/scaa035b/scaa035b.pdf

    The speed of the semiconductor won't change based on voltage alone, unless you change the clock frequency. Adjusting the voltage downward will result in lower power consumption (which is measurable both via a power monitor, or via heat production), provided there is at least enough voltage there to make the physics work.

    So when you say that all power that's missing (34 or 27 or whatever Watts) has to slow down the GPU, that's not true, because the clock didn't necessarily change. You will see a lower amount of heat production, and that's where all that power that is missing has went. You have to have enough voltage to make the physics work, but beyond that, your just cranking up the thermostat on your silicon heating elements. 
    i didn't say it slowed down the gpu chip itself.  But something took a hit and is running slower.  The only real way to know is if he actually tested it looking at his fps and fan speed.  Something isn't getting the power it was using.  So yea the  gpu chip is probably running at 100% but the cooling system or the ram has taken a loss.  If we were talking about a CPU then I would agree with what his findings are.  But we are talking about an entire graphics card which includes more items using power then just the gpu chip itself.
    Are you onto something or just on something?
  • craftseekercraftseeker Member RarePosts: 1,740
    @filmoret You are sure giving me a lot of laughs with your posts on this topic. Keep up the good work! 
  • filmoretfilmoret Member EpicPosts: 4,906
    @filmoret You are sure giving me a lot of laughs with your posts on this topic. Keep up the good work! 
    Same to yourself.  Anyone can look smart when they never speak.
    Are you onto something or just on something?
  • QuizzicalQuizzical Member LegendaryPosts: 25,355
    filmoret said:
    Ridelynn said:
    filmoret said:
    Watts= amps X volts

    You decrease the volts and you decrease the watts.  Watts is the amount of power used.  Volts limits how much power can be used in this situation.  He is changing the volts not the amps in this equation.  
    That is true, but Amps = Volts / Resistance. So you aren't just changing one variable when you change voltage.

    It gets a lot more complicated in semiconductor math though, because although it's a DC voltage, it has a lot more variables that just Ohms and Kirchoffs. There's an alternating component because of the clock frequency, and a capacitive component because of the nature of semiconductors.

    In general in semiconductors, power is directly proportional to frequency, and proportional to the square of voltage, with a constant based on the specifics of the process node.

    http://www.ti.com/lit/an/scaa035b/scaa035b.pdf

    The speed of the semiconductor won't change based on voltage alone, unless you change the clock frequency. Adjusting the voltage downward will result in lower power consumption (which is measurable both via a power monitor, or via heat production), provided there is at least enough voltage there to make the physics work.

    So when you say that all power that's missing (34 or 27 or whatever Watts) has to slow down the GPU, that's not true, because the clock didn't necessarily change. You will see a lower amount of heat production, and that's where all that power that is missing has went. You have to have enough voltage to make the physics work, but beyond that, your just cranking up the thermostat on your silicon heating elements. 
    i didn't say it slowed down the gpu chip itself.  But something took a hit and is running slower.  The only real way to know is if he actually tested it looking at his fps and fan speed.  Something isn't getting the power it was using.  So yea the  gpu chip is probably running at 100% but the cooling system or the ram has taken a loss.  If we were talking about a CPU then I would agree with what his findings are.  But we are talking about an entire graphics card which includes more items using power then just the gpu chip itself.
    I even explained to you exactly what it is that slowed down and why in my pair of consecutive long posts.  Try reading it.
  • filmoretfilmoret Member EpicPosts: 4,906
    Ridelynn said:
    filmoret said:

    Watts= amps X volts

    You decrease the volts and you decrease the watts.  Watts is the amount of power used.  Volts limits how much power can be used in this situation.  He is changing the volts not the amps in this equation.  
    That is true, but Amps = Volts / Resistance. So you aren't just changing one variable when you change voltage.

    It gets a lot more complicated in semiconductor math though, because although it's a DC voltage, it has a lot more variables that just Ohms and Kirchoffs. There's an alternating component because of the clock frequency, and a capacitive component because of the nature of semiconductors.

    In general in semiconductors, power is directly proportional to frequency, and proportional to the square of voltage, with a constant based on the specifics of the process node.

    http://www.ti.com/lit/an/scaa035b/scaa035b.pdf

    The speed of the semiconductor won't change based on voltage alone, unless you change the clock frequency. Adjusting the voltage downward will result in lower power consumption (which is measurable both via a power monitor, or via heat production), provided there is at least enough voltage there to make the physics work.

    So when you say that all power that's missing (34 or 27 or whatever Watts) has to slow down the GPU, that's not true, because the clock didn't necessarily change. You will see a lower amount of heat production, and that's where all that power that is missing has went. You have to have enough voltage to make the physics work, but beyond that, your just cranking up the thermostat on your silicon heating elements. 
    So we are assuming that it drew extra power that it didn't need and just fizzled it into heat.  That is possible but without a proper analysis then we cannot simply assume that is what happened based upon the fact that the chip is using less power.  Remember we aren't talking about a cpu.  Graphics cards use power for various things and not just the chip itself.
    Are you onto something or just on something?
  • filmoretfilmoret Member EpicPosts: 4,906
    You realize also that you guys are saying that AMD didn't properly setup their flagship graphics card.  Thats really a shame.
    Are you onto something or just on something?
  • QuizzicalQuizzical Member LegendaryPosts: 25,355
    filmoret said:
    You realize also that you guys are saying that AMD didn't properly setup their flagship graphics card.  Thats really a shame.
    Have you considered reading the replies to a thread before creating more of them?  Or at least getting around to reading what others have said before you make your 12th reply to the same thread?  I think it would help.  I've already explained in great detail why that is wrong, but you're ignoring the explanations and just reasserting random nonsense.
  • filmoretfilmoret Member EpicPosts: 4,906
    Quizzical said:
    filmoret said:
    You realize also that you guys are saying that AMD didn't properly setup their flagship graphics card.  Thats really a shame.
    Have you considered reading the replies to a thread before creating more of them?  Or at least getting around to reading what others have said before you make your 12th reply to the same thread?  I think it would help.  I've already explained in great detail why that is wrong, but you're ignoring the explanations and just reasserting random nonsense.
    Its like I say something and its completely ignored.  So AMD releases a card and places it somewhere around 150w power usage.  They decide for some strange reason not to test this as the very brilliant malabooga has demonstrated that all they needed to do was decrease the volts a little and viola their cards are now running 125w and not losing any processing power whatsoever.   

    You realize that the 32w decrease is a huge thing for such a card?  This kind of mistake especially on a flagship card is just outright rediculous.  Agan you are mistaking this for a cpu which has different variables.  The card runs on a driver which tells it what to do.  The driver made by AMD which controlls everything.  Yet its not optimizing the flagship because they just set it to run everything?  Again this isnt a cpu.
    Are you onto something or just on something?
  • craftseekercraftseeker Member RarePosts: 1,740
    filmoret said:
    @filmoret You are sure giving me a lot of laughs with your posts on this topic. Keep up the good work! 
    Same to yourself.  Anyone can look smart when they never speak.
    Better than opening your mouth and demonstrating your lack of knowledge and understanding.

    Besides you already have two people giving you accurate explanations of how you are wrong.
  • filmoretfilmoret Member EpicPosts: 4,906
    What is being described really shows a severe lack of quality on AMD's part.  How hard is it to include in the GPU driver something that tests the voltage of the card so it can be at optimal performance?  I mean you guys are saying every single card is different so its not really that hard to setup cards even if their parameters are slightly different.   Its clear you guys are missing something here because I'm not about to believe that AMD screwed up so bad that they can't write a simple program to optimize their cards.

    There is some kind of misunderstanding or AMD is just poor quality drivers.  As malabooga clearly pointed out the nvidia cards do not have this problem.  Not because of some strange option that AMD has given us.  But because Nvidia cards are already optimized and doing something like decreasing voltage isn't making it better.  Once you reach that sweet spot it will only hurt and because it only hurts the nvidia cards then its fairly obvious that the nvidia cards are optimized and the amd ones are not.
    Are you onto something or just on something?
  • MalaboogaMalabooga Member UncommonPosts: 2,977
    Quit the trolling. Although its funny as hell, stop the spam lol
  • QuizzicalQuizzical Member LegendaryPosts: 25,355
    filmoret said:
    Quizzical said:
    filmoret said:
    You realize also that you guys are saying that AMD didn't properly setup their flagship graphics card.  Thats really a shame.
    Have you considered reading the replies to a thread before creating more of them?  Or at least getting around to reading what others have said before you make your 12th reply to the same thread?  I think it would help.  I've already explained in great detail why that is wrong, but you're ignoring the explanations and just reasserting random nonsense.
    Its like I say something and its completely ignored.  So AMD releases a card and places it somewhere around 150w power usage.  They decide for some strange reason not to test this as the very brilliant malabooga has demonstrated that all they needed to do was decrease the volts a little and viola their cards are now running 125w and not losing any processing power whatsoever.   

    You realize that the 32w decrease is a huge thing for such a card?  This kind of mistake especially on a flagship card is just outright rediculous.  Agan you are mistaking this for a cpu which has different variables.  The card runs on a driver which tells it what to do.  The driver made by AMD which controlls everything.  Yet its not optimizing the flagship because they just set it to run everything?  Again this isnt a cpu.
    If you can find a way to magically determine the hottest ambient temperature at which a given chip will ever be run--which varies wildly from one chip to the next, even for nominally identical chips cut out of the very same wafer--then I can assure you that AMD would be very interested in such a thing.  As are Nvidia, Intel, and a whole lot of other companies, as they all have the same problem.
  • QuizzicalQuizzical Member LegendaryPosts: 25,355
    filmoret said:
    @filmoret You are sure giving me a lot of laughs with your posts on this topic. Keep up the good work! 
    Same to yourself.  Anyone can look smart when they never speak.
    Have you ever heard the saying, "Better to keep quiet and be thought a fool than to open your mouth and remove all doubt?"

    But even that is too cynical, really.

    "Even fools are thought wise if they keep silent, and discerning if they hold their tongues." -Proverbs 17:28 (NIV)
  • RidelynnRidelynn Member EpicPosts: 7,383
    filmoret said:

    lol i just noticed that it was incorrect.  Bleh it happens sometimes.  But you are also ignoring one major fact that makes you look equally as dumb.  The card only uses a certain amount of amps.  It limits itself on how much amps it can use.  That is why when you increase the volts the watts also increase.  Because the amps are not going to break a certain barrier made by the manufacturers.
    Ok. I tried sticking up for you out there because I'm a sucker for the underdog. But your a big boy and can swim just fine all on your own.
  • 13lake13lake Member UncommonPosts: 719
    edited January 2017
    What makes me think he is running damage control / or just trolling, is how after every proper explanation, he responds with : "Is this really how AMD would do it ?, did they really do it, if they did well that is ,..."

    Those kind of answers would imply, that he acknowledges that it is possible that he is wrong, but tries to prove that he in fact isn't wrong by using basic logical fallacies ("There's no way i'm wrong, because then AMD would be wrong, there's no way AMD is wrong, so you guys are wrong").

    And the kicker is it's basic physics,thermodynamics and specifics of the nature of manufacturing that are the issue, which is out of anyone's hands atm.

    Of course it's also possible he is only in it to win the argument for the sake of argument, with complete disregard to any specifics of the actual argument.
  • MalaboogaMalabooga Member UncommonPosts: 2,977
    edited January 2017
    "157-125=27 and all of you are wrong"

    Thats pretty much where discussion with filmoret starts....and ends lol

    so try to refrain yourself from feeding the troll
  • SomethingUnusualSomethingUnusual Member UncommonPosts: 546
    Ridelynn said:
    filmoret said:

    lol i just noticed that it was incorrect.  Bleh it happens sometimes.  But you are also ignoring one major fact that makes you look equally as dumb.  The card only uses a certain amount of amps.  It limits itself on how much amps it can use.  That is why when you increase the volts the watts also increase.  Because the amps are not going to break a certain barrier made by the manufacturers.
    Ok. I tried sticking up for you out there because I'm a sucker for the underdog. But your a big boy and can swim just fine all on your own.
    It wouldn't be possible (Warning: Physics incoming) for a device to lock and regulate current draw without regulating voltage and resistance at the exact same time. I explained in very minor detail earlier in this very thread about zener diode configurations being used as voltage regulators. This is what I'm assuming is occurring in the device itself. Lowering or increasing the voltage to the device input would be therefore irrelevant as the regulator circuit will simply "fix" what you changed for the device to function as it intended. Thus you've done nothing for power consumption or performance. 

    An example: http://www.electronics-tutorials.ws/diode/diode_7.html


  • filmoretfilmoret Member EpicPosts: 4,906
    edited January 2017
    Yea I know its obviously the chip burning up unnecessary energy untill malabooga fixed it and now its running smoothly.  There is no possible way he took energy from the ram or anything else.  Although they probably run on the same circuit.  Thats impossible but its much more probable that AMD just didn't optimize their chips properly because well thats impossible to do.

    Yet somehow they cannot write a simple program that optimizes their cards checking the power consumption vs volts vs clock speed.   Yea thats a really hard program to make.
    Are you onto something or just on something?
  • RidelynnRidelynn Member EpicPosts: 7,383
    Ridelynn said:
    filmoret said:

    lol i just noticed that it was incorrect.  Bleh it happens sometimes.  But you are also ignoring one major fact that makes you look equally as dumb.  The card only uses a certain amount of amps.  It limits itself on how much amps it can use.  That is why when you increase the volts the watts also increase.  Because the amps are not going to break a certain barrier made by the manufacturers.
    Ok. I tried sticking up for you out there because I'm a sucker for the underdog. But your a big boy and can swim just fine all on your own.
    It wouldn't be possible (Warning: Physics incoming) for a device to lock and regulate current draw without regulating voltage and resistance at the exact same time. I explained in very minor detail earlier in this very thread about zener diode configurations being used as voltage regulators. This is what I'm assuming is occurring in the device itself. Lowering or increasing the voltage to the device input would be therefore irrelevant as the regulator circuit will simply "fix" what you changed for the device to function as it intended. Thus you've done nothing for power consumption or performance. 

    An example: http://www.electronics-tutorials.ws/diode/diode_7.html


    My guess is they use PWM in the VRM rather than a zener to finely control power via software/firmware, but I have to admit I don't know that for certain.
  • RidelynnRidelynn Member EpicPosts: 7,383
    filmoret said:
    Yea I know its obviously the chip burning up unnecessary energy untill malabooga fixed it and now its running smoothly.  There is no possible way he took energy from the ram or anything else.  Although they probably run on the same circuit.  Thats impossible but its much more probable that AMD just didn't optimize their chips properly because well thats impossible to do.

    Yet somehow they cannot write a simple program that optimizes their cards checking the power consumption vs wats vs clock speed.   Yea thats a really hard program to make.
    So, yeah... PowerTune.

    You gotta give the chips enough voltage to work - each chip is a little bit different, so you default to the minimum voltage that allows all chips to work. For many individual cards, that voltage may be higher than needed, and may be able to be tuned down, but it's very much a per-card thing. Yeah, it may result in extra power being drawn in some cases, but it guarantees that every card will work at default settings, and you have PowerTune to make sure you don't exceed your design maximum.

    Could you do that automagically? Maybe you could devise a process that gives each card a custom firmware that declares what the optimum voltage for each individual die is. And then you need to track each firmware revision versus serial number of the GPU, so you can make sure firmware upgrades in the future contain the optimum custom voltage setting.

    But is it worth the hassle and expense of doing so, or do you just burn a few extra watts that's well within the engineering capacity of the chip and cooler, and make it a generic number across the entire line?
Sign In or Register to comment.