Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

GeForce RTX 3080 reviews are out.

QuizzicalQuizzical Member LegendaryPosts: 25,353
The card is fast.  My ballpark estimate is 40% faster than a GeForce RTX 2080 Ti in situations where you're limited by the GPU.

The die is huge, at 628 mm^2.  That's not really a shock, but it is a reason why the card is expensive and always will be.

To get that jump in performance, there's also a considerable jump in power consumption.  Using 30% more power than an RTX 2080 Ti in order to deliver 40% more performance is only a modest increase in energy efficiency.  That's still enough to make it the most efficient GPU ever, but not by that large of a margin, and you'd have hoped that Nvidia would have gotten more efficiency out of a die shrink than that.  Then again, AMD didn't get an enormous jump in efficiency out of the die shrink to 7 nm, either, so die shrinks probably don't do as much for you as they used to.

What the card is not, however, is available.  Micron says that GDDR6X is still only sampling, not in mass production:

https://www.micron.com/products/ultra-bandwidth-solutions/gddr6x/part-catalog

That certainly lets you make review samples, and lets you have a handful of cards show up at retail.  But don't expect to see the RTX 3080 widely available at retail at the MSRP of $700 this year.

The RTX 3070 is far more likely to have widespread availability soon, however, as it relies on the now mature GDDR6 memory, rather than the upcoming GDDR6X.  From some back of the envelope arithmetic, it is probably a little slower than the RTX 2080 Ti.  However, at $500, it's going to be massively cheaper than the latter card.  That's definitely something good that we got out of the die shrink.

There are two other cards that loom large in this analysis, of course.  One is Nvidia's own GeForce RTX 3090.  That may or may not be a fully functional GA102 die, while the RTX 3080 uses a considerably cut down die.  From the paper specs, the RTX 3090 could be about 20% faster than an RTX 3080.  It will also cost more than twice as much.

And there is also Navi 2X, which AMD has said will be in the Radeon RX 6000 series.  Hopefully that heralds a return to sane naming after the RX Vega 64, the Radeon VII, and the RX 5000 series being the successor to the RX 500 series.  AMD has promised that Navi 2X will offer a 50% energy efficiency improvement over Navi.  I'm guessing that it doubles the performance of a Radeon RX 5700 XT, which would put it as slightly slower than an RTX 3080, while using significantly less power.  Of course, if that's where the card should land, then AMD might decide to just clock it higher and make it match an RTX 3080 in both metrics.
«1

Comments

  • VrikaVrika Member LegendaryPosts: 7,888
    Quizzical said:

    And there is also Navi 2X, which AMD has said will be in the Radeon RX 6000 series.  Hopefully that heralds a return to sane naming after the RX Vega 64, the Radeon VII, and the RX 5000 series being the successor to the RX 500 series.  AMD has promised that Navi 2X will offer a 50% energy efficiency improvement over Navi.  I'm guessing that it doubles the performance of a Radeon RX 5700 XT, which would put it as slightly slower than an RTX 3080, while using significantly less power.  Of course, if that's where the card should land, then AMD might decide to just clock it higher and make it match an RTX 3080 in both metrics.
    50% increase is just what AMD claims. Meanwhile NVidia claims that Ampere offers up to 1.9 times more performance per watt than Turing.

    Those numbers should be taken to mean that it's more energy efficient than the previous product, but one should not make the mistake of believing that it's that much efficient in normal use.
     
  • botrytisbotrytis Member RarePosts: 3,363
    edited September 2020
    Problem is the 3080, in testing, used 523 watts of power. That is compared to the 2080 which used 416 watts. Either way, a big PS is in order to run them.


  • QuizzicalQuizzical Member LegendaryPosts: 25,353
    botrytis said:
    Problem is the 3080, in testing, used 523 watts of power. That is compared to the 2080 which used 416 watts. Either way, a big PS is in order to run them.
    I'm skeptical that that is power for the video card alone, at least at stock settings.  Most likely, that's power draw at the wall, which is easier to measure, but includes a bunch of other things besides the video card.
  • QuizzicalQuizzical Member LegendaryPosts: 25,353
    remsleep said:
    botrytis said:
    Problem is the 3080, in testing, used 523 watts of power. That is compared to the 2080 which used 416 watts. Either way, a big PS is in order to run them.

    Yep I am used to it at this point - which is why I am rocking a 1600W PS

    Will be interesting to see super versions of 3000s next year

    The best benchmark I saw was basically this

    3080 can run most games at 2k at same FPS that 2080TI runs at 1080p - that is pretty huge
    You assume that there will be "super" versions of the cards.  Nvidia has only used that nomenclature once.  Most of the time, they just increment the first digit and call it a new generation.
  • botrytisbotrytis Member RarePosts: 3,363
    edited September 2020
    Quizzical said:
    botrytis said:
    Problem is the 3080, in testing, used 523 watts of power. That is compared to the 2080 which used 416 watts. Either way, a big PS is in order to run them.
    I'm skeptical that that is power for the video card alone, at least at stock settings.  Most likely, that's power draw at the wall, which is easier to measure, but includes a bunch of other things besides the video card.
    This was based on software measurements from NVidia itself. That is just for the video card alone. NVidia stated the 3080/3090 will need a minimum of an 850 watt PS.


  • QuizzicalQuizzical Member LegendaryPosts: 25,353
    botrytis said:
    Quizzical said:
    botrytis said:
    Problem is the 3080, in testing, used 523 watts of power. That is compared to the 2080 which used 416 watts. Either way, a big PS is in order to run them.
    I'm skeptical that that is power for the video card alone, at least at stock settings.  Most likely, that's power draw at the wall, which is easier to measure, but includes a bunch of other things besides the video card.
    This was based on software measurements from NVidia itself. That is just for the video card alone. NVidia stated the 3080/3090 will need a minimum of an 850 watt PS.
    But was that at stock settings or with a heavy overclock?  I could believe the latter.  For the former, that's running the power delivery way out of spec.  It's not just about what the power supply can deliver in total.  An 8-pin PCI-E connector is only supposed to deliver 150 W, and if you're pulling 200 W through it, that's not good.
  • AmazingAveryAmazingAvery Age of Conan AdvocateMember UncommonPosts: 7,188
    botrytis said:
    Quizzical said:
    botrytis said:
    Problem is the 3080, in testing, used 523 watts of power. That is compared to the 2080 which used 416 watts. Either way, a big PS is in order to run them.
    I'm skeptical that that is power for the video card alone, at least at stock settings.  Most likely, that's power draw at the wall, which is easier to measure, but includes a bunch of other things besides the video card.
    This was based on software measurements from NVidia itself. That is just for the video card alone. NVidia stated the 3080/3090 will need a minimum of an 850 watt PS.
    Total System Power Consumption -  it is 21% more total system power on this particular reviewers test rig vs the 2080

    The PCAT tool measures just the graphics card power also at the link above.




  • botrytisbotrytis Member RarePosts: 3,363
    edited September 2020
    Quizzical said:
    botrytis said:
    Quizzical said:
    botrytis said:
    Problem is the 3080, in testing, used 523 watts of power. That is compared to the 2080 which used 416 watts. Either way, a big PS is in order to run them.
    I'm skeptical that that is power for the video card alone, at least at stock settings.  Most likely, that's power draw at the wall, which is easier to measure, but includes a bunch of other things besides the video card.
    This was based on software measurements from NVidia itself. That is just for the video card alone. NVidia stated the 3080/3090 will need a minimum of an 850 watt PS.
    But was that at stock settings or with a heavy overclock?  I could believe the latter.  For the former, that's running the power delivery way out of spec.  It's not just about what the power supply can deliver in total.  An 8-pin PCI-E connector is only supposed to deliver 150 W, and if you're pulling 200 W through it, that's not good.
    Stock. It was the card they were testing now. The 3080/3090 are using a new Molex 12 pin connector the size of the 8 pin out now.


  • VrikaVrika Member LegendaryPosts: 7,888
    botrytis said:
    Quizzical said:
    botrytis said:
    Problem is the 3080, in testing, used 523 watts of power. That is compared to the 2080 which used 416 watts. Either way, a big PS is in order to run them.
    I'm skeptical that that is power for the video card alone, at least at stock settings.  Most likely, that's power draw at the wall, which is easier to measure, but includes a bunch of other things besides the video card.
    This was based on software measurements from NVidia itself. That is just for the video card alone. NVidia stated the 3080/3090 will need a minimum of an 850 watt PS.
    They require 750W power supply, not 850W.
    AmazingAvery
     
  • mgilbrtsnmgilbrtsn Member EpicPosts: 3,430
    I'm actually waiting for this card to build my new system I hope it's as good as it sounds 

    I self identify as a monkey.

  • QuizzicalQuizzical Member LegendaryPosts: 25,353
    botrytis said:
    Quizzical said:
    botrytis said:
    Quizzical said:
    botrytis said:
    Problem is the 3080, in testing, used 523 watts of power. That is compared to the 2080 which used 416 watts. Either way, a big PS is in order to run them.
    I'm skeptical that that is power for the video card alone, at least at stock settings.  Most likely, that's power draw at the wall, which is easier to measure, but includes a bunch of other things besides the video card.
    This was based on software measurements from NVidia itself. That is just for the video card alone. NVidia stated the 3080/3090 will need a minimum of an 850 watt PS.
    But was that at stock settings or with a heavy overclock?  I could believe the latter.  For the former, that's running the power delivery way out of spec.  It's not just about what the power supply can deliver in total.  An 8-pin PCI-E connector is only supposed to deliver 150 W, and if you're pulling 200 W through it, that's not good.
    Stock. It was the card they were testing now. The 3080/3090 are using a new Molex 12 pin connector the size of the 8 pin out now.
    It's still fed by two 8-pin connectors.  If you're pulling 400 W through a 12-pin connector, you're pulling at least 200 W through some 8-pin connector, and that's bad.

    Could you just give me the reference to what you're talking about so that I don't have to guess?
  • botrytisbotrytis Member RarePosts: 3,363
    edited September 2020
    Quizzical said:
    botrytis said:
    Quizzical said:
    botrytis said:
    Quizzical said:
    botrytis said:
    Problem is the 3080, in testing, used 523 watts of power. That is compared to the 2080 which used 416 watts. Either way, a big PS is in order to run them.
    I'm skeptical that that is power for the video card alone, at least at stock settings.  Most likely, that's power draw at the wall, which is easier to measure, but includes a bunch of other things besides the video card.
    This was based on software measurements from NVidia itself. That is just for the video card alone. NVidia stated the 3080/3090 will need a minimum of an 850 watt PS.
    But was that at stock settings or with a heavy overclock?  I could believe the latter.  For the former, that's running the power delivery way out of spec.  It's not just about what the power supply can deliver in total.  An 8-pin PCI-E connector is only supposed to deliver 150 W, and if you're pulling 200 W through it, that's not good.
    Stock. It was the card they were testing now. The 3080/3090 are using a new Molex 12 pin connector the size of the 8 pin out now.
    It's still fed by two 8-pin connectors.  If you're pulling 400 W through a 12-pin connector, you're pulling at least 200 W through some 8-pin connector, and that's bad.

    Could you just give me the reference to what you're talking about so that I don't have to guess?
    https://www.techpowerup.com/271444/nvidia-ampere-12-pin-power-connector-pictured-some-more

    https://www.tweaktown.com/news/74775/this-is-an-even-better-look-at-nvidias-new-12-pin-power-connector/index.html

    NVIDIA GeForce RTX 3080 Unboxing  Preview  TechPowerUp

    Nvidia GeForce RTX 3090 release date price specs and performance

    BIG ASS CARD!!


  • QuizzicalQuizzical Member LegendaryPosts: 25,353
    botrytis said:
    Quizzical said:
    botrytis said:
    Quizzical said:
    botrytis said:
    Quizzical said:
    botrytis said:
    Problem is the 3080, in testing, used 523 watts of power. That is compared to the 2080 which used 416 watts. Either way, a big PS is in order to run them.
    I'm skeptical that that is power for the video card alone, at least at stock settings.  Most likely, that's power draw at the wall, which is easier to measure, but includes a bunch of other things besides the video card.
    This was based on software measurements from NVidia itself. That is just for the video card alone. NVidia stated the 3080/3090 will need a minimum of an 850 watt PS.
    But was that at stock settings or with a heavy overclock?  I could believe the latter.  For the former, that's running the power delivery way out of spec.  It's not just about what the power supply can deliver in total.  An 8-pin PCI-E connector is only supposed to deliver 150 W, and if you're pulling 200 W through it, that's not good.
    Stock. It was the card they were testing now. The 3080/3090 are using a new Molex 12 pin connector the size of the 8 pin out now.
    It's still fed by two 8-pin connectors.  If you're pulling 400 W through a 12-pin connector, you're pulling at least 200 W through some 8-pin connector, and that's bad.

    Could you just give me the reference to what you're talking about so that I don't have to guess?
    https://www.techpowerup.com/271444/nvidia-ampere-12-pin-power-connector-pictured-some-more

    https://www.tweaktown.com/news/74775/this-is-an-even-better-look-at-nvidias-new-12-pin-power-connector/index.html

    NVIDIA GeForce RTX 3080 Unboxing  Preview  TechPowerUp

    Nvidia GeForce RTX 3090 release date price specs and performance

    BIG ASS CARD!!
    That doesn't say that they measured an RTX 3080 as using 523 W.
  • QuizzicalQuizzical Member LegendaryPosts: 25,353
    mgilbrtsn said:
    I'm actually waiting for this card to build my new system I hope it's as good as it sounds 
    In that case, you're likely to have to keep waiting until 2021.  The official launch is tomorrow, but it's difficult to buy a card when very few of them exist.
    mgilbrtsn
  • VrikaVrika Member LegendaryPosts: 7,888
    Quizzical said:
    botrytis said:
    Quizzical said:
    botrytis said:
    Quizzical said:
    botrytis said:
    Quizzical said:
    botrytis said:
    Problem is the 3080, in testing, used 523 watts of power. That is compared to the 2080 which used 416 watts. Either way, a big PS is in order to run them.
    I'm skeptical that that is power for the video card alone, at least at stock settings.  Most likely, that's power draw at the wall, which is easier to measure, but includes a bunch of other things besides the video card.
    This was based on software measurements from NVidia itself. That is just for the video card alone. NVidia stated the 3080/3090 will need a minimum of an 850 watt PS.
    But was that at stock settings or with a heavy overclock?  I could believe the latter.  For the former, that's running the power delivery way out of spec.  It's not just about what the power supply can deliver in total.  An 8-pin PCI-E connector is only supposed to deliver 150 W, and if you're pulling 200 W through it, that's not good.
    Stock. It was the card they were testing now. The 3080/3090 are using a new Molex 12 pin connector the size of the 8 pin out now.
    It's still fed by two 8-pin connectors.  If you're pulling 400 W through a 12-pin connector, you're pulling at least 200 W through some 8-pin connector, and that's bad.

    Could you just give me the reference to what you're talking about so that I don't have to guess?
    https://www.techpowerup.com/271444/nvidia-ampere-12-pin-power-connector-pictured-some-more

    https://www.tweaktown.com/news/74775/this-is-an-even-better-look-at-nvidias-new-12-pin-power-connector/index.html

    NVIDIA GeForce RTX 3080 Unboxing  Preview  TechPowerUp

    Nvidia GeForce RTX 3090 release date price specs and performance

    BIG ASS CARD!!
    That doesn't say that they measured an RTX 3080 as using 523 W.
    Guru3D measured RTX 3080 using max. 338W.

    https://www.guru3d.com/articles-pages/geforce-rtx-3080-founder-review,7.html
     
  • VrikaVrika Member LegendaryPosts: 7,888
    botrytis said:
    It says total system power consumption.

    That's for the system, not for the card. Also I've had to correct you now twice that they need 750W PSU, not 850W PSU.

    I think you're just trolling us.
    AmazingAvery
     
  • QuizzicalQuizzical Member LegendaryPosts: 25,353
    edited September 2020
    botrytis said:
    That link says that the RTX 3080 used 327 W.  That is not particularly close to 523 W.  The 523 W number is for the total system power consumption, which includes, the CPU, motherboard, memory, power supply, and some other stuff.
  • AmazingAveryAmazingAvery Age of Conan AdvocateMember UncommonPosts: 7,188
    Vrika said:
    botrytis said:
    It says total system power consumption.

    That's for the system, not for the card. Also I've had to correct you now twice that they need 750W PSU, not 850W PSU.

    I think you're just trolling us.
    Yes that much is obvious. Everything he is saying is incorrect.



  • AmazingAveryAmazingAvery Age of Conan AdvocateMember UncommonPosts: 7,188
    Quizzical said:
    remsleep said:
    botrytis said:
    Problem is the 3080, in testing, used 523 watts of power. That is compared to the 2080 which used 416 watts. Either way, a big PS is in order to run them.

    Yep I am used to it at this point - which is why I am rocking a 1600W PS

    Will be interesting to see super versions of 3000s next year

    The best benchmark I saw was basically this

    3080 can run most games at 2k at same FPS that 2080TI runs at 1080p - that is pretty huge
    You assume that there will be "super" versions of the cards.  Nvidia has only used that nomenclature once.  Most of the time, they just increment the first digit and call it a new generation.
    It is either going to be super or TI. Either way the cut versions are already referenced and found in engineering samples nomenclature in the PCI ID Repository database. 



  • Asm0deusAsm0deus Member EpicPosts: 4,405
    edited September 2020
    Looking like the 3080 is about 20 to 30% faster than a rtx 2080ti in 4k, this goes lower as resolution goes down by quite a bit in some cases.

    This tells me the 3070 will about more or less the same as rtx 2080ti maybe a bit less depending on some cases contrary to what was expected.





    Brenics ~ Just to point out I do believe Chris Roberts is going down as the man who cheated backers and took down crowdfunding for gaming.





  • cheyanecheyane Member LegendaryPosts: 9,100
    My Alienware has 875 watts power supply but the card won't fit into the Aurora R4 plus it is almost 6 years old. I am probably going to have to get a new machine for the card.
    Chamber of Chains
  • Asm0deusAsm0deus Member EpicPosts: 4,405
    edited September 2020
    cheyane said:
    My Alienware has 875 watts power supply but the card won't fit into the Aurora R4 plus it is almost 6 years old. I am probably going to have to get a new machine for the card.

    Just get a new PC case and put w/e is in your aurora r4 into it? 

    No need to buy a new "machine" if what you have works and isn't that old.  That said if it's 6 years old might be a good time to plan a new build.

    If you just transfer the "guts" of your aurora you don't even have to reformat or anything other than putting in the proper driver for you new gpu.

    Brenics ~ Just to point out I do believe Chris Roberts is going down as the man who cheated backers and took down crowdfunding for gaming.





  • cheyanecheyane Member LegendaryPosts: 9,100
    edited September 2020
    Yes I use 1080. Ahh no it is 1920 X 1200 that is probably rounded up to 1080 right.
    Chamber of Chains
  • The user and all related content has been deleted.

    거북이는 목을 내밀 때 안 움직입니다












Sign In or Register to comment.