Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

GeForce GTX 1660 Ti launches: Turing at its most efficient

QuizzicalQuizzical Member LegendaryPosts: 25,353
I have a longstanding theory that the first part of a major new architecture is far more important than derivative parts.  I don't mean from a commercial perspective, but more in terms of predicting how the company is going to do in the next few years, as a stock trader might wish to do.  The reason that the FX-8150 was so disastrous for AMD is not that it was merely a single bad SKU.  It meant that outside of the cat cores for the low end, AMD's CPUs would be rather bad for the next 5 1/2 years.

Sometimes the first part of a new architecture is an oddball that makes it hard to judge the architecture.  That was the case with Maxwell and the GeForce GTX 750 Ti, for example.  The bottom of the line Maxwell part had what was at the time the largest L2 cache for any GPU ever.  That took a lot of die space and made the card cost far too much for a lower end card, making it hard to gauge how efficient the architecture was otherwise.  It wasn't until the launch of the GeForce GTX 980 later that year that it became clear that Nvidia had a winner on their hands.

The launch of Volta/Turing may well be the most extreme case of this ever.  It started with GV100 in the Titan V, which wasn't optimized for graphics.  The TU102, TU104, and TU106 dies of the higher end Turing cards were bloated by ray-tracing and tensor cores, with the former somewhat dubious for gaming in the near future and the latter completely worthless to gamers.  On a performance per watt basis, or performance per model number (which is a stupid way to analyze cards), they were a fine advance.

The problem was the price tag.  $1200 for a consumer card is an awfully tough sell, no matter what it can do.  For a $700 card, gamers expect the top of the line, not something far removed from it.  The prices were high because the die size was large, in part because it was bloated by unnecessary junk.  That made Turing look bad up front, but held out the hope that it could be a fine architecture if you chop out the bloat.

Well, Nvidia just did exactly that.  The TU116 die of the GeForce GTX 1660 Ti doesn't have ray tracing logic, nor does it have tensor cores.  It does feature a stupid name that is completely out of whack with Nvidia's longstanding naming scheme, though it's less bad than the erratic names that AMD has been giving their Vega cards.  Some back of the envelope arithmetic indicates that ray tracing and tensor cores combine for about 15%-20% of the die size of the earlier Turing cards.

So now we see what Turing can really do.  The GTX 1660 Ti is about on par in performance with the GTX 1070, while using less power.  That makes it a fine choice for consumers at $280, considering the rest of today's market.

But I'm not so interested in a particular SKU as in what this says about what Nvidia will have to offer for the next few years.  And on that count, the news is decidedly bad.

The key point of reference is the GP104 die of the GeForce GTX 1080.  The latter had a 314 mm^2 die of the Pascal architecture, as compared to 284 mm^2 for the TU116 die of the GTX 1660 Ti.  So the new GPU has over 90% of the die size of one that Nvidia paper launched about 33 months earlier--an eternity in technology.  The problem doesn't offer 90% of the performance of the GTX 1080.

Over the course of 33 months, performance per mm^2 actually went down.  That's awful, and almost never happens, at least if you exclude top end compute parts bloated by non-graphical stuff.  And it's in spite of moving to a new, better process node.  To be fair, moving from 16 nm to 12 nm isn't a full node die shrink, or even a half node.  It's not so much a shrink at all as a more mature node tuned better for how Nvidia wanted it.  But with the analogous move from Polaris 10 (Radeon RX 480) to Polaris 20 (Radeon RX 580) to Polaris 30 (Radeon RX 590), AMD at least saw performance per mm^2 go up, not down.  It wasn't anything earth-shattering, but it sure beats going in the wrong direction entirely.

Now, the GP104 die of the GTX 1080 was a terrific chip, far more so than we realized at the time.  AMD wouldn't launch Polaris until later that year, and even that had rather small dies; they wouldn't launch Vega until about a year after Pascal.  And it still isn't entirely clear how GlobalFoundries 14 nm node compares to TSMC's 16 nm.  GP104 also had the disadvantage of being Nvidia's lead chip on a new and then-immature process node, while TU116 is launching on what is now a very mature process node.

That doesn't mean that Turing is catastrophic for Nvidia.  This isn't a Bulldozer-level catastrophe that will threaten the company's viability.  It's not a bad architecture on an absolute scale; it's not like Fermi for graphics or Kepler for compute.  But it might be more like a GPU version of Kaby Lake:  nice at launch, and fine in its own right, but merely treading water while your competitor is greatly advancing.  Depending on how good AMD's upcoming Navi architecture is, it's also very possible that the better comparison for Turing will be Broadwell:  too expensive and not much of an advance, but still ahead of the competition.

The GTX 1660 Ti launching now is also bad news for Nvidia in another sense.  AMD already has a 7 nm GPU out.  Navi is coming, with various rumors putting it in July or October, and AMD promising to say a lot more about it sometime this year.  That Nvidia is launching a new $280 GPU today means that they surely aren't going to have a $280 GPU on 7 nm anytime soon.  If they had a new 7 nm lineup coming by the middle of the year, they'd have canceled this part long ago.  That lends credibility to the rumors that Nvidia won't have anything on 7 nm until 2020.
ChildoftheShadowsMendeltweedledumb99
«13

Comments

  • OzmodanOzmodan Member EpicPosts: 9,726
    Yep, an interesting product choice.  Nvidia realizing that the ray tracing/dlss gimmick is exactly that a gimmick.  It is priced to compete with the RX-590.  I do not understand the 6gb memory though.  While 6gb is ok enough for today, what about next year or the year after?   I don't buy a video card every year or two.  I might still recommend the RX-590 just for that reason.
    [Deleted User]
  • VrikaVrika Member LegendaryPosts: 7,888
    Quizzical said:

    The key point of reference is the GP104 die of the GeForce GTX 1080.  The latter had a 314 mm^2 die of the Pascal architecture, as compared to 284 mm^2 for the TU116 die of the GTX 1660 Ti.  So the new GPU has over 90% of the die size of one that Nvidia paper launched about 33 months earlier--an eternity in technology.  The problem doesn't offer 90% of the performance of the GTX 1080.
    I think GTX 1080 is a bad comparison because with it NVidia was able to pick the top parts and use rest for to make GTX 1070.

    But GTX 1660 Ti doesn't compare well against GTX 1060 either if you compare die size or transistor count. They've managed to bring the power consumption/performance down but they need a bit more transistors to bring same performance.
    [Deleted User]
     
  • RidelynnRidelynn Member EpicPosts: 7,383
    Ozmodan said:
    Yep, an interesting product choice.  Nvidia realizing that the ray tracing/dlss gimmick is exactly that a gimmick.  It is priced to compete with the RX-590.  I do not understand the 6gb memory though.  While 6gb is ok enough for today, what about next year or the year after?   I don't buy a video card every year or two.  I might still recommend the RX-590 just for that reason.
    You may not want to buy a new video card next year, or the year after... but guess who wants you to?

    Once you answer that question it makes a lot more sense.
    [Deleted User]
  • VrikaVrika Member LegendaryPosts: 7,888
    edited February 2019
    Ridelynn said:
    Ozmodan said:
    Yep, an interesting product choice.  Nvidia realizing that the ray tracing/dlss gimmick is exactly that a gimmick.  It is priced to compete with the RX-590.  I do not understand the 6gb memory though.  While 6gb is ok enough for today, what about next year or the year after?   I don't buy a video card every year or two.  I might still recommend the RX-590 just for that reason.
    You may not want to buy a new video card next year, or the year after... but guess who wants you to?

    Once you answer that question it makes a lot more sense.
    But would removing some processing power so that you can instead add 2GB of additional RAM while keeping the price same make GTX 1660 Ti faster or slower?

    I think some people are too focused on the RAM when it's always a trade-off between computing power vs. RAM vs. price vs. heat, noise, power used and reliability.

     
  • RidelynnRidelynn Member EpicPosts: 7,383
    Why would you remove processing power to add RAM?
    [Deleted User]
  • VrikaVrika Member LegendaryPosts: 7,888
    edited February 2019
    Ridelynn said:
    Why would you remove processing power to add RAM?
    Because you can't just add something without increasing costs, unless you also remove something of equal price.
    [Deleted User]
     
  • QuizzicalQuizzical Member LegendaryPosts: 25,353
    Vrika said:
    Quizzical said:

    The key point of reference is the GP104 die of the GeForce GTX 1080.  The latter had a 314 mm^2 die of the Pascal architecture, as compared to 284 mm^2 for the TU116 die of the GTX 1660 Ti.  So the new GPU has over 90% of the die size of one that Nvidia paper launched about 33 months earlier--an eternity in technology.  The problem doesn't offer 90% of the performance of the GTX 1080.
    I think GTX 1080 is a bad comparison because with it NVidia was able to pick the top parts and use rest for to make GTX 1070.

    But GTX 1660 Ti doesn't compare well against GTX 1060 either if you compare die size or transistor count. They've managed to bring the power consumption/performance down but they need a bit more transistors to bring same performance.
    You can make salvage parts out of nearly any GPU die, and it's decently likely that salvage parts of TU116 are coming.  The proper comparison is the top bin of one die to the top bin of another.  I went with GP104 for the comparison rather than GP106 because it's the nearest in die size.  If the die size gets too far apart, the comparison is more muddled for a variety of reasons.
  • QuizzicalQuizzical Member LegendaryPosts: 25,353
    Vrika said:
    Ridelynn said:
    Ozmodan said:
    Yep, an interesting product choice.  Nvidia realizing that the ray tracing/dlss gimmick is exactly that a gimmick.  It is priced to compete with the RX-590.  I do not understand the 6gb memory though.  While 6gb is ok enough for today, what about next year or the year after?   I don't buy a video card every year or two.  I might still recommend the RX-590 just for that reason.
    You may not want to buy a new video card next year, or the year after... but guess who wants you to?

    Once you answer that question it makes a lot more sense.
    But would removing some processing power so that you can instead add 2GB of additional RAM while keeping the price same make GTX 1660 Ti faster or slower?

    I think some people are too focused on the RAM when it's always a trade-off between computing power vs. RAM vs. price vs. heat, noise, power used and reliability.

    It's probable that they could pair exactly the same die with 12 GB or 24 GB of memory if so inclined.  Even if they couldn't, the die size adjustment to be able to do so would be nearly trivial.  For that matter, it's also decently likely that they will make a Quadro version of the same card with 12 GB of memory.  That they capped it at 6 GB for consumer cards is for reasons of cost of production and market segmentation.
  • QuizzicalQuizzical Member LegendaryPosts: 25,353
    DMKano said:
    I hope 2 things happen:

    1. Navi turns out to be really good
    2. Navi comes out in July

    Disregarding Navi completely - IMO right now I wish Nvidia had 8GB versions of RTX2060 for 2K gaming that is priced at $250-$300

    1060p gaming is already covered by much cheaper cards, and 4K gaming is IMO impractical.

    So 2K is where it's at - whoever can deliver the best 2K performance for lowest cost is the winner for me - don't care if its team red or green
    Don't bet on getting a GPU based on a 445 mm^2 die for under $300.  That only ever happens if it's due to extenuating circumstances, such as having to slash prices to compete when getting crushed by the competition (the GeForce GTX 260 comes to mind) or having built way too many of the cards and needing to get rid of them at clearance prices (the Radeon R9 290 had a case of this after a mining bust, but even that was a smaller die).
  • RidelynnRidelynn Member EpicPosts: 7,383
    I can understand what DMKano is asking for - he doesn’t care what size the die is, or for all the bloat features that came with it. He wants that level of performance and that is what he’s willing to pay for it.

    Apparently it would have been possible from
    nv, now we wait to see if AMD can do it
    [Deleted User]
  • MendelMendel Member LegendaryPosts: 5,609
    Quizzical said:
    <snip>
    The GTX 1660 Ti launching now is also bad news for Nvidia in another sense.  AMD already has a 7 nm GPU out.  Navi is coming, with various rumors putting it in July or October, and AMD promising to say a lot more about it sometime this year.  That Nvidia is launching a new $280 GPU today means that they surely aren't going to have a $280 GPU on 7 nm anytime soon.  If they had a new 7 nm lineup coming by the middle of the year, they'd have canceled this part long ago.  That lends credibility to the rumors that Nvidia won't have anything on 7 nm until 2020.
    I really think you're probably spot on about this.  Releasing an 'improved' 12nm/16nm die now means that there's not anything better in the immediate future from Nvidia in the sub-$300 range.  I just got a 1050 Ti last year.  It looks like it will need to last until 2020 (or beyond) before I will be looking at an upgrade/replacement.  Good breakdown, @Quizical.

    Now, I just really hope those 2020 rumors are accurate and not a 2022 reality.



    Logic, my dear, merely enables one to be wrong with great authority.

  • QuizzicalQuizzical Member LegendaryPosts: 25,353
    Ridelynn said:
    I can understand what DMKano is asking for - he doesn’t care what size the die is, or for all the bloat features that came with it. He wants that level of performance and that is what he’s willing to pay for it.

    Apparently it would have been possible from
    nv, now we wait to see if AMD can do it
    Getting performance on par with an RTX 2060 from a much smaller, cheaper die on a 7 nm process node should be very doable.  You'll just have to wait until those GPUs show up.  Die shrinks are what drive Moore's Law, and with it, long-term performance improvements.
  • QuizzicalQuizzical Member LegendaryPosts: 25,353
    Mendel said:
    Quizzical said:
    <snip>
    The GTX 1660 Ti launching now is also bad news for Nvidia in another sense.  AMD already has a 7 nm GPU out.  Navi is coming, with various rumors putting it in July or October, and AMD promising to say a lot more about it sometime this year.  That Nvidia is launching a new $280 GPU today means that they surely aren't going to have a $280 GPU on 7 nm anytime soon.  If they had a new 7 nm lineup coming by the middle of the year, they'd have canceled this part long ago.  That lends credibility to the rumors that Nvidia won't have anything on 7 nm until 2020.
    I really think you're probably spot on about this.  Releasing an 'improved' 12nm/16nm die now means that there's not anything better in the immediate future from Nvidia in the sub-$300 range.  I just got a 1050 Ti last year.  It looks like it will need to last until 2020 (or beyond) before I will be looking at an upgrade/replacement.  Good breakdown, @Quizical.

    Now, I just really hope those 2020 rumors are accurate and not a 2022 reality.
    If the rumors are accurate and Nvidia doesn't launch any 7 nm GPUs until 2020, that would be bad news.  AMD has already launched a GPU on 7 nm, and has promised plenty more this year.

    This isn't like game development, where an isolated product can readily be delayed for years for reason specific to that product.  It's possible for an entire process node to be delayed, canceled, or terrible enough that everyone ignores it.  But once it works, it works for everyone.  It's not just that Apple launched new iPhones using the process node last year.  AMD has already launched a high-power GPU with a decently big die on it.

    For Nvidia not to launch any 7 nm or smaller GPUs until 2022 would be extremely shocking, and the most plausible way that I could see that happening is if some cataclysmic event destroys the fabs where they were going to build the chips.  For example, a nuclear war.  In that case, product launches being delayed would be the least of our worries.

    In the absence of some earth-shattering event, Nvidia not launching any 7 nm or smaller GPUs until 2022 would be "is Nvidia going out of business?" levels of badness.  That's extremely unlikely to happen so soon.
    Mendel
  • ForgrimmForgrimm Member EpicPosts: 3,059
    Ozmodan said:
    Yep, an interesting product choice.  Nvidia realizing that the ray tracing/dlss gimmick is exactly that a gimmick.  It is priced to compete with the RX-590.  I do not understand the 6gb memory though.  While 6gb is ok enough for today, what about next year or the year after?   I don't buy a video card every year or two.  I might still recommend the RX-590 just for that reason.
    It's benchmarking at 25% faster than the 590 while having quite a bit less power draw. I don't think the 590's extra 2gb of memory are going to offset that. https://www.techspot.com/review/1797-nvidia-geforce-gtx-1060-ti/
    Ozmodan
  • QuizzicalQuizzical Member LegendaryPosts: 25,353
    Forgrimm said:
    Ozmodan said:
    Yep, an interesting product choice.  Nvidia realizing that the ray tracing/dlss gimmick is exactly that a gimmick.  It is priced to compete with the RX-590.  I do not understand the 6gb memory though.  While 6gb is ok enough for today, what about next year or the year after?   I don't buy a video card every year or two.  I might still recommend the RX-590 just for that reason.
    It's benchmarking at 25% faster than the 590 while having quite a bit less power draw. I don't think the 590's extra 2gb of memory are going to offset that. https://www.techspot.com/review/1797-nvidia-geforce-gtx-1060-ti/
    Good thing that the Radeon RX 590 is a cheaper card.  I'd expect to see it drop in price in response to the new competition, too, as it's a tough sell at $260, but would make a lot more sense at $220.
  • OzmodanOzmodan Member EpicPosts: 9,726
    Forgrimm said:
    Ozmodan said:
    Yep, an interesting product choice.  Nvidia realizing that the ray tracing/dlss gimmick is exactly that a gimmick.  It is priced to compete with the RX-590.  I do not understand the 6gb memory though.  While 6gb is ok enough for today, what about next year or the year after?   I don't buy a video card every year or two.  I might still recommend the RX-590 just for that reason.
    It's benchmarking at 25% faster than the 590 while having quite a bit less power draw. I don't think the 590's extra 2gb of memory are going to offset that. https://www.techspot.com/review/1797-nvidia-geforce-gtx-1060-ti/
    I have not seen any benchmark that has that much difference, most I have seen is 10-15%, I would still take the extra ram.

  • ForgrimmForgrimm Member EpicPosts: 3,059
    Ozmodan said:
    Forgrimm said:
    Ozmodan said:
    Yep, an interesting product choice.  Nvidia realizing that the ray tracing/dlss gimmick is exactly that a gimmick.  It is priced to compete with the RX-590.  I do not understand the 6gb memory though.  While 6gb is ok enough for today, what about next year or the year after?   I don't buy a video card every year or two.  I might still recommend the RX-590 just for that reason.
    It's benchmarking at 25% faster than the 590 while having quite a bit less power draw. I don't think the 590's extra 2gb of memory are going to offset that. https://www.techspot.com/review/1797-nvidia-geforce-gtx-1060-ti/
    I have not seen any benchmark that has that much difference, most I have seen is 10-15%, I would still take the extra ram.

    It's right in the link I posted. Across several games it's running on average 24% faster at 1080p and 25% faster at 1440p.
  • ForgrimmForgrimm Member EpicPosts: 3,059
    Quizzical said:
    Forgrimm said:
    Ozmodan said:
    Yep, an interesting product choice.  Nvidia realizing that the ray tracing/dlss gimmick is exactly that a gimmick.  It is priced to compete with the RX-590.  I do not understand the 6gb memory though.  While 6gb is ok enough for today, what about next year or the year after?   I don't buy a video card every year or two.  I might still recommend the RX-590 just for that reason.
    It's benchmarking at 25% faster than the 590 while having quite a bit less power draw. I don't think the 590's extra 2gb of memory are going to offset that. https://www.techspot.com/review/1797-nvidia-geforce-gtx-1060-ti/
    Good thing that the Radeon RX 590 is a cheaper card.  I'd expect to see it drop in price in response to the new competition, too, as it's a tough sell at $260, but would make a lot more sense at $220.
    At 8% more expensive and 25% faster performance, it's not a bad tradeoff. But yeah, the 590 is going to need to come down in price for it to make sense for anyone to buy it at this point.
  • SlyLoKSlyLoK Member RarePosts: 2,698
    Didn't amd already drop the price of the vega 56 in response to this release? Seems like it would be the card to get if you wanted one right now.
  • ForgrimmForgrimm Member EpicPosts: 3,059
    edited February 2019
    SlyLoK said:
    Didn't amd already drop the price of the vega 56 in response to this release? Seems like it would be the card to get if you wanted one right now.
    I think it's just a temporary drop though. But yes, for someone looking to buy right now it's a good option.
    SlyLoK
  • MendelMendel Member LegendaryPosts: 5,609
    Quizzical said:
    Mendel said:
    Quizzical said:
    <snip>
    The GTX 1660 Ti launching now is also bad news for Nvidia in another sense.  AMD already has a 7 nm GPU out.  Navi is coming, with various rumors putting it in July or October, and AMD promising to say a lot more about it sometime this year.  That Nvidia is launching a new $280 GPU today means that they surely aren't going to have a $280 GPU on 7 nm anytime soon.  If they had a new 7 nm lineup coming by the middle of the year, they'd have canceled this part long ago.  That lends credibility to the rumors that Nvidia won't have anything on 7 nm until 2020.
    I really think you're probably spot on about this.  Releasing an 'improved' 12nm/16nm die now means that there's not anything better in the immediate future from Nvidia in the sub-$300 range.  I just got a 1050 Ti last year.  It looks like it will need to last until 2020 (or beyond) before I will be looking at an upgrade/replacement.  Good breakdown, @Quizical.

    Now, I just really hope those 2020 rumors are accurate and not a 2022 reality.
    If the rumors are accurate and Nvidia doesn't launch any 7 nm GPUs until 2020, that would be bad news.  AMD has already launched a GPU on 7 nm, and has promised plenty more this year.

    This isn't like game development, where an isolated product can readily be delayed for years for reason specific to that product.  It's possible for an entire process node to be delayed, canceled, or terrible enough that everyone ignores it.  But once it works, it works for everyone.  It's not just that Apple launched new iPhones using the process node last year.  AMD has already launched a high-power GPU with a decently big die on it.

    For Nvidia not to launch any 7 nm or smaller GPUs until 2022 would be extremely shocking, and the most plausible way that I could see that happening is if some cataclysmic event destroys the fabs where they were going to build the chips.  For example, a nuclear war.  In that case, product launches being delayed would be the least of our worries.

    In the absence of some earth-shattering event, Nvidia not launching any 7 nm or smaller GPUs until 2022 would be "is Nvidia going out of business?" levels of badness.  That's extremely unlikely to happen so soon.
    Minor correction:  I meant to say 2022 7nm *in the sub-$300 category*.  I'm pretty sure that they will have *no 7 nm by 2022* would indeed be disastrous news.

    Shoulda stayed asleep today.



    Logic, my dear, merely enables one to be wrong with great authority.

  • QuizzicalQuizzical Member LegendaryPosts: 25,353
    SlyLoK said:
    Didn't amd already drop the price of the vega 56 in response to this release? Seems like it would be the card to get if you wanted one right now.
    https://www.newegg.com/Product/ProductList.aspx?Submit=ENE&amp;N=100007709 601302833 4814&amp;IsNodeId=1&amp;bop=And&amp;Order=PRICE&amp;PageSize=36

    That doesn't look like a price cut.  That looks like a discontinued part where the overpriced ones haven't sold out yet.

    A Radeon RX 590 isn't that expensive of a card to build.  AMD could probably make a decent profit selling them for $200 each.  A Radeon RX Vega 56 for $300 might not be so profitable.

    At minimum, AMD is going to want to discontinue and liquidate their Vega cards (with the possible exception of the Radeon VII) in advance of the Navi launch.  AMD surely knows just how good Navi is (or isn't), and likely knows pretty well when it's going to launch by now.
  • ForgrimmForgrimm Member EpicPosts: 3,059
    Quizzical said:
    SlyLoK said:
    Didn't amd already drop the price of the vega 56 in response to this release? Seems like it would be the card to get if you wanted one right now.
    https://www.newegg.com/Product/ProductList.aspx?Submit=ENE&amp;N=100007709 601302833 4814&amp;IsNodeId=1&amp;bop=And&amp;Order=PRICE&amp;PageSize=36

    That doesn't look like a price cut.  That looks like a discontinued part where the overpriced ones haven't sold out yet.

    A Radeon RX 590 isn't that expensive of a card to build.  AMD could probably make a decent profit selling them for $200 each.  A Radeon RX Vega 56 for $300 might not be so profitable.

    At minimum, AMD is going to want to discontinue and liquidate their Vega cards (with the possible exception of the Radeon VII) in advance of the Navi launch.  AMD surely knows just how good Navi is (or isn't), and likely knows pretty well when it's going to launch by now.
    It was reduced to $279 but sold out fast https://www.notebookcheck.net/AMD-briefly-drops-Vega-56-to-US-279-to-take-on-the-GTX-1660-Ti-sells-out-within-hours.410646.0.html
  • QuizzicalQuizzical Member LegendaryPosts: 25,353
    edited February 2019
    Forgrimm said:
    Quizzical said:
    SlyLoK said:
    Didn't amd already drop the price of the vega 56 in response to this release? Seems like it would be the card to get if you wanted one right now.
    https://www.newegg.com/Product/ProductList.aspx?Submit=ENE&amp;N=100007709 601302833 4814&amp;IsNodeId=1&amp;bop=And&amp;Order=PRICE&amp;PageSize=36

    That doesn't look like a price cut.  That looks like a discontinued part where the overpriced ones haven't sold out yet.

    A Radeon RX 590 isn't that expensive of a card to build.  AMD could probably make a decent profit selling them for $200 each.  A Radeon RX Vega 56 for $300 might not be so profitable.

    At minimum, AMD is going to want to discontinue and liquidate their Vega cards (with the possible exception of the Radeon VII) in advance of the Navi launch.  AMD surely knows just how good Navi is (or isn't), and likely knows pretty well when it's going to launch by now.
    It was reduced to $279 but sold out fast https://www.notebookcheck.net/AMD-briefly-drops-Vega-56-to-US-279-to-take-on-the-GTX-1660-Ti-sells-out-within-hours.410646.0.html
    That looks like liquidating your inventory before it becomes too hard to do so, not offering longer term competition at that price point.
    Ridelynn
  • RidelynnRidelynn Member EpicPosts: 7,383
    The web is calling it and AMD Price drop on Vega56.. but seems it was just one SKU from one vendor, and one crappy website that wanted to make a story out of nothing,
Sign In or Register to comment.