Quantcast

Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

GeForce GTX 1660 Ti launches: Turing at its most efficient

13

Comments

  • RidelynnRidelynn Member EpicPosts: 7,061
    edited February 2019
    Here's the last thing I'll say about Steam surveys, then I'll let everyone have their last word

    I don't care if your Steam survey was inaccurate, or if only submitted it once 12 years ago, or if you only hit submit will gaming on your ancient Gateway 2000 laptop so you can giggle at uploading bad information. It's still a sizable cross section of an applicable market, and your personal experience with it doesn't dilute that.

    Steam hardware survey was accurate enough to pick up a change in marketshare on AMD/Intel on the CPU side. It seems completely implausible to me that a significant market share jump in GPUs, and you are talking about doubling on the part of AMD, would go completely undetected in the same period of time.

    Mining on PC is done, from my understanding in many markets you can't even break even on electric costs today. Even those people that did mine - a lot of those cards ended up sold as used and are now in gaming rigs (who else would be buying them used?). That hurt both AMD and nVidia, and I believe directly related to the inventory overstock they both have recently discussed in their Earnings statements. So I don't believe ~all~ the AMD marketshare went to just mining rigs and no where else... and I don't believe those that did go to mining rigs are still out there mining away. 

    I could believe that it was all mining marketshare, if we were still seeing mining going strong and those cards were still tied up in mining rigs crunching away. But now that the bubble is popped and a lot of that used inventory has started to flush through the system, we still aren't seeing any change at all in marketshare.

    Now, maybe Steam is wrong... that's certainly plausible, and there's a lot of anecdotal evidence here to suggest that. But I don't believe it would show the CPU change, as small of a fraction and as infrequent as those get upgraded, compared to the GPU upgrade cycle and magnitude of the supposed marketshare change in GPUs.

    If you believe the marketshare has changed - the Seeking Alpha article is a good source, but so far, that's the only one I've seen linked here that indicates as such and it's attempting to extrapolate based on revenue, earnings reports, or other investor-related information (or at least to my understanding, I may be wrong). At least the Steam survey is counting actual physical installations - which I won't claim is entirely accurate but I will say is representative. If you have another source -- Let's see the numbers.
    MendelAmazingAveryTorval
  • AmazingAveryAmazingAvery Age of Conan AdvocateMember UncommonPosts: 7,183
    edited February 2019
    Ozmodan said:
    Ridelynn said:

    The thing with the 1660 Ti is that there are so many variations. With MSRP cards at $280 ranging to mass market cards up to $320-$330 with more premium build quality, more fans, some OC boosting and larger heat sinks for temperature headroom when OC.
    This is true of nearly every GPU though... I mean, just look at RTX 2060 - on Newegg there are 6 different SKUs just from Gigabyte alone, with a $50 spread between them. And there are at least 5 other manufacturers with similar numbers of various SKUs
    True, but so far with the 1660 Ti there are 25 known SKU's, which is pretty nuts! sure some are regional markets but at higher end pricing a RTX 2060 is a better buy.
    Please explain to me how the RTX 2060 is a better buy?  The ray tracing and DLSS on a 2060 is practically useless.  Hence it is not worth one penny more than the 1660.
    Sure - MSRP on the 1660ti is $280 on the 2060 is $350. Those are the lower ends so it varies on what card has your eye ie. Some 1660 Ti models go for $330. Nevertheless there is a basic $70 difference. So that is one piece understood on the other side what about performance? Well @1440p the 1660 Ti is 14% slower over 33 games see below - near the end. At 1080p there is 23% difference according to anandtech.
     
    https://www.techspot.com/review/1799-geforce-gtx-1660-mega-benchmark/

    so excluding RTX features like DLSS and Ray Tracing which I agree are in infancy stages but getting better and will get better on performance alone and the price difference it’s worth considering. As I mentioned if you are looking at a 1660 Ti higher end model at $330 and you compare that to a $350 2060 I’d take the $20 extra for a 2060 and the performance boost with it.




    Post edited by AmazingAvery on



  • xD_GamingxD_Gaming Member EpicPosts: 2,686
    edited February 2019
    makes buying an VII a lot easier. 4096 bit memory bus LOL  compared to nvidia's 256 bit LOL . Just dellusional people still buy dated tech.

    Vega 54  Memory bus = 2048bits ;Bandwidth 409.6 GB/s

    Not even a contest anymore.
    Post edited by xD_Gaming on
    Gdemami
    There is a multiverse inside our minds which millions live.
    Twitter : @xD_Gaming_Merch
    xD Merch : https://bit.ly/2v13MT8
    "Dragons are tilly folly !"
  • AmazingAveryAmazingAvery Age of Conan AdvocateMember UncommonPosts: 7,183
    makes buying an VII a lot easier. 4096 bit memory bus LOL  compared to nvidia's 256 bit LOL . Just dellusional people still buy dated tech.

    Vega 54  Memory bus = 2048bits ;Bandwidth 409.6 GB/s

    Not even a contest anymore.
    Huh?
    There is absolutely no meaningful reason, unless you mine, to get a Radeon VII at MRSP. In AMD's internal review guide numbers, by their own admission a 2080 will still be faster at the same price.



  • RidelynnRidelynn Member EpicPosts: 7,061
    I thought the 2060 was faster than a 1660Ti. 2060 was 1070Ti level, 1660 is 1070 level - or so I thought.

    I mean, yeah, that isn't a very wide margin of performance there, but still, it's also not "the same except for RayTracing and DLSS". And there was about a $70 delta between the 1070 and 1070Ti MSRP (street price is a different matter), so the price difference between a 1660Ti and 2060 is historically appropriate (regardless of how otherwise "appropriate" we may feel about it).

    Also, I can think of a few reasons where very large memory bandwidth is a good benefit. But by and large, memory bandwidth isn't the ultimate indicator of gaming performance, which is what I care about, and the intended purpose of most of these GPU cards we are discussing.

    Apart from that, I can think of one debatable reason to buy a Radeon VII at MSRP over a 2080, and one legit purpose. The legit purpose is 4K video editing, where the 16GB of HBM shines. Here is a review, it focuses on gaming performance, but it mentions video editing specifically if you read the article. If you frequently edit a lot of video, and want to buy something cheaper than a $5,000+ Pro card, the VII is a steal.

    The debatable reason would be a protest purchase. The performance is slower, but only marginally so, than a 2080, so it's not like you are buying something that is entirely inferior or unsuitable... it's still same class of performance. Every R7 purchase sends a pretty clear message to nVidia, and supports AMD. 

    I don't necessarily recommend that - I think you should get the best card, regardless of manufacturer, for the amount you have budgeted. But it's a plausible reason.
    xD_GamingQuizzical
  • xD_GamingxD_Gaming Member EpicPosts: 2,686
    edited February 2019
    makes buying an VII a lot easier. 4096 bit memory bus LOL  compared to nvidia's 256 bit LOL . Just dellusional people still buy dated tech.

    Vega 54  Memory bus = 2048bits ;Bandwidth 409.6 GB/s

    Not even a contest anymore.
    Huh?
    There is absolutely no meaningful reason, unless you mine, to get a Radeon VII at MRSP. In AMD's internal review guide numbers, by their own admission a 2080 will still be faster at the same price.


    Fast "now" , I know no one who builds p.c.'s for "now". When vd@ can't fabricate their designs on 7nm , .. and developers start taking advantage of the HBM and the bandwith it can produce, specifully for the console markets ? ...... like I stated, \/d@ is playing for "now" and they have no real future unless it is licensed through AMD.


    Price point, RX580 / 590 will play most games for 1080p which is where the mid level market sits. for the price of  1 1660 I can crossfire 2 powercolor Red Dragon's with a 99.00 AMD game gift .. lol 1660 is a joke, period.

    https://www.newegg.com/Product/Product.aspx?Item=N82E16814131720&Description=rx580&cm_re=rx580-_-14-131-720-_-Product

    The last part is when \/d@ get caught with unfair trade practices, it will pretty much tank them and Intel will be picking them up to further intels push into the gpu market.
    Post edited by xD_Gaming on
    Gdemami
    There is a multiverse inside our minds which millions live.
    Twitter : @xD_Gaming_Merch
    xD Merch : https://bit.ly/2v13MT8
    "Dragons are tilly folly !"
  • xD_GamingxD_Gaming Member EpicPosts: 2,686
    edited February 2019
    Ridelynn said:
    I thought the 2060 was faster than a 1660Ti. 2060 was 1070Ti level, 1660 is 1070 level - or so I thought.

    I mean, yeah, that isn't a very wide margin of performance there, but still, it's also not "the same except for RayTracing and DLSS". And there was about a $70 delta between the 1070 and 1070Ti MSRP (street price is a different matter), so the price difference between a 1660Ti and 2060 is historically appropriate (regardless of how otherwise "appropriate" we may feel about it).

    Also, I can think of a few reasons where very large memory bandwidth is a good benefit. But by and large, memory bandwidth isn't the ultimate indicator of gaming performance, which is what I care about, and the intended purpose of most of these GPU cards we are discussing.

    Apart from that, I can think of one debatable reason to buy a Radeon VII at MSRP over a 2080, and one legit purpose. The legit purpose is 4K video editing, where the 16GB of HBM shines. Here is a review, it focuses on gaming performance, but it mentions video editing specifically if you read the article. If you frequently edit a lot of video, and want to buy something cheaper than a $5,000+ Pro card, the VII is a steal.

    The debatable reason would be a protest purchase. The performance is slower, but only marginally so, than a 2080, so it's not like you are buying something that is entirely inferior or unsuitable... it's still same class of performance. Every R7 purchase sends a pretty clear message to nVidia, and supports AMD. 

    I don't necessarily recommend that - I think you should get the best card, regardless of manufacturer, for the amount you have budgeted. But it's a plausible reason.
    Multimedia encoding of any kind which includes streaming.

    One other thing, drivers, AMD Adrenline platform is hands down the best set of drivers i've used in all of my computing days. 

    From what I've seen of \/d@ , they are using the same driver formula from 2000....



    RVII out of stock :|  // RTX 2080 instock // RTX 2070 price slashed ......
    Gdemami
    There is a multiverse inside our minds which millions live.
    Twitter : @xD_Gaming_Merch
    xD Merch : https://bit.ly/2v13MT8
    "Dragons are tilly folly !"
  • VrikaVrika Member EpicPosts: 6,437
    Ridelynn said:
    I thought the 2060 was faster than a 1660Ti. 2060 was 1070Ti level, 1660 is 1070 level - or so I thought.

    I mean, yeah, that isn't a very wide margin of performance there, but still, it's also not "the same except for RayTracing and DLSS". And there was about a $70 delta between the 1070 and 1070Ti MSRP (street price is a different matter), so the price difference between a 1660Ti and 2060 is historically appropriate (regardless of how otherwise "appropriate" we may feel about it).

    Also, I can think of a few reasons where very large memory bandwidth is a good benefit. But by and large, memory bandwidth isn't the ultimate indicator of gaming performance, which is what I care about, and the intended purpose of most of these GPU cards we are discussing.

    Apart from that, I can think of one debatable reason to buy a Radeon VII at MSRP over a 2080, and one legit purpose. The legit purpose is 4K video editing, where the 16GB of HBM shines. Here is a review, it focuses on gaming performance, but it mentions video editing specifically if you read the article. If you frequently edit a lot of video, and want to buy something cheaper than a $5,000+ Pro card, the VII is a steal.

    The debatable reason would be a protest purchase. The performance is slower, but only marginally so, than a 2080, so it's not like you are buying something that is entirely inferior or unsuitable... it's still same class of performance. Every R7 purchase sends a pretty clear message to nVidia, and supports AMD. 

    I don't necessarily recommend that - I think you should get the best card, regardless of manufacturer, for the amount you have budgeted. But it's a plausible reason.
    Multimedia encoding of any kind which includes streaming.

    One other thing, drivers, AMD Adrenline platform is hands down the best set of drivers i've used in all of my computing days. 

    From what I've seen of \/d@ , they are using the same driver formula from 2000....



    RVII out of stock :|  // RTX 2080 instock // RTX 2070 price slashed ......
    Newegg lists only two different Radeon VII models, it doesn't look like AMD has any intention of making more than just a few Radeon VII cards.

    There's no reliable data of sales numbers, but if you count Newegg reviews then NVidia has sold 70 RTX 2080 Ti's for each Radeon VII sold by AMD.
     
  • RidelynnRidelynn Member EpicPosts: 7,061
    I"ve had a suspicion that, ever since the day when AMD cranked out way to many Southern Island cards (and ended up with like 3 generations of rebadges), that they've been extremely conservative with production runs.

    You don't see nearly as many different SKUs from AIBs, and you don't see many overstock sales, and you see an awful lot of "sold out".

    xD_GamingTorval
  • QuizzicalQuizzical Member LegendaryPosts: 22,127
    Ozmodan said:
    Ridelynn said:

    The thing with the 1660 Ti is that there are so many variations. With MSRP cards at $280 ranging to mass market cards up to $320-$330 with more premium build quality, more fans, some OC boosting and larger heat sinks for temperature headroom when OC.
    This is true of nearly every GPU though... I mean, just look at RTX 2060 - on Newegg there are 6 different SKUs just from Gigabyte alone, with a $50 spread between them. And there are at least 5 other manufacturers with similar numbers of various SKUs
    True, but so far with the 1660 Ti there are 25 known SKU's, which is pretty nuts! sure some are regional markets but at higher end pricing a RTX 2060 is a better buy.
    Please explain to me how the RTX 2060 is a better buy?  The ray tracing and DLSS on a 2060 is practically useless.  Hence it is not worth one penny more than the 1660.
    Sure - MSRP on the 1660ti is $280 on the 2060 is $350. Those are the lower ends so it varies on what card has your eye ie. Some 1660 Ti models go for $330. Nevertheless there is a basic $70 difference. So that is one piece understood on the other side what about performance? Well @1440p the 1660 Ti is 14% slower over 33 games see below - near the end. At 1080p there is 23% difference according to anandtech.
     
    https://www.techspot.com/review/1799-geforce-gtx-1660-mega-benchmark/

    so excluding RTX features like DLSS and Ray Tracing which I agree are in infancy stages but getting better and will get better on performance alone and the price difference it’s worth considering. As I mentioned if you are looking at a 1660 Ti higher end model at $330 and you compare that to a $350 2060 I’d take the $20 extra for a 2060 and the performance boost with it.
    $350 is in no way a "lower end" card except in a relative sense with much of the new lineup not yet released.  $350 is mid-range to upper mid-range.  $100 is a lower end card.

    DLSS is garbage and will always be garbage.  It's dead, and the only question is whether Nvidia knows it yet.  They probably do, but don't want to let their fanboys in on that secret until they come up with the next gimmick.

    Real-time ray tracing probably has a future.  DLSS doesn't.
  • QuizzicalQuizzical Member LegendaryPosts: 22,127
    makes buying an VII a lot easier. 4096 bit memory bus LOL  compared to nvidia's 256 bit LOL . Just dellusional people still buy dated tech.

    Vega 54  Memory bus = 2048bits ;Bandwidth 409.6 GB/s

    Not even a contest anymore.
    There's no reason to care about memory bus width for its own sake.  What matters is memory bandwidth, capacity, the power it takes to get that, and occasionally space.  Bus width is an input into those things, as is memory clock speed, but it's not something that you should care about in isolation.

    But if you do want to play that game, then the memory on the Radeon VII is clocked at 1 GHz, while that of the GeForce GTX 1060 is clocked at 2 GHz, and 2 GHz is a lot more than 1 GHz.  Right?
  • QuizzicalQuizzical Member LegendaryPosts: 22,127
    makes buying an VII a lot easier. 4096 bit memory bus LOL  compared to nvidia's 256 bit LOL . Just dellusional people still buy dated tech.

    Vega 54  Memory bus = 2048bits ;Bandwidth 409.6 GB/s

    Not even a contest anymore.
    Huh?
    There is absolutely no meaningful reason, unless you mine, to get a Radeon VII at MRSP. In AMD's internal review guide numbers, by their own admission a 2080 will still be faster at the same price.
    There are other compute purposes besides mining.  If you're doing something where you know that you need 16 GB, then the RTX 2080 (or even 2080 Ti) is a non-starter, but the Radeon VII might be quite nice.  Similarly if it's a compute thing that needs a ton of memory bandwidth.

    There are also weird corner cases.  The reason I bought a Vega 64 over a GTX 1080 Ti is the idle power consumption with three 144 Hz monitors attached.  I'm not sure if that is fixed in Turing yet, but if not, the same issue would justify buying a Radeon VII over an RTX 2080 Ti if you're using my monitor setup.

    I would agree that for most gamers, an RTX 2080 makes more sense than a Radeon VII.  But "absolutely no meaningful reason" greatly overstates the case.
    Gdemami
  • QuizzicalQuizzical Member LegendaryPosts: 22,127
    Vrika said:
    Ridelynn said:
    I thought the 2060 was faster than a 1660Ti. 2060 was 1070Ti level, 1660 is 1070 level - or so I thought.

    I mean, yeah, that isn't a very wide margin of performance there, but still, it's also not "the same except for RayTracing and DLSS". And there was about a $70 delta between the 1070 and 1070Ti MSRP (street price is a different matter), so the price difference between a 1660Ti and 2060 is historically appropriate (regardless of how otherwise "appropriate" we may feel about it).

    Also, I can think of a few reasons where very large memory bandwidth is a good benefit. But by and large, memory bandwidth isn't the ultimate indicator of gaming performance, which is what I care about, and the intended purpose of most of these GPU cards we are discussing.

    Apart from that, I can think of one debatable reason to buy a Radeon VII at MSRP over a 2080, and one legit purpose. The legit purpose is 4K video editing, where the 16GB of HBM shines. Here is a review, it focuses on gaming performance, but it mentions video editing specifically if you read the article. If you frequently edit a lot of video, and want to buy something cheaper than a $5,000+ Pro card, the VII is a steal.

    The debatable reason would be a protest purchase. The performance is slower, but only marginally so, than a 2080, so it's not like you are buying something that is entirely inferior or unsuitable... it's still same class of performance. Every R7 purchase sends a pretty clear message to nVidia, and supports AMD. 

    I don't necessarily recommend that - I think you should get the best card, regardless of manufacturer, for the amount you have budgeted. But it's a plausible reason.
    Multimedia encoding of any kind which includes streaming.

    One other thing, drivers, AMD Adrenline platform is hands down the best set of drivers i've used in all of my computing days. 

    From what I've seen of \/d@ , they are using the same driver formula from 2000....



    RVII out of stock :|  // RTX 2080 instock // RTX 2070 price slashed ......
    Newegg lists only two different Radeon VII models, it doesn't look like AMD has any intention of making more than just a few Radeon VII cards.

    There's no reliable data of sales numbers, but if you count Newegg reviews then NVidia has sold 70 RTX 2080 Ti's for each Radeon VII sold by AMD.
    It's a soft launch, as AMD wanted to get the cards out there as soon as they possibly could, rather than waiting a couple of months until they had enough inventory to keep them in stock forever.  There weren't a whole lot of GTX 1080s available shortly after launch, either, but that didn't mean it was a bad card.
  • AmazingAveryAmazingAvery Age of Conan AdvocateMember UncommonPosts: 7,183
    Quizzical said:
    Ozmodan said:
    Ridelynn said:

    The thing with the 1660 Ti is that there are so many variations. With MSRP cards at $280 ranging to mass market cards up to $320-$330 with more premium build quality, more fans, some OC boosting and larger heat sinks for temperature headroom when OC.
    This is true of nearly every GPU though... I mean, just look at RTX 2060 - on Newegg there are 6 different SKUs just from Gigabyte alone, with a $50 spread between them. And there are at least 5 other manufacturers with similar numbers of various SKUs
    True, but so far with the 1660 Ti there are 25 known SKU's, which is pretty nuts! sure some are regional markets but at higher end pricing a RTX 2060 is a better buy.
    Please explain to me how the RTX 2060 is a better buy?  The ray tracing and DLSS on a 2060 is practically useless.  Hence it is not worth one penny more than the 1660.
    Sure - MSRP on the 1660ti is $280 on the 2060 is $350. Those are the lower ends so it varies on what card has your eye ie. Some 1660 Ti models go for $330. Nevertheless there is a basic $70 difference. So that is one piece understood on the other side what about performance? Well @1440p the 1660 Ti is 14% slower over 33 games see below - near the end. At 1080p there is 23% difference according to anandtech.
     
    https://www.techspot.com/review/1799-geforce-gtx-1660-mega-benchmark/

    so excluding RTX features like DLSS and Ray Tracing which I agree are in infancy stages but getting better and will get better on performance alone and the price difference it’s worth considering. As I mentioned if you are looking at a 1660 Ti higher end model at $330 and you compare that to a $350 2060 I’d take the $20 extra for a 2060 and the performance boost with it.
    $350 is in no way a "lower end" card except in a relative sense with much of the new lineup not yet released.  $350 is mid-range to upper mid-range.  $100 is a lower end card.

    DLSS is garbage and will always be garbage.  It's dead, and the only question is whether Nvidia knows it yet.  They probably do, but don't want to let their fanboys in on that secret until they come up with the next gimmick.

    Real-time ray tracing probably has a future.  DLSS doesn't.
    I didn’t say that $350 is low end in the sense you’re saying.
    I was meaning this -
    1660 Ti range $280 (low) to $330 (high)
    2060 range $350 and up
    meant within the ranges of the products.

    I 100% agree $350 is mid range and I’d also add Nvidia is shifting the pricing distribution segments this gen up, and that is not good at all

    As for DLSS it works near perfect on metro exodus. I appreciate the innovation and implementation. It is new take on tech and has to start somewhere then iterate on it. It’s totally a great example of an agile approach, progress over perfection. Get something out the door and iterate on it which is the same approach game devs will have with it. And at least they are fiscally supporting in the tech. 



  • AmazingAveryAmazingAvery Age of Conan AdvocateMember UncommonPosts: 7,183

    Quizzical said:
    makes buying an VII a lot easier. 4096 bit memory bus LOL  compared to nvidia's 256 bit LOL . Just dellusional people still buy dated tech.

    Vega 54  Memory bus = 2048bits ;Bandwidth 409.6 GB/s

    Not even a contest anymore.
    Huh?
    There is absolutely no meaningful reason, unless you mine, to get a Radeon VII at MRSP. In AMD's internal review guide numbers, by their own admission a 2080 will still be faster at the same price.
    There are other compute purposes besides mining.  If you're doing something where you know that you need 16 GB, then the RTX 2080 (or even 2080 Ti) is a non-starter, but the Radeon VII might be quite nice.  Similarly if it's a compute thing that needs a ton of memory bandwidth.

    There are also weird corner cases.  The reason I bought a Vega 64 over a GTX 1080 Ti is the idle power consumption with three 144 Hz monitors attached.  I'm not sure if that is fixed in Turing yet, but if not, the same issue would justify buying a Radeon VII over an RTX 2080 Ti if you're using my monitor setup.

    I would agree that for most gamers, an RTX 2080 makes more sense than a Radeon VII.  But "absolutely no meaningful reason" greatly overstates the case.
    Quizzical said:
    makes buying an VII a lot easier. 4096 bit memory bus LOL  compared to nvidia's 256 bit LOL . Just dellusional people still buy dated tech.

    Vega 54  Memory bus = 2048bits ;Bandwidth 409.6 GB/s

    Not even a contest anymore.
    Huh?
    There is absolutely no meaningful reason, unless you mine, to get a Radeon VII at MRSP. In AMD's internal review guide numbers, by their own admission a 2080 will still be faster at the same price.
    There are other compute purposes besides mining.  If you're doing something where you know that you need 16 GB, then the RTX 2080 (or even 2080 Ti) is a non-starter, but the Radeon VII might be quite nice.  Similarly if it's a compute thing that needs a ton of memory bandwidth.

    There are also weird corner cases.  The reason I bought a Vega 64 over a GTX 1080 Ti is the idle power consumption with three 144 Hz monitors attached.  I'm not sure if that is fixed in Turing yet, but if not, the same issue would justify buying a Radeon VII over an RTX 2080 Ti if you're using my monitor setup.

    I would agree that for most gamers, an RTX 2080 makes more sense than a Radeon VII.  But "absolutely no meaningful reason" greatly overstates the case.
    Sure there are a few niche use cases. I had no idea on the multi monitor set up that is interesting!

    For several generations Adobe video software favours Nvidia And still does. The extra memory is very helpful on the Radeon but the difference in testing is minimal between the two from what I’ve read. Places like VideoCopilot you can read how much more favorable Nvidia has been since Polaris on hardware acceleration.

    Personally I was hoping Radeon VII would be great but it is just to loud and hot and I’m disappointed as expected more especially with the die shrink. But I’m equally disappointed with the RTX high end pricing shenanigans, very.



  • QuizzicalQuizzical Member LegendaryPosts: 22,127
    I didn’t say that $350 is low end in the sense you’re saying.
    I was meaning this -
    1660 Ti range $280 (low) to $330 (high)
    2060 range $350 and up
    meant within the ranges of the products.

    I 100% agree $350 is mid range and I’d also add Nvidia is shifting the pricing distribution segments this gen up, and that is not good at all

    As for DLSS it works near perfect on metro exodus. I appreciate the innovation and implementation. It is new take on tech and has to start somewhere then iterate on it. It’s totally a great example of an agile approach, progress over perfection. Get something out the door and iterate on it which is the same approach game devs will have with it. And at least they are fiscally supporting in the tech. 
    Ah, sorry to misunderstand your point about "lower end" meaning the price range of SKUs for a given card.

    As for DLSS in Metro Exodus, I'd really want to see a comparison of it with traditional upscaling akin to what the video linked from this thread did with Battlefield 5:

    https://forums.mmorpg.com/discussion/479496/apparently-dlss-is-as-bad-as-we-thought-it-would-be#latest

    Depending on the resolution at which a game is rendered for DLSS, it could be tuned for relatively  higher frame rates at worst image quality or lower frame rates at better image quality.  What you have to do is to compare it to simple upscaling at the same frame rate in the same portion of the same game and then compare the image quality.  If DLSS can't beat traditional upscaling in image quality at the same frame rate--in spite of simple upscaling having more samples to work with--then it's garbage.
  • RidelynnRidelynn Member EpicPosts: 7,061
    I just watched a Metro comparison video:

    https://m.youtube.com/watch?v=5JczNqpqwfI&time_continue=2

    RT + DLSS “worked”, but the image was much softer, almost blurry, and the colors muted. And 20-50% FPS hit. I thought the standard rasterized image looked much better.

    but apart from the FPS, everything else is my subjective opinion, and may differ from your own. The tech did work, I didn’t notice any shimmering or other obvious glitches.
  • xD_GamingxD_Gaming Member EpicPosts: 2,686
    I watched some video of ray tracing as well, it looks more like a blurr effect then anything.
    There is a multiverse inside our minds which millions live.
    Twitter : @xD_Gaming_Merch
    xD Merch : https://bit.ly/2v13MT8
    "Dragons are tilly folly !"
  • xD_GamingxD_Gaming Member EpicPosts: 2,686
    edited February 2019
    Quizzical said:
    makes buying an VII a lot easier. 4096 bit memory bus LOL  compared to nvidia's 256 bit LOL . Just dellusional people still buy dated tech.

    Vega 54  Memory bus = 2048bits ;Bandwidth 409.6 GB/s

    Not even a contest anymore.
    There's no reason to care about memory bus width for its own sake.  What matters is memory bandwidth, capacity, the power it takes to get that, and occasionally space.  Bus width is an input into those things, as is memory clock speed, but it's not something that you should care about in isolation.

    But if you do want to play that game, then the memory on the Radeon VII is clocked at 1 GHz, while that of the GeForce GTX 1060 is clocked at 2 GHz, and 2 GHz is a lot more than 1 GHz.  Right?

    I do multimedia encoding aand also work which include After Effects from Adobe, it is a huge advantage to have as much band with when rendering. In your own statement you are look at 2ghz at 256 bit memory bus vs 1ghz at 4096 bit , you think a faster gpu clock  with narrow bit width is better ? Kind of weird thing ti think that you can simply increase gpu clocks forever and not widen the memory bit width or speed. Using ddr 6 is nvidia's answer to the future ? that future look bleak imho.
    TorvalGdemamiAmazingAvery
    There is a multiverse inside our minds which millions live.
    Twitter : @xD_Gaming_Merch
    xD Merch : https://bit.ly/2v13MT8
    "Dragons are tilly folly !"
  • ForgrimmForgrimm Member EpicPosts: 2,995
    Looks like the 1660 and 1650 will be coming out over the next couple of months. Prices are expected to be $229 and $179. https://www.techspot.com/news/78960-nvidia-reportedly-launch-geforce-gtx-1660-1650-march.html
  • OzmodanOzmodan Member EpicPosts: 9,726
    edited February 2019
    Ozmodan said:
    Ridelynn said:

    The thing with the 1660 Ti is that there are so many variations. With MSRP cards at $280 ranging to mass market cards up to $320-$330 with more premium build quality, more fans, some OC boosting and larger heat sinks for temperature headroom when OC.
    This is true of nearly every GPU though... I mean, just look at RTX 2060 - on Newegg there are 6 different SKUs just from Gigabyte alone, with a $50 spread between them. And there are at least 5 other manufacturers with similar numbers of various SKUs
    True, but so far with the 1660 Ti there are 25 known SKU's, which is pretty nuts! sure some are regional markets but at higher end pricing a RTX 2060 is a better buy.
    Please explain to me how the RTX 2060 is a better buy?  The ray tracing and DLSS on a 2060 is practically useless.  Hence it is not worth one penny more than the 1660.
    Sure - MSRP on the 1660ti is $280 on the 2060 is $350. Those are the lower ends so it varies on what card has your eye ie. Some 1660 Ti models go for $330. Nevertheless there is a basic $70 difference. So that is one piece understood on the other side what about performance? Well @1440p the 1660 Ti is 14% slower over 33 games see below - near the end. At 1080p there is 23% difference according to anandtech.
     
    https://www.techspot.com/review/1799-geforce-gtx-1660-mega-benchmark/

    so excluding RTX features like DLSS and Ray Tracing which I agree are in infancy stages but getting better and will get better on performance alone and the price difference it’s worth considering. As I mentioned if you are looking at a 1660 Ti higher end model at $330 and you compare that to a $350 2060 I’d take the $20 extra for a 2060 and the performance boost with it.

    You cannot do ray tracing on a 2060 nor DLSS, unless of course you want FPS in the single digits.  My friend at Microcenter tells me they are selling very few 2060's just does not seem to be a market for them.  They already canceled a backup order for them, the stock is just not moving.  It is a card that fits few buyers.  Few gamers do 1440p.  From what I have seen in benchmarks your percentages are quite high.  I very much doubt there is that much difference. 

     As to the new Metro, I have seen the game with ray tracing and DLSS turned on with my friend's 2080.   First off it majorly kills performance and it looked to both of us the images were more blurry than with it off.  So far my friend has not found a game where turning ray tracing on is worth it.  He is rather disappointed in Nvidia at this point, he got caught up in their gimmick.

    Torval
  • xD_GamingxD_Gaming Member EpicPosts: 2,686
    edited February 2019
    Think the bottom line, \/d@ still doesn't have a comparable card  in any price range. They are throwing mud at the wall at this point.

    Look at the cards they have, the technology is dated, their driver packages are dated and that guy's leather motorcycle jacket is dated. Look for a CEO change in 2019, because Dr. Lisa Su is kicking ass right now :) . <3 Dr. Su 
    Asm0deus
    There is a multiverse inside our minds which millions live.
    Twitter : @xD_Gaming_Merch
    xD Merch : https://bit.ly/2v13MT8
    "Dragons are tilly folly !"
  • RidelynnRidelynn Member EpicPosts: 7,061
    I think both the 1660Ti and 2060 are very competitive for nVidia. 1660Ti forces AMD to move the RX590 and the 2060 forces AMD to move the Vega 56 to be competitive -- at least until they can show Navi off. Sure, I wish the nVidia cards were cheaper, but that's speaking as a consumer. AMD still has the <$250 market wrapped up with the 570/580, but that only lasts so long and even those cards have been out for a while (being minor bump refreshes of the 470/480, Polaris has been around since June 2016).

    If I were an nVidia investor, I would say they are marketed about right, and the only thing i would be concerned about would be that the 2060 includes all the additional cost of RTX without being able to deliver the benefits... you could have put out a chip with the same rasterizing performance, minus RTX, and sold it for about the same price with a lot lower production cost.

    Rumor is that a 1660 and 1650 will come out later in March/April, and hit the upper $100 and lower $200 price points, and that will fill out the lineup for nVidia on this generation.

    I think the 2070 is a ridiculous card standing next to the 2060, and it didn't make a lot of sense in the lineup before the 2060 was announced anyway. I think the 2080 is priced to the upper end of what I would ever consider reasonable for a top tier card, and the 2080Ti/Titan are just out of any ballpark I plan to play in. 

    I think nVidia's overall strategy is to skew the entire GPU cost lineup higher - so that people no longer think of a "budget" GPU as the <$200 market, but rather somewhere north of $300, and the mid and upper tiers significantly higher than that even. I can see that making sense - they have a large majority marketshare so they can push the market around, it's before a major release by AMD, and well before a push by Intel to get into the market. If they can push the price points all higher before the competition can catch up, any resulting "price war" from competition is softened significantly, and margins can stay higher long term.
    Torval
  • TorvalTorval Member LegendaryPosts: 20,003
    edited February 2019
    Ridelynn said:
    I thought the 2060 was faster than a 1660Ti. 2060 was 1070Ti level, 1660 is 1070 level - or so I thought.

    I mean, yeah, that isn't a very wide margin of performance there, but still, it's also not "the same except for RayTracing and DLSS". And there was about a $70 delta between the 1070 and 1070Ti MSRP (street price is a different matter), so the price difference between a 1660Ti and 2060 is historically appropriate (regardless of how otherwise "appropriate" we may feel about it).

    Also, I can think of a few reasons where very large memory bandwidth is a good benefit. But by and large, memory bandwidth isn't the ultimate indicator of gaming performance, which is what I care about, and the intended purpose of most of these GPU cards we are discussing.

    Apart from that, I can think of one debatable reason to buy a Radeon VII at MSRP over a 2080, and one legit purpose. The legit purpose is 4K video editing, where the 16GB of HBM shines. Here is a review, it focuses on gaming performance, but it mentions video editing specifically if you read the article. If you frequently edit a lot of video, and want to buy something cheaper than a $5,000+ Pro card, the VII is a steal.

    The debatable reason would be a protest purchase. The performance is slower, but only marginally so, than a 2080, so it's not like you are buying something that is entirely inferior or unsuitable... it's still same class of performance. Every R7 purchase sends a pretty clear message to nVidia, and supports AMD. 

    I don't necessarily recommend that - I think you should get the best card, regardless of manufacturer, for the amount you have budgeted. But it's a plausible reason.
    Radeon VII on LInux is a reason for those users to choose AMD over nV. Phoronix has extensive benchmarks that put Radeon VII squarely on par with 2080/1080ti and even the 2080ti in some tests. Granted Linux users are an incredibly small gaming market share, but the OS is used extensively in university, science, industry, and server where Windows is the micro minority share.

    Things started loading. Here is the article. https://www.phoronix.com/scan.php?page=article&item=radeon-vii-linux&num=1

    It's probably not interesting to a Windows user, but for Linux users this signals good things ahead for us.
    GdemamiRidelynn
    Fedora - A modern, free, and open source Operating System. https://getfedora.org/

    traveller, interloper, anomaly, iteration


  • TorvalTorval Member LegendaryPosts: 20,003
    Ridelynn said:
    I think both the 1660Ti and 2060 are very competitive for nVidia. 1660Ti forces AMD to move the RX590 and the 2060 forces AMD to move the Vega 56 to be competitive -- at least until they can show Navi off. Sure, I wish the nVidia cards were cheaper, but that's speaking as a consumer. AMD still has the <$250 market wrapped up with the 570/580, but that only lasts so long and even those cards have been out for a while (being minor bump refreshes of the 470/480, Polaris has been around since June 2016).
    <snip>
    Not for me. I'm not paying that much money for a 6GB card, especially one that will struggle to hold its own on 1440p. It's a shitty deal and a 970 all over again in my opinion. The xx70 and xx60 series cards are pretty bunk compared to their previous generation namesakes.
    RidelynnOzmodan
    Fedora - A modern, free, and open source Operating System. https://getfedora.org/

    traveller, interloper, anomaly, iteration


Sign In or Register to comment.