Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Nvidia unveils cut-down GeForce GTX 1060 3GB £190/$199 price-point

The user and all related content has been deleted.

거북이는 목을 내밀 때 안 움직입니다












Comments

  • ForgrimmForgrimm Member EpicPosts: 3,059
    If my 960 dies I'll probably get a 1060 as a replacement. Not sure if I'd go with this one or the 6gb one though. I was impressed with the 1060 6GB version's improvement over the 960: http://www.geforce.com/whats-new/articles/nvidia-geforce-gtx-1060 Not sure how this 3gb version holds up.
  • QuizzicalQuizzical Member LegendaryPosts: 25,348
    I say boo to the naming scheme.  If you make two different cards, they should have two different names.  Different cards with the same name is not cool.

    How long before someone makes a 6 GB version of the 3 GB version of the GTX 1060 that disabled a compute unit, resulting in a card slower than an RX 480, but with the same name as another card typically faster than an RX 480?
  • RidelynnRidelynn Member EpicPosts: 7,383
    It's the GTX460 all over again
  • [Deleted User][Deleted User] Posts: 12,263
    The user and all related content has been deleted.

    거북이는 목을 내밀 때 안 움직입니다












  • QuizzicalQuizzical Member LegendaryPosts: 25,348
    edited August 2016
    For the benefit of those who don't know what Ridelynn is talking about, in 2010, Nvidia released the GeForce GTX 460.  At launch, there were two versions of it:  one with 1 GB of memory and a 256-bit memory bus, and one with 768 MB of memory and a 192-bit memory bus.  People said, get the 1 GB version as it's better, because it has more memory bandwidth and more ROPs.

    So Nvidia later quietly launched a 1 GB version of the GeForce GTX 460 with a 192-bit memory bus.  It had mismatched memory channels, so you could only get the full use of 768 MB, and it also had the ROPs disabled.  But the Internet comments about the difference between the cards focused on memory capacity, not bus width, so people looking for the right GTX 460 to buy could easily buy the newly available wrong one.

    Proof of the claim is here:

    http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-460/specifications

    The "v2" version is the one that came later.

    On that basis, I'm inclined to say, assume that any GTX 1060 you find is the cut down version, even if it has 6 GB, unless you can find that it explicitly says 1280 shaders.  And expect the cut-down version to typically be slower than a Radeon RX 480, which makes the card completely irrelevant unless it's substantially cheaper than an RX 480.
  • filmoretfilmoret Member EpicPosts: 4,906
    So the question is.   NVIDIA is resulting to pulling parlor tricks so they don't lose completely to AMD?
    Are you onto something or just on something?
  • QuizzicalQuizzical Member LegendaryPosts: 25,348
    filmoret said:
    So the question is.   NVIDIA is resulting to pulling parlor tricks so they don't lose completely to AMD?
    Nah, Nvidia is pulling stunts like this because they always pull stunts like this.  They were behind on gaming efficiency in the day of their GTX 460 and GTX 550 Ti stunts, but being ahead at the time didn't stop them from the GTX 970 shenanigans.
  • ForgrimmForgrimm Member EpicPosts: 3,059
    Quizzical said:
    On that basis, I'm inclined to say, assume that any GTX 1060 you find is the cut down version, even if it has 6 GB, unless you can find that it explicitly says 1280 shaders.  And expect the cut-down version to typically be slower than a Radeon RX 480, which makes the card completely irrelevant unless it's substantially cheaper than an RX 480.
    Every version of the 6GB 1060 that I've seen so far has 1280 CUDA cores.
  • QuizzicalQuizzical Member LegendaryPosts: 25,348
    Forgrimm said:
    Quizzical said:
    On that basis, I'm inclined to say, assume that any GTX 1060 you find is the cut down version, even if it has 6 GB, unless you can find that it explicitly says 1280 shaders.  And expect the cut-down version to typically be slower than a Radeon RX 480, which makes the card completely irrelevant unless it's substantially cheaper than an RX 480.
    Every version of the 6GB 1060 that I've seen so far has 1280 CUDA cores.
    Today, yes.  But the basic strategy is that you put out the good parts first to much fanfare, then the cut down stuff shows up at retail quietly later.  Then people mistakenly buy the cut down parts thinking they're getting the higher end parts that they saw in reviews.  Right now, only the top GTX 1060 is available, but give it several months and there will probably be cut down 6 GB versions showing up at retail.

    I'm basically saying, don't buy a GTX 1060 thinking it's going to perform like in the reviews unless you check that particular card that you're about to buy and it very explicitly says 1280 shaders.  Shader count is what you're looking for, not just memory total.

    It's not that the cut down GTX 1060 is a bad card, any more than the RX 470 is a bad card.  I'd be totally fine with Nvidia launching the cut down GTX 1060 and calling it a GTX 1050 or GTX 1055 or some such instead.  But two different cards with the same name is bad.
  • RidelynnRidelynn Member EpicPosts: 7,383
    edited August 2016
    There were way too many different versions of the 460GTX.

    When it first came out, it was an awesome card. It was one of the few times where a pair of low range cards beat the top tier card in performance, and was significantly cheaper to boot (this was back when I still had hope that SLI would go somewhere). It hit a nice price point (about $225 iirc), it was pretty much the only first-gen Fermi chip that didn't suck wholesale in terms of noise/heat/power, and was pretty peppy for it's day. I used to recommend it in a lot of mid-budget builds (at least until AMD released Northern Islands / Barts).

    The first two models were announced roughly together, the 1G version and the little brother, a 768M version. It wasn't exactly hidden, but not widely made known either, that the 768M version wasn't exactly the same chip - it was slightly cut down bin. But no one really cared, everyone who "knew" went out and got the 1G version anyway. But you had to look really closely, as the VRAM wasn't exactly printed on the front of the box very prominently in most cases.

    Then about 6 months later, they released an overclocked 1G edition. It had a lot better clocks than the previous 1G edition (dubbed GTX460 v2), but it got there by using the lower binned chips from the 768M model. And then shortly after that is when the 4th version hit. It was a even further cut down 460, with 1G of RAM. It was called " GTX 460 SE" (fondly referred to as the "Slow Edition"), so technically it did have a different name. But if you weren't paying attention, that SE was written in very small font on the side of the box, but GTX460 was plastered all over the place with the CGI dragon and well-endowed lara croft on the box top, right after the release were touting their new superclocked v2 460's.

    And to top it off, if you got a 460 from an OEM, it had different clocks and chip yet again - it was the lower binned 768M chip, but with 1G of RAM - pretty much the same hardware as the v2, but not the same thing you'd get if you bought a retail 460 1G, and not nearly the same clocks as the v2 had.

    So nVidia had 5 different SKUs all named GTX460. Four of them had 1G of RAM. Only one of them was the one you really wanted.

    I don't really blame nVidia for it - not a lot of consumers really understand what a GPU is in the first place, and the more confusion you can keep in the market, the better for you to make money on it. But it doesn't mean I have to respect them for it. And it's not like AMD is blameless in this either, they have been recently infamous for rebadging, and the same chip will be sold across many different generations with a different name - not the same problem, but one with similar consequences.

    And so the 1060 starts, they release what appears to be a decent midrange card. And then a little while later comes the "not quite" edition. And we'll see if nVidia continues down this same slippery slope, or if this is it for the naming shenanigans (at least for this model).
Sign In or Register to comment.