Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!




Last Active
Favorite Role
  • Public service announcement: LOL does not stand for "disagree".

    Clicking LOL on someone else's post increases his score just like Agree, Awesome, or Insightful.
  • The endless "illogical" arguments...

    Kopogero said:
    Others are welcome to add their examples.
    Since you asked, I've got a good example:

    "I don't pay money for new games anymore.  This is in protest of how they don't live up to my standards.  Refusing to support anything will get the attention of developers who will throw enormous amounts of money at making games that cater to exactly what I want.  And in particular, they won't dismiss me as not part of the gaming market and ignore me."
  • Welcome to the era of pre-ordered video cards

    Last March, with the launch of the Titan X, Nvidia did something really insidious.  I didn't catch it at the time, but yesterday's GTX 1080 announcement clarified it for me.  When they launched the Titan X, it wasn't just that it was a paper launch.  Nvidia let you buy the card off their web site immediately, with a promise to send it to you when it was available.  That would end up being several weeks later.

    At the time, I thought they were just trying to get out in front of an imminent Fiji launch.  I was mistaken on that, as the Fury X wouldn't launch until June, and wouldn't have wide availability until about September.  Or at least if I was right, then Nvidia was also mistaken on the Fury X launch timing, which probably isn't the case.

    And then came yesterday.  Micron claims to be the first vendor to ramp up production of GDDR5X memory, and announced that they had started sampling it on March 29, 2016.  They hope to enter mass production of GDDR5X this summer.  Hynix and Samsung don't dispute Micron's claim to be first on GDDR5X, but would parry instead that it doesn't matter because HBM2 is better and they're ahead of Micron on HBM2 production.

    Last night, Nvidia announced that the GeForce GTX 1080 will launch in May 27, 2016 and use GDDR5X memory.  Putting the year on both of those dates is necessary because otherwise, one would reasonably assume that they must be different years.  If you really believe that completed video cards with GDDR5X can be widely available less than two months after the first GDDR5X chips started sampling, then you know nothing about where hardware comes from.  It's not going to happen.  Even a six month gap is unlikely.

    So if the May 27 date isn't when you can get a card, then what is it?  Probably a date that you can pre-order a card.  Pay money now and get the card eventually.  The problem is that "eventually" isn't going to be a few days or even a few weeks.  It likely won't even be a few months.  It's going to be long enough that, by the time you get the card, the market will be radically different from when you placed the order.

    From the specs, the GTX 1070 is likely to offer performance in the ballpark of a GTX 980 Ti or a Fury X.  Those are roughly $600 cards, so if you can get that performance in a $449 "Founders Edition" card (yes, Nvidia really called it that) with less power consumption for only $449, what's not to like?  So you shell out your money.

    Then AMD launches Polaris 10, and gives that same performance for $300.  Maybe it's a soft launch, but it will probably have widespread availability before the GTX 1070.  Now you just paid $449 to eventually get a card that you could have had for $300 immediately if only you had waited.  Still such a good deal?

    It's plausible that the GTX 1070 might have wide availability much sooner than the GTX 1080, as it's not waiting on GDDR5X memory.  But the GTX 1080 is a long way off still.  Want to pay $699 later this month to get a card around the end of this year that would be the fastest on the market if you had it today?  By the time you have your card, Vega might well be close--and it's likely that Vega will blow away GTX 1080 performance, and also likely that it will cost under $699.

    Is that pre-order that pays a premium for the fastest card on the market still such a good deal if gets beaten badly a couple months after you get the card?  With Vega and GP100 on the way, it's not likely that a GTX 1080 will cost $699 for all that long after the cards are available.  In fact, it's not hard to see the price dropping below $400 within months.  Or $300.

    The price it takes to get a given level of performance tends to drop as time passes.  If Nvidia can get you to make decisions based on today's prices for a card that will be delivered several months later and be worth much less then, then they can get you to massively overpay.  That's the goal here, and so I hope it flops miserably.

    Just like games that try to get you to pre-purchase or pay for alpha access or Kickstarter or whatever.  Oh wait.  Some gamers apparently love paying now for the hopes of getting something well in the future.  Now I see why Nvidia is going that route, and to be fair, people who pre-order a GTX 1080 will almost certainly get one eventually, which is more than one could say for backing a Kickstarter.

    So my advice is simple.  Buy a card that is in stock when you buy it.  Don't treat "you'll get it eventually" as being the same as "we'll ship it to you today".  Otherwise, AMD will be forced to follow suit and the pre-orders will start coming earlier and earlier, before Nvidia or AMD has any real idea of what the final specs will even be.  Want to choose today which vendor you'll buy a card from in 2018, without waiting to see what the cards are like?  I sure don't.
  • This is the greatest era of PC gaming

    I agree.  1975-2025 has been the greatest era of PC gaming by far.  It's so much better than 1700-1750.
  • Blizzard DDoSers caught by the FEDs

    The punishment should fit the crime.  They should only be allowed to use dialup to get online for the next decade.
  • Have you ever seen this level of tyranny in your gaming life?

    Wait, wait, you're complaining about being banned from an illegal private server?  Shouldn't the fact that you were playing on an illegal private server be enough to justify the ban?
  • In light of huge hype for BDO and shocking truth from players that actually tried it now.

    So you see why I tend to ignore pre-release hype and get around to looking into games some months after launch.  Or years.
  • FCC killed net neutrality. What does it mean for gamers?

    Considering that this only reverts to the rules as they were in early 2015, freaking out only makes sense if you thought the Internet was some dystopian wasteland in 2014 and has gotten massively better since then.

    Either the sky will fall or else it won't.  Most likely, returning to the light-touch regulatory regime that facilitated the rise of the Internet over the course of nearly 20 years preceding the FCC's arbitrary switch to Title II regulations in 2015 will similarly help facilitate future Internet improvements that we don't foresee today.

    But it's also possible that ISPs will commonly roll out abusive and predatory business practices and block legitimate sites that they don't like or some such.  If that happens, then the view that heavier regulation of the Internet is necessary will become prevalent all across the political spectrum rather than the Internet being just another domain in with the left wants more regulation and the right wants fewer.  In that case, heavier regulations will come, hopefully in the form of Congress passing a bill properly authorizing heavier regulations.

    And don't think that Congress is incapable of acting when there's overwhelming public support for an issue.  It's hard to pass laws when half of the public is in favor and half against, and that's by design, but it's much easier to pass laws when there is broad popular support and few people opposed.  For example, consider the CAN-SPAM act of 2003, which passed the Senate unanimously and the House by a vote of 392-5.
    TheDarkrayneEponyxDamorSirAgravaineErgoProxyDecayDaranarlaxieGorweScotbartoni33Bellomoand 6 others.
  • About That Vocal "Minority" Against P2W

    Because nothing measures the true preferences of the mostly indifferent masses like an Internet poll.  Internet polls measure passion, not breadth of support.  Remember all of those Internet polls that gave huge majorities to Ron Paul?

    I don't think there's any real doubt that most of the people with really strong opinions about making item mall goods tradeable are against it.  But that's no way to measure majority opinion.
  • Don't buy the recent reference cards

    There are really several intersecting thoughts here.  I don't think I can structure this post without burying the lede somehow, so let's get the main thoughts up front.

    1.  Whatever happened to PowerTune?
    2.  Why does the reference RX 480 only have a single 6-pin PCI-E connector?
    3.  What if a driver update reduces clock speeds?
    4.  It's a good thing that third-party board partners are around.

    Returning to the title, I'd include the GeForce GTX 1080, GeForce GTX 1070, and Radeon RX 480 in that.  For the GeForce cards, it's really just a question of price.  Do you really want to pay $700 (or $800 or $900, depending on how much you're gouged) for a product that you know will soon be $600?

    But with the RX 480, it's something much worse.  At least Nvidia was up front about pricing, if not timing.  And the delays were incandescently obvious to those who understood the tech.  But I've complained enough about Pascal, so I want to spend most of this post going after Polaris.

    Some reviews noticed the Radeon RX 480 pulling 160 W.  Now, there's nothing wrong with a desktop video card burning 160 W.  But there's something very wrong with a PCI Express card with only a single 6-pin PCI-E power connector burning 160 W.

    The PCI Express slot is rated as being able to deliver 75 W, a 6-pin connector also 75 W, and an 8-pin connector 150 W.  If all you've got is the slot and a single 6-pin, that's 75+75 = 150 W.  Pulling 160 W through that is running something out of spec.

    Now, the Radeon RX 480 isn't the first card to do this.  The GeForce GTX 470 had only two 6-pin PCI-E power connectors, and it routinely pulled more than 225 W.

    No, I didn't just say that Polaris is as bad as Fermi.  But I suspect that the reasons are the same:  someone decided late in the game that the stock clock speed needed to increase.  Rather than going back to the drawing board to beef up power delivery and make a card that could handle it, they just took the cards they had and clocked them higher.

    And the problem is completely fixable simply by adding more power delivery circuitry.  Give the RX 480 a second 6-pin connector and suitable corresponding VRMs and such on the board and you're set.  It's not at all like the GeForce GTX 480 burning 300 W inside a radiator-like cooler that dared you to try frying an egg on it.

    Back in the bad old days, video cards had fixed clock speeds that didn't adjust well for the particular workload, beyond clocking down at idle.  Power viruses (e.g., FurMark, OCCT, or the StarCraft 2 title screen) that pushed a card harder than the company expected could fry things.  But if you throttle clock speeds way back to handle the power viruses, you give up a bunch of gaming performance and people don't buy your cards.

    Fortunately, AMD solved this in 2010 with PowerTune.  It tracks power consumption in real time and throttles back clock speed by just enough to stay inside the desired power envelope.  You don't get performance that obviously tanks as with the severe throttling from overheating.  But you also don't need to know ahead of time everything that can push a card too hard.  AMD demonstrably had it working way back in 2010 on the Radeon HD 6970.

    If you set the PowerTune cap to 150 W, it shouldn't be possible for the card to pull 160 W for thermally significant periods of time.  Did AMD drop PowerTune entirely?  Is it malfunctioning?  Did they increase the PowerTune cap to 160 or 170 W to try to score better reviews?  Isn't it remarkable how these accidents tend to increase performance?

    And now reports are coming in that the out-of-spec power draw from the RX 480 is damaging motherboards.  It's probably only a tiny handful, and probably further restricted to cheap junk motherboards, likely backed by a mediocre or worse power supply, and possibly egged on by power weirdness coming from the wall.  High quality components can handle running a little out of spec.

    But you shouldn't rely on that.  Even if you're going to overclock, you shouldn't run anything other than the component you're overclocking out of spec.  If you want to overclock a CPU to the moon and have it burn 200 W, you should get a motherboard, power supply, case, and cooler that can handle a CPU putting out 200 W so that everything but the CPU itself is running in spec.

    Running things out of spec unnecessarily is bad.  Doing it intentionally on commercial hardware without telling anyone is worse.

    Remember the GeForce GTX 590?  It was a "365 W" card (already outside of the PCI Express specification, but not really bad in a desktop built around it) with two 8-pin PCI-E power connectors.  That means that the rated power delivery was 375 W.  That's quite a lot, but it didn't help that the 365 W TDP was a total lie and the card could easily blow well past 400 W.  Some of them didn't survive the review process.

    No, I didn't just say that Polaris is as bad as Fermi.  But it isn't a good sign that that's the comparison I have to reach for.

    Now, handling 400 W in a two-slot cooler is just plain hard.  AMD finally got it right with the Radeon R9 295 X2 that liquid-cooled them both.  But there's no excuse for not being able to handle 160 W.

    So this is fixable with cards from board partners.  And we should be thankful that AMD and Nvidia let partners such as MSI, Asus, Sapphire, and EVGA design and build cards.  AMD and Nvidia don't always seem competent at it, and the reference cards mentioned above are far from the only clunkers in their histories.  Remember the GeForce FX 5800 "dustbuster"?

    But you know how else the running out of spec problem is fixable?  Throttling back clock speeds more aggressively so that the card doesn't burn more than 150 W.  That can be done with a driver update, and don't be surprised if AMD does exactly that.

    The problem with cutting back clock speeds is that you lose performance.  To stay inside of 150 W, maybe you lose 3% of your performance in this game and 5% in that one.  And people notice lower numbers on bar graphs.  If the performance losses don't come until after reviews are safely up and no one bothers to update them later, then they don't count, right?  After all, the only people who suffer from that are your customers.  They probably won't notice if they lose 3% of their performance, but they'll sure notice hardware failures.

    So companies play various shenanigans to try to win reviews.  Clock higher when you detect a canned benchmark running, and lower when you detect a power usage benchmark running.  Both Nvidia and Intel have on various occasions said "look how fast it is" and "look how low power it is" for a part, trying to imply you could have both at once even though it wasn't even close to true.  Make short-lived, small volume parts like the Radeon X800 XT PE.  And launch just such a part as the claimed competitor to a competitor's real, volume part.  Remember the EVGA GeForce GTX 460 FTW?

    No, I didn't just say that Polaris is as bad as Fermi.  But this is the third time I've had to assert that, and in comparison to four different cards from that architecture.  The problems with the Radeon RX 480 are fixable by beefing up the power delivery, even without changing the clock speed.  Third-party cards will do exactly that, and if history is any guide, probably at MSRP.  Even if it adds $5 to the bill of materials, that shouldn't add $50 to the retail price tag.  So I say, if you want a Radeon RX 480, you should wait for that.  It probably won't be long.