Quizzical

About

Username
Quizzical
Joined
Visits
7,037
Last Active
Roles
Member
Points
3,945
Rank
Epic
Favorite Role
Tank
Posts
18,325
Badges
51
  • OH God ... Can one of these upcoming titles go into beta already?

    I'm sure that some aspiring developer will see this thread and say, you know what, let's push the game into beta early so that someone who made an all caps thread title can play it.  Because people who make all caps thread titles are the most profitable demographic to target.
    JamesGoblinJemAs666Scot
  • Miners make it harder to find budget video cards

    Bitcoin miners haven't bought GPUs to mine since 2011.  Back then, the Radeon HD 6970 and 6950 were the GPUs of choice, and they ignored Nvidia entirely because Nvidia was terrible at integer and logical operations used for bitcoin mining.

    Then bitcoin mining moved to FPGAs, and later to ASICs.  Once you've got ASICs for bitcoin mining, no one cares about GPUs anymore, as they can't compete with ASICs.  The Cayman-based GPUs being hard to find for gaming only lasted a few months.

    The next round of this came in 2014, when litecoin miners bought Radeon R9 290 and 290X cards.  They again ignored Nvidia, as Nvidia GPUs were still terrible at integer and logical operations of the sort used in cryptocurrency hashing functions.

    Then other cryptocurrencies also moved to FPGAs and off of GPUs, and buying GPUs for it again became inefficient.  Those miners buying AMD's Hawaii GPUs also only lasted a few months.  In addition, AMD had ramped up production of the GPUs, leaving them too many GPUs that they intended to sell to miners and no one buying them, so gamers got good deals for quite a while.

    The launch of the GeForce GTX 980 in September 2014 was important here for two reasons.  First, Maxwell was the first Nvidia GPU architecture to be any good at integer and logical operations of the sort used in cryptocurrency mining.  It was also the first Nvidia GPU architecture to be notably good at compute more generally, even apart from cryptocurrency mining, as it fixed a number of things that were blatantly broken in Kepler, and was the first GPU architecture from any vendor not to be finicky about weird things for no good reason.

    Today, it's the ethereum miners who are bothering you.  Ethereum basically decided that bitcoin being mined by ASICs so that some guy in China has half of the world's bitcoin mining capability is a bad thing, and set out to build a cryptocurrency hashing algorithm that you couldn't build an ASIC for.  Rather than computations, Ethereum mining is mostly based on doing random lookups into a large table of around 3 GB or so.  You're not fitting 3 GB of cache onto an ASIC.

    Nvidia's GDDR5X controller chokes on random lookups into global memory, so it's not very good at ethereum mining.  It doesn't take 3 GB to make this happen; even at 16 MB, a GeForce GTX 1080 is substantially slower than a Radeon RX 480.  Nvidia's GPUs with GDDR5 or HBM2 memory controllers handle it fine, but AMD tends to go heavier on memory bandwidth than Nvidia, making AMD GPUs mostly superior here.  The GP100 chip of a Tesla P100 or Quadro GP100 is markedly better at ethereum mining than anything AMD has, but those also cost $6000+, so miners ignore them for cost reasons.

    That ethereum can't be done more efficiently on FPGAs or ASICs means that unlike the previous blips, the ethereum mining disruption has lasted much longer.  The days when it was hard to find a Radeon RX 580 for double its MSRP are past us, but AMD Polaris 10 GPUs got bid up so high by ethereum miners that it was worthwhile to buy a GeForce GTX 1060 instead, even though it wasn't as good at mining, at least if you got a 6 GB version.

    It's not over entirely, but prices are a lot closer to MSRP than they were six months ago.  It's decently likely that eventually people will figure out that mining cryptocurrencies is a bad idea and they'll all be worthless or nearly so, but that could take quite a while.

    It's also notable that the previous mining spikes targeted high end GPUs, while ethereum went after the mid-range.  That was because Nvidia's high end consumer GPUs had a GDDR5X controller that was bad at ethereum mining, while AMD basically didn't have a high end at the time.  So it's kind of a fluke and probably won't be repeated, but it was a nuisance for people who wanted to buy a $200 GPU.
    GorweTillerjonp200someforumguyYaevinduskOctagon7711francis_baudThupliSinsai
  • Major security flaw in ALL intel chip......

    Wizardry said:
    What will be interesting is to see if someone a lawyer will prove they knew of these flaws for years so were knowingly abusing customers and their security.

    This could lead to massive law suits,especially from any firm that can prove being hacked and suffer financial losses.

    My gut feeling,is that it has always been planned that way as part of Microsoft spying and embedding and just all around corrupt ways of doing business.Anyone that can straight up say they trust Microsoft is super naive.

    Definitely more news to come on this by end of month and what happens to all the systems that are basically downgraded because of the hit we will take after the patch?They sell their hardware based on numbers and various marketing schemes,well if all of that if smashed,then again it could create some lawful refunds or again law suits on misleading information but again has to be proved that they knew.

    This leads to another problem and how law screws us over,i bet the most important "to know" employees inside of Microsoft that could leak out information are under strict contracts/oath to not say anything or land in jail.


    If you want to go crazy conspiracy theorist on us, I'd like to request that you at least limit yourself to something coherent and plausible.  Among other issues:

    1)  Why would Microsoft want to create a hardware security flaw in Intel processors?
    2)  How would Microsoft create a security flaw in Intel processors even if they wanted to?
    3)  How would Microsoft manage to make all OSes affected by the flaw, including Linux and OS X?

    It's also extremely implausible that Intel has known of the flaw for years.  If they knew of it several years ago, they'd have fixed it in newer generations of CPUs.
    RoinTorval
  • Apple now requires games with paid loot boxes to publish the odds

    A lot of gamers hate loot boxes.  People have called for various regulations of them, up to and including banning them entirely.  But one of the simplest regulations is to merely require the games that have them to disclose the odds of getting the drop you want.  Apple has apparently done exactly that.

    http://toucharcade.com/2017/12/21/apple-quietly-updated-the-app-store-review-guidelines-to-require-disclosure-of-loot-box-iap-odds/

    Here's the document in question from Apple's web site:

    https://developer.apple.com/app-store/review/guidelines/#in-app-purchase

    And the critical quote:

    "Apps offering “loot boxes” or other mechanisms that provide randomized virtual items for purchase must disclose the odds of receiving each type of item to customers prior to purchase."

    Apple can impose that unilaterally, as they have the power to kick games off of their platform entirely.  Game console manufacturers could do the same thing if so inclined.  I'd like to see that mandated more widely.  While I'm often skeptical of the benefits of additional regulations, merely requiring that companies publish the odds of getting something that they're asking you to pay for a chance at winning without restricting what they can sell is a pretty weak requirement that isn't likely to impose much in the way of compliance costs.

    I'm not saying that nothing beyond mandating disclosure should be done.  But I am saying that the law should at least mandate disclosure of the odds of whatever you're buying a chance at winning.
    TorvalSlapshot1188IselinwingoodScotCalaruilConstantineMeruswandericaUngoodrodingoand 5 others.
  • FCC killed net neutrality. What does it mean for gamers?

    k61977 said:
    There are really only a handful of providers most of the country already.  Comcast and Timewarner pretty much have 2/3 of the country on lockdown.  They bought up most of the competition let them keep there name but all the money goes to them.  Just try going into an area that either of those as the major provider for landline internet and get someone else if you don't live in a major city.  So yeah there is already a huge monopoly in this country for this service.
    I'd submit that one of the most important considerations in any proposed regulations is what will it do to competition.  If you've got multiple good ISP options where you live, then your ability to switch to a competitor will do more to push them to offer you better service than any regulations ever could.

    Let's also not forget that lighter regulations doesn't mean no regulations.  Ajit Pai has said that the main thrust will be informing customers of what you're doing.  It will remain very illegal for an ISP to throttle sites or block sites while claiming that they aren't.  On another net neutrality thread, someone linked to a list of bad things that various ISPs did that they were fined or otherwise sanctioned for and forced to stop before the Title II regulations were implemented in the first place.
    SirAgravaineHulluckCaffynatedAlBQuirky