Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

AMD Vega GPUs will launch "over the next couple of months".

12467

Comments

  • OzmodanOzmodan Member EpicPosts: 9,726
    Unless you have fiber, any 4k streaming from any service is a crap shoot.  Sure you can do it on a week day in non prime time, but evenings or weekends, forget it.  I have tried 4 different ISP's and all fail during these time periods, at some times it degrades to 720p or less.

    So streaming beyond 1080p is highly subject to ISP congestion and they all have it.  So,  I will just laugh at anyone talking about streaming 4k.
    Asm0deus
  • laseritlaserit Member LegendaryPosts: 7,591
    Ozmodan said:
    Unless you have fiber, any 4k streaming from any service is a crap shoot.  Sure you can do it on a week day in non prime time, but evenings or weekends, forget it.  I have tried 4 different ISP's and all fail during these time periods, at some times it degrades to 720p or less.

    So streaming beyond 1080p is highly subject to ISP congestion and they all have it.  So,  I will just laugh at anyone talking about streaming 4k.
    I have no real problems at my location so far. I have had a couple minor hiccups on Netflix but you can't really conclude that it would have been the provider.

    "Be water my friend" - Bruce Lee

  • time007time007 Member UncommonPosts: 1,062
    Quizzical said:
    Late to market..it doesn't matter if this was planned or not. GTX 1070 and 1080 presented great reason to update and they have been out for more than 13 months when Vega arrive.
    For GTX 1080 ti it's 4 months. Will Vega offer enough to 1070, 1080 and 1080 ti that they would update? Or will those wait for Volta? I mean those GP104 and GP102 cards will offer decent performance even after Vega has been launched. Too little to late most people already moved on to Nvidia. AMD should give up on the higher-end of the market, it's pretty obvious that they're shit at it.
    Why would someone who already has a GeForce GTX 1080 Ti care about either Vega or Volta?  One would hope that the useful life of the GTX 1080 Ti will extend past the launch of Navi and whatever comes after Volta.
    So everyone is done with buying a new pc or upgrading it? It is a continuous process. We will have to see if Vega will be anything interesting for the people who are looking for a new pc after it has been released. Hopefully it will start some competition between NVIDIA and AMD in the high end.
    The RX cards are selling fast and when they were released, you heard similar comments 'too little too late'. But now it turns out they sell very well. Finally much needed competition in lower segment.

    I also notice that many here never take pricing into account as reason to buy a card. It is possible that AMD will release cards in between 1070 and 1080 prices. Aiming at buyers for who the 1070/1080 cards were just out of budget.
    yeah man i agree completely.  im waiting for the dust to settle a bit more and see during x-mas what kinda price cuts come up.  as long as these cards keep rolling out among competitors this could result in more bang for your buck, which is good for the consumer.  

    IMPORTANT:  Please keep all replies to my posts about GAMING.  Please no negative or backhanded comments directed at me personally.  If you are going to post a reply that includes how you feel about me, please don't bother replying & just ignore my post instead.  I'm on this forum to talk about GAMING.  Thank you.
  • VrikaVrika Member LegendaryPosts: 7,888
    edited July 2017
    Ozmodan said:
    Unless you have fiber, any 4k streaming from any service is a crap shoot.  Sure you can do it on a week day in non prime time, but evenings or weekends, forget it.  I have tried 4 different ISP's and all fail during these time periods, at some times it degrades to 720p or less.

    So streaming beyond 1080p is highly subject to ISP congestion and they all have it.  So,  I will just laugh at anyone talking about streaming 4k.
    There are countries in Europe and Asia where a connection capable of 4K streaming is available to majority of the population.
     
  • CleffyCleffy Member RarePosts: 6,412
    4k streaming is not that bad. It's basically streaming 4 1080p videos at once. Haven't had an issue streaming 4k personally.
  • RidelynnRidelynn Member EpicPosts: 7,383
    edited July 2017
    What does video card have to do with 4K streaming?

    My TV can stream 4K - it probably has some crappy ARM CPU with bottom of the barrel Imagination graphics cores. My ISP definitely cannot handle 4k, so I've never tried it, but I do know that it's a Netflix-certified 4k stream approved device.

    Are we saying that the GPU can accelerate streaming in some way, or that you need a certain amount of VRAM to perform 4k streaming?

    I assume this is coming from this @AmazingAvery comment:

    Streaming assist - NVIDIA & Netflix Now Previewing 4K Streaming - 4K Netflix support should fully work on all Pascal GPUs with 3GB or more of VRAM (think short term future from now and what the needs will be)


    [Deleted User]Quizzical
  • HrimnirHrimnir Member RarePosts: 2,415
    Ridelynn said:
    What does video card have to do with 4K streaming?

    My TV can stream 4K - it probably has some crappy ARM CPU with bottom of the barrel Imagination graphics cores. My ISP definitely cannot handle 4k, so I've never tried it, but I do know that it's a Netflix-certified 4k stream approved device.

    Are we saying that the GPU can accelerate streaming in some way, or that you need a certain amount of VRAM to perform 4k streaming?

    I assume this is coming from this @AmazingAvery comment:

    Streaming assist - NVIDIA & Netflix Now Previewing 4K Streaming - 4K Netflix support should fully work on all Pascal GPUs with 3GB or more of VRAM (think short term future from now and what the needs will be)


    So basically because of *gasp* you guess it, DRM!!! you can only watch 4k shit with certain GPU's. It has nothing to do with the power of the hardware or any of that shit, its purely a bullshit move from my understanding.

    Anyways, correct me if i'm wrong.

    "The surest way to corrupt a youth is to instruct him to hold in higher esteem those who think alike than those who think differently."

    - Friedrich Nietzsche

  • QuizzicalQuizzical Member LegendaryPosts: 25,355
    Older GPUs probably just don't have a big enough video decode block to handle decoding 4K video.  That's done in fixed-function hardware, so adding more shaders and texture units and such doesn't help.
  • RidelynnRidelynn Member EpicPosts: 7,383
    edited July 2017
    Hrminir has the right of it - nothing to do with hardware, everything to do with DRM. Just about anyone can stream 4k from YouTube if you wanted to, after all.

    https://www.extremetech.com/computing/239860-4k-netflix-finally-coming-pcs-still-probably-cant-watch
    [Deleted User]
  • AmazingAveryAmazingAvery Age of Conan AdvocateMember UncommonPosts: 7,188
    https://hardforum.com/threads/amd-rx-vega-freesync-vs-nvidia-gtx-1080-ti-g-sync-coming-soon.1940447/



    [H]ard|OCP "Just a little tease here as to what is coming this week. RX Vega FreeSync vs. GTX 1080 Ti G-Sync Blind Gaming test video being edited now. All done at my house, with gamers with a couple hundred years of twitch gaming experience. All system UEFI and OS set up by me personally."

    So HardOCP gets a little blind testing in.

    I'm not comfortable with a few things here -
    1. "Because I wanted to. AMD wanted me to use a 1080." - I think that says it all really on the performance of Vega.
    2. This whole marketing spin about comparing costs with FreeSync vs G-Sync and visuals - do you know why? it is because AMD's new display driver coming out that includes something called "Enhanced Sync" which is what basically Nvidia have had to market for a few years now. So cue the marketing spin comparisons. Folks buying the card will already have a monitor and won't specifically be looking to buy a "Freesync one enabled for Enhanced Sync" because it is $300 less than a comparable Nvidia one. Just compare the cards without the monitors.
    3. Oh dear - blower fan this is not good. Vega´s power draw is insane, there is no way that cooler can handle the heat it gives off at a reasonable operating temperature. There is a reason why the Fury X came with a liquid cooler and Vega is even worse in that regard. The Vega FE has the same design as this - but expect clocks to be around 1400Mhz and temps hovering at around 80C with a higher fan profile. I still wouldn't touch a reference design for anything. The custom AIB solutions are what you wait for or get the water cooled version instead. Unfortunately, the AIB solutions are for low end Vega only.... Not being able to reach the cards boost clock, hitting the power limit all the time and power throttling is not an idea set up. That thing is going to be a heater and probably very noisy too if you want boost clocks. The "R" stands for rocket..




  • RidelynnRidelynn Member EpicPosts: 7,383
    Kyle also talks about how AMD was very specific in installing & setting up the Ryzen driver, and wouldn't let him play around with it.

    It reminds me of ... I can't remember if it was the Fury roll out or the Vishera/Bulldozer roll out - where they would put an AMD computer next to an Intel/nVidia computer and try to get people to notice the difference. AMD would be running at like 70-80fps, the IntelVidia machine running at like 100-120fps, on a monitor locked to 60Hz.

    They do have a point - if you can't tell the difference, then what does it really matter. But performance does matter, you won't always be playing today's games, at today's resolutions and refresh rates, or with variable refresh technology available. It especially matters when performance on a given API is ~the~ defining characteristic in a GPU.

    I think we can all infer where the performance metric is going to land, based on this from HardOCP and the previous FE testing - everything is looking like another release identical to Fiji. That being said, budget also matters, and nothing has really been said about that. If you can't tell the difference in performance today, and one setup costs $1200, and the other $900 - that's a big difference.
    [Deleted User]AmazingAvery
  • QuizzicalQuizzical Member LegendaryPosts: 25,355

    1. "Because I wanted to. AMD wanted me to use a 1080." - I think that says it all really on the performance of Vega.


    That makes it sound like AMD thinks Vega will beat a GTX 1080 but lose to a GTX 1080 Ti.  If they thought they'd beat both or lose to both, it's better to have your card compared to the competition's best.  And if it's relying on "they're both fast enough that you can't tell the difference", then you'd want your card to be compared to the opposition's best.

    If a Radeon RX Vega clearly beats a GTX 1080 but loses to a GTX 1080 Ti, that's a loss for AMD, but not the sort of catastrophe that also losing to a GTX 1080 would be.  Being able to sell your top end card as a good value at $600 is not at all similar to having to price it at $400 to be a reasonable value.
  • QuizzicalQuizzical Member LegendaryPosts: 25,355

    2. This whole marketing spin about comparing costs with FreeSync vs G-Sync and visuals - do you know why? it is because AMD's new display driver coming out that includes something called "Enhanced Sync" which is what basically Nvidia have had to market for a few years now. So cue the marketing spin comparisons. Folks buying the card will already have a monitor and won't specifically be looking to buy a "Freesync one enabled for Enhanced Sync" because it is $300 less than a comparable Nvidia one. Just compare the cards without the monitors.


    http://anandtech.com/show/11664/amd-radeon-software-crimson-relive-edition-1772/6

    Enhanced sync is more comparable to the adaptive v-sync that Nvidia introduced several years ago.  It's a nice option to have, but inferior in every way to a proper implementation of adaptive sync.  Both AMD's enhanced sync and Nvidia's adaptive v-sync would be disabled if you're using FreeSync or G-sync.
    [Deleted User]AmazingAvery
  • QuizzicalQuizzical Member LegendaryPosts: 25,355
    https://hardforum.com/threads/amd-rx-vega-freesync-vs-nvidia-gtx-1080-ti-g-sync-coming-soon.1940447/



    [H]ard|OCP "Just a little tease here as to what is coming this week. RX Vega FreeSync vs. GTX 1080 Ti G-Sync Blind Gaming test video being edited now. All done at my house, with gamers with a couple hundred years of twitch gaming experience. All system UEFI and OS set up by me personally."

    So HardOCP gets a little blind testing in.

    I'm not comfortable with a few things here -
    1. "Because I wanted to. AMD wanted me to use a 1080." - I think that says it all really on the performance of Vega.
    2. This whole marketing spin about comparing costs with FreeSync vs G-Sync and visuals - do you know why? it is because AMD's new display driver coming out that includes something called "Enhanced Sync" which is what basically Nvidia have had to market for a few years now. So cue the marketing spin comparisons. Folks buying the card will already have a monitor and won't specifically be looking to buy a "Freesync one enabled for Enhanced Sync" because it is $300 less than a comparable Nvidia one. Just compare the cards without the monitors.
    3. Oh dear - blower fan this is not good. Vega´s power draw is insane, there is no way that cooler can handle the heat it gives off at a reasonable operating temperature. There is a reason why the Fury X came with a liquid cooler and Vega is even worse in that regard. The Vega FE has the same design as this - but expect clocks to be around 1400Mhz and temps hovering at around 80C with a higher fan profile. I still wouldn't touch a reference design for anything. The custom AIB solutions are what you wait for or get the water cooled version instead. Unfortunately, the AIB solutions are for low end Vega only.... Not being able to reach the cards boost clock, hitting the power limit all the time and power throttling is not an idea set up. That thing is going to be a heater and probably very noisy too if you want boost clocks. The "R" stands for rocket..


    Now that I've watched the video, you've buried the lede.  There were 4 people who said that the Vega system was better, and only one that said that the GTX 1080 Ti was better.  Only three people said that one system was worth $300 more than the other, and all three of them preferred the Vega system.  You made it sound like the only question was how much AMD would lose in that comparison by, when they won it outright on performance alone while ignoring cost, in addition to also being cheaper.
  • AmazingAveryAmazingAvery Age of Conan AdvocateMember UncommonPosts: 7,188
    edited July 2017
    ^ I was going off the Reddit thread I had read on the Budapest and the Portland views and feeling mentioned from attendee's that went there.

    The pitch is clear from AMD and it goes like this - Please tell us the difference in your gaming experience - one is on an G-sync monitor and the other FreeSync. And btw the cost difference between the systems can be a few hundred dollars.

    Their focus has been on overall system cost.

    And I can see why (potentially) with this article today - https://www.nordichardware.se/nyheter/radeon-rx-vega-prislapp-sverige.html
    (You'll have to translate to read)

    It's clear VEGA is going to be quite expensive if this turns out true. For me, it's looking like high end VEGA can't fully beat an AIB 1080. Why would you buy a card that performs a little better than 1080 for 1080ti price? And at twice the power? doesn't make sense so will have to wait and see for more info.



  • AmazingAveryAmazingAvery Age of Conan AdvocateMember UncommonPosts: 7,188
    edited July 2017
    Here is HardOCP's write up -

    https://www.hardocp.com/article/2017/07/26/blind_test_rx_vega_freesync_vs_gtx_1080_ti_gsync

    Look there is that pitch again.....  - "When we are asking questions about value in our interviews, this is where the "$300 difference in cost" comes from, however it is a bit more than that. AMD did not tell us what the RX Vega video card's MSRP would be, so we did not consider this in our value question."

    Would rather see no FreeSync or G-Sync and see FastSync Vs Enchanced Sync and on the same monitor.

    Video -




  • QuizzicalQuizzical Member LegendaryPosts: 25,355
    ^ I was going off the Reddit thread I had read on the Budapest and the Portland views and feeling mentioned from attendee's that went there.

    The pitch is clear from AMD and it goes like this - Please tell us the difference in your gaming experience - one is on an G-sync monitor and the other FreeSync. And btw the cost difference between the systems can be a few hundred dollars.

    Their focus has been on overall system cost.

    And I can see why (potentially) with this article today - https://www.nordichardware.se/nyheter/radeon-rx-vega-prislapp-sverige.html
    (You'll have to translate to read)

    It's clear VEGA is going to be quite expensive if this turns out true. For me, it's looking like high end VEGA can't fully beat an AIB 1080. Why would you buy a card that performs a little better than 1080 for 1080ti price? And at twice the power? doesn't make sense so will have to wait and see for more info.
    The test in question has Vega beating a GTX 1080 Ti outright.  Not just a GTX 1080, but a 1080 Ti.  And that's if we ignore cost entirely.  It's possible that that's an outlier, but let's not use it as evidence that Vega is going to be slower than a GTX 1080.

    If a Radeon RX Vega costs $1000, then yeah, that would dampen the value proposition a lot.  But that's what a Frontier Edition card costs, and AMD has said the RX Vega will be a lot cheaper than that.
  • RidelynnRidelynn Member EpicPosts: 7,383
    The issue I have with the HardOCP video - now that it's out and I've been able to watch:

    I think the testing was as thorough as could be allowed - your not going to get anything scientific or concrete in a setting like that. They chose a game at a resolution that wasn't particularly demanding, but it's something.

    The part that I find funny - they kept coming back to the cost delta of $300 - based largely on the difference in price between GSync and Freesync monitors if I'm not mistaken.

    Some people on HardOCP forums think the cost difference between the panels was a good bit more than $300, and that there may be some signaling there of Vega pricing. I don't know about that, but it's interesting speculation.

    My though is, if the value difference of the GPU is driven by your monitor:
    a) Who is buying new monitors every time they buy a new GPU? Most of the time I don't even buy a new monitor when I build a new computer.
    b) It would take near-0 effort for nVidia to just enable Freesync in their driver and totally eliminate that advantage, if that is the advantage that AMD wants to market primarily
    AmazingAvery[Deleted User]
  • QuizzicalQuizzical Member LegendaryPosts: 25,355
    Ridelynn said:
    b) It would take near-0 effort for nVidia to just enable Freesync in their driver and totally eliminate that advantage, if that is the advantage that AMD wants to market primarily
    If Nvidia does that, I would call that progress.  If your competitor is sabotaging their own product, why not make a big fuss about it until they stop doing so?
  • AmazingAveryAmazingAvery Age of Conan AdvocateMember UncommonPosts: 7,188
    I don't like that they chose an AMD favoured benchmark game. It's one they have chosen to use before with cards. I wonder if they demanded it of him.

    Ridelynn said:
    They chose a game at a resolution that wasn't particularly demanding, but it's something.






  • CleffyCleffy Member RarePosts: 6,412
    GSync is better than FreeSync at achieving consistent frames. I don't think that will ever be debatable due to nVidia's hardware based solution. But the perceived benefit vanishes the higher the refresh rate on the monitor.  144hz doesn't matter what adaptive sync you are using. You really don't need it.
    What does matter is price of these monitors, and the more pressure AMD puts on nVidia, the more likely they will be to drop GSync entirely as any new monitor will support FreeSync. We aren't really talking about most of nVidia's proprietary bullshit now because their need was removed from the marketplace. GSync will just be another one of nVidia's proprietary solutions that will disappear.

    On perceived framerate, the goal could be to showoff the benefits of HBM2. When you are doing a side to side comparison at beyond human sight frame rates, it all comes down to latency and micro-stutter. HBM will handle micro stutter better than GDDR5X. Traditionally AMD has handled micro-stutter better until GSync.
  • QuizzicalQuizzical Member LegendaryPosts: 25,355
    edited July 2017
    I don't like that they chose an AMD favoured benchmark game. It's one they have chosen to use before with cards. I wonder if they demanded it of him.

    Ridelynn said:
    They chose a game at a resolution that wasn't particularly demanding, but it's something.

    That's one reason why it might be an outlier.  Still, most DX12 and Vulkan games thus far seem to be fairly pro-AMD, and that's probably the future as APIs go.

    Even so, to beat your competitor's best in a legitimate test with only mild cherry-picking of which game to use sure beats losing to your competitor even with that cherry-picking.  It's not like it was a benchmark of closed-source code written by AMD, of the sort that Nvidia pushes with GameWorks, or in another era, GPU PhysX.

    This could easily be a rerun of Fiji or Ryzen where AMD released early that their product won at certain cherry-picked but otherwise reasonable tests (4K in several games for Fiji, Cinebench for Ryzen), and other benchmarks when the full reviews arrived were somewhat less favorable to AMD's product.  Even so, that heralded Fiji and Ryzen being competitive products.  A similarly modest amount of cherry-picking wouldn't have made Bulldozer look good, and this gives reason to doubt the claims that a Radeon RX Vega was only going to be competitive with a GeForce GTX 1070, which would be a genuine Bulldozer-level fiasco.
  • QuizzicalQuizzical Member LegendaryPosts: 25,355
    Cleffy said:
    HBM will handle micro stutter better than GDDR5X.
    Why do you say that?  The high level, macroscopic phenomenon of micro stutter is so far away from the minutia of memory bandwidth and latency timings that I'd find it awfully hard to trace a casual relationship from the latter to the former even if there were one.
  • AmazingAveryAmazingAvery Age of Conan AdvocateMember UncommonPosts: 7,188
    edited July 2017
    Doom is AMD's strongest game. There's no evidence it beat anything because we don't have raw data. A pointless test due to the game. The problem with Doom is it already runs so fast on even Maxwell that a blind challenge wouldn't show any difference. Using a more taxing game where fps matters would certainly be more realistic since these are the best from both companies. We basically already know it won’t be competing with Nvidia's high end and they will have to rely on subjective tests to make them seem competitive. The raw performance won’t be there.

    It's interesting that AMD seems to be very intent on hiding the actual performance of Vega. All of this subjective nonsense. Smoke and mirrors.

    The conclusion is really it's running at vulkan, which AMD can get up to 40% increament, while NV can get up to 5% and it plays better on AMD FreeSync.  It seems 1 or 2 guys thought that the Vega system was a bit slower or dipped in FPS compared to the 1080ti, but overall the message was that they couldn't IMMEDIATELY sense any marginal differences between the two systems. Was that FreeSync's Monitor's doing or was it the VEGA card itself? That is why I find this approach worthless and a marketing spin. 

    What in the hell is really going on with Vega?
    pantaro



  • QuizzicalQuizzical Member LegendaryPosts: 25,355
    We'll find out when it launches.  In the meantime, don't pre-order one, for the same reasons as all other pre-release hardware.
    Hrimnir
Sign In or Register to comment.