Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

4k gaming and MMO are pointless.

245

Comments

  • dreamscaperdreamscaper Member UncommonPosts: 1,592
    4K video rendering is going to be fairly poor unless you're using something properly coded to take advantage of the GPU. Otherwise it's like trying to slice bread with a spoon.

    <3

  • QuizzicalQuizzical Member LegendaryPosts: 25,348
    Originally posted by CalmOceans
    Originally posted by Quizzical

    You are confused in many ways.

    First, video decoding is generally done on the GPU, not the CPU.  The CPU load should typically be minimal for video decoding.

    You seem more confused than he is, the 4k video is heavily taxing the CPU.

    Only if the video decode block on the GPU can't handle it.  Which, if he's having problems, it probably can't.  The solution is doing 4K video decode in a way that the GPU can handle it.  That can mean different codecs, different frame rates, and so forth.  It can also mean getting a newer GPU.  I'd be mildly surprised if certain recent GPUs can't do 4K video decode, and very surprised if upcoming ones from AMD and Nvidia can't do it.  But until the industry settles on which encodings and such everyone is going to use (which may have already happened, though I don't know if it has), 4K video decode will be hit and miss as you can't build a video decode block to handle everything that anyone could possibly try.

    Of course, if you're using an older GPU such as Fermi or Northern Islands, then it's extremely unlikely that it can decode 4K video at all.  I'm not sure when AMD and Nvidia started adding 4K video support, though it's likely that it was around the time they started adding 4K monitor support.

  • CalmOceansCalmOceans Member UncommonPosts: 2,437
    Originally posted by Quizzical
    Originally posted by CalmOceans
    Originally posted by Quizzical

    You are confused in many ways.

    First, video decoding is generally done on the GPU, not the CPU.  The CPU load should typically be minimal for video decoding.

    You seem more confused than he is, the 4k video is heavily taxing the CPU.

    Only if the video decode block on the GPU can't handle it.

    It's going to tax every single CPU. Do the test, run https://www.youtube.com/watch?v=Zk9J5xnTVMA in 4k, do CTRL-ALT-DEL and check your CPU. It's going to be taxed heavily.

    That's the decoding on the CPU.

    People severely overestimate what is being hardware accelerated, it's less than you think.

  • krondinkrondin Member UncommonPosts: 106

     < Bad Comedy Day Mwhahaha >

     

     

    4k is Pointless?  You are so RIGHT!  

     

    Just keep on making riding straps, no one is Really going to give up a perfectly good Horse to ride in those new Automobiles!

  • QuizzicalQuizzical Member LegendaryPosts: 25,348
    Originally posted by CalmOceans
    Originally posted by Quizzical
    Originally posted by CalmOceans
    Originally posted by Quizzical

    You are confused in many ways.

    First, video decoding is generally done on the GPU, not the CPU.  The CPU load should typically be minimal for video decoding.

    You seem more confused than he is, the 4k video is heavily taxing the CPU.

    Only if the video decode block on the GPU can't handle it.

    It's going to tax every single CPU. Do the test, run https://www.youtube.com/watch?v=Zk9J5xnTVMA in 4k, do CTRL-ALT-DEL and check your CPU. It's going to be taxed heavily.

    That's the decoding on the CPU.

    People severely overestimate what is being hardware accelerated, it's less than you think.

    Well of course it's not going to use the video decode block on my video card.  I bought my video card in 2009, back when 4K was a long way off and any sort of DisplayPort at all was still so new that virtually no monitors used it.

    But offloading 4K video decode to GPUs is coming, if it's not here already.  It takes time for the industry to coalesce around standards of exactly how the encoding will be done, and then after that, for GPU vendors to implement it in silicon.  I'm not sure how far along that process is, but it's going to finish eventually.

  • QuizzicalQuizzical Member LegendaryPosts: 25,348
    Originally posted by Jemcrystal
    What's 4k?  What does it have to do with gaming?

    4K is shorthand for the monitor resolution 3840x2160.

  • CalmOceansCalmOceans Member UncommonPosts: 2,437
    Originally posted by Quizzical

    It's going to tax every single CPU. Do the test, run https://www.youtube.com/watch?v=Zk9J5xnTVMA in 4k, do CTRL-ALT-DEL and check your CPU. It's going to be taxed heavily.

    That's the decoding on the CPU.

    People severely overestimate what is being hardware accelerated, it's less than you think.

    Well of course it's not going to use the video decode block on my video card.  I bought my video card in 2009, back when 4K was a long way off and any sort of DisplayPort at all was still so new that virtually no monitors used it.

    It's going to tax your CPU regardless of what GPU you have. It has nothing to do with your GPU being old.

    Everyone who tests this video at 4k will see their CPU being taxed the minute they click play, regardless of what GPU they have.

  • kujiikujii Member UncommonPosts: 190
    How something runs on Youtube has nothing to do with how a game will run. Youtube is slow even at 1080. 4k on Vimeo runs much better then Youtube.
  • AxehiltAxehilt Member RarePosts: 10,504
    Originally posted by Kiyoris

    So, I ran a test with that uniengine.

    From resolution 1024*640 to 1920*1057

    that is about 4 times as much.

    It should be close to going from 1080p to 4k.

    My FPS dropped to 1/3rd.

    But if you account for the fact that 1024*640 is still slightly more than a 4th of 1920*1057, then it is about 1/4th.

     

    So, going from 1080p to 4k...I think...should drop your FPS to 1/4th.

     

    I got 66.6 fps on the 1920x1080 fullscreen option.  Running fullscreen might get you quite a bit higher FPS.

    Calling 4k gaming "pointless" is silly.  Technology is always improving, and not everyone has a weak computer. Gaming typically doesn't push the boundaries that hard (you ideally want to run smoothly on at least 80% of your target audience's computers -- and that includes some pretty weak computers.) but that doesn't mean it shouldn't support the players who are at the higher end.

    "What is truly revealing is his implication that believing something to be true is the same as it being true. [continue]" -John Oliver

  • BascolaBascola Member UncommonPosts: 425
    Originally posted by Kiyoris

    eeeeeh,

    I think so. So, I was trying some video on youtube that is 4k, it lags like cazy, on both of my PC. They're very fast, one with an i5, anohter with an i7.

    None of the browsers seem to use the GPU to render, but still, state of the art CPU are not even fast enough to render 4k video. It shows how much extra power you would need to go from 1080p to 4k.

     

    4k would require rendering 4 times as many pixels, to get the same speed as 1080p, you would need 4 times as fast a GPU, 4 times as fast RAM to load 4 times as much texture data, 4 times as much bandwidth.

    If your PC now does 120FPS @ 1080P...it will do 30FPS @ 4k..at best, probably less because you will not even have the bandwidth for it....no sane person is going to accept such a hit in framerate.

    It's neeever gonna happen. At least not in the first couple of years. No one is going to accept 4 times less framerate for a few more pixels.

    My current Rig is as follows:

    • EVGA NV 970 AC2.0 SCC
    • Intel i4790K
    • ASUS P97 Pro Gamer with 32 GB RAM
    • Asus PB287Q LED 4K Monitor
    Latest Game i tested was Dying Light: VeryHigh 30-50 FPS

    I run most games except Metro:LL with 40-60+ FPS. 4K Gaming is absolutely beautiful and if you have not seen it on a real 4K monitor you don't know anything at all. Watching a 4K Video on a 1K screen is a joke.

    Dragon Age Inquisition looks absolutely stunning, so does FFXIV, FarCry4 or AC:Unity. Shadow of Modor will blow your mind. Even older games look fantastic in 4K.

    You can turn off all the shitty AA because you simply don't need it any more. The games look so much crisper and better with AA off at 4K.

     

     

  • BascolaBascola Member UncommonPosts: 425
    Originally posted by CalmOceans

    It's going to tax every single CPU. Do the test, run https://www.youtube.com/watch?v=Zk9J5xnTVMA in 4k, do CTRL-ALT-DEL and check your CPU. It's going to be taxed heavily.

    That's the decoding on the CPU.

    People severely overestimate what is being hardware accelerated, it's less than you think.

    50-57% CPU while 7 Days to Die was ruining in the Background. You people need to clean your PC's from all the malware.

    I had it running on my 4K monitor with the Task Manager on the second monitor.

  • MikehaMikeha Member EpicPosts: 9,196
    Originally posted by CalmOceans
    Originally posted by Quizzical
    Originally posted by CalmOceans
    Originally posted by Quizzical

    You are confused in many ways.

    First, video decoding is generally done on the GPU, not the CPU.  The CPU load should typically be minimal for video decoding.

    You seem more confused than he is, the 4k video is heavily taxing the CPU.

    Only if the video decode block on the GPU can't handle it.

    It's going to tax every single CPU. Do the test, run https://www.youtube.com/watch?v=Zk9J5xnTVMA in 4k, do CTRL-ALT-DEL and check your CPU. It's going to be taxed heavily.

    That's the decoding on the CPU.

    People severely overestimate what is being hardware accelerated, it's less than you think.

     

    CPU 50%-60% at 4K

    CPU 12%-20% at 1080P

    Tested the video in Chrome running an AMD FX 8350 and Radeon 7970GHZ

  • Thomas2006Thomas2006 Member RarePosts: 1,152

    ATM nvidia does not support hardware VP9 decoding.. That youtube video is in VP9 and thus will not be decoded on your graphics hardware

    http://www.phoronix.com/scan.php?page=news_item&px=VP9-Parallel-Decode

    So I don't know how this test is worth anything other then saying modern CPU's can not keep up with 4K video decoding.

    This is exactly why running games at 4k (using your GPU) does not result in the same issues as viewing a VP9 encoded video on youtube that decodes via your CPU.

    Now in you find a h.264 encoded 4k video and view it then it will decode via your GPU and that will provide you with some results. Youtube ever since they switched over to HTML5 uses VP9 for there videos.

  • ReklawReklaw Member UncommonPosts: 6,495
    Originally posted by Kiyoris
    Originally posted by Aori
    What videos are you watching in 4k? 

    https://www.youtube.com/watch?v=Zk9J5xnTVMA

    (make sure you turn on 4k video)

     

    I can see stutter lag on i5 and i7.

    IE, Firefox and Chrome all seem to be using the CPU to render, even though everyone keeps saying the GPU is used to render video, in reality, I rarely see any program outside games use the GPU for much of anything.

    Task manager shows the CPU at 40%-60% load to render the video.

    Same video at 1080P is ONLY 5% load.

     

    So from 1080p to 4k, is from 5% CPU load to, 45%.

    4k in theory is only 4 times as much power required, but it is never that little in reality, you get bandwidth issues, and components causing bottlenecks, etc. Will be the same with gaming.

    Clip runs smooth on my 4k livingroom tv but stutters on my gaming desktop but could it be because my monitor isn't 4k while my pc system can handle 4k?

  • Thomas2006Thomas2006 Member RarePosts: 1,152

    And to put this silly thread to rest.

    http://us.hardware.info/reviews/5609/7/the-best-htpc-platform-for-the-future-video-quality-tested-of-current-cpus-and-gpus-h264-in-ultra-hd-often-good-sometimes-not

    When you actually use hardware decoding in a 4k video your looking at around %5 to 10% cpu usage with silky smooth video playback.

  • KiyorisKiyoris Member RarePosts: 2,130
    Originally posted by VastoHorde
    Originally posted by CalmOceans
    Originally posted by Quizzical
    Originally posted by CalmOceans
    Originally posted by Quizzical

    You are confused in many ways.

    First, video decoding is generally done on the GPU, not the CPU.  The CPU load should typically be minimal for video decoding.

    You seem more confused than he is, the 4k video is heavily taxing the CPU.

    Only if the video decode block on the GPU can't handle it.

    It's going to tax every single CPU. Do the test, run https://www.youtube.com/watch?v=Zk9J5xnTVMA in 4k, do CTRL-ALT-DEL and check your CPU. It's going to be taxed heavily.

    That's the decoding on the CPU.

    People severely overestimate what is being hardware accelerated, it's less than you think.

     

    CPU 50%-60% at 4K

    CPU 12%-20% at 1080P

    Tested the video in Chrome running an AMD FX 8350 and Radeon 7970GHZ

    Same for me, 4k puts a massive amount of load on my CPU? and my GPU is brand new.

    Not sure why the guy called me "confused", when my CPU is loaded at 60% it's using the CPU. Can't be more simple than that.

  • bentrimbentrim Member UncommonPosts: 299
    Yea, dual 680s is nothing to brag about...you don't have enough machine.
  • QuizzicalQuizzical Member LegendaryPosts: 25,348
    Originally posted by Kiyoris
    Originally posted by VastoHorde
    Originally posted by CalmOceans
    Originally posted by Quizzical
    Originally posted by CalmOceans
    Originally posted by Quizzical

    You are confused in many ways.

    First, video decoding is generally done on the GPU, not the CPU.  The CPU load should typically be minimal for video decoding.

    You seem more confused than he is, the 4k video is heavily taxing the CPU.

    Only if the video decode block on the GPU can't handle it.

    It's going to tax every single CPU. Do the test, run https://www.youtube.com/watch?v=Zk9J5xnTVMA in 4k, do CTRL-ALT-DEL and check your CPU. It's going to be taxed heavily.

    That's the decoding on the CPU.

    People severely overestimate what is being hardware accelerated, it's less than you think.

     

    CPU 50%-60% at 4K

    CPU 12%-20% at 1080P

    Tested the video in Chrome running an AMD FX 8350 and Radeon 7970GHZ

    Same for me, 4k puts a massive amount of load on my CPU? and my GPU is brand new.

    Not sure why the guy called me "confused", when my CPU is loaded at 60% it's using the CPU. Can't be more simple than that.

    Just because you've found a site that isn't compatible with your video card and/or video drivers doesn't mean that 4K is pointless.  Likewise, if I find a Linux executable and try to run it on Windows and it doesn't work, that doesn't mean that Windows is pointless.  It just means that you found something incompatible.

    It might be a problem in silicon with an older video decode block not able to handle 4K.  It might be a problem in video drivers with the drivers not yet designed to handle all of the slightly different ways that web sites do things.  It might be a problem with YouTube doing something proprietary and stupid to break compatibility.  It might be more than one of those.  But it's a compatibility problem, not something intrinsically wrong with 4K video.

    What GPU do you have, anyway?

  • BoneserinoBoneserino Member UncommonPosts: 1,768

    The point is, why waste processing power on what is basically an infinitesimal, if not negligible, improvement in graphic quality.

     

    ( This is obviously based on the size of your screen and the distance away that you sit.  Even then the difference will not be large.  The largest improvement will be sitting close to a very big screen.  If this is you then go for it! )

     

    FFA Nonconsentual Full Loot PvP ...You know you want it!!

  • dreamscaperdreamscaper Member UncommonPosts: 1,592
    Originally posted by Boneserino

    The point is, why waste processing power on what is basically an infinitesimal, if not negligible, improvement in graphic quality.

     

    ( This is obviously based on the size of your screen and the distance away that you sit.  Even then the difference will not be large.  The largest improvement will be sitting close to a very big screen.  If this is you then go for it! )

     

    From what I've seen in person, the difference between 4k and 1080p is about the same as the difference between 720p and 1080p. No, it's not huge, but when your monitor is only about 3'-4' from your face, which is the case for anyone sitting at a computer desk, the difference is quite noticeable.

    <3

  • Thomas2006Thomas2006 Member RarePosts: 1,152


    Originally posted by Boneserino
    The point is, why waste processing power on what is basically an infinitesimal, if not negligible, improvement in graphic quality. ( This is obviously based on the size of your screen and the distance away that you sit.  Even then the difference will not be large.  The largest improvement will be sitting close to a very big screen.  If this is you then go for it! ) 


    I'm not sure how your not going to notice nearly 4 times more pixels on the screen. You are going from around two million pixels to eight million pixels.

    Your going to notice that jaggy edges appear alot less if not gone. When things start to make more use of the 4K space your going to notice way more crisp textures. More detail in the textures.

  • BascolaBascola Member UncommonPosts: 425
    Originally posted by dreamscaper
    Originally posted by Boneserino

    The point is, why waste processing power on what is basically an infinitesimal, if not negligible, improvement in graphic quality.

     

    ( This is obviously based on the size of your screen and the distance away that you sit.  Even then the difference will not be large.  The largest improvement will be sitting close to a very big screen.  If this is you then go for it! )

     

    From what I've seen in person, the difference between 4k and 1080p is about the same as the difference between 720p and 1080p. No, it's not huge, but when your monitor is only about 3'-4' from your face, which is the case for anyone sitting at a computer desk, the difference is quite noticeable.

    Absolutely not. You need to really go and see a game or video in 4K compared with one in 1K and you will see that the difference is much higher than from 720 to 1080. You are able to read signs that are far in the distance in 4K while in 1K they are just a blob of pixels.

    It's almost impossible to illustrate for a 1K monitor user but this is a comparison of a distant object in 4k and 1k:

    Original 1K screenshot:

    The highlighted area in 1K and besides it 4k:

    1K:    4K: 

     

  • sk8chalifsk8chalif Member UncommonPosts: 666

    4k video run between 35fps to 45fps with my  Nvidia Geforce GTX 770 OC 2GB and AMD FX 8350. Video looked great on my benQ 27p ,120hz

     

    no really into 4k here,but i saw devin video about why he upload on 4k , for the future, those video will be amazing once everyone get 4k, just like we all got 1080p back then lol

     

     

     

    image
    ~The only opinion that matters is your own.Everything else is just advice,~

  • SomeOldBlokeSomeOldBloke Member UncommonPosts: 2,167
    youtube is crap, or it could be the way it get rendered and uploaded. I can watch two 1080p back to back on one stutters and buffers and the next runs smoothly all the way through.
  • TrionicusTrionicus Member UncommonPosts: 498

    My 2 cents!

    Tried the Road to Machu Picchu link first in chrome with terrible results. Roughly 40% to 60% cpu usage in 4K.

    Then tried it in Firefox @ 4K with silky smooth playback and under 30% cpu usage. Like, super silky baby bottom smooth playback.

     

    /shrug

Sign In or Register to comment.