Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

4k gaming and MMO are pointless.

KiyorisKiyoris Member RarePosts: 2,130

eeeeeh,

I think so. So, I was trying some video on youtube that is 4k, it lags like cazy, on both of my PC. They're very fast, one with an i5, anohter with an i7.

None of the browsers seem to use the GPU to render, but still, state of the art CPU are not even fast enough to render 4k video. It shows how much extra power you would need to go from 1080p to 4k.

 

4k would require rendering 4 times as many pixels, to get the same speed as 1080p, you would need 4 times as fast a GPU, 4 times as fast RAM to load 4 times as much texture data, 4 times as much bandwidth.

If your PC now does 120FPS @ 1080P...it will do 30FPS @ 4k..at best, probably less because you will not even have the bandwidth for it....no sane person is going to accept such a hit in framerate.

It's neeever gonna happen. At least not in the first couple of years. No one is going to accept 4 times less framerate for a few more pixels.

«1345

Comments

  • IncomparableIncomparable Member UncommonPosts: 1,138

    Consider the 970 fiasco, I am not sure if the developers are selling low frame 4k video for PC rendering.

    but I haven't heard this issue before. Video rendering is just more pixels. Video games have pixels behind pixels, polygons which makes things 3d, and in game physics.

    so if people are playing in 4k already I'm assuming watching a vid is a lot more doable. They obviously need the top end hard ware. Also I believe they use g-sync monitors which help with bad frame rates.

     

    edit;,found this clip of bf3 in 4k

     

    the specs are i5 at 3.3 ghz with two 680s. Runs at 30fps.

    video runs at 30 fps ( I mean for a DVD player mostly ), so for watching vids that's fine even though I prefer more than 30even for vids.

     

    edit 2;

    most YouTube vids run at 30 fps.

    http://www.extremetech.com/gaming/185454-will-60fps-youtube-videos-force-game-developers-to-prioritize-frame-rate

    “Write bad things that are done to you in sand, but write the good things that happen to you on a piece of marble”

  • KiyorisKiyoris Member RarePosts: 2,130
    Originally posted by Aori
    What videos are you watching in 4k? 

    https://www.youtube.com/watch?v=Zk9J5xnTVMA

    (make sure you turn on 4k video)

     

    I can see stutter lag on i5 and i7.

    IE, Firefox and Chrome all seem to be using the CPU to render, even though everyone keeps saying the GPU is used to render video, in reality, I rarely see any program outside games use the GPU for much of anything.

    Task manager shows the CPU at 40%-60% load to render the video.

    Same video at 1080P is ONLY 5% load.

     

    So from 1080p to 4k, is from 5% CPU load to, 45%.

    4k in theory is only 4 times as much power required, but it is never that little in reality, you get bandwidth issues, and components causing bottlenecks, etc. Will be the same with gaming.

  • KiyorisKiyoris Member RarePosts: 2,130
    Originally posted by bestever
    That comes down to your internet and youtube

    can't be because the video loads faster than I can play it

  • KiyorisKiyoris Member RarePosts: 2,130
    Originally posted by bestever
    I did a test with tera at 4k and it ran pretty good. That's with a i3 and a gtx 980.

    How much of a frame hit do you get from 1080p to 4k though.

  • ferdmertferdmert Member Posts: 10
    I have no problem playing or viewing at 4k.  Sounds like your PC isn't up to par for it. 
  • ReizlaReizla Member RarePosts: 4,092
    Originally posted by Kiyoris
    Originally posted by Aori
    What videos are you watching in 4k? 

    https://www.youtube.com/watch?v=Zk9J5xnTVMA

    (make sure you turn on 4k video)

     

    I can see stutter lag on i5 and i7.

    IE, Firefox and Chrome all seem to be using the CPU to render, even though everyone keeps saying the GPU is used to render video, in reality, I rarely see any program outside games use the GPU for much of anything.

    Task manager shows the CPU at 40%-60% load to render the video.

    Same video at 1080P is ONLY 5% load.

     

    So from 1080p to 4k, is from 5% CPU load to, 45%.

    4k in theory is only 4 times as much power required, but it is never that little in reality, you get bandwidth issues, and components causing bottlenecks, etc. Will be the same with gaming.

    Did a playback with 4K settings on my 1080p monitor and I experienced little to no lag in FireFox. Next I ran that same video with 4K settings through Vivaldi (new browser from the makers of Opera) and there I saw a lot of stuttering obn the playback.

    I think 4K resolutions are a bit too early for gaming yet. I agree, the technology is already there to play games in 4K, but the average gaming PC just can't handle it. Perhaps in 3 years from now it'll become more common...

  • KiyorisKiyoris Member RarePosts: 2,130
    Originally posted by ferdmert
    I have no problem playing or viewing at 4k.  Sounds like your PC isn't up to par for it. 

    I'd believe you, if there weren't hundreds of comments of ppl with brand new i7 lagging too.

  • KiyorisKiyoris Member RarePosts: 2,130

    maybe it's the compression, the CPU is probably handling decompression

    I don't have any RAW 4k video, but maybe that wouldn't lag since it wouldn't be compressed

    it would require far more bandwidth but far less CPU power

  • evilizedevilized Member UncommonPosts: 576
    You guys do realize that unless you have a 4k tv/monitor, running anything at 4k is completely pointless, right?
  • TheGoblinKingTheGoblinKing Member UncommonPosts: 208

    No problems on any setting here and i'm running quad i7's with a nvida GTX 580

     

  • CalmOceansCalmOceans Member UncommonPosts: 2,437

    There are times the video lags for me and times when it doesn't. It didn't lag the third time I played it.

    Although I have to say, it is seriously taxing on my system. I don't feel like using 4k video for the mere fact it makes my computer's temperature skyrocket. It doesn't even get that hot when playing games.

  • KiyorisKiyoris Member RarePosts: 2,130
    Originally posted by evilized
    You guys do realize that unless you have a 4k tv/monitor, running anything at 4k is completely pointless, right?

    I know, but it will still render 4k, so you can still see how your PC handles 4k.

  • KiyorisKiyoris Member RarePosts: 2,130
    Originally posted by Aori

    I'm pretty sure this is more of a software related problem.

    I mean Netflix and Amazon are delivering UHD streaming to shitboxes for goodness sake. 

    Yes but the bitrate of those videos is abysmal.  Netflix 1080p is barely better than DVD quality sometimes.

    Some people are saying that some 4k video on youtube lags and some doesn't. My guess is the ones that don't lag is 4k at really low bitrate.

  • KiyorisKiyoris Member RarePosts: 2,130

    I'm always amazed how good DVD quality looks still. Physical DVD.

    Yes, it is low resolution, but the fact it is not as compressed, doesn't come from an online source and doesn't have compression artefacts, makes it enjoyable to watch.

    I mean, they talk about 8k video, but I would rather watch DVD quality without artefacts than 8k with artefacts, frame drops and stutter.

  • RzepRzep Member UncommonPosts: 767

    I will let you guys on a little secret, just because Youtube is popular does not mean it is good.

    On the technology side Youtube BLOWS. If a video lags for you don't question your i5s or even i7s or your 970s. Both your CPU and GPU are fine.

    Youtube sucks, end of the discussion. Encoding sucks, uploading sucks, their servers suck. 

  • Loke666Loke666 Member EpicPosts: 21,441
    Originally posted by bestever
    Originally posted by TheGoblinKing

    No problems on any setting here and i'm running quad i7's with a nvida GTX 580

    Strange as I jumped on my desktop with my I7 with a nvidia 680 and still have the same issue. My monitor is 2560x1080 if I drop it to 1440p runs smooth as butter. Maybe its my internet which is 100mbps so who knows what the deal is.

    I run 2560x1440 and I never get lags unless it is server lag. I mostly play GW2 right now and I still run fine in DEs and PvE when some people in chat complains that they just see a slideshow, and the game is totally maxed out in the graphics settings.

    My speccs: AMd Phenomenah II 3,3ghzx6, Nvidia 780 GTX, 8 gig ram, WIN 7.

    OP is of course right that the higher resolution the better GFX card do you need. And at really high resolutions you actually need a large memory on your GPU as well as a high clock speed on it.

    But in your case you don't have 4K so unless your CPU is really crappy or you are seriously low on ram I would say it is another problem. My guess (assuming you upgrade your drivers now and then) is that there is a bunch of crap on your computer that takes up precious resources.

    My recommendation is to either reinstall Windows and be careful with what programs you install or that you do a serious cleaning, particularly with the programs that starts with windows. You can do that manually or use a good cleaner( I use YAC myself, it is free). And get rid of any malware that mooches on you, most cleaners can do that as well or you can use a specific program made for it like Lavasofts Add aware.

    Also, always have at least 10% free on your C: harddrive.

    I actually recommend OP to do this as well, and anyone who feels that their computer feels slower then when it was new for that matter. A clean PC is a fast one.

  • AkulasAkulas Member RarePosts: 3,006
    Didn't run any better or worse than hd on my machine other than it using 90% of my cpu instead of 5% and i'm only on a GTX660 with 4gig RAM and Intel Core i7-4820K @ 3.7GHZ with 8 cpus.

    This isn't a signature, you just think it is.

  • Loke666Loke666 Member EpicPosts: 21,441
    Originally posted by Rzep

    I will let you guys on a little secret, just because Youtube is popular does not mean it is good.

    On the technology side Youtube BLOWS. If a video lags for you don't question your i5s or even i7s or your 970s. Both your CPU and GPU are fine.

    Youtube sucks, end of the discussion. Encoding sucks, uploading sucks, their servers suck. 

    We are in agreement there. With youtube it is totally random what FPS you get no matter what computer you own.

    And frankly can a 20 year old computer run videos fine, but when you render stuff it is very different.

    If you seriously want to test your computer then you should benchmark it. Here is one acceptable benchmark for free: https://unigine.com/products/heaven/download/

    Give us your score and Ill tell you if your computer is slow or not. :)

  • Loke666Loke666 Member EpicPosts: 21,441
    Originally posted by bestever

    We're talking about 4k youtube videos not games. I have no issues with my desktop or my steam box. I already play games at 1440p and some at 4k on my steam box as it has a gtx 980 4g. 

    Its just the youtube videos at 4k are really choppy. 

    Duh, youtube just sucks, heck manylow  rez vids lags as well. And you ISP can mess it up as well.

  • KiyorisKiyoris Member RarePosts: 2,130

    Loke, I reinstall my OS every couple of months, but I have software with limited keys from Corel and Dassault, so I don't want to just reinstall all the time.

    I'm waiting until windows 10 now probably, in the fall.

  • KiyorisKiyoris Member RarePosts: 2,130

    So, I ran a test with that uniengine.

    From resolution 1024*640 to 1920*1057

    that is about 4 times as much.

    It should be close to going from 1080p to 4k.

    My FPS dropped to 1/3rd.

    But if you account for the fact that 1024*640 is still slightly more than a 4th of 1920*1057, then it is about 1/4th.

     

    So, going from 1080p to 4k...I think...should drop your FPS to 1/4th.

     

  • QuizzicalQuizzical Member LegendaryPosts: 25,351

    You are confused in many ways.

    First, video decoding is generally done on the GPU, not the CPU.  The CPU load should typically be minimal for video decoding.  Furthermore, it's not the entire GPU that is used for video decoding, but a dedicated video decode block.  Because video decoding needs don't vary wildly and the video decode block isn't that big, it's typically exactly the same decode block for all of the cards in a generation, from the top to the bottom.  Thus, if you get a GeForce GTX 650 or a GeForce GTX 780 Ti, you're probably getting exactly the same video decode block.

    Clock speeds can affect its performance a bit, but outside of the very low end (e.g., phones tablets), you get about the same video decode performance no matter what card you get within a generation.  Indeed, I'd expect the GTX 650 to have a little better video decode performance than the GTX 780 Ti due to the higher clock speed, in spite of only offering about 15% as much performance for gaming.  But even if this is the case, there may or may not be any way to make that performance difference matter, even in synthetic benchmarks.

    One thing about fixed function hardware blocks is that, if you give them exactly what they're expecting, performance can be awesome.  This is why it's possible to watch decent resolution videos on a cell phone that doesn't have 1% of the gaming GPU performance of a desktop gaming card.  But if you give them something slightly different from what they're expecting, they completely choke.  The video decode block in video cards is built to handle certain encodings at certain resolutions and frame rates.  If you give it one of the things it is built for, you'll get flawless video decode.  And if you give it something else, bad things can happen--ranging from degraded performance to being completely unable to use the decode block at all.

    -----

    Increasing monitor resolution so that you have to render more pixels per frame does greatly increase the load on video cards.  But if you're just displaying the desktop, this means it goes from inconsequential to several times inconsequential, which is still pretty inconsequential.  The load in games varies wildly, too.  Anything that is playable on a lower end card at 1080p should be playable at the same settings (except the resolution) on a high end card with four times the performance at 4K.

    But quadrupling the number of pixels to draw tends not to quadruple the load on hardware.  Most CPU side code doesn't care about the monitor resolution.  The only real exceptions are the bits of code that determine or let you change the resolution and some culling code to skip drawing things that are known to be off the screen in the current frame.

    Even a lot of GPU code doesn't care about the monitor resolution.  There are five programmable pipeline stages in the modern APIs (six if you count compute shaders, but those can go anywhere, so I'll ignore them).  Four of those five stages neither know nor care what the monitor resolution is, but do the same work carrying out the same computations regardless of the resolution.  Higher resolutions may make them process extra data because less stuff gets culled as being obviously off the screen entirely, but they still have to do vastly more than 1/4 of the work of the higher resolution.  Only pixel/fragment shaders scale linearly with the monitor resolution.

    But you know what else the load on pixel/fragment shaders scales with?  Anti-aliasing.  Running 1080p at 4x SSAA is the same load as running 4K with anti-aliasing off.  So if you play games at 1080p and 4x SSAA, exactly the same hardware is capable of rendering the same game at the same frame rates at 4K with no anti-aliasing.

  • CalmOceansCalmOceans Member UncommonPosts: 2,437
    Originally posted by Quizzical

    You are confused in many ways.

    First, video decoding is generally done on the GPU, not the CPU.  The CPU load should typically be minimal for video decoding.

    You seem more confused than he is, the 4k video is heavily taxing the CPU.

  • BattlerockBattlerock Member CommonPosts: 1,393
    It's time to focus on game quality rather than Graphics.
  • JemcrystalJemcrystal Member UncommonPosts: 1,984
    What's 4k?  What does it have to do with gaming?


Sign In or Register to comment.