Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

4k gaming and MMO are pointless.

24567

Comments

  • KiyorisKiyoris Member RarePosts: 2,130

    I'm always amazed how good DVD quality looks still. Physical DVD.

    Yes, it is low resolution, but the fact it is not as compressed, doesn't come from an online source and doesn't have compression artefacts, makes it enjoyable to watch.

    I mean, they talk about 8k video, but I would rather watch DVD quality without artefacts than 8k with artefacts, frame drops and stutter.

  • RzepRzep Member UncommonPosts: 767

    I will let you guys on a little secret, just because Youtube is popular does not mean it is good.

    On the technology side Youtube BLOWS. If a video lags for you don't question your i5s or even i7s or your 970s. Both your CPU and GPU are fine.

    Youtube sucks, end of the discussion. Encoding sucks, uploading sucks, their servers suck. 

  • besteverbestever Member UncommonPosts: 902
    Originally posted by Rzep

    I will let you guys on a little secret, just because Youtube is popular does not mean it is good.

    On the technology side Youtube BLOWS. If a video lags for you don't question your i5s or even i7s or your 970s. Both your CPU and GPU are fine.

    Youtube sucks, end of the discussion. Encoding sucks, uploading sucks, their servers suck. 

    I have to agree. After testing a bunch of other videos I found one that ran pretty good but most are just choppy.

  • Loke666Loke666 Member EpicPosts: 21,441
    Originally posted by bestever
    Originally posted by TheGoblinKing

    No problems on any setting here and i'm running quad i7's with a nvida GTX 580

    Strange as I jumped on my desktop with my I7 with a nvidia 680 and still have the same issue. My monitor is 2560x1080 if I drop it to 1440p runs smooth as butter. Maybe its my internet which is 100mbps so who knows what the deal is.

    I run 2560x1440 and I never get lags unless it is server lag. I mostly play GW2 right now and I still run fine in DEs and PvE when some people in chat complains that they just see a slideshow, and the game is totally maxed out in the graphics settings.

    My speccs: AMd Phenomenah II 3,3ghzx6, Nvidia 780 GTX, 8 gig ram, WIN 7.

    OP is of course right that the higher resolution the better GFX card do you need. And at really high resolutions you actually need a large memory on your GPU as well as a high clock speed on it.

    But in your case you don't have 4K so unless your CPU is really crappy or you are seriously low on ram I would say it is another problem. My guess (assuming you upgrade your drivers now and then) is that there is a bunch of crap on your computer that takes up precious resources.

    My recommendation is to either reinstall Windows and be careful with what programs you install or that you do a serious cleaning, particularly with the programs that starts with windows. You can do that manually or use a good cleaner( I use YAC myself, it is free). And get rid of any malware that mooches on you, most cleaners can do that as well or you can use a specific program made for it like Lavasofts Add aware.

    Also, always have at least 10% free on your C: harddrive.

    I actually recommend OP to do this as well, and anyone who feels that their computer feels slower then when it was new for that matter. A clean PC is a fast one.

  • AkulasAkulas Member RarePosts: 2,538
    Didn't run any better or worse than hd on my machine other than it using 90% of my cpu instead of 5% and i'm only on a GTX660 with 4gig RAM and Intel Core i7-4820K @ 3.7GHZ with 8 cpus.

    This isn't a signature, you just think it is.

  • Loke666Loke666 Member EpicPosts: 21,441
    Originally posted by Rzep

    I will let you guys on a little secret, just because Youtube is popular does not mean it is good.

    On the technology side Youtube BLOWS. If a video lags for you don't question your i5s or even i7s or your 970s. Both your CPU and GPU are fine.

    Youtube sucks, end of the discussion. Encoding sucks, uploading sucks, their servers suck. 

    We are in agreement there. With youtube it is totally random what FPS you get no matter what computer you own.

    And frankly can a 20 year old computer run videos fine, but when you render stuff it is very different.

    If you seriously want to test your computer then you should benchmark it. Here is one acceptable benchmark for free: https://unigine.com/products/heaven/download/

    Give us your score and Ill tell you if your computer is slow or not. :)

  • besteverbestever Member UncommonPosts: 902
    Originally posted by Loke666
    Originally posted by bestever
    Originally posted by TheGoblinKing

    No problems on any setting here and i'm running quad i7's with a nvida GTX 580

    Strange as I jumped on my desktop with my I7 with a nvidia 680 and still have the same issue. My monitor is 2560x1080 if I drop it to 1440p runs smooth as butter. Maybe its my internet which is 100mbps so who knows what the deal is.

    I run 2560x1440 and I never get lags unless it is server lag. I mostly play GW2 right now and I still run fine in DEs and PvE when some people in chat complains that they just see a slideshow, and the game is totally maxed out in the graphics settings.

    My speccs: AMd Phenomenah II 3,3ghzx6, Nvidia 780 GTX, 8 gig ram, WIN 7.

    OP is of course right that the higher resolution the better GFX card do you need. And at really high resolutions you actually need a large memory on your GPU as well as a high clock speed on it.

    But in your case you don't have 4K so unless your CPU is really crappy or you are seriously low on ram I would say it is another problem. My guess (assuming you upgrade your drivers now and then) is that there is a bunch of crap on your computer that takes up precious resources.

    My recommendation is to either reinstall Windows and be careful with what programs you install or that you do a serious cleaning, particularly with the programs that starts with windows. You can do that manually or use a good cleaner( I use YAC myself, it is free). And get rid of any malware that mooches on you, most cleaners can do that as well or you can use a specific program made for it like Lavasofts Add aware.

    Also, always have at least 10% free on your C: harddrive.

    I actually recommend OP to do this as well, and anyone who feels that their computer feels slower then when it was new for that matter. A clean PC is a fast one.

    We're talking about 4k youtube videos not games. I have no issues with my desktop or my steam box. I already play games at 1440p and some at 4k on my steam box as it has a gtx 980 4g. 

    Its just the youtube videos at 4k are really choppy. 

  • besteverbestever Member UncommonPosts: 902
    Originally posted by emperorwings
    Didn't run any better or worse than hd on my machine other than it using 90% of my cpu instead of 5% and i'm only on a GTX660 with 4gig RAM and Intel Core i7-4820K @ 3.7GHZ with 8 cpus.

    Well four cores and four threads. Not really 8 cpu's

  • Loke666Loke666 Member EpicPosts: 21,441
    Originally posted by bestever

    We're talking about 4k youtube videos not games. I have no issues with my desktop or my steam box. I already play games at 1440p and some at 4k on my steam box as it has a gtx 980 4g. 

    Its just the youtube videos at 4k are really choppy. 

    Duh, youtube just sucks, heck manylow  rez vids lags as well. And you ISP can mess it up as well.

  • KiyorisKiyoris Member RarePosts: 2,130

    Loke, I reinstall my OS every couple of months, but I have software with limited keys from Corel and Dassault, so I don't want to just reinstall all the time.

    I'm waiting until windows 10 now probably, in the fall.

  • KiyorisKiyoris Member RarePosts: 2,130

    So, I ran a test with that uniengine.

    From resolution 1024*640 to 1920*1057

    that is about 4 times as much.

    It should be close to going from 1080p to 4k.

    My FPS dropped to 1/3rd.

    But if you account for the fact that 1024*640 is still slightly more than a 4th of 1920*1057, then it is about 1/4th.

     

    So, going from 1080p to 4k...I think...should drop your FPS to 1/4th.

     

  • QuizzicalQuizzical Member LegendaryPosts: 21,287

    You are confused in many ways.

    First, video decoding is generally done on the GPU, not the CPU.  The CPU load should typically be minimal for video decoding.  Furthermore, it's not the entire GPU that is used for video decoding, but a dedicated video decode block.  Because video decoding needs don't vary wildly and the video decode block isn't that big, it's typically exactly the same decode block for all of the cards in a generation, from the top to the bottom.  Thus, if you get a GeForce GTX 650 or a GeForce GTX 780 Ti, you're probably getting exactly the same video decode block.

    Clock speeds can affect its performance a bit, but outside of the very low end (e.g., phones tablets), you get about the same video decode performance no matter what card you get within a generation.  Indeed, I'd expect the GTX 650 to have a little better video decode performance than the GTX 780 Ti due to the higher clock speed, in spite of only offering about 15% as much performance for gaming.  But even if this is the case, there may or may not be any way to make that performance difference matter, even in synthetic benchmarks.

    One thing about fixed function hardware blocks is that, if you give them exactly what they're expecting, performance can be awesome.  This is why it's possible to watch decent resolution videos on a cell phone that doesn't have 1% of the gaming GPU performance of a desktop gaming card.  But if you give them something slightly different from what they're expecting, they completely choke.  The video decode block in video cards is built to handle certain encodings at certain resolutions and frame rates.  If you give it one of the things it is built for, you'll get flawless video decode.  And if you give it something else, bad things can happen--ranging from degraded performance to being completely unable to use the decode block at all.

    -----

    Increasing monitor resolution so that you have to render more pixels per frame does greatly increase the load on video cards.  But if you're just displaying the desktop, this means it goes from inconsequential to several times inconsequential, which is still pretty inconsequential.  The load in games varies wildly, too.  Anything that is playable on a lower end card at 1080p should be playable at the same settings (except the resolution) on a high end card with four times the performance at 4K.

    But quadrupling the number of pixels to draw tends not to quadruple the load on hardware.  Most CPU side code doesn't care about the monitor resolution.  The only real exceptions are the bits of code that determine or let you change the resolution and some culling code to skip drawing things that are known to be off the screen in the current frame.

    Even a lot of GPU code doesn't care about the monitor resolution.  There are five programmable pipeline stages in the modern APIs (six if you count compute shaders, but those can go anywhere, so I'll ignore them).  Four of those five stages neither know nor care what the monitor resolution is, but do the same work carrying out the same computations regardless of the resolution.  Higher resolutions may make them process extra data because less stuff gets culled as being obviously off the screen entirely, but they still have to do vastly more than 1/4 of the work of the higher resolution.  Only pixel/fragment shaders scale linearly with the monitor resolution.

    But you know what else the load on pixel/fragment shaders scales with?  Anti-aliasing.  Running 1080p at 4x SSAA is the same load as running 4K with anti-aliasing off.  So if you play games at 1080p and 4x SSAA, exactly the same hardware is capable of rendering the same game at the same frame rates at 4K with no anti-aliasing.

  • CalmOceansCalmOceans Member UncommonPosts: 2,437
    Originally posted by Quizzical

    You are confused in many ways.

    First, video decoding is generally done on the GPU, not the CPU.  The CPU load should typically be minimal for video decoding.

    You seem more confused than he is, the 4k video is heavily taxing the CPU.

  • BattlerockBattlerock Member CommonPosts: 1,393
    It's time to focus on game quality rather than Graphics.
  • JemcrystalJemcrystal Member UncommonPosts: 1,920
    What's 4k?  What does it have to do with gaming?


  • dreamscaperdreamscaper Member UncommonPosts: 1,592
    4K video rendering is going to be fairly poor unless you're using something properly coded to take advantage of the GPU. Otherwise it's like trying to slice bread with a spoon.

    <3

  • QuizzicalQuizzical Member LegendaryPosts: 21,287
    Originally posted by CalmOceans
    Originally posted by Quizzical

    You are confused in many ways.

    First, video decoding is generally done on the GPU, not the CPU.  The CPU load should typically be minimal for video decoding.

    You seem more confused than he is, the 4k video is heavily taxing the CPU.

    Only if the video decode block on the GPU can't handle it.  Which, if he's having problems, it probably can't.  The solution is doing 4K video decode in a way that the GPU can handle it.  That can mean different codecs, different frame rates, and so forth.  It can also mean getting a newer GPU.  I'd be mildly surprised if certain recent GPUs can't do 4K video decode, and very surprised if upcoming ones from AMD and Nvidia can't do it.  But until the industry settles on which encodings and such everyone is going to use (which may have already happened, though I don't know if it has), 4K video decode will be hit and miss as you can't build a video decode block to handle everything that anyone could possibly try.

    Of course, if you're using an older GPU such as Fermi or Northern Islands, then it's extremely unlikely that it can decode 4K video at all.  I'm not sure when AMD and Nvidia started adding 4K video support, though it's likely that it was around the time they started adding 4K monitor support.

  • CalmOceansCalmOceans Member UncommonPosts: 2,437
    Originally posted by Quizzical
    Originally posted by CalmOceans
    Originally posted by Quizzical

    You are confused in many ways.

    First, video decoding is generally done on the GPU, not the CPU.  The CPU load should typically be minimal for video decoding.

    You seem more confused than he is, the 4k video is heavily taxing the CPU.

    Only if the video decode block on the GPU can't handle it.

    It's going to tax every single CPU. Do the test, run https://www.youtube.com/watch?v=Zk9J5xnTVMA in 4k, do CTRL-ALT-DEL and check your CPU. It's going to be taxed heavily.

    That's the decoding on the CPU.

    People severely overestimate what is being hardware accelerated, it's less than you think.

  • krondinkrondin Member UncommonPosts: 106

     < Bad Comedy Day Mwhahaha >

     

     

    4k is Pointless?  You are so RIGHT!  

     

    Just keep on making riding straps, no one is Really going to give up a perfectly good Horse to ride in those new Automobiles!

  • QuizzicalQuizzical Member LegendaryPosts: 21,287
    Originally posted by CalmOceans
    Originally posted by Quizzical
    Originally posted by CalmOceans
    Originally posted by Quizzical

    You are confused in many ways.

    First, video decoding is generally done on the GPU, not the CPU.  The CPU load should typically be minimal for video decoding.

    You seem more confused than he is, the 4k video is heavily taxing the CPU.

    Only if the video decode block on the GPU can't handle it.

    It's going to tax every single CPU. Do the test, run https://www.youtube.com/watch?v=Zk9J5xnTVMA in 4k, do CTRL-ALT-DEL and check your CPU. It's going to be taxed heavily.

    That's the decoding on the CPU.

    People severely overestimate what is being hardware accelerated, it's less than you think.

    Well of course it's not going to use the video decode block on my video card.  I bought my video card in 2009, back when 4K was a long way off and any sort of DisplayPort at all was still so new that virtually no monitors used it.

    But offloading 4K video decode to GPUs is coming, if it's not here already.  It takes time for the industry to coalesce around standards of exactly how the encoding will be done, and then after that, for GPU vendors to implement it in silicon.  I'm not sure how far along that process is, but it's going to finish eventually.

  • QuizzicalQuizzical Member LegendaryPosts: 21,287
    Originally posted by Jemcrystal
    What's 4k?  What does it have to do with gaming?

    4K is shorthand for the monitor resolution 3840x2160.

  • CalmOceansCalmOceans Member UncommonPosts: 2,437
    Originally posted by Quizzical

    It's going to tax every single CPU. Do the test, run https://www.youtube.com/watch?v=Zk9J5xnTVMA in 4k, do CTRL-ALT-DEL and check your CPU. It's going to be taxed heavily.

    That's the decoding on the CPU.

    People severely overestimate what is being hardware accelerated, it's less than you think.

    Well of course it's not going to use the video decode block on my video card.  I bought my video card in 2009, back when 4K was a long way off and any sort of DisplayPort at all was still so new that virtually no monitors used it.

    It's going to tax your CPU regardless of what GPU you have. It has nothing to do with your GPU being old.

    Everyone who tests this video at 4k will see their CPU being taxed the minute they click play, regardless of what GPU they have.

  • kujiikujii Member UncommonPosts: 190
    How something runs on Youtube has nothing to do with how a game will run. Youtube is slow even at 1080. 4k on Vimeo runs much better then Youtube.
  • AxehiltAxehilt Member RarePosts: 10,504
    Originally posted by Kiyoris

    So, I ran a test with that uniengine.

    From resolution 1024*640 to 1920*1057

    that is about 4 times as much.

    It should be close to going from 1080p to 4k.

    My FPS dropped to 1/3rd.

    But if you account for the fact that 1024*640 is still slightly more than a 4th of 1920*1057, then it is about 1/4th.

     

    So, going from 1080p to 4k...I think...should drop your FPS to 1/4th.

     

    I got 66.6 fps on the 1920x1080 fullscreen option.  Running fullscreen might get you quite a bit higher FPS.

    Calling 4k gaming "pointless" is silly.  Technology is always improving, and not everyone has a weak computer. Gaming typically doesn't push the boundaries that hard (you ideally want to run smoothly on at least 80% of your target audience's computers -- and that includes some pretty weak computers.) but that doesn't mean it shouldn't support the players who are at the higher end.

    "What is truly revealing is his implication that believing something to be true is the same as it being true. [continue]" -John Oliver

  • BascolaBascola Member UncommonPosts: 425
    Originally posted by Kiyoris

    eeeeeh,

    I think so. So, I was trying some video on youtube that is 4k, it lags like cazy, on both of my PC. They're very fast, one with an i5, anohter with an i7.

    None of the browsers seem to use the GPU to render, but still, state of the art CPU are not even fast enough to render 4k video. It shows how much extra power you would need to go from 1080p to 4k.

     

    4k would require rendering 4 times as many pixels, to get the same speed as 1080p, you would need 4 times as fast a GPU, 4 times as fast RAM to load 4 times as much texture data, 4 times as much bandwidth.

    If your PC now does 120FPS @ 1080P...it will do 30FPS @ 4k..at best, probably less because you will not even have the bandwidth for it....no sane person is going to accept such a hit in framerate.

    It's neeever gonna happen. At least not in the first couple of years. No one is going to accept 4 times less framerate for a few more pixels.

    My current Rig is as follows:

    • EVGA NV 970 AC2.0 SCC
    • Intel i4790K
    • ASUS P97 Pro Gamer with 32 GB RAM
    • Asus PB287Q LED 4K Monitor
    Latest Game i tested was Dying Light: VeryHigh 30-50 FPS

    I run most games except Metro:LL with 40-60+ FPS. 4K Gaming is absolutely beautiful and if you have not seen it on a real 4K monitor you don't know anything at all. Watching a 4K Video on a 1K screen is a joke.

    Dragon Age Inquisition looks absolutely stunning, so does FFXIV, FarCry4 or AC:Unity. Shadow of Modor will blow your mind. Even older games look fantastic in 4K.

    You can turn off all the shitty AA because you simply don't need it any more. The games look so much crisper and better with AA off at 4K.

     

     

Sign In or Register to comment.