Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

4k gaming and MMO are pointless.

135

Comments

  • BoneserinoBoneserino Member UncommonPosts: 1,768
    Originally posted by Bascola
    Originally posted by dreamscaper
    Originally posted by Boneserino

    The point is, why waste processing power on what is basically an infinitesimal, if not negligible, improvement in graphic quality.

     

    ( This is obviously based on the size of your screen and the distance away that you sit.  Even then the difference will not be large.  The largest improvement will be sitting close to a very big screen.  If this is you then go for it! )

     

    From what I've seen in person, the difference between 4k and 1080p is about the same as the difference between 720p and 1080p. No, it's not huge, but when your monitor is only about 3'-4' from your face, which is the case for anyone sitting at a computer desk, the difference is quite noticeable.

    Absolutely not. You need to really go and see a game or video in 4K compared with one in 1K and you will see that the difference is much higher than from 720 to 1080. You are able to read signs that are far in the distance in 4K while in 1K they are just a blob of pixels.

    It's almost impossible to illustrate for a 1K monitor user but this is a comparison of a distant object in 4k and 1k:

    Original 1K screenshot:

    The highlighted area in 1K and besides it 4k:

    1K:    4K: 

     

    Good article here:   http://www.cnet.com/news/why-ultra-hd-4k-tvs-are-still-stupid/

     

    Its obviously better in the right situations.   Just not that many situations.   Its another shiny new toy for the TV makers to take more of your money with.  Like the article states, its coming and it will be a standard.  Just might want to wait before jumping in, to save some money.

     

     

    FFA Nonconsentual Full Loot PvP ...You know you want it!!

  • BascolaBascola Member UncommonPosts: 425
    Originally posted by Boneserino
    Originally posted by Bascola
    Originally posted by dreamscaper
    Originally posted by Boneserino

    The point is, why waste processing power on what is basically an infinitesimal, if not negligible, improvement in graphic quality.

     

    ( This is obviously based on the size of your screen and the distance away that you sit.  Even then the difference will not be large.  The largest improvement will be sitting close to a very big screen.  If this is you then go for it! )

     

    From what I've seen in person, the difference between 4k and 1080p is about the same as the difference between 720p and 1080p. No, it's not huge, but when your monitor is only about 3'-4' from your face, which is the case for anyone sitting at a computer desk, the difference is quite noticeable.

    Absolutely not. You need to really go and see a game or video in 4K compared with one in 1K and you will see that the difference is much higher than from 720 to 1080. You are able to read signs that are far in the distance in 4K while in 1K they are just a blob of pixels.

    It's almost impossible to illustrate for a 1K monitor user but this is a comparison of a distant object in 4k and 1k:

    Original 1K screenshot:

    The highlighted area in 1K and besides it 4k:

    1K:    4K: 

     

    Good article here:   http://www.cnet.com/news/why-ultra-hd-4k-tvs-are-still-stupid/

     

    Its obviously better in the right situations.   Just not that many situations.   Its another shiny new toy for the TV makers to take more of your money with.  Like the article states, its coming and it will be a standard.  Just might want to wait before jumping in, to save some money.

     

    I sit about 3 feet away from my monitor and i can see a big difference between 1k and 4k. That article is talking about huge TV screens and very far viewing distances. The difference in a normal PC working/gaming environment is huge compared to a living room setup.

  • AkulasAkulas Member RarePosts: 3,004
    Yeah i'm a nub with computers and was going off the dxdiag program. I assume it said 8 instead of 4 because each core can take 2 threads instead of 1 so it acts like 8 cores. So, 4 cpus would be correct based on that.

    This isn't a signature, you just think it is.

  • QuizzicalQuizzical Member LegendaryPosts: 25,347
    Originally posted by Boneserino

    Good article here:   http://www.cnet.com/news/why-ultra-hd-4k-tvs-are-still-stupid/

     

    Its obviously better in the right situations.   Just not that many situations.   Its another shiny new toy for the TV makers to take more of your money with.  Like the article states, its coming and it will be a standard.  Just might want to wait before jumping in, to save some money.

    On whatever your current gaming rig is, try turning anti-aliasing off and on.  See if you can tell the difference--and if having anti-aliasing on looks better.  If anti-aliasing makes your game look better, then a higher monitor resolution would make it look better yet.  After all, the entire point of anti-aliasing is to partially (but not entirely!) compensate for the monitor resolution not being as high as would be beneficial.

    I'm not sure what the largest resolution that people would benefit from is, but it's a whole lot higher than 4K.  The human eye has about 125 million rods and cones to detect light at particular points, which could be read as pointing toward 125 million pixels as being the maximum useful.  But they're not distributed uniformly; they're far more densely directed toward whatever you're focusing on than things far off in your peripheral vision.

  • BascolaBascola Member UncommonPosts: 425
    Originally posted by Quizzical
    Originally posted by Boneserino

    Good article here:   http://www.cnet.com/news/why-ultra-hd-4k-tvs-are-still-stupid/

    Its obviously better in the right situations.   Just not that many situations.   Its another shiny new toy for the TV makers to take more of your money with.  Like the article states, its coming and it will be a standard.  Just might want to wait before jumping in, to save some money.

    On whatever your current gaming rig is, try turning anti-aliasing off and on.  See if you can tell the difference--and if having anti-aliasing on looks better.  If anti-aliasing makes your game look better, then a higher monitor resolution would make it look better yet.  After all, the entire point of anti-aliasing is to partially (but not entirely!) compensate for the monitor resolution not being as high as would be beneficial.

    I'm not sure what the largest resolution that people would benefit from is, but it's a whole lot higher than 4K.  The human eye has about 125 million rods and cones to detect light at particular points, which could be read as pointing toward 125 million pixels as being the maximum useful.  But they're not distributed uniformly; they're far more densely directed toward whatever you're focusing on than things far off in your peripheral vision.

    I turn AA off when i play my games in 4K. Not only does it cost a lot of GPU cycles, it also is not needed at this resolution. At the distance i am sitting i can see Aliasing only in very rare cases when there is an extremely high contrast area. Even then it is so small that you do not notice it if you are not looking for it.

  • H0urg1assH0urg1ass Member EpicPosts: 2,380

    If I need to turn AA off while using three monitors on current generation graphics hardware, then the technology just isn't ready for gaming yet.

    I'm what I like to call myself a "middle adopter".  I never snatch up the new technology while it's still in it's infancy and I don't wait until it's at the end of it's lifespan either.  

    In other words, I see myself picking up three 4k monitors in about 2-3 years when video cards have caught up to the demands of it and the monitors themselves have been tried and tested.

    I mean ffs, I'm just NOW, with current generation graphics cards, able to play Skyrim on three 24" monitors with Ugrid set to 7, nearly 5GB of texture mods and full-on ENB.  "Upgrading" to 4k monitors would put me back at square one.  No thanks, I'm enjoying what I have now.

  • CalmOceansCalmOceans Member UncommonPosts: 2,437
    Originally posted by Bascola

     

    The highlighted area in 1K and besides it 4k:

    1K:    4K: 

     

    Zooming in on objects is a pointless argument. You can argue you need 8k and 16k and 200000k...as long as you keep zooming in.

    The argument makes no sense.

     

    Here is a video of ppl watching 1080P vs 4K.

    Same content. Same brand. Same environment.

    *Most people had to guess what they were watching

    *Most people didn't think the difference was significant

    *Most people didn't think spending the money on 4k would be worth it

    https://www.youtube.com/watch?v=pzw1D9dU6ts

     

  • CalmOceansCalmOceans Member UncommonPosts: 2,437
    Originally posted by Boneserino

    The point is, why waste processing power on what is basically an infinitesimal, if not negligible, improvement in graphic quality.

    Exactly, you're trading a massive amount of power for a pixel density advantage that is insignificant in most situations.

    I'll take smooth FPS and more detail in my games than a few more pixels.

    I also don't like the idea of stressing my computer and heating it up for an insignificant advantage. The higher the temperature of your CPU and GPU, the more prone it is to failing, especially when gates are a few nm wide. It's just stupid to stress your machine like that if you don't gain a significant advantage, and you don't with 4k.

    The tradeoff you have to make with 4k makes absolutely no sense.

     

  • nariusseldonnariusseldon Member EpicPosts: 27,775
    Originally posted by CalmOceans

    Here is a video of ppl watching 1080P vs 4K.

    Same content. Same brand. Same environment.

    *Most people had to guess what they were watching

    *Most people didn't think the difference was significant

    *Most people didn't think spending the money on 4k would be worth it

    https://www.youtube.com/watch?v=pzw1D9dU6ts

     

    I am not surprise.

    Personally i have seen 4K TV in a show room .. and I can't tell the difference unless i am like 5 inches away from the screen. It is a lot of marketing with very little real benefit, at least to me. So i am not paying good money for it.

     

  • QuizzicalQuizzical Member LegendaryPosts: 25,347
    Originally posted by Bascola
    Originally posted by Quizzical
    Originally posted by Boneserino

    Good article here:   http://www.cnet.com/news/why-ultra-hd-4k-tvs-are-still-stupid/

    Its obviously better in the right situations.   Just not that many situations.   Its another shiny new toy for the TV makers to take more of your money with.  Like the article states, its coming and it will be a standard.  Just might want to wait before jumping in, to save some money.

    On whatever your current gaming rig is, try turning anti-aliasing off and on.  See if you can tell the difference--and if having anti-aliasing on looks better.  If anti-aliasing makes your game look better, then a higher monitor resolution would make it look better yet.  After all, the entire point of anti-aliasing is to partially (but not entirely!) compensate for the monitor resolution not being as high as would be beneficial.

    I'm not sure what the largest resolution that people would benefit from is, but it's a whole lot higher than 4K.  The human eye has about 125 million rods and cones to detect light at particular points, which could be read as pointing toward 125 million pixels as being the maximum useful.  But they're not distributed uniformly; they're far more densely directed toward whatever you're focusing on than things far off in your peripheral vision.

    I turn AA off when i play my games in 4K. Not only does it cost a lot of GPU cycles, it also is not needed at this resolution. At the distance i am sitting i can see Aliasing only in very rare cases when there is an extremely high contrast area. Even then it is so small that you do not notice it if you are not looking for it.

    I'm not arguing against your approach.  My argument is that if anti-aliasing makes a game look better to you, then a higher monitor resolution with sufficient GPU power to drive it would look better yet.  And if the only difference you can tell from anti-aliasing is lower frame rates, then your monitor resolution is high enough and there's no need to go with anything higher.

  • BascolaBascola Member UncommonPosts: 425
    Originally posted by CalmOceans
    Originally posted by Bascola

     

    The highlighted area in 1K and besides it 4k:

    1K:    4K: 

     

    Zooming in on objects is a pointless argument. You can argue you need 8k and 16k and 200000k...as long as you keep zooming in.

    The argument makes no sense.

    I never said anything about zooming in, I said you can read the signs in the distance in 4K while you can't in 1K. WITHOUT ZOOMING. The argument is the 4x higher detail.

    Here is a video of ppl watching 1080P vs 4K.

    Same content. Same brand. Same environment.

    *Most people had to guess what they were watching

    *Most people didn't think the difference was significant

    *Most people didn't think spending the money on 4k would be worth it

    https://www.youtube.com/watch?v=pzw1D9dU6ts

     

    They watched Spiderman 1K Bluray LMAO. It was mastered in 4K but it is not 4K. What is really pointless is watching a 1K Bluray on a 4K display. The movie is not encoded in 4K so how would you see a difference?

     

    This is a gaming site, who cares about movies and videos.

    You obviously never seen a game in 4K. The difference is anything but small.

  • CalmOceansCalmOceans Member UncommonPosts: 2,437
    Originally posted by Bascola

    1K:    4K: 

     

    WITHOUT ZOOMING. The argument is the 4x higher detail.

    Don't lie, I'm not stupid.

    This is the actual pixel size of that part of your image cropped.

    This is what people would see on their screen. Not the picture you showed zoomed in 4 times.

    A lot smaller than you who zoomed in on the picture.

    I'm not from yesterday.

    Your argument is completely flawed.

    You get pixelaton because you increase the actual size of your image until you get pixelation.

     

    You can use the same argument to prove you need 8k or 16k or 32k or 100000000000k.

    It's a completely false and dishonest argument.

    You could zoom in on an ant in the picture, increase the size, and make the argument you need a 10000000000000k screen.

     

    to prove I cropped it:

     

  • CalmOceansCalmOceans Member UncommonPosts: 2,437
    Originally posted by Bascol

    This is a gaming site, who cares about movies and videos.

    You obviously never seen a game in 4K. The difference is anything but small.

    There's no reason to assume if people can't tell the difference in movies, they could see the difference in games.

    You could make the argument that gamers sit closer to their screen, but I doubt that difference would be enough to tip the balance.

  • BascolaBascola Member UncommonPosts: 425
    Originally posted by CalmOceans
    Originally posted by Bascola

    1K:    4K: 

     

    WITHOUT ZOOMING. The argument is the 4x higher detail.

    Don't lie, I'm not stupid.

    This is the actual size of that part of your image cropped.

    A lot smaller than you who zoomed in on the picture.

    I'm not from yesterday.

    You get pixelaton because you increase the actual crop.

    If you look at them on 1K and a 4K monitor of the exact same size (let's say 28") Then they take up the exact same amount of room on both monitors (let's say a 2x3" rectangle) just that the 1K is pixelated while the 4K one is not. Exactly what i showed with the comparison. You just don't get it or you are being stubborn.

    Even if you don't resize it's pretty clear which one has more detail which was all i wanted to show.

  • BascolaBascola Member UncommonPosts: 425
    Originally posted by CalmOceans
    Originally posted by Bascol

    This is a gaming site, who cares about movies and videos.

    You obviously never seen a game in 4K. The difference is anything but small.

    There's no reason to assume if people can't tell the difference in movies, they could see the difference in games.

    You could make the argument that gamers sit closer to their screen, but I doubt that difference would be enough to tip the balance.

    WoW, that has to be the most ignorant statement ever made. The movie was a 1080p movie so of cause they can't see a difference.

    You have not even seen a game in 4K but come here arguing nonsense. Please go buy yourself a 4K rig or go to a store and take a look.

    You need to stop talking about things you have no idea of.

  • CalmOceansCalmOceans Member UncommonPosts: 2,437
    Originally posted by Bascola

     You just don't get it or you are being stubborn.

     

    I get just fine what you did.

    You took a crop of your picture, you increased the size through the MMORPG.COM pop up until you could see pixelation.

    Your actual picture is much smaller..

    It's a dishonest way to try to prove that pixelation is being caused at 1080p, YOU caused the pixelation, and you know very well you did it too.

  • BascolaBascola Member UncommonPosts: 425
    Originally posted by CalmOceans
    Originally posted by Bascola

     You just don't get it or you are being stubborn.

     

    I get just fine what you did.

    You took a crop of your picture, you increased the size through the MMORPG.COM pop up until you could see pixelation.

    Your actual picture is much smaller..

    It's a dishonest way to try to prove that pixelation is being caused at 1080p, YOU caused the pixelation, and you know very well you did it too.

    Sorry, but that is how many pixels you have in a 1K picture, call it pixelations or whatever you want. It does not change the fact that that is the amount of DATA available at 1K

    To compare THEY HAVE TO BE THE SAME SIZE or the comparison is pointless. But since you are obviously not able to grasp a simple concept like this there is no more point in talking to you.

    Have a nice day.

  • nariusseldonnariusseldon Member EpicPosts: 27,775
    Originally posted by Bascola
     

    This is a gaming site, who cares about movies and videos.

     

    I do. Gamers don't watch movies and videos?

  • QuirhidQuirhid Member UncommonPosts: 6,230
    Originally posted by CalmOceans
    Originally posted by Bascol

    This is a gaming site, who cares about movies and videos.

    You obviously never seen a game in 4K. The difference is anything but small.

    There's no reason to assume if people can't tell the difference in movies, they could see the difference in games.

    You could make the argument that gamers sit closer to their screen, but I doubt that difference would be enough to tip the balance.

    I have a 2.5k screen on my phone and I can see the difference. If you can't tell the difference when moving up to 4k you're just not paying attention.

    I skate to where the puck is going to be, not where it has been -Wayne Gretzky

  • fivorothfivoroth Member UncommonPosts: 3,916
    Originally posted by Kiyoris

    eeeeeh,

    I think so. So, I was trying some video on youtube that is 4k, it lags like cazy, on both of my PC. They're very fast, one with an i5, anohter with an i7.

    None of the browsers seem to use the GPU to render, but still, state of the art CPU are not even fast enough to render 4k video. It shows how much extra power you would need to go from 1080p to 4k.

     

    4k would require rendering 4 times as many pixels, to get the same speed as 1080p, you would need 4 times as fast a GPU, 4 times as fast RAM to load 4 times as much texture data, 4 times as much bandwidth.

    If your PC now does 120FPS @ 1080P...it will do 30FPS @ 4k..at best, probably less because you will not even have the bandwidth for it....no sane person is going to accept such a hit in framerate.

    It's neeever gonna happen. At least not in the first couple of years. No one is going to accept 4 times less framerate for a few more pixels.

    I think 4k gaming is meant for the highest end CPUs and GPUs. i5 is completely out of the question. You would probably need the best i7 and the best that Nvidia can offer but that would be ridiculously expensive.

    It's funny how you say that no one is going to accept 4 times less frame rates for "a few more" pixels" yet in your previous paragaraph you mention that it's 4 times more pixels. 4 times more pixels is certainly not a few more pixels. It's 4 times as clear as 1080p.

    A fun fact, there are phones which can shoot 4k and play it back. True they only have 1080p display but they play back 4k just fine.

    Mission in life: Vanquish all MMORPG.com trolls - especially TESO, WOW and GW2 trolls.

  • BascolaBascola Member UncommonPosts: 425
    Originally posted by Quirhid
    Originally posted by CalmOceans
    Originally posted by Bascol

    This is a gaming site, who cares about movies and videos.

    You obviously never seen a game in 4K. The difference is anything but small.

    There's no reason to assume if people can't tell the difference in movies, they could see the difference in games.

    You could make the argument that gamers sit closer to their screen, but I doubt that difference would be enough to tip the balance.

    I have a 2.5k screen on my phone and I can see the difference. If you can't tell the difference when moving up to 4k you're just not paying attention.

    Ignore CalmOceans, he/she obviously never has seen anything higher than 1K. The difference is so dramatic it's a completely new experience, at least for me it was. Dragon Age: Inquisition is mind blowing in detail with 4K.

  • timtracktimtrack Member UncommonPosts: 541
    Originally posted by Kiyoris
    Originally posted by VastoHorde
    Originally posted by CalmOceans
    Originally posted by Quizzical
    Originally posted by CalmOceans
    Originally posted by Quizzical

    You are confused in many ways.

    First, video decoding is generally done on the GPU, not the CPU.  The CPU load should typically be minimal for video decoding.

    You seem more confused than he is, the 4k video is heavily taxing the CPU.

    Only if the video decode block on the GPU can't handle it.

    It's going to tax every single CPU. Do the test, run https://www.youtube.com/watch?v=Zk9J5xnTVMA in 4k, do CTRL-ALT-DEL and check your CPU. It's going to be taxed heavily.

    That's the decoding on the CPU.

    People severely overestimate what is being hardware accelerated, it's less than you think.

     

    CPU 50%-60% at 4K

    CPU 12%-20% at 1080P

    Tested the video in Chrome running an AMD FX 8350 and Radeon 7970GHZ

    Same for me, 4k puts a massive amount of load on my CPU? and my GPU is brand new.

    Not sure why the guy called me "confused", when my CPU is loaded at 60% it's using the CPU. Can't be more simple than that.

    Idle: 4% CPU, 0% GPU.

    Playing the video in 4k: 10% CPU, 44% GPU.

    Early 2013 MacBook Pro.

  • KiyorisKiyoris Member RarePosts: 2,130

    I think a lot of people are trying very hard to justify their 4k purchase lol.

    All the people who say you can tell 4k from 1080p are anecdotal.

    "oh yes, I can tell the difference"...ok....I'm sure you can image

     

    The experts are saying something else!

    They are saying you can't tell!

    Let's look at the actual evidence.

    http://carltonbale.com/does-4k-resolution-matter/

    THX and Sony support this data.

     

    Experts with measuring equipment who have studied this, are saying there's no way that a human can see the difference at a normal distance

    Some anecdotal people who bought 4k, are saying you can.

    I'll go with the experts lol.

     

  • NanfoodleNanfoodle Member LegendaryPosts: 10,610
    There is so little content thats 4K that owning a 4k TV or monitor is a "I had it first" Badge. Till you start seeing shows, movies, and other media being done standard in 4k its just not worth it. Most places 720p is just becoming the standard. 1080p conversion is still taking place. We are years off from 4k standards. 
  • Flyte27Flyte27 Member RarePosts: 4,574
    To me 4K is not worth it at the moment.  If the prices come down and they are able to improve video card performance while reducing power consumption maybe one day it will be worth it.  I'd imagine you need to have one super video card that costs about 1000 or more dollars or a SLI/Crossfire that produces the same.  1080p still looks pretty darn good.  It's a far cry from the days of monochrome and super low resolutions.  Sometimes having better graphics actually limits and MMO.  For instance Everquuest still has more complex dungeons, zones, and buildings than most modern day MMOs and much of that is it's low quality/blocky objects.  The textures are still quite nice though.  They give the game a real Forgotten Realms type of feeling.
Sign In or Register to comment.