Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Various questions, 1070, monitor, cpu... +rambling!

VolgoreVolgore Member EpicPosts: 3,872
edited December 2016 in Hardware
Hello awesome people of the mmorpg.coms!!1

I was thinking of replacing my trusty 970'OC with a 1070. At the moment this is really just "thinking", because i'm afraid i would run into some technical nonsense and would end up buying much more than just a video card.

1.
I recently bought this new 21:9 monitor (29" 2560x1080) and it is awesome. But 60Hz unfortunately! Means it can't display more than 60fps anyway, which my 970 provides easily in about every game in 2560x1080 so far.
No use in buying a 1070 for say 100fps if my monitor won't display them, right?

2.
My next idea was to buy another new monitor (120hz+) together with dat 1070, but i love my current aspect ratio/screen size and would want to stay with that.
As it seems, there is NOT A SINGLE 29" 21:9 2560x1080 monitor on the market that does more than 60hz (75 with Freesync). Can anyone confirm or deny that?
There are however 34" 2560x1080 144hz monitors, but i read that the ppi isn't that good and has some "grid" to it, as it is basicly a stretched version of a 29".
Right or wrong?

3.
My CPU is a i5 [email protected]. It goes even beyond that, but it wasn't yet necessary to o'c it even more. Would it bottleneck a 1070?

Thanks for your input!






image

Comments

  • QuizzicalQuizzical Member LegendaryPosts: 25,348
    There will always be something that is the bottleneck, as otherwise, you'd have infinite performance.  If you've got the frame rates you want at the graphical settings you want, it doesn't particularly matter what your bottleneck is.  So I wouldn't worry about it on your CPU.

    The video card rendering at higher frame rates than the monitor can display can slightly reduce the display latency, but it's not a big deal.  Perhaps the bigger deal is that you're more protected against dips when you go into busier areas of the game.  But really, there's no need to upgrade a CPU or GPU if you're happy with your current performance.

    I'd advise against upgrading from a GTX 970 to a GTX 1070, as it's not a big enough upgrade to justify the cost.  The GTX 1070 is faster, of course, but I generally advise against upgrading a video card to anything less than double your current performance.  If the old card wasn't fast enough, something only a little faster will barely be fast enough--and you'll likely be looking to replace the new card sooner rather than later.  A GTX 1080 roughly gets you to double your current GPU, but it's more expensive.  The coming arrival of AMD Vega is likely to shake up pricing at the high end, though.

    High resolution and high refresh rates simultaneously requires enormous amounts of monitor bandwidth.  I think the latest version of DisplayPort is the only monitor port on the market with the bandwidth for what you're after, while HDMI, DVI, and older versions of DisplayPort can all support your current monitor.

    I'm also not sure what a GTX 970 supports; at minimum, you should look into it before buying a new monitor unless you're set on replacing the video card anyway.  If your particular card doesn't have a DisplayPort port, you'll be dead in the water.  It probably has one, but depending on which version it supports, it may or may not be able to handle the refresh rate you want.

    The sort of monitor you're looking for is likely to support FreeSync (which is really just AMD's implementation of the industry standard adaptive sync), but that doesn't mean you have to use it.  If you build a high quality monitor with modern components, it's likely to naturally support FreeSync, so the monitor vendor might as well claim support.  That's the advantage of open standards; in contrast, to support G-sync adds about $150 to the price tag.
  • RidelynnRidelynn Member EpicPosts: 7,383
    I would save your money for now if I were you.
  • CleffyCleffy Member RarePosts: 6,412
    It usually doesn't make sense to jump up 1 generation unless you are selling your old card for a decent amount. Too much money for a minor increase in performance. At least wait 2 generations before upgrading, even then it's better to upgrade after 3~5 generations especially if you are buying near the top end.
  • MalaboogaMalabooga Member UncommonPosts: 2,977
    Why exactly do you need > 60 FPS at all? In fact, its much better to have 60 FPS on 60 Hz monitor than 80 on 120/144 Hz monitor. Unless you go Freesync/GSync, but GSync adds another hefty 150-200$ "premium" on already expencive monitor
  • VolgoreVolgore Member EpicPosts: 3,872
    Thank you everyone for your replies so far! :)

    image
  • suckm3suckm3 Member UncommonPosts: 187
    Yeah, because 1070 isn't oversized all at. I haven't found any game I could run on full details yet. Maybe in 10years it will happens.

    “Two things are infinite: the universe and human stupidity; and I'm not sure about the universe.”? -Albert Einstein 

    "The ability to speak doesn't make you intelligent" - Qui-gon Jinn. After many years of reading Internet forums, there's no doubt that neither does the ability to write.
    So if you notice that I'm no longer answering your nonsense, stop trying... because you just joined my block list.

  • CleffyCleffy Member RarePosts: 6,412
    You want over 60 fps for the dips. You might get 2 frames on one refresh and get small microstutter.
  • rojoArcueidrojoArcueid Member EpicPosts: 10,722
    Cleffy said:
    You want over 60 fps for the dips. 
    that makes sense




  • ReizlaReizla Member RarePosts: 4,092
    Volgore said:
    1.
    I recently bought this new 21:9 monitor (29" 2560x1080) and it is awesome. But 60Hz unfortunately! Means it can't display more than 60fps anyway, which my 970 provides easily in about every game in 2560x1080 so far.
    No use in buying a 1070 for say 100fps if my monitor won't display them, right?

    Biggest BS ever. Your monitor refreshes the image 60 times per second (which is the definition of Herz), but your FPS can be a lot higher than that. With your eyes not seeing more than between 25-30 images per second, the difference between 60, 144 and 200 (or even more) does not influence the quality of what you can see. The only thing it MIGHT influence is the fact that when your light(bulb) flicker a bit, the monitor might flicker a bit as well when you have a 60Hz monitor.

    Then the 60FPS vs 60Hz. Yet an other plain BS thing commonly said on forums (and even here). I have 3 60FPS monitors and I can get 100+ FPS in games. Here the GPU (and to a certain degree your CPU as well, since mine is somewhat bottlenecking the GPU :s ) indeed comes into place. The more powerful your GPU is the higher your FPS might become. And with a 21:9 monitor you indeed do need a more powerful GPU than on a 16:9 monitor to get the same FPS in games.
    If a 1070 would make so much of a difference compared to an 970OC, I can not tell. From my old experiences, I can tell that a nicely OC'd 660 would compete very nicely with a 760, but that's already half a decade ago and I don't know how much has changed in the architecture of the 970/1070 that'd influence the performance.
  • VrikaVrika Member LegendaryPosts: 7,888
    edited December 2016
    Volgore said:
    There are however 34" 2560x1080 144hz monitors, but i read that the ppi isn't that good and has some "grid" to it
    It depends on how close to the monitor you want to be. If you're half a meter from 34 inch monitor then yes, you will see pixels. On the other hand if you're 5 meters away then you won't.

    I'd consider twice before buying a 34" monitor for computer because of the viewing distance. 34" is already so large monitor that it's hard to position it to comfortable distance (far enough to see whole screen at once comfortably) on a normal computer desk.
     
  • QuizzicalQuizzical Member LegendaryPosts: 25,348
    Reizla said:
    Volgore said:
    1.
    I recently bought this new 21:9 monitor (29" 2560x1080) and it is awesome. But 60Hz unfortunately! Means it can't display more than 60fps anyway, which my 970 provides easily in about every game in 2560x1080 so far.
    No use in buying a 1070 for say 100fps if my monitor won't display them, right?

    Biggest BS ever. Your monitor refreshes the image 60 times per second (which is the definition of Herz), but your FPS can be a lot higher than that. With your eyes not seeing more than between 25-30 images per second, the difference between 60, 144 and 200 (or even more) does not influence the quality of what you can see. The only thing it MIGHT influence is the fact that when your light(bulb) flicker a bit, the monitor might flicker a bit as well when you have a 60Hz monitor.

    Then the 60FPS vs 60Hz. Yet an other plain BS thing commonly said on forums (and even here). I have 3 60FPS monitors and I can get 100+ FPS in games. Here the GPU (and to a certain degree your CPU as well, since mine is somewhat bottlenecking the GPU :s ) indeed comes into place. The more powerful your GPU is the higher your FPS might become. And with a 21:9 monitor you indeed do need a more powerful GPU than on a 16:9 monitor to get the same FPS in games.
    If a 1070 would make so much of a difference compared to an 970OC, I can not tell. From my old experiences, I can tell that a nicely OC'd 660 would compete very nicely with a 760, but that's already half a decade ago and I don't know how much has changed in the architecture of the 970/1070 that'd influence the performance.
    You are mistaken, as you confuse the threshold to interpret pictures as motion with the fastest that your eyes can see.  It typically takes about 20 frames per second for your brain to interpret it as motion rather than independent pictures, but that depends some on what the motion is.  Five frames per second looks like motion for a single object moving very slowly, while 50 frames per second won't look like motion for an object that darts across the screen so fast as to only be visible for two frames.

    I'm not aware of any evidence that the human eyes naturally see discrete frames rather than something more continuous.  One experiment with pilots found that with the image of a plane on the screen for only 1/220 of a second, they were able to not just see when a plane appeared, but identify the plane that was shown--which proved that they saw it clearly and weren't just guessing.

    Quite apart from what you can see, higher frame rates mean lower latency.  If you get 20 frames per second, then on average, you're seeing a frame that was first shown 25 ms ago, and that frame had to capture the state of the game 50 ms before it finished.  If you get 100 frames per second (and your monitor can show them!), on average, you see a frame that was first shown 5 ms ago, and draws the state of the game as of 10 ms before it finished.  There are some other factors in display latency that won't depend on your frame rate, but it's still a 60 ms difference in display latency, and that's a huge deal, even if 20 frames per second looks like motion to you.
    Gdemami
  • QuizzicalQuizzical Member LegendaryPosts: 25,348
    Vrika said:
    Volgore said:
    There are however 34" 2560x1080 144hz monitors, but i read that the ppi isn't that good and has some "grid" to it
    It depends on how close to the monitor you want to be. If you're half a meter from 34 inch monitor then yes, you will see pixels. On the other hand if you're 5 meters away then you won't.

    I'd consider twice before buying a 34" monitor for computer because of the viewing distance. 34" is already so large monitor that it's hard to position it to comfortable distance (far enough to see whole screen at once comfortably) on a normal computer desk.
    Get a bigger desk.  When I chose my current desk, ample space for lots of monitors was a major requirement.
  • holdenhamletholdenhamlet Member EpicPosts: 3,772
    edited December 2016
    I'm thinking since you're already happy with the performance, wait this cycle out.  970 can handle just about anything out currently fine.

    Someone on here posted a general rule with upgrading nvidia cards- do it every other cycle.  I think that makes sense considering the value you get for the price.  So I'd be thinking about an 1170 in the future (or whatever they're going to call them).

    I recently got a 144hz monitor, but the extra hz is only really useful for Overwatch imo and other fps games where you really want to see as much as possible and have as little response time as possible.  Games like Dark Souls 3 are even capped at 60, so a new setup would do absolutely nothing for a game like that.

    The freesync with my new 480 "seems" nice, but it's hard to say.  If money is no issue then go ahead and get a 1080 with a gsync monitor, but the cost for that is basically insane.
  • QuizzicalQuizzical Member LegendaryPosts: 25,348
    I'm thinking since you're already happy with the performance, wait this cycle out.  970 can handle just about anything out currently fine.

    Someone on here posted a general rule with upgrading nvidia cards- do it every other cycle.  I think that makes sense considering the value you get for the price.  So I'd be thinking about an 1170 in the future (or whatever they're going to call them).

    I recently got a 144hz monitor, but the extra hz is only really useful for Overwatch imo and other fps games where you really want to see as much as possible and have as little response time as possible.  Games like Dark Souls 3 are even capped at 60, so a new setup would do absolutely nothing for a game like that.
    You have to look at the performance difference, not just the number of generations.  This is especially the case due to the tendency to rebrand cards from one generation into the next.
  • VolgoreVolgore Member EpicPosts: 3,872
    Again thanks for your input.

    I think i will take you guy's advice and stick to my current setup.

    image
Sign In or Register to comment.