Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Nvida GTX 465, 470, 480 totally confused

2»

Comments

  • BarbarbarBarbarbar Member UncommonPosts: 271

    This recurring bashing of SLI is a myth by now.  The 460 do excellently in SLI, not only in "some" games but in the plethora of new games tested.

    With performance increases around 190%, those who still won't embrace SLI are stuck in 2008. By going 460 SLI compared to a single 480 you will get: Lower noise, lower temperatures, vastly increased performance and you will have saved 50 $.

    http://www.guru3d.com/article/geforce-gtx-460-sli-review/5

    What's there to not understand?

  • ShinamiShinami Member UncommonPosts: 825

    The world being shocked at how the 480 GTX turned out? You're kidding me right?

     

    Thanks to eVGA, those cards come with eVGA precision and OC utilities and you get a LIMITED LIFETIME WARRANTY that covers even overclocking. No ATI card has a good warranty on it that actually gets honored. The Nvidia 480 GTX is actually viewed as a step in "Freedom" because a lot of gamers who play casually and just care about getting a Framerate of 60 or over actually moved to Linux when the card came out. It was the first card that on moderate settings using a good processor on Ubuntu Linux + Wine enabled on Games broke the 60 FPS barrier...

     

    Thanks to this, Windows 7 is good for one thing and one thing only when it comes to gaming....Triple Screen Monitor Gaming, which a lot are pissed at the Bezels blocking the way, which reminds me...Nvidia actually released their own version of 3D Vision Surround and multimonitor gaming that managed to work on practically every game while ATI had many issues. Nvidia also allowed teh technology to be activated on their 2XX series of cards too.

     

    There was a special on 4th of July and I got a SuperClocked 480GTX version for $50 less than what it costs. The cooler is a 4 heatpipe cooler and the temp threshold is 105c. The card is expensive because it is a GPGPU, which deals with a full physics processing AND it also processes everything else.

     

    480 GTX SLI is extremely popular and has been selling like hotcakes. Ive been able to test a lot of configurations and on cheap systems I own ATI cards, but all my heavy systems I own Nvidia. Right now a 480 GTX SLI is the strongest Video Card Configuration out there.

     

    The world was disappointed with the 470s that didn't do much, but when the 480s and 460s came out...they are at the point unless AMD releases something quick, they will be screwed. Lets not forget that when ATI released their cards, they spoke of Direct X 11, but the Nvidia cards have a Stronger, NON-BETA implementation of DirectX11...I love the deception behind the ATI cards and how the actual DEMOs are not FULL direct X 11 demos...

     

    How do I know this? Simple! I develop a lot using DirectX11 and there are things that you can try on an ATI cards from the DirectX11 API and the ATI cards actually HALT, fill up its memory and then CRASH...and before you know it....your development program actually freezes and you are forced to do a Forced Quit....While on the Nvidia Cards I can completely develop something and using the Sections of the API DO NOT CAUSE THE CARD TO CRASH...THis is because DirectX11 on Nvidia cards was well implemented and it wasn't some BETA hash and preliminary SOLD TO THE PUBLIC as DirectX11 when back then most of the major games were still 10 and 10.1 Direct X being ran...

     

    I used to be a die hard ATI fan during the All In Wonder games and I enjoyed what they did for a long time, but when I started doing heavy gaming and developing and seeing how ATI cuts corners...I kept around their basic cards for basic comps and went Nvidia for the heaviest of things.

     

    The reason Intel and Nvidia did not combine into a company when AMD and ATI merged was because A court ruled that Nvidia and Intel together would equal a monopoly and were to remain seperate companies.

  • CatamountCatamount Member Posts: 773

    Originally posted by Shinami

    The world being shocked at how the 480 GTX turned out? You're kidding me right?

     

    Thanks to eVGA, those cards come with eVGA precision and OC utilities and you get a LIMITED LIFETIME WARRANTY that covers even overclocking. No ATI card has a good warranty on it that actually gets honored. The Nvidia 480 GTX is actually viewed as a step in "Freedom" because a lot of gamers who play casually and just care about getting a Framerate of 60 or over actually moved to Linux when the card came out. It was the first card that on moderate settings using a good processor on Ubuntu Linux + Wine enabled on Games broke the 60 FPS barrier...

     

    Thanks to this, Windows 7 is good for one thing and one thing only when it comes to gaming....Triple Screen Monitor Gaming, which a lot are pissed at the Bezels blocking the way, which reminds me...Nvidia actually released their own version of 3D Vision Surround and multimonitor gaming that managed to work on practically every game while ATI had many issues. Nvidia also allowed teh technology to be activated on their 2XX series of cards too.

     

    There was a special on 4th of July and I got a SuperClocked 480GTX version for $50 less than what it costs. The cooler is a 4 heatpipe cooler and the temp threshold is 105c. The card is expensive because it is a GPGPU, which deals with a full physics processing AND it also processes everything else.

     

    480 GTX SLI is extremely popular and has been selling like hotcakes. Ive been able to test a lot of configurations and on cheap systems I own ATI cards, but all my heavy systems I own Nvidia. Right now a 480 GTX SLI is the strongest Video Card Configuration out there.

     

    The world was disappointed with the 470s that didn't do much, but when the 480s and 460s came out...they are at the point unless AMD releases something quick, they will be screwed. Lets not forget that when ATI released their cards, they spoke of Direct X 11, but the Nvidia cards have a Stronger, NON-BETA implementation of DirectX11...I love the deception behind the ATI cards and how the actual DEMOs are not FULL direct X 11 demos...

     

    How do I know this? Simple! I develop a lot using DirectX11 and there are things that you can try on an ATI cards from the DirectX11 API and the ATI cards actually HALT, fill up its memory and then CRASH...and before you know it....your development program actually freezes and you are forced to do a Forced Quit....While on the Nvidia Cards I can completely develop something and using the Sections of the API DO NOT CAUSE THE CARD TO CRASH...THis is because DirectX11 on Nvidia cards was well implemented and it wasn't some BETA hash and preliminary SOLD TO THE PUBLIC as DirectX11 when back then most of the major games were still 10 and 10.1 Direct X being ran...

     

    I used to be a die hard ATI fan during the All In Wonder games and I enjoyed what they did for a long time, but when I started doing heavy gaming and developing and seeing how ATI cuts corners...I kept around their basic cards for basic comps and went Nvidia for the heaviest of things.

     

    The reason Intel and Nvidia did not combine into a company when AMD and ATI merged was because A court ruled that Nvidia and Intel together would equal a monopoly and were to remain seperate companies.

    It's curious how you repeat these exact same arguments from another thread, most or all of which I already addressed (which met with no further reponse from you).

    I'm still just as confused as to why someone as clearly educated as you would use these grasp-at-straw arguments to try desperately to find saving graces for mostly-inferior GPUs. I'll just breifly re-iterate what I said before.

     

    1.) No one really cares about Linux. I'm sorry, but for the average end user this is just true, especially among gamers. By far, the vast majority of gamers run Windows because Linux does not give native access to 99% of new games that come out (and the whole Wine/Cedega route is a hit or miss, and mostly miss, approach). Hooray if Nvidia has better Linux drivers, but 99% of gamers really don't care.

     

    2.) I'd still like you to cite an actual source on claiming that the HD 5000 series doesn't fully support DX11, or better still, how us a DX11 game or benchmark that these cards can't run. If you have nothing but an unsubstantiated claim based on personal developer experiences, it really means little if I can still go out and buy a Radeon HD 5870 today and run every DX11 software title in existence.

    Furthermore, why does it even matter if all the games at the time were DX10/10.1 anyways? If there really is a feature that the HD 5000 series doesn't support, as you allege, then surely this won't be an issue for Southern Islands. Any DX11 title already out does nothing that's unsupported by the HD 5000 series, and by the time these other alleged features of the API are implemented, HD 6000 cards will be out and flooding the market, so even if what you say is true, it's literally 100% a non-issue, because the HD 5000 series will only see a real market in a DX10/10.1 world anyways.

    What's more, you're rooting for Nvidia on API support while conveniently neglecting the fact that they took forever to support DX10.1. If we all had taken your general attitude of "always go with Nvidia, they do APIs better" a generation of cards ago, we'd all be stuck without DX10.1 support. Criticizing Ati for something that Nvidia has done worse (partial support is still better than no support) without even mentioning Nvidia's side of this problem hardly demonstrates objectivity on your part.

     

    3.) Yes, the public was shocked at how the GTX 480 turned out, because it has to use vastly more transistors to acheive a given level of performance than Cypress, and it's nothing but an overengineered contraption that uses downright frightening levels of power (which generates a proportionate increase in heat, requiring monstrous heatsinks), and all this just to build a card to barely surpass the performance of an Ati card from 8 months prior.

    I understand if there are attributes to Nvidia GPUs that you like. Lifetime warranties are nice, for example. Does it change the fact that Nvidia GPUs are intrinsically inferior, because they require more hardware (and a lot more juice) to achieve a level of performance equivalent to what we see on Ati GPUs? Nope; it doesn't.

    It also doesn't change the fact that just as Nvidia is releasing the first competently designed Fermi card, the GTX460, finally achieving rough PARITY with Ati's cards (in this case, the 5830), Ati is already just about to release Southern Islands, and is not far off from releasing Northern Islands, which Fermi will have to compete with (something that ought to work about as well as using G200 cards to compete with Cypress did). Nvidia has taken a year to release a card that's finally on par, in every way, with Ati's GPUs, so it should be no surprise, if Nvidia is just catching up with old-news releases from Ati, that Ati is now getting ready to hammer the market with yet another GPU release (and then another right after).

     

     

    Nvidia's luck hasn't changed since the day that the $300 Radeon HD 4870 came out, and plausibly challenged the performance of Nvidia's $600 Geforce GTX 280. You can cite things most of us don't care about like Linux drivers that 99% of us don't care about, or hypothetical DX11 issues that I've only heard of you running into and that none of us see in any actual released title, but here in the corporeal world where gamers simply want a fast card at a good price with low heat and noise, the GTX 480 is a miserable product, and fixing it to finally catch up with Ati so late that ATI is about to obsolete their present hardware anways, doesn't speak to any great capacity to compete in today's market on Nvidia's part.

    It really is shocking that someone with your level of expertise in computers (not just claimed, but demonstrated) would fall to what really seems to be nothing more than rampant fanboyism. It seems rather out of character for you, based on how well you address other topics.

  • ShinamiShinami Member UncommonPosts: 825

    @ Catamount

     

    Hello, After reading your post I have a few things to say about it. :)

     

    There is no rule that says I can not use the same argument if it fits into the same context of things. In fact that is a common practice out there. I will go one to your points :) 

     

    1) The average end-user of an Operating System is not a gamer. The Wine Route actually is promising on Nvidia Drivers. Most games crash instantly on ATI drivers. However, ATI drivers beat out Nvidia Drivers on MAC OS X, but Nvidia wins at the Workstation department. Your reference to gamers should be clarified...gamers are split between console gaming, 2D PC Gaming, 3D PC gaming and handheld gamers.  A small example...Neopets at one point had 35 million unique members, and it fell to around 27 - 28 million players and it still maintains the highest female population percentage in any MMO game. Currently in the US alone, there are 167 Million Sudoku players. Perhaps you should have made a mention to "99% of Full Screen 3D Gamers" prefer Windows, even though there is a population of Linux and MAC gamers who will not let go of their favorite OSs. Just the same as the majority of console games today are 3D games too! Not to mention how 3D gaming exists now in Handhelds and are making their way to SmartPhones...guess windows really isn't "supreme" but just another prefered medium among millions of gamers.

     

    2) The top two Single-GPU ATI cards are non-reference designs. One is a 2GB ASUS model. The other is a 1GB Gigabyte Model. Both are 5870s. The 2GB Version has pciexpress + 8 PIn + 8 Pin = 375W allowed on max electronics. The other is 2 six pin PEGs + pci express = 225W allowed on max electronics. Both cards being non-reference design mean that they were created not by ATI itself, but by the third party. This is why they are both over $500.

     

    I believe the 2GB ASUS card is an amazing card if you are only playing games. It is the one I consider to be on-par on Gaming Performace vs a 480 GTX, but below I will explain why a 768MB 460 SLI configuration does not beat a SINGLE 480 GTX. Also explain why the 1GB models of any video card are bad. Below is a screenshot with an Argument :) 

     

    http://www.smashmybrain.com/screenshots/1920x1080mw2.jpg

     

    I just uploaded this screenshot. I was playing Modern Warfare II at 1920x1080 Resoution. The map I was playing was "Body Count" in which you try to get 30,000 points as fast as possible. The map is a small business area in town where all hell breaks loose. It is accessable through Special Ops and one of my favorite maps (The same map used for Homeland Security). Look at the memory consumption. Just playing for 20 minutes, I was using over 800MB of Video RAM.

     

    In short, if I run a 768MB 460 SLI Configuration, I would RUN OUT OF MEMORY and thanks to the swap file my framerate will fall to dismal lows through being forced into using System RAM! The reason I disapprove of the 5870s out there with the 2GB models being the Exception is because thanks to triple monitor gaming and high resolutions like 1920x1080 and beyond...not to mention the next generation of graphics and physics engines, 1GB is simply not enough memory to contain these games. Hard OCP even stated on the 1GB Gigabyte card review that 1GB simply was not enough..and if they are smart, they probably lowered the resolution.

     

    I am both, an Nvidiot and a fanATIc, but in different scopes of things :) High end, I go Nvidia, Low end I tend to go ATI. Below is my reason why I like the 480 GTX :)

     

    3) The 4XX Nvidia series is more than just a pure gaming video card GPU. Its true that many people are gamers, but I am not just a gamer. I am a developer, a programmer and a heavy modder. ATI and Nvidia run neck and neck at times in Single GPU performance, but once you go from gaming into actual development and GPGPU calculation and physics processing, the ATI cards rely on brute force and perform a lot lower. Nvidia needed a winner in order to stay afloat. Instead of focusing on just gaming, they focused on a more integrated solution and the big guns of computer science. They even capped until a driver uncaps their Double Precision Operations at FP64 at 12.5% and FP32 at 50%. I love how ATI's single GPU flashship solution got decimated by a previous generation Nvidia card at GPGPU performance.

     

    Even capping that decimated current cards and can hold its own in gaming performance. I see this when I am working in development. For the first time in years running a physics engine and modding/developing while full 3D graphics are working and in place, in rendering I am getting over 60 FPS rendering full DX11. This card and the programs of today do not do any justice, but its already optimized for what the future will bring. What Fermi truly is, is a video card that falls on the middle ground between a pure video card and a pure workstation card.

     

    It is excellent strategy, which became better when Nvidia through a driver activated 3D Vision surround on all cards starting from their 2XX series of GPUs! And the cool thing is that many games worked right out of the box after putting in the drivers. Nvidia is positioned very well for the future in not just gaming, but in many areas.

     

    What good is the reference design in ATI's Flagship model if I am already breaking 1GB VRAM usage in most games at my settings? You can give me a dual GPU, but if you give me 1024x2, that also gets squashed since crossfire and SLI work by copying everything in both framebuffers and running them synchronized. Notice that I also said that 460 SLI sucks! :) Right now, I am at a point and many gamers are in which they are afraid that next generation games on the majority will break the 1GB memory barrier at HD resolution that it will make every single 1024MB card be it ATI or NVIDIA obsolete and useless...Unless of course you want to play with  x0AA and texture quality at Low or Medium settings, but we don't buy video cards to play at LOW SETTINGs, we but them to kick butt at high resolution + high settings. :)

  • CatamountCatamount Member Posts: 773

    @Shinami

    You do definitely bring up some intruiging points. Actually, most people that I know right now who do Crossfire setups with 5870s use a combination of a 2GB and 1GB model, because 1GB of RAM actually is surprisingly easy to fill up. I remember running into the same problem last generation when the first 4870s were 512mb, and a generation or two before that with the 512mb cards (and heaven forbid you should make the mistake of getting a 320mb Geforce 8800gts like I did).

    As time goes on, however, I do not believe this fact will be lost on either company. I can't speak to what Ati will do with Southern Islands, because it's little more than a stop-gap designed to fit in where a die-shrink usually does (that's coming with Northern Islands, in this case), but I will be greatly surprised if Ati sticks to 1GB of RAM on the new chips early next year. As for the situation today, I think it would be a big mistake putting two 768MB GTX460s in SLI, but at that level of performance, 1GB should still be adequate most of the time. The same goes for using a single HD 5870, which honestly isn't powerful enough to run many of the newest titles at triple-monitor resolutions anyways (and if you get two, you can afford for one of them to be a 2GB model).

     

     

    Now, as for the GPGPU capabilities of the GTX480, that's precisely the problem, in my opinion. I know that Nvidia cards perform vastly better in applications like Folding@Home, but the GTX480 is too much a GPGPU, and not enough of a gaming card. It really does use more transistors, and produce vastly more heat (because of sucking up obscenely more power) to achieve levels of performance from ATI cards so old, that ATI is already getting ready to obsolete them. If anything, that still shows that Nvidia is in trouble, just like they were with the G200 series, which they had to bottom prices out on unrealistically just to maintain sales (while Ati made a killing on every HD 4000 series card).

     

    If I were to build a GPGPU box, say for biology research or to model climate impacts on biodiversity, I wouldn't use an ATI GPU setup. Qua GPGPUs, Nvidia GPUs are vastly better, because GPGPUs are what they are, but qua gaming cards, Ati cards are far better, because they match Nvidia's cards in terms of performance per dollar, but completely obliterate them in terms of performance per watt, and this is with GPUs eight months older! Nvidia has only managed to catch up in that area now that Ati GPUs are a year older than the GF104 chips.

    I will admit that the frame buffer issue is not a non-issue, though your own inability to overload a 1GB frame buffer with a normal game, even when trying (I'm sure you hand-picked the application and circumstances), shows that the problem is far from critical (though I will complain alongside you if it persists in Norther Islands). The problem of Nvidia being so far behind Ati that they just barely release inferior gaming GPUs to compete with ATI GPUs just in time for ATI to obsolete those GPUs with something else, on the other hand, is a very critical problem.

     

    I don't know if you recall this, but Nvidia was in so much trouble trying to get real yields out of Fermi at the beginning of 2010, then they didn't even manage to get a card out for CES2010, and got caught using mockups. When a company is expected to bring a card to an electronics show to compete with 4-month old competitor cards, and even with a 4-month time deficit, still have to bring mockups, that says to me that said company is in deep, deep trouble. Nvidia may make good GPGPUs, but it's not holding their business together, and they certainly aren't keeping afloat with their gaming cards. That's why the GTX480 "shocked the world". 8 months of catchup, 8 months of extra develpment time, and the best they were able to come up with for a gaming GPU was GPGPU with three billion transistors (but not the gaming preformance you'd expect from that many) that could double as a hotplate, because it consumed as much energy as dual-GPU setups from the other guys.

     

     

    As far as defining "gamers" in terms of Linux usage, you know what I meant :p

  • ShinamiShinami Member UncommonPosts: 825

    Hi Catamount :)

     

    I wasn't trying to overload a card beyond 1024MB. The entire Nvidia Argument is the "buy a 768MB 460 SLI, which is $400 and beats a 480 GTX by 10% attitude" that has lead to a lot of deceptive problems. I am sorry for not being professional about Clarification of that screenshot. :)

     

    Modern Warfare 2: "Body Count" 1920x1080, Anti Alliasing 0x, AFx16, Maximum detail on textures with the in-game settings at maximum for textures. This alone caused 800MB of VRAM to use. The reason I chose Modern Warfare II wasn't because of its popularity. It was because its a game based on the Quake 3 Engine. The engine is considered to be the "minimum" for shooters today with it being heavily modified out there. It has great netcoding and the smallest overhead out there. I these settings take 800MB of RAM...Imagine a super-bloated, heavy, next generation game :'(

     

    I mean look at the screenshot :) Do you think I can muster over 200 FPS in MW II on a single 480 GTX if I had Anti Alliasing Enabled? :) I was also using a Q6600 overclocked to 3ghz as well. That alone pretty much shows how bad the 460 SLIs. When HardOCP was testing on current generation games their 1024MB 5870s in some games, they ran out of memory on a Current Generation game too. Here are two ATI reviews, and you can be happy :) Remember the two cards I spoke about last post? Here are both reviews. :)

     

    Gigabyte 1GB 5870 HardOCP review

     

    Just to show you how my mind works :) In two of the games, it showed that memory itself had reached a limit on this card according to the author. I didn't mind so much the first game...But in the second game Battlefield Bad Company, I considered it a "Failure" on all three cards. To me, I DO NOT LIKE playing any FPS game be it multiplayer or singleplayer with under 60 Frames Per Second. I can make an exception to some singleplayers, but I won't ever play a Multiplayer with under 60 FPS....Due to this, I considered all three cards tested to be a "failure" under those settings for multiplayer. Crossfire and SLI performance would be a lot better at those settings. As for the last 2 Games...A memory limit was hit...which scares me because these are current games...Imagine next generation titles? :(

     

    ASUS 2GB 5870 HardOCP review

     

    This is the strongest of the ATI Single GPU cards out right now. 2GB is futureproof. They did review the two games that had reached a memory limit..which were Splinter Cell Conviction and Alien vs Predator. They found on one game the 2GB version (splinter cell) beat by 1 FPS the 480 GTX, but it still averaged out the Framerate between 39 and 40. The max framerate I don't care about since you hardly will see the max framerate. :) Even with 2GB Memory, the reverse occured in Alien vs Predator...this time the 480 GTX had performed 1 FPS better. It means Both cards can handle those games :) If you look at Metro..The Radeon cards have a problem switching into 2560x1600 resolution, though I like that Dirt II performed better on the ATI cards :)

     

    What I see...is that Nvidia and ATI are neck and neck in performance, but to reach that level of performance I had to look at the non-reference board reviews each going for over $500, but im not complaining. :) The two ATI cards are fantastic in games...But then comes a question:

     

    "How can I get more for my money? If Nvidia and ATI are running neck and neck in games, what about other features? and this is what got me to get a 480 GTX. Gaming is fine on both Single-GPU, there is no doubt about it. I do other things and at the end of the day found that in those development programs (even photoshop has CUDA support now), The 5870 matches the abilities of a 280 GTX.

     

    I like Fermi for the same reason I loved the All In Wonder ATI series...Multifunctioning cards that make them more complete..Back then AIW were decent development cards too, before CUDA and physics engines came along.

     

    My fear is that people buy cards that already are showing memory limits on Current Generation Games, just to be left in the rain on both sides when new games come out. :( Its a sad state of things. This is why I recommend the 480 GTX and the 2GB HD5870 by ASUS, but don't recommend anyone buying any 1024MB video card if they are seeking to futureproof themselves.

  • KhrymsonKhrymson Member UncommonPosts: 3,090

    Didn't read every post, but I would avoid the 465 and 470 for now, as the 465 is on the verge of being discontinued since the 460 is far better, and the 470 has been dropping in price quite rapidly these past 2 weeks...it was around $429, and now you can get them for as low as $275-$310 if you look around.

     

    This tells me that a better version of the 470, more than likely being named the 475 is going to be released in the next month.  There is also the 450 that has been announced for a sometime late Aug release.  So I'd sugget you once again wait a little longer for the 475, or just get a 460 now.  Even the 480 will be discontinued in the next 2-3 months for the 485 and/or 495 but those will be quite expensive.

  • ShinamiShinami Member UncommonPosts: 825

    Thank God :) if its discontinued in 2 months, I can simply pay the difference and eVGA will allow me to trade up for the top card. Evga allows a person to trade up 90 days from buying a card or just asking the people...They tend to be lenient about the high end cards out there...:) They do this because they also are into FOLDING and they want people to continue buying their products and fold with them.

  • AmazingAveryAmazingAvery Age of Conan AdvocateMember UncommonPosts: 7,188

    Originally posted by Shinami

    Thank God :) if its discontinued in 2 months, I can simply pay the difference and eVGA will allow me to trade up for the top card. Evga allows a person to trade up 90 days from buying a card or just asking the people...They tend to be lenient about the high end cards out there...:) They do this because they also are into FOLDING and they want people to continue buying their products and fold with them.

     EVGA are really awesome here, they even have the try b4 you buy stuff.



  • KhrymsonKhrymson Member UncommonPosts: 3,090

    Yup I love EVGA, and I have only bought their GPUs for at least 8 years now.  Never had one ever go out on me or give me any problems, except that after I had bought the GTX 275 last year, I found out through scouring their database that its one of a few cards they make that didn't support a DVi to HDMi cable for some odd reason.

    So I've been having to use my 40" HDTV for the past year with a VGA cable, and the GTX 275 also doesn't appear to support higher than 1600*1200.  I even powered up an old GForce 2 GPU in an old PC of mine and it runs in 1920*1080/1200 just fine on this HDTV course using that VGA cable too since its all it supports.  Anyway its weird I can't get higher with the GTX 275...ah well!

  • ShinamiShinami Member UncommonPosts: 825

    Have you tried setting up a Custom Resolution? Nvidia Allows this through their drivers. See if 1920x1080 @ 60hz can be set as one. :)

     

    Under the latest drivers, do the following:

     

    Go the Nvidia Control Panel and go to

     

    Change Resolution ----> Customize -----> Create Custom Resolution.

     

    Create a profile for 1920x1080 @ 60hz

     

    See if that works :)

     

  • KhrymsonKhrymson Member UncommonPosts: 3,090

    Originally posted by Shinami

    Have you tried setting up a Custom Resolution? Nvidia Allows this through their drivers. See if 1920x1080 @ 60hz can be set as one. :)

     

    Under the latest drivers, do the following:

     

    Go the Nvidia Control Panel and go to

     

    Change Resolution ----> Customize -----> Create Custom Resolution.

     

    Create a profile for 1920x1080 @ 60hz

     

    See if that works :)

     

     

    Interesting, could have sworn I had tried that in the past before and it didn't work, but it did this time.  However there is now about a 1/4in black bar down the entire right side of my screen now, blocking the scroll bar, and other icons a little.  Its weird and I don't see how to fix this yet, and its only while running this resolution.  1920*1200 ~ wait now its there when I switched back to 1600*1200...doh!

     

    Ah scratch that ^above, there were some advanced features on my HDTV I forgot about that allowed me to adjust the screen and center it, however after using 1600*1200 for so long I'm used to it, as the 1920*1200 was hard on my eyes.  Everything was just a little too small...

Sign In or Register to comment.