Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Yet again soe made their game spec too high

2»

Comments

  • QuizzicalQuizzical Member LegendaryPosts: 25,348
    Originally posted by Uhwop
    Originally posted by Quizzical
    Originally posted by Uhwop

    Notice how games now-a-days will have those littler blurbs "runs best on..."?  That's usually because they have some form of partnership withthe GPU builders or because the game is specifically optimized for that brand card. 

    Have you never noticed that people will play a game with an Nvidia card, and tons of people playing the same game with an ATI card that is supposed to outperform the Nvidia cards people have no trouble with, can't seem to get the game to run on those ATI cards? 

    It's very possible that Nvidia is wroking directly with SoE to get better performance out of what are lower end cards compared to the ATI ones, and that SoE is specifically working to optimize the game towards Nvidia cards and not ATI.  ATI and Nvidia do not use the same architecture in there their GPU's, and if a game is built with a specific brand in mind then it's going to run better on a lower end card for that brand then it is for a higher end brand of a card they didn't build the game for. 

    Scroll to the bottom of the page, you'll see the Nvidia logo listed there.  When you see a GPU manufacturers logo, or any company logo outside of the developer and publisher of the game, it means they have a partnership with that company.  In this case, SoE has partnered with Nvidia and both SoE and Nvidia are gearing PS2 to run better on Nvidia cards then on ATI. 

    This ATI's fault really.  Nvidia does a good job of working with game developers to ensure that more games are built to run on their cards better then ATI.  Coinsidentally, CPU's work very much the same way; afterall, isn't Nvidia associated with pentium and ATI with AMD?

    Oh my.  There is so much wrong there.  Where to begin.

    That Nvidia pays a company to put an Nvidia logo on the game is just a marketing expense.  Expecting a game to run massively better on AMD cards as a result of marketing expenses like that is like expecting a product to be awesome because you saw ads for it on television.  The marketing department might help with sales, but it doesn't make products better.

    Yes, Nvidia does work with some game developers to help them optimize code through their "The Way It's Meant to be Played" program.  AMD does the same, and calls their program "Gaming Evolved".  That's taken into account when comparing the "average" frame rates of different cards among a bunch of different games.

    There is some variance in which particular games run better on which cards for complicated architectural and driver reasons.  But for one particular game to run even 20% better on Nvidia cards as compared to AMD cards than you'd expect from an "average" game is definitely an outlier.  And remember that, by definition, the average is 0%.  The only games I've seen show up much in reviews in recent reviews that favor one GPU vendor over the other by substantially more than a 20% margin are DiRT Showdown (maybe 40% advantage for AMD) and Civilization V (varies wildly with driver versions, as it was the first DirectX 11 game to implement multithreaded rendering and neither Nvidia nor AMD had drivers ready).

    If Planetside 2 really does run more than 500% better on Nvidia cards than you'd expect from comparable AMD cards, then the game engine is completely broken.  Apart from a simple bug (or several such bugs) that should soon be fixed, it's hard to imagine what could cause that short of deliberate sabotage.  Even implementing proprietary Nvidia stuff like GPU PhysX just means that you turn that particular feature off when you're running it on an AMD card, and then it still runs fine.

    And no, processors aren't the same way.  AMD and Intel don't partner with game companies to get them to optimize games for their processors.  Some games do favor one processor vendor over the other more than average due to complex architectural reasons, but for one game to be 50% more favorable to one vendor than "average" for a given number of cores would be an extreme outlier.

    And no, Nvidia isn't associated with Pentium.  That's an Intel brand name, and Nvidia and Intel don't get along with each other very well.  Remember how they were suing each other for billions until they settled a while ago?  The processors that Nvidia makes use the ARM architecture and Tegra brand name, and geared toward cell phones and tablets.  They're irrelevant to GeForce cards at the moment, though ARM is trying to move up into higher power, higher performance markets--and Nvidia's upcoming Maxwell architecture will reportedly have ARM cores on board doing who knows what.  The reason AMD is associated with ATI is that AMD bought ATI several years ago.  That's kind of like saying that AMD is associated with SeaMicro or that Nvidia is associated with 3dfx.

     No sir, it's not wrong.  Those are partner logos, in the case of Nvidia it usually means they provide extra support for the specific game. 

    No the game engine isn't broken, it's the way game development has worked for quite a while now.  The whole "plays best on"  means that that GPU manufacturer provided added support to ensure that the game runs better on thier cards then their competitors. 

    And yes, Nvidia has an association with Intel in the way of a 1.5 billion dollar lisencing fee. 

    "Under the new agreement, Intel will have continued access to NVIDIA's full range of patents. In return, NVIDIA will receive an aggregate of $1.5 billion in licensing fees, to be paid in annual installments, and retain use of Intel's patents, consistent with its existing six-year agreement with Intel. This excludes Intel's proprietary processors, flash memory and certain chipsets for the Intel platform."

    Nvidia also has a history of partnering with SoE.  They did it with EQ2, SWG, and now PS2.

    http://www.geforce.com/games-applications/pc-games/planetside-2

    Nvidia was even giving away beta keys. 

    But please, feel free to tell people they're wrong about things you obviously didn't bother to actually look up first, it does wonders for your credibility. 

    A little bit of reading goes a lot further then behaiving like a know-it-all.

    If a cross-licensing agreement is an "association", then AMD has an association with Intel, and AMD also has an association with Nvidia.  Does that mean that you'd expect having an AMD video card in a system to increase the performance of an Nvidia card or vice-versa?  Or that having an AMD processor in a system would increase the performance of an Intel processor or vice-versa?

    Cross-licensing agreements are ubiquitous because you need patents from a bunch of different companies (not just AMD and Nvidia!) in order to develop a modern GPU.  It's a mutually assured destruction scenario where if AMD and Nvidia don't cross-license each other's GPU patents, neither of them would be legally able to produce any GPUs.

    TWIMTBP and Gaming Evolved "partnerships" aren't about making a game run better on your company's GPUs than on their competitors.  Or at least, usually they aren't.  The occasional exceptions of sabotage (anti-aliasing in Batman: Arkham Asylum is the only one that comes to mind) get a company so much public scorn as to not be worth the trouble.

    Rather, the point is to help a company to make the game run better on your company's hardware than it would otherwise.  It's one thing to make a game run better on your company's hardware, and then it runs however it runs on your competitor's hardware.  That's perfectly clean.  Sometimes it incidentally increases performance on the competitor's hardware because the programming is more efficient in general.  Often it means that a game has additional graphical features that the game company wouldn't have otherwise known how to implement (or in the case of GPU PhysX, wouldn't have bothered because it's stupid).  Sometimes a game that is sponsored by one GPU vendor incidentally runs better on the hardware of the other.

    It's entirely different to go out of your way to insert code that will intentionally sabotage performance on your competitor's hardware.  Speeding up performance on your own hardware by a factor of six is unheard of unless the program was previously incredibly inefficient.  And if it was so inefficient, then fixing the inefficiencies will probably speed up performance on the competitor's hardware, too.  If the game really does perform better on a GeForce GT 520 than on a Radeon HD 6850, then intentional sabotage is the probable culprit--and that's a huge story, and one that should be told to SOE's shame until they fix it.

    But does a GeForce GT 520 really perform better than a Radeon HD 6850?  I'd be strongly skeptical of that.  I'd likewise be strongly skeptical that a GeForce GT 520 outperforms a GeForce GTX 480.  Most likely, the official recommended system specs are simply stupid.  It's much easier and more plausible that writing the official system requirements was delegated to someone who doesn't know much about computer hardware than that SOE put some code into their system to intentionally sabotage performance on AMD video cards.  The latter would be suicidal, as it closes off your game to a substantial fraction of the people who would otherwise play it.

  • VirusDancerVirusDancer Member UncommonPosts: 3,649
    In regard to the discussion about GPU sponsorship, it leads me back to the only nVidia card I've ever bought.  It was for Neverwinter Nights...for the Aurora Toolset.  It did not work with anybody but nVidia at release.  No ATI, no S3, no etc.  So I ran it on a separate box.

    I miss the MMORPG genre. Will a developer ever make one again?

    Explorer: 87%, Killer: 67%, Achiever: 27%, Socializer: 20%

  • fadisfadis Member Posts: 469

    The GPU spec is that they can support shader model 3.0. 

    For NVDIA that's the 8800... For ATI... who cares, ATI is awful.  But... if it's something before 4850, then you get a cookie.  Write a letter to Smedley and revel in your glory.

     

    In any case - we're still talking about rigs that are going to be 5 years old at launch.  I don't know what to tell you if that makes you unhappy/sad/mad.  SOE is making an MMOFPS... it's going to have higher requirements than your standard MMOs.... and that's the game they want to make.

     

     

  • UhwopUhwop Member UncommonPosts: 1,791
    Originally posted by Quizzical
     

    But does a GeForce GT 520 really perform better than a Radeon HD 6850?  I'd be strongly skeptical of that.  I'd likewise be strongly skeptical that a GeForce GT 520 outperforms a GeForce GTX 480.  Most likely, the official recommended system specs are simply stupid.  It's much easier and more plausible that writing the official system requirements was delegated to someone who doesn't know much about computer hardware than that SOE put some code into their system to intentionally sabotage performance on AMD video cards.  The latter would be suicidal, as it closes off your game to a substantial fraction of the people who would otherwise play it.

     No, the 520 is a much lower grade of card then the 6850.  However:

    It doesn't matter if the ATI card is technically better or not.  When the GPU manufacturer works with the game developer to help opitimize the game to run better on their card no amount of "the ATI card is better so the game should run better." 

    This is not new for either Nvidia or ATI, but Nvidia is much more active in doing this sort of stuff.  Nvidia in particular has been doing this for many, many years.  "Runs best on Nvidia" isn't just a blurb, it doesn't benefit the game developer in any way to simply put a splash screen up with the Nvidia logo on thier games.  It has EVERYTHING to do with Nvidia providing support during development of the game to ensure that the game runs on thier cards.

    If we both both build a GPU and the biggest difference between your GPU and my GPU is that I work directly with developers to ensure that their game runs on my card better then yours, guess who sells more cards. 

    There is a very good reason why Nvidia is the best selling GPU on the market, and why ATI cards have a tendency to give thier users problems. 

    AMD bought out ATI several years ago.  Prior to that AMD and Nvidia worked very closely together.  This is what lead to the cross lisencing deal between Intel and Nvidia.  It would be foolish of Intel and Nvidia to not work together when Intels competitor buys out Nvidias competitor; especially when Intel and Nvidia are both interested in doing exactly what AMD-ATI are now capable of doing.  Intel and Nvidia don't have to spend years developing their own technologies to do what AMD can with the buyout of ATI. 

    http://arstechnica.com/business/2011/01/intelnvidia-bombshell-look-for-nvidia-gpu-on-intel-processor-die/

    "The cross-licensing agreement allows Intel to integrate NVIDIA technologies and those that are covered by our patents into their CPUs, such as Sandy Bridge, for example," said Jen-Hsuan. "And a cross-license allows us to build processors and take advantage of Intel patents for the types of processor we're building—Project Denver, Tegra, and the types of processors we're going to build in the future."

    There's a very good reason why when you buy an intel based PC it usually comes with an nvidia based GPU or integrated intel GPU using Nvidia architecture. 

    And SOE has been partnered with Nvidia for many years now.  

      http://eq2wire.com/2011/07/07/live-welcome-reception/

    John Smedley back to the stage to talk about new things.

    Forge Light MMO Engine
    * Being used to build PlanetSide 2 and EverQuest Next.
    * Massively seamless worlds — NO ZONES
    * Real-time Radiosity
    * Designed to Scale to newer computers
    * Advanced atmospheric scattering
    * Volumetric fog
    * Advanced ambient occlusion — objects shadowing each other
    * Complex shaders
    * Advanced environmental lighting
    * Real-time physics system with nVIDIA PhysX
    * Partnered with nVidia

    It has EVETYING TO DO with Nvidia providing the support needed to ensure that their cards work better with the game. 

    Blizzard did the exacts same thing with ATI:

    http://www.gamespot.com/news/blizzard-amd-forge-graphics-card-deal-6195824

    However, that isn't the only measure that AMD is taking to gain ground on graphics-chip rival Nvidia. The semiconductor company also said today that it has entered into a partnership with Blizzard Entertainment, purveyor of arguably the most popular PC game in the world, World of Warcraft. As part of the development-partnership deal, AMD will also begin bundling unspecified Blizzard Entertainment games with all ATI Radeon graphics cards.

    "Our collaboration with AMD is especially important to us because it provides us with early access to some of the latest graphics technology," commented Paul Sams, chief operating officer of Blizzard Entertainment. "Delivering a polished game experience is one of our top priorities, and this relationship helps us achieve that goal for Blizzard gamers who choose AMD graphics cards."

    Which is probably why ATI cards had fewer issues with WoW then Nvidia cards did.  Go figure, you would think WoW would run on a voodoo card.  Yet, ATI users had far fewer problems with performance then Nvidia users did.  This was also the one and only time I ever purchased an ATI card, I got tired of dealing with trying to run the game well on Nvidia cards. 

     

    Once again, this is why older, less powerful cards are able to outperform newer, more powerful cards in individual games; as well as why you'll see a lower performing card listed for a game over another brand of card that actually benchmarks higher in the minimum requirements.     

    This is not an opinion, it is not information I'm pulling out of my ass, THIS IS EXACTLY THE WAY THAT IT WORKS.   

     

    The 4850 is coparible to nvidia's 9800/ 520 series. 

    The 8600 series, which is apparently what I'm seeing is the listed minimum for Nvidia cards, is actually comparible to the ATI 3650. 

    That huge gap has everything to do with the partnership, and support that SoE has and receives from Nvidia.

    And the shader model 3 thing isn't right either, the ATI 4850 supports shader model 4.1.

     

  • QuizzicalQuizzical Member LegendaryPosts: 25,348

    "It doesn't matter if the ATI card is technically better or not. When the GPU manufacturer works with the game developer to help opitimize the game to run better on their card no amount of "the ATI card is better so the game should run better." "

    Except that that isn't how it has worked in basically every other game ever made--certainly including all recent games.  Sometimes one side will have fewer driver bugs relevant to a particular game, but not systematically run 6x as fast.

    ""Runs best on Nvidia" isn't just a blurb, it doesn't benefit the game developer in any way to simply put a splash screen up with the Nvidia logo on thier games."

    That's like saying that the EVE Online banner near the top of the screen right now doesn't benefit MMORPG.com in any way.  Don't you understand what advertising is?

    "If we both both build a GPU and the biggest difference between your GPU and my GPU is that I work directly with developers to ensure that their game runs on my card better then yours, guess who sells more cards. "

    To the contrary, if that's the biggest difference between our cards, then they're going to perform essentially identically in every single game.

    "There is a very good reason why Nvidia is the best selling GPU on the market, and why ATI cards have a tendency to give thier users problems. "

    Reputation from things that happened years ago, marketing, fanboys, and FUD.  Speaking of which...

    "This is what lead to the cross lisencing deal between Intel and Nvidia."

    You fundamentally have no clue what a cross-licensing deal is.  It's about patents, not working together.

    Say Nvidia figures out a slick way to increase ROP performance.  So they patent it.  AMD figures out how to make a hardware tessellator more efficient.  So they patent it.  And so forth.  All relevant parties patent tons of little innovations in their chips.

    In order to get a patent, you have to explain exactly how what you did works.  But then no one else can do the same thing unless they license your patent.

    One problem is that it's far from clear what is patentable and what is not.  You can't get a patent on things that are "obvious".  But hardware patents are so technical that judges and juries tend not to understand them, so if you go to court to sue someone for violating your patents, you risk having the other side convince a jury that your patent was obvious and is therefore invalid.

    But the bigger problem is that there are huge numbers of things that are patented.  In order to make a working, modern GPU at all, you need to license patents from Nvidia and AMD and Intel and a bunch of other companies.  If you don't license the patents, then it's illegal for you to build a modern GPU at all.

    Cross-licensing agreements are the way around this.  AMD and Nvidia sign a deal that says that they're both allowed to use all of the graphics patents of the other companies and won't sue each other for it.  If they don't, then neither are allowed to produce any GPUs, so they basically have to.

    Except that what sometimes happens is that the sides can't agree on the terms of the cross-licensing agreement.  One side might argue that it has a lot more patents than the other, so the other side should have to pay them some money to get the deal.  One side argues that it doesn't particularly need the other's patents, and so should receive monetary compensation rather than just getting to use the other side's patents.  Patent trolls are companies that buy up a bunch of patents, but don't produce anything, and just sue everyone to try to get them to pay money for a license to use their patents.

    For example, that's what Apple and Samsung are suing each other over.  Both have a bunch of patents relevant to tablets, cell phones, and so forth.  But they're unable to agree to the precise terms of a cross-licensing agreement, so they file a bunch of lawsuits against each other all over the world.  At some point, they might agree to settle on a cross-licensing agreement that lets each use the other's patents for some number of years and maybe one side pays the other a fixed amount of money.  Until that happens, both sides risk having judges rule that a bunch of their products cannot be sold in their particular country due to patent violations.

    In the case of Nvidia and Intel, that was hardly the first cross-licensing agreement that the companies had signed.  For example:

    http://www.nvidia.com/object/IO_17070.html

    Basically, Intel needed Nvidia's graphics patents in order to produce integrated graphics.  In exchange, they agreed to let Nvidia produce chipsets for Intel processors.  When Bloomfield processors launched in 2008, Intel refused to let Nvidia make chipsets, arguing that the licensing agreement only allowed Nvidia to make chipsets for processors that used FSB, while Bloomfield used QPI instead to do the same thing, and that was fundamentally different.  Nvidia argued that this was a breach of the previous cross-licensing agreement, which allowed Nvidia to make chipsets for any successors to FSB, even if Intel decided to call them something different.  There were other claimed violations of the previous agreement, too.

    They fought it out in court for a while, and finally agreed to a cross-licensing agreement in which Intel got to use Nvidia patents without Nvidia having the right to produce chipsets, but Intel would pay Nvidia $1.5 billion to compensate for this.  That's the cross-licensing agreement that you cited.  But you know who else Intel has a comparable cross-licensing agreement with?  AMD.

    http://www.amd.com/us/press-releases/Pages/amd-press-release-2009nov12.aspx

    So, incidentally, does Nvidia.

    "It would be foolish of Intel and Nvidia to not work together when Intels competitor buys out Nvidias competitor; especially when Intel and Nvidia are both interested in doing exactly what AMD-ATI are now capable of doing."

    Well yes, Intel and Nvidia both produce chips that have a CPU and GPU on the same chip.  Intel calls theirs Sandy Bridge and Ivy Bridge so far, with Haswell coming.  I think some version of Atom does the same.  Nvidia calls theirs Tegra.

    "Intel and Nvidia don't have to spend years developing their own technologies to do what AMD can with the buyout of ATI. "

    To the contrary, Intel has had to develop their own GPU and write drivers for it.  They have access to Nvidia's patents (and also AMD's, etc.), but that doesn't magically design the chip for them.  What works for chip design depends on the process node you're using, and Intel is their own fab and has their own process nodes that are different from the TSMC process nodes that Nvidia and AMD build GPUs on.

    Likewise, Nvidia is unable to get access to Intel's x86 patents.  Instead, to make a processor, they've had to license ARM cores.  But ARM basically exists to create low-power processor cores and related technologies and then license them to anyone and everyone.

    "There's a very good reason why when you buy an intel based PC it usually comes with an nvidia based GPU or integrated intel GPU using Nvidia architecture."

    You're wildly wrong.  If you buy a computer with an Intel CPU, more likely than not, it has only Intel integrated graphics in it.  There is no such thing as an "integrated intel GPU using Nvidia architecture", never has been, and probably never will be.  Intel's integrated graphics use their own architecture.

    "Once again, this is why older, less powerful cards are able to outperform newer, more powerful cards in individual games; as well as why you'll see a lower performing card listed for a game over another brand of card that actually benchmarks higher in the minimum requirements."

    Okay, so you've bought the stupid fanboy FUD on The Way It's Meant To Be Played.  Can you explain why a GeForce GT 520 meets the minimum requirements and a GeForce GTX 480 doesn't?  They're both Nvidia cards, and both the Fermi architecture, even.

    "The 4850 is coparible to nvidia's 9800/ 520 series. "

    A Radeon HD 4850 is roughly comparable to a GeForce 9800 GTX+/GTS 150/GTS 250/whatever else Nvidia decided to rename it.  Both cards are dramatically faster than a GeForce GT 520.

    -----

    You don't understand how graphics cards work and the purpose of industry standards.  Most of the code that makes a game run is done on the processor.  It's written in C++ or Java or C# or whatever.  I think C++ is by far the most common, but there's no rule that it's illegal to make games in anything else.  That portion of the code mostly neither knows nor cares what GPU you're using, and will run at exactly the same speed no matter what video card you have.

    In order to use the video card, there are graphics APIs.  Most games use DirectX, but OpenGL is also used by some.  An API has a bunch of different commands relevant to graphics.  In order to claim compliance with some particular version of DirectX or OpenGL or whatever, a video card vendor has to write drivers that can make their video cards exhibit certain behavior when any particular API command is called.  Drawing 3D graphics basically consists of sending a long sequence of API commands to draw this here, that there, and so forth.

    Actually, the larger part of video card performance is probably shaders, which are programs that run on the video card written in HLSL (for DirectX) or GLSL (for OpenGL).  But there are a number of fixed-function portions of the graphics pipeline in both cases, too.

    In any case, in order to be compliant with the API at all, a video card has to be able to take any arbitrary API commands and shaders and draw something that complies with what the API specifies.  In some cases, the specification says that you have to draw exactly this.  In others, it merely says that it has to have several particular characteristics, but leaves some flexibility.  In some cases, the specifications even say that a video card can do whatever it wants, but these are things that programmers are well-advised to avoid, such as taking the square root of a negative number.

    Video card vendors are given flexibility in exactly how they implement various API calls and shader functions.  When new drivers improve performance in a bunch of games, it likely means that they've redone the implementation of various things to get the same end result faster.

    When AMD or Nvidia partners with a game company, they send someone to work at the company and help them code part of the game.  They may advise doing something this way instead of that way because it's more efficient on that company's cards--and often because it's just more efficient in general.  They may help the company implement graphical features that none of the company's own employees know how to implement.

    A particular API call or HLSL function may happen to run better on one video card than on another.  Some functions have performance mostly dependent on PCI Express bandwidth.  Others rely on shaders.  Still others rely on TMUs.  Video memory bandwidth is a factor in many things, though some things are more bandwidth-intensive than others.

    One game may be coded to do a lot of things by texture lookups that another game does by computing things in shaders.  If card A has better texture performance while card B has beter shader performance, then the first game might run better on card A while the second runs better on card B.  That can easily explain why a game might perform 20% better as compared to another than you'd expect from an "average" game.  Different shader architectures between AMD and Nvidia can make it easier to put shaders to good use.

    In the generations in question, AMD's shaders offered greater parallelism via VLIW5, but it was harder to exploit, as using all five shaders meant that you needed to have the same instruction used by five different shaders at once, but operating on different data.  And perhaps more to the point, the GPU's scheduler needed to be able to find them and collate them without taking much work to do so.  Some vector operations make this fairly trivial to do, at least 3- or 4-way, but often it can't.

    But when you're comparing a Radeon HD 6850 to a GeForce GT 520, it's not a case where one card is better at some things and the other is better at others.  The 6850 is vastly better at just about everything, and usually by a huge margin.  The 6850 has nearly 10 times the theoretical peak shader performance, but even in a worse-case scenario where it isn't able to exploit VLIW5 paralleism at all, it still would have nearly double the shader performance of a GT 520.  The 6850 has 6 times as many TMUs, and clocked only slightly lower than the GT 520.  It has 8 times as many ROPs, and again, clocked only slightly lower.  Parallelism to use all of the TMUs and ROPs is fairly trivial to do, too, so it's not a case of being unable to exploit them unless the card isn't meaningfully limited by TMU or ROP performance.  The 6850 also has about 9 times the memory bandwidth.

    Even if you were to try to cherry-pick API calls and shader functions that Nvidia's architecture handles much better than AMD's, you might have a hard time getting a GT 520 to outperform a Radeon HD 6850 outside of intentional sabotage (e.g., querying the GPU vendor and then calling a bunch of wait functions in the processor code if it's AMD).  Your only real hope would be to find a flagrant bug in AMD's drivers and trigger it as often as you can.  But if a game company figures out how to do that, the rational thing to do is to report the bug to AMD so that they'll fix it.

    But really, this shouldn't just be an abstract argument.  Claiming that the game runs better on a GeForce GT 520 than on a Radeon HD 6850 is an empirical claim.  The way to test it is to try running the game on both cards and seeing what happens.

  • MothanosMothanos Member UncommonPosts: 1,910

    It still has serious performance issues no matter what system you run this game on.
    With release this close this concern it getting bigger and bigger with each day passing.

    I own the minimum gpu needed to play PS2, the Asus 4850 1gb, and it just doesnt even look nice on high settings.
    Offcourse you cannot expect eye candy with this card anymore, but at low settings one should and can expect smooth gameplay.

    But even with every setting on low it stutters , its unstable , massive fps dips out of nowhere.
    Some textures load horrible, and i mean it looks like a old nintendo texture...


    Beta is beta and it does not represent the final release, but its a very valid concern this close to release.

    Cant wait to buy the new Aito 8xxx series when they come out, but untill then it seems my gpu fails to play this game and PS2 needs major performance increase.

  • herculeshercules Member UncommonPosts: 4,924
    Well i preached on this for eq2 and will do same here if average gamer pc fails to handle it then it will struggle to do well.
  • TeknoBugTeknoBug Member UncommonPosts: 2,156

    In the past few years, games has been released with realistic specs but when installing and running the game it says a different story, Crysis needed a beefy system (and even then a $4000 beefed up PC couldn't run it at max).


    Devs should start reading the hardware census from Steam, there's a surprising amount of people still gaming on a 2core CPU and 4GB ram, etc.

    image
    image

  • AeolronAeolron Member Posts: 648
    Originally posted by TeknoBug

    In the past few years, games has been released with realistic specs but when installing and running the game it says a different story, Crysis needed a beefy system (and even then a $4000 beefed up PC couldn't run it at max).


    Devs should start reading the hardware census from Steam, there's a surprising amount of people still gaming on a 2core CPU and 4GB ram, etc.

     More and more games now for the PC and starting to get into the quad core stuff and if you look at the specs for BF3 it is actualy higher then PS2 and yet , a TON of players play BF3 seems my system can handly PS2 with no issues what so ever, my specs

     

    Windows 7

    AMD 8150 CPU

    Asus GTX 670

    12 gigs of ram

    1 Terabyte mechanical hard drive

    1000 wat power

    Asus Sabertooth Main board

    you don't need a $4000 system, hell even with a budget of 1200 bucks you could run this game no problem, what I do find funny is Everquest 2 runs like shit on my system but planet side 2 runs very very smooth even in huge battles with max settings I get 60-70 FPS.

    Game studios can't always hold back on their progress because of afew people who won't upgrade their systems, I can understand where your coming from if you don't have the money , but it can be said if you don't have the money why would you buy more games in the first place that you can barely run because of your system specs?

  • zomard100zomard100 Member Posts: 228
    You are serious about that? It is most optimized engine  i saw in last years. I have with 6850 ati, 8gb ddr3, quad core proc not a single frame rate issue even in big battles. It is time to gear up for you my friend
  • AeolronAeolron Member Posts: 648
    Originally posted by zomard100
    You are serious about that? It is most optimized engine  i saw in last years. I have with 6850 ati, 8gb ddr3, quad core proc not a single frame rate issue even in big battles. It is time to gear up for you my friend

     I agree with you, but I put it alittle differently haha :P

    It is one of the best mmo engines I have ever seen , not just because I am playing the game but because I have played 95% of the mmos out and not one comes to mind of a engine that can handle that many people in one tight area and have almost no FPS issues at all , I was amazed that my screen didn't turn into a slide show!

    Now those damn vanu weapons are powerful!

  • XasapisXasapis Member RarePosts: 6,337

    Too high to play in a WoW capable toaster?

     

    It's been a while since developers actually develop with an eye to the future. I blame the lack of console hardware update for that. They seem to be the driving force in gaming these days and the lack of serious hardware updates hurt the long term technological advancement.

    Which btw is another reason so many games look so similar nowadays.

  • VhalnVhaln Member Posts: 3,159

    There's so much pressure on devs to make their games look good, that how they actually play falls to the wayside.  It's just another symptom of the mass appeal mass produced corporate industry that gaming has become.

     

    Ironically, the masses don't have great computers.  They're just too stupid to know that, and will buy whatever looks awesome, anyhow.

    When I want a single-player story, I'll play a single-player game. When I play an MMO, I want a massively multiplayer world.

  • Proton37Proton37 Member Posts: 7

    Theres already enough games out there that run on a tandy 1000 EX

  • PainlezzPainlezz Member UncommonPosts: 646

    Anything SOE makes has a stamp of fail on it in my eyes.

     

    Everquest 2 ran like crap when it was launched on top of the line machines (at that time).  Here we are many.. MANY years later and it runs "ok" on a top of the line machine. 

    I guess i'll give the EQ2 team a little credit.  I believe they designed that game during the single core era and they expected that we would continue to design faster single core processors.  Sadly for them the industry took another route and started pumping out multi-core systems.

    Bla bla bla...

  • VhalnVhaln Member Posts: 3,159
    This is a game that really thrives with lots of players being rendered onscreen at the same time.  From reports I've seen, such as previews by TotalBiscuit.. it just isn't doing that well.  It's not just about people with crappy old computers.  It's not even doing that well on the latest hardware.

    When I want a single-player story, I'll play a single-player game. When I play an MMO, I want a massively multiplayer world.

  • AeolronAeolron Member Posts: 648
    Originally posted by Vhaln
    This is a game that really thrives with lots of players being rendered onscreen at the same time.  From reports I've seen, such as previews by TotalBiscuit.. it just isn't doing that well.  It's not just about people with crappy old computers.  It's not even doing that well on the latest hardware.

     Runs very very well for myself and afew of my friends systems on max settings right now with hardly a dip in the FPS Pool.

    Some people just need to upgrade their systems is all, can't exspect to have hardware that is 6-7 years old run a modern game, and most likely those people try to run the game on max settings even though their systems can't handle it and wonder why they experience crappy FPS, but when they turn it down it looks ok but not as nice so they will moan and complain about the engine not being optimized properly, however there are some games out there that have poor optimization , Example , EQ2 .

  • MothanosMothanos Member UncommonPosts: 1,910

    My personal vieuw is that no matter how you look at the choice for developers to choose 1 brand the other brand gets shafted, and with it all the potential players owning (in this case) a Ati card.

    Nvidia try's hard to fish the biggest releases while AMD gets left behind in most of the cases when tripple A games are released.

    I think its bad, and it has to be equal.

    Ati and Nvidia needs to be present with any AAA game to smooth out the gameplay.

    Nothing worse then buying 2x 680's sli to see your fps is crap
    Nothing worse then buying 2x 7970 crossfire to see your fps is crap

    Drivers needs to be optimised no matter what brand you buy.
    I even want to go so far that it needs to be punished by law or by our wallet.

  • TeknoBugTeknoBug Member UncommonPosts: 2,156


    Originally posted by Aeolron
    Originally posted by TeknoBug In the past few years, games has been released with realistic specs but when installing and running the game it says a different story, Crysis needed a beefy system (and even then a $4000 beefed up PC couldn't run it at max). Devs should start reading the hardware census from Steam, there's a surprising amount of people still gaming on a 2core CPU and 4GB ram, etc.
     More and more games now for the PC and starting to get into the quad core stuff and if you look at the specs for BF3 it is actualy higher then PS2 and yet , a TON of players play BF3 seems my system can handly PS2 with no issues what so ever, my specs

     

    Windows 7

    AMD 8150 CPU

    Asus GTX 670

    12 gigs of ram

    1 Terabyte mechanical hard drive

    1000 wat power

    Asus Sabertooth Main board

    you don't need a $4000 system, hell even with a budget of 1200 bucks you could run this game no problem, what I do find funny is Everquest 2 runs like shit on my system but planet side 2 runs very very smooth even in huge battles with max settings I get 60-70 FPS.

    Game studios can't always hold back on their progress because of afew people who won't upgrade their systems, I can understand where your coming from if you don't have the money , but it can be said if you don't have the money why would you buy more games in the first place that you can barely run because of your system specs?



    When Crysis first came out yes you needed a good system but still couldn't meet its match. BF3's Frostbite 2.0 engine is one of the most optimized and scales very well with cores and is mostly GPU bound, it runs decently on my older AMD system as (@720p).


    I remember Vanguard and Age of Conan playing horribly even on newer PCs when they came out, even GTA4 ran like crap. Guild Wars 2 seems to be where that is now, the halloween patches made things worse but hopefully they'll come around to fixing the performance issues because right now it isn't touching the GPU anymore (used to before the October patches).

    image
    image

Sign In or Register to comment.