Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Shaders=Game Industry Scam!

GravargGravarg Member UncommonPosts: 3,424

I find it wierd how my old computer can run games like Warhammer Online, WoW, Guild Wars, and some others, but when it comes to some games it says I can't play because it requires shader 2.0. I have a geforce2 mx400 in it. I think it's a whole money thing with teh gaming industry. Warhammer online and Guild Wars both use Shaders (it has it in the video options to adjust it). I tried Age of Conan, and it starts up then crashes about 5 seconds at log in screen, due to lack of memory, but it still starts!

So why is it some games that use shaders that i can play and others i can't...it's fishy?  I know that shaders do actually do something, it makes the game look alot more realistic, but for it to be required is a scam.  In the games listed above, when a shader is used on my new computer, it's just pixelated on my old computer, but it's still very playable...just doesn't look as good.

 

Now I know you're asking, where's the scam part? Well, i'll tell ya! The industry comes out with these new technologies, that are soon a required thing to play their games (not all, but like 90% of the games in the past 5 years require a shader support).  Thus you have to spend your money on updating your gpu in order to play.  When otherwise you could use a 10 year old computer (I got my old computer back in Feb '00), to play modern day games.

Comments

  • HarkerTempesHarkerTempes Member UncommonPosts: 13

     It's not a scam to want your game to have decent graphics. Honestly they cannot hold back the progress of games just to cater to people who have not upgraded in years.

  • GravargGravarg Member UncommonPosts: 3,424

    i'm not saying they shouldn't advance the graphics and capabilities, but to make them required is BS.  When a 10 year old computer can play a game, as long as it has enough RAM, even if it requires shaders.  The only thing stopping my old computer from playing the game is when you start up the game they usually have a system check and it says, "hey there's no pixel shader 2.0", he doesn't get to play.  When infact you can still play the game without the shaders, things will just look pixelated

  • drbaltazardrbaltazar Member UncommonPosts: 7,856

    your right on the shader part often if its deactivated you ll get very nice game play.

    good luck trying to ask game maker to deactivate those tho.

    on a side not graphic card are so cheap now a day if you check your local city  at e-bay  you can find some shader 3 card for 30 $

    hell they havent even moved to 64 bit its been released what 7 years ago 8 years ago!

  • soap46soap46 Member Posts: 169

    The time and money involved for game devs to make sure their game runs on every 10 year old POS Dell out there is just too great as well as just plain impractical.  The fact is that most devs just expect the gamers to have upgraded their rig in the last, I don't know, 5 years...  If you play PC games at all and have a machine that can't utilize shader model 2.0 at the very least, than it's time for a massive upgrade.  I shudder to think what else is in that machine...

  • SnarlingWolfSnarlingWolf Member Posts: 2,697

    It's not a scam it's a video card limitation. If they are using features that require current directx and newer graphics card to use then that is all it is. Companies don't always want to be held back in what they can do just because some people don't update their video cards for 6 years. Often times the newer shader technology allows more options while being more efficient (For example DirectX 10 finally made DirectX almost as efficient as OpenGL).

     

    Depending on the system they could do two things: Either force it to run in software mode if the user doesn't have it, but a machine old enough to not have it would die trying to use all the computing/memory power to mimick the feature, or they could allow the user to disable it. Depending on how deep into the system the feautre is programmed (if it is a low level system that much of the graphics is derived on) then the option for the user to turn it off goes out the window.

     

     

    No conspiracy just the continued evolution of PC gaming. Most companies usually don't force their users to use a technology until research has showed that 75-80% of the market already has it in their systems.So most developers really do a good job of using not current technology to expand the usability of it. You just got far enough into the not upgraded group that you fell into the small percentage that gets cut out until they upgrade.

  • XerithXerith Member Posts: 970

    ITs based on the engine the game is built upon, its not just some random decision that the game makers decide one day in some vaulted room 100 feet below the surface of the Earth dressed in their best villain costumes.

    Pixel shading is now supported by all graphics cards, the older ones will have 2.0 while the newer ones support up to 5.0. Buy a 30 dollar card off newegg and I promise you it will have at least 2.0 on it.

    You can use programs like 3d analyze to move the workload off of your gpu over to your cpu, however if your cpu is also lacking all this will cause is massive bottle necking and you will get constant lock ups while trying to play. Im not sure what card you have, but video cards as far back as Nvidia's original 6 series has 3.0 support, so Im guessing you are running a toaster oven.

  • banthisbanthis Member Posts: 1,891

    I think its BS that people expect things to progress but for technology back to the stone ages to be supported.  Its actually far more challenging if not taxing and a waste of money to make games accept every single possible configuration.  Sure some games could offer the way to turn it off but honestly some games are just frigging TErrible graphically to not have the shaders turned on not to mention they can turn it off but your computer may explode anyway since its actually far more taxing on your system to turn shaders off for SOME things.  

    Shader language 2.0 has been around for years..I mean frigging Years..it was new when I was still in college over 5 years ago.   Get a new video card or atleast something "newer" you dont have to have the newest of the new for shader 2.0.  

    Your computer sux and you blame the game industry.   Thats really frigging hilarious considering Games are not the only things that use shaders, support shaders, and drive the technological industry.  Its just the one with the most eyes on it at all times.

    If you want to be a PC gamer accept that technology changes and sometimes you have to update.   You dont have to spend alot of money to stay up to date.  You only have to spend alot of money to stay at the bleeding edge which is best left for the hardcores and those with cash and nothing better to do.

    Actually if you want to be a gamer period you have to accept that.  Console are just as bad as they eventually stop supporting old games completely. 

  • drbaltazardrbaltazar Member UncommonPosts: 7,856

    and another thing speed up adoption once in a while ,technology so versatile that its easier for everybody to adopt them

    things lik w7 64 bit

    or dx11 witch works with ALL PREVIOUS ITERATION OF DX

     dont get me wrong tho very soon mmo will be made to work on linux ,why ?china

    one of the biggest market for mmo do you believe 1.5 billion of people will buy microsoft

    nope not a chance,they ll support the free version and linux is so close to be mass market ready like any other os that

    market like china and india will tweak linux the little bit   it miss ,and make their mmo on those

    so microsoft and mac only got a small respite because linux is not mmo ready yet

    but i bet right now some engineer are working on doing a linux version of ms donnybrook.

  • Thomas2006Thomas2006 Member RarePosts: 1,152

     WoW a geforce2 and a MX card at that..  Oh common man, time to get a upgrade. Thats like me pulling out the old voodoo2 and expecting it to run World of Warcraft or any other game made within the last 7 years. It's not going to happen.

  • ForceQuitForceQuit Member Posts: 350

    I'm sorry OP, but you do not have a clear understanding of PC technology.  There are sometimes crummy business practices in the industry, but hardware requirements are not a scam.  The fact is, every game is going to have certain minimum requirements, or its design objectives can not be met.  Very few modern games can run well if at all on ten year old hardware, and it is a physical limitation, not a business one.

  • Kaelaan21Kaelaan21 Member UncommonPosts: 349

    A shader is a basically a programming function that is applied as a processing affect. The objects that are used in the game world are pretty much static and cannot change their shape or properties without a severe performance issue. This is why the older 3D games were rigid and stiff. Shaders give the illusion that the ingame objects can be modified in real time. Certain cloth simulation, lighting effects, weather, space ship shields are all examples of using shaders.

    (EDIT: Above statement is over simplified, of course)

     

    Each DirectX shader generation has newer capabilities for programmers to take advantage of. Giving even more flexibility. The trade off is that they are not backwards compatible. So, if you write a game using Shader 2.0, then if you want your game to work in older shader models, you will also have to write the same exact shader again but in the older shader format. Also, if you want the game to also work in older versions of Direct X without shader support, then you have also have to make a Software accelerated version of your game engine.

     

    So, basically, it comes down to a company making a decision on how much profit vs development hours are spent on how wide their supported platform is. To support older hardware, the programmers need to write the same functions multiple times for the different versions they need to support. It's not only a waste of resources on certain games, but it also causes version issues. Where in some hardware versions, you may get a slightly different look than other hardware versions. This makes debugging a nightmare.

  • GravargGravarg Member UncommonPosts: 3,424

    I have no problem with them advancing graphics and all that.  The fact that with a programlike 3d analyze or swiftshader, my old computer can still run the highest quality of today's games.  The problem I have is that I have to use a 3rd party program just to get rid of the stupid shader check.  I have a new system with dual geforce 295 GTX cards.  I just wondered why the heck are shaders a requirement to play, when on my old rig with a program that disables thier shader checks, I can play Age of Conan if I just added another 256mb of ram.  Granted the graphics are pixelated and not even comparable to my new computer, but it's still playable.  I figured having a game that could be played on as many computers as possible, would be a good upside to profits.

  • GdemamiGdemami Member EpicPosts: 12,342


    Originally posted by Gravarg
    I have no problem with them advancing graphics and all that.  The fact that with a programlike 3d analyze or swiftshader, my old computer can still run the highest quality of today's games.  The problem I have is that I have to use a 3rd party program just to get rid of the stupid shader check.  I have a new system with dual geforce 295 GTX cards.  I just wondered why the heck are shaders a requirement to play, when on my old rig with a program that disables thier shader checks, I can play Age of Conan if I just added another 256mb of ram.  Granted the graphics are pixelated and not even comparable to my new computer, but it's still playable.  I figured having a game that could be played on as many computers as possible, would be a good upside to profits.

    It is nothing simple as making on/off button.

    Supporting various shader models is the same as supporting different platforms - it requires additional coding and testing. At some point, this effort becomes inefficient.

  • Kaelaan21Kaelaan21 Member UncommonPosts: 349
    Originally posted by Gravarg


    I have no problem with them advancing graphics and all that.  The fact that with a programlike 3d analyze or swiftshader, my old computer can still run the highest quality of today's games.  The problem I have is that I have to use a 3rd party program just to get rid of the stupid shader check.  I have a new system with dual geforce 295 GTX cards.  I just wondered why the heck are shaders a requirement to play, when on my old rig with a program that disables thier shader checks, I can play Age of Conan if I just added another 256mb of ram.  Granted the graphics are pixelated and not even comparable to my new computer, but it's still playable.  I figured having a game that could be played on as many computers as possible, would be a good upside to profits.



     

    Yes, but once again, it comes down to how much investment it would take to rewrite a portion of their game engine (possibly introducing bugs on specific systems) versus how much additional revenue they would gain.

     

    Those 3rd party apps, don't always work. In fact, sometimes they simply cause the game to crash. I have an old laptop with an Radeon integrated chipset that only supports DX 7. This means that most newer games, I need to trick the OS in believing that it supports a later version of Direct X and shader model to run certain games. Problem is that some games simply crash.

     

    Also, if the shaders are used to cover up model defects, or are combined with objects transparencies to give a cool effect, disabling shaders all together could cause an exploit. This is simular to those 3rd party programs that would disable textures of your choosing. This allows for an overlay to select a wall, disable the texture and then see through the other side. The ingame physics still prevent you from walking through or shooting through the wall, but it gives you the jump on your opponent when you can see him or her through a wall. These programs are usually seen as hacks and are targeted by protection programs such as GameGuard and PunkBuster.

  • HaegemonHaegemon Member UncommonPosts: 267

    If the new game was graphically stripped down, sure, almost anything could run on anything.

     

    But the real crux of this is that the dev specs for these things are to present their game to the player and make it look as good as possible for the hardware available.

    Those min.specs asking for a specific shader model are because that is the level of graphics that the developers designed it for, intended it for, and optimized it for.  Moreso with PC hardware than mmo-playstyles, you design around that lowest common denominator. If you make that to low, then the top end suffers. To high, the low end suffers. Right now, that MX400 is well below that "mass-market min.spec"

     

    This also goes into how the game gets reviewed/presented to the public. Take a game built to run on more modern hardware, then just force disable a lot of the new graphic effects that went into it just for antiquity-grade hardware. Now, the game looks amazing/runs amazing at its recommended specs, but because of those min-specs, some media outlet will do a full review in those settings, its inevitable. Then you'll get the complaints over releasing such an ugly/messy/unplayable game due to not redeveloping graphical replacements to put in the old-hardware version vs the more standard version.

     

    Scam, no. There are perfectly legitimate reasons in the gaming market for force obsolescence of old hardware.

    All about the money either way? Ya, they're a business after all. Thats the end-game no matter how you cut it.

     

    Lets Push Things Forward

    I knew I would live to design games at age 7, issue 5 of Nintendo Power.

    Support games with subs when you believe in their potential, even in spite of their flaws.

  • KruulKruul Member UncommonPosts: 482

    enlighten yourself

     

    http://en.wikipedia.org/wiki/Shader

  • GravargGravarg Member UncommonPosts: 3,424

    they don't even need an on/off switch for shaders.  You can play a game without shaders that is using shaders. That's the main point of this entire thread.  The graphics aren't the best, but you get what you pay for.  Shaders aren't like CPU power or RAM, where they're a must in order to do things.  Without enough CPU or RAM, the game crashes.  Playing without shaders when the game is using shaders does nothing except make things looke pixelated, kinda like playing a ps1 game.  If they took out the check for Shader support or not on the gpu, that's actually less code for them to write.  I know that some companies don't want players to log in with below requirement computers, because it effects the experience of playing (like the graphics).  However, I don't really care about graphics much.  Gameplay is key for me, and if you can't play the game because they add a simple Shaders check, well that kills gameplay :)  The only thing making Shaders required is the check installed by the game developers.  You can get around this with 3rd party programs.  That's why I'm able to play games like Warhammer Online, Atlantica Online, Age of Conan (I dug around and found 2 512mb ram sticks :) yay!), etc. on a machine that's 10 years old.  I was suprised that I could run Age of Conan.  AoC's graphics are amazing on my new computer.  The only problem is my CPU doesn't meet the requirements, so city areas and load screens are choppy and kinda long, but it's still playable (which is my whole point)

     

    I guess my language in the thread topic is kind of over the top.  It's not so much a scam as it is an unessecary requirement.

     

    I should add, the only reason I actually care about it is I'm using my old computer to dual box with my new one, so graphics could be black and white for all i care on it :)  Just need to be able to use my healbot :)

  • Kaelaan21Kaelaan21 Member UncommonPosts: 349
    Originally posted by Gravarg


    they don't even need an on/off switch for shaders.  You can play a game without shaders that is using shaders. That's the main point of this entire thread.  The graphics aren't the best, but you get what you pay for.  Shaders aren't like CPU power or RAM, where they're a must in order to do things.  Without enough CPU or RAM, the game crashes.  Playing without shaders when the game is using shaders does nothing except make things looke pixelated, kinda like playing a ps1 game.  If they took out the check for Shader support or not on the gpu, that's actually less code for them to write.  I know that some companies don't want players to log in with below requirement computers, because it effects the experience of playing (like the graphics).  However, I don't really care about graphics much.  Gameplay is key for me, and if you can't play the game because they add a simple Shaders check, well that kills gameplay :)  The only thing making Shaders required is the check installed by the game developers.  You can get around this with 3rd party programs.  That's why I'm able to play games like Warhammer Online, Atlantica Online, Age of Conan (I dug around and found 2 512mb ram sticks :) yay!), etc. on a machine that's 10 years old.  I was suprised that I could run Age of Conan.  AoC's graphics are amazing on my new computer.  The only problem is my CPU doesn't meet the requirements, so city areas and load screens are choppy and kinda long, but it's still playable (which is my whole point)
     
    I guess my language in the thread topic is kind of over the top.  It's not so much a scam as it is an unessecary requirement.
     
    I should add, the only reason I actually care about it is I'm using my old computer to dual box with my new one, so graphics could be black and white for all i care on it :)  Just need to be able to use my healbot :)



     

    I know exactly what you are getting at, but as I pointed out before - a lot of games will not work when you try disabling the shaders. This is simply because one part of the engine may be expecting a result from results that the shader produces.

     

    Also, it can have other consequences in the form of cheating. Let's say your playing a game and you happen to walk in front of building with smokey plated glass windows. The game doesn't allow you to break the glass, but there could be a player on the other side. The glass itself is an object with an invisible texture (pure alpha channel, with maybe a hint of a dark color). By itself, you can clearly see through it. However, with a shader, the devs can give the effect of a semi-glossy reflection of yourself and the scenery behind you in window. In addition, the shader could also use an animated texture map that as you walk by the glass, it gives that smokey realistic look that I can almost,bearly see through.

     

    If you use a program to disable shaders within the game, then the player can be able to see directly through the glass no problems. This was not the intent of the game developers. In fact, the other player on the other side of the glass doesn't know you can do this. Now, you can't break the glass, but you can sure as hell ambush your oponent with ease in such a situation. If the game developer was going to allow you to shut off parts of the game engine in such a scenario, they would need to make another version of the same glass window and have it solid grey so that the player without the shader support could play, but not have a serious advantage. They would also need to do this for every simular effect through the entire game. It's not worth it for the amount of money gained by the developers.

     

    As mentioned before, the above tactic is used ALL OF THE TIME in cheating in FPS games. A lot of people stopped playing BF2 because of hacked video drivers allowing players to disable wall textures. You can't shoot through the walls, but it makes it much easier to compete if you have x-ray vision on all the time.

     

    I remember in Lineage 2 a cheat that would allow a player to zoom way out and look straight down. This would allow you to pin point exactly where your enemy is by easily viewing them in a mode that is simular to a real time overhead map.

     

    There are trade offs and if you are caught turning off DirectX features within any network game, I would not be surprised when someone accuses you of cheating, even if you weren't. (http://www.youtube.com/watch?v=1EJXTh4HMAE)

  • majinantmajinant Member UncommonPosts: 418

    It's all a scam. It's a scam that I can't play this game with my 128MB SD RAM! It's a scam that I can't play with my 800Mhz P3! It's scam I can't play this game with my Geforce MX440 64MB!

    Lol no. Hardware advances and so do games. They cannot continue to cater to people who haven't upgraded in years.



  • forinboyforinboy Member UncommonPosts: 89
    Originally posted by Gravarg


    I find it wierd how my old computer can run games like Warhammer Online, WoW, Guild Wars, and some others, but when it comes to some games it says I can't play because it requires shader 2.0. I have a geforce2 mx400 in it. I think it's a whole money thing with teh gaming industry. Warhammer online and Guild Wars both use Shaders (it has it in the video options to adjust it). I tried Age of Conan, and it starts up then crashes about 5 seconds at log in screen, due to lack of memory, but it still starts!

    So why is it some games that use shaders that i can play and others i can't...it's fishy?  I know that shaders do actually do something, it makes the game look alot more realistic, but for it to be required is a scam.  In the games listed above, when a shader is used on my new computer, it's just pixelated on my old computer, but it's still very playable...just doesn't look as good.
     
    Now I know you're asking, where's the scam part? Well, i'll tell ya! The industry comes out with these new technologies, that are soon a required thing to play their games (not all, but like 90% of the games in the past 5 years require a shader support).  Thus you have to spend your money on updating your gpu in order to play.  When otherwise you could use a 10 year old computer (I got my old computer back in Feb '00), to play modern day games.

    Please lock this for utter stupidity or trolling, whichever works.

     

     

Sign In or Register to comment.