Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Any one else think we reached our limit on graphics?

wankydrakewankydrake Member Posts: 43

After looking at crysis2 screenshots i dont see in the next year and up of graphics really geting better then that i know a lot of small details could get improved but unless the whole 3D glasses catch on i just cant see it geting higher then we are now.

«134

Comments

  • MehveMehve Member Posts: 487

    Short answer: No.

    What's limiting companies these days are 1) cost of development, and 2) average processing power of target customers. Better graphics cost more money/time to design, more money to optimize, and require more powerful equipment that reduces the number of people who can buy/use the product.

    There are plenty of tech demos out there that blow away even stuff like what's been coming out for Crysis, nevermind what the professional animation/graphics studios have been churning out for motion pictures. The stuff just can't be generated anywhere near realtime, which makes them infeasible for gaming purposes.

    A Modest Proposal for MMORPGs:
    That the means of progression would not be mutually exclusive from the means of enjoyment.

  • drbaltazardrbaltazar Member UncommonPosts: 7,856

    lol the issue isnt graphic as intel as shown recently in asia the limiting fact in mmo is the fact it has to go somewhere

    oh the solution are there ( a version of remote differential compression applied to mmo might help)

    microsoft donnybrook will help if dev get the permission from game maker to use those as part of the game design

    but all those are still bandaid.we had virtual reality in the 90 in netcafe (no you are right it wasnt online it was local)

    but the idea is the same put a motorcycle helmet on imagine the chromed glass protector on it is the view screen and its pretty mutch what we had full 3 d motion snesor that detected when you turned your head(imagine it being the a & d key to turn)

    if i putted todays new small 3d glasses then added that 3 d screen helmet we had back then  it would be very good techno fairly easy to do to with todays techno we didnt have the techno avail in the 90s .

    but one bottleneck remain the internet!so the limit you perceive is only there because dev are asked to.

    check they still are fully in the dx9 era not many will ever see dx10 even less will ever bother wtih dx11 ,donnybrook ,remote differential comression.all these techno help but only if your application is designed to abuse of it

    if its not designed with it in mind everything is slower.check microsoft suggest to leave remote differential compression

    on the off chance you meet a site that use it .true there are some that are designed around it and take a lot less ressource at both end but on average its better disabled!

  • WickedjellyWickedjelly Member, Newbie CommonPosts: 4,990

    No

    1. For god's sake mmo gamers, enough with the analogies. They're unnecessary and your comparisons are terrible, dissimilar, and illogical.

    2. To posters feeling the need to state how f2p really isn't f2p: Players understand the concept. You aren't privy to some secret the rest are missing. You're embarrassing yourself.

    3. Yes, Cpt. Obvious, we're not industry experts. Now run along and let the big people use the forums for their purpose.

  • cpc71783cpc71783 Member Posts: 45

    We're not even close to having reached our limit. What we face right now is that companies like ATI and nVidia are in competition every year to see who can crank out more poly's, and the problem with this is that it takes these monstrous power hungry video card processors to handle those poly's. We're essentially limiting ourselves, and these companies are making a lot of money every year by requiring that gamers buy these new graphics cards in order to keep up--in order to process those few extra poly's.

    The main problem with this is that poloygon's are limited in the way that they can be processed and still LOOK good. We have to use things like anti aliasing and anistropic filtering to make graphics utilizing polygons look even halfway decent, and this is a total failure if you don't have a high end graphics card.

    There's new technology in the works to do away with poly's all together and the extremely expensive video cards along with them, but do you think nVidia and ATI will allow that to happen? Probably not until they figure out a way to make money off of the next technology.

    Graphics that LOOK ten times better and require very little processing can be run mostly by software and require next to nothing from a video processor stand point. It's just like our current vehicles and the need for OIL. Oil companies aren't going to let alternative means of transportation ruin their current monopoly, and all at the cost of hindering our technological advancement.

    Down with the man, the power, and the greed!

  • cpc71783cpc71783 Member Posts: 45

    Originally posted by zymurgeist

    Originally posted by cpc71783

    We're not even close to having reached our limit. What we face right now is that companies like ATI and nVidia are in competition every year to see who can crank out more poly's, and the problem with this is that it takes these monstrous power hungry video card processors to handle those poly's. We're essentially limiting ourselves, and these companies are making a lot of money every year by requiring that gamers buy these new graphics cards in order to keep up--in order to process those few extra poly's.

    The main problem with this is that poloygon's are limited in the way that they can be processed and still LOOK good. We have to use things like anti aliasing and anistropic filtering to make graphics utilizing polygons look even halfway decent, and this is a total failure if you don't have a high end graphics card.

    There's new technology in the works to do away with poly's all together and the extremely expensive video cards along with them, but do you think nVidia and ATI will allow that to happen? Probably not until they figure out a way to make money off of the next technology.

    Graphics that LOOK ten times better and require very little processing can be run mostly by software and require next to nothing from a video processor stand point. It's just like our current vehicles and the need for OIL. Oil companies aren't going to let alternative means of transportation ruin their current monopoly, and all at the cost of hindering our technological advancement.

    Down with the man, the power, and the greed!

     Still waiting on my 200 MPG carburator

    Trust me if you could run decent graphics on an intel GMA video chip someone would be doing it.

    There are no miracle technologies. Everything is a development of something that already existed.  

    You sure about that? Check out Vexels (essentially pixels) and their use in UNLIMITED detail. Unlimited detail being produced without the need of a video card.

    Unlimited detail uses pixels, or vexels instead of polygons, and renders an unlimited amount of detail in a virtual world. It does this by only rendering what is visible on the screen, instead of rendering the entire world around you all at the same time:

    Eat your words.

  • SovrathSovrath Member LegendaryPosts: 32,014

    Originally posted by cpc71783

    Originally posted by zymurgeist


    Originally posted by cpc71783

    We're not even close to having reached our limit. What we face right now is that companies like ATI and nVidia are in competition every year to see who can crank out more poly's, and the problem with this is that it takes these monstrous power hungry video card processors to handle those poly's. We're essentially limiting ourselves, and these companies are making a lot of money every year by requiring that gamers buy these new graphics cards in order to keep up--in order to process those few extra poly's.

    The main problem with this is that poloygon's are limited in the way that they can be processed and still LOOK good. We have to use things like anti aliasing and anistropic filtering to make graphics utilizing polygons look even halfway decent, and this is a total failure if you don't have a high end graphics card.

    There's new technology in the works to do away with poly's all together and the extremely expensive video cards along with them, but do you think nVidia and ATI will allow that to happen? Probably not until they figure out a way to make money off of the next technology.

    Graphics that LOOK ten times better and require very little processing can be run mostly by software and require next to nothing from a video processor stand point. It's just like our current vehicles and the need for OIL. Oil companies aren't going to let alternative means of transportation ruin their current monopoly, and all at the cost of hindering our technological advancement.

    Down with the man, the power, and the greed!

     Still waiting on my 200 MPG carburator

    Trust me if you could run decent graphics on an intel GMA video chip someone would be doing it.

    There are no miracle technologies. Everything is a development of something that already existed.  

    You sure about that? Check out Vexels (essentially pixels) and their use in UNLIMITED detail. Unlimited detail being produced without the need of a video card.

    Unlimited detail uses pixels, or vexels instead of polygons, and renders an unlimited amount of detail in a virtual world. It does this by only rendering what is visible on the screen, instead of rendering the entire world around you all at the same time:

    Eat your words.

    The problem I see with unlimited detail is that developers will need to support it. I think in one of the interviews they said that they had approached some video card developers but that those developers didn't seem interested.

    They  need to make it so that it becomes so good that developers couldn't think about NOT including this tech.

    Like Skyrim? Need more content? Try my Skyrim mod "Godfred's Tomb." 

    Godfred's Tomb Trailer: https://youtu.be/-nsXGddj_4w


    Original Skyrim: https://www.nexusmods.com/skyrim/mods/109547

    Try the "Special Edition." 'Cause it's "Special." https://www.nexusmods.com/skyrimspecialedition/mods/64878/?tab=description

    Serph toze kindly has started a walk-through. https://youtu.be/UIelCK-lldo 
  • drbaltazardrbaltazar Member UncommonPosts: 7,856

    Originally posted by cpc71783

    Originally posted by zymurgeist


    Originally posted by cpc71783

    We're not even close to having reached our limit. What we face right now is that companies like ATI and nVidia are in competition every year to see who can crank out more poly's, and the problem with this is that it takes these monstrous power hungry video card processors to handle those poly's. We're essentially limiting ourselves, and these companies are making a lot of money every year by requiring that gamers buy these new graphics cards in order to keep up--in order to process those few extra poly's.

    The main problem with this is that poloygon's are limited in the way that they can be processed and still LOOK good. We have to use things like anti aliasing and anistropic filtering to make graphics utilizing polygons look even halfway decent, and this is a total failure if you don't have a high end graphics card.

    There's new technology in the works to do away with poly's all together and the extremely expensive video cards along with them, but do you think nVidia and ATI will allow that to happen? Probably not until they figure out a way to make money off of the next technology.

    Graphics that LOOK ten times better and require very little processing can be run mostly by software and require next to nothing from a video processor stand point. It's just like our current vehicles and the need for OIL. Oil companies aren't going to let alternative means of transportation ruin their current monopoly, and all at the cost of hindering our technological advancement.

    Down with the man, the power, and the greed!

     Still waiting on my 200 MPG carburator

    Trust me if you could run decent graphics on an intel GMA video chip someone would be doing it.

    There are no miracle technologies. Everything is a development of something that already existed.  

    You sure about that? Check out Vexels (essentially pixels) and their use in UNLIMITED detail. Unlimited detail being produced without the need of a video card.

    Unlimited detail uses pixels, or vexels instead of polygons, and renders an unlimited amount of detail in a virtual world. It does this by only rendering what is visible on the screen, instead of rendering the entire world around you all at the same time:

    Eat your words.

    hasnt intel sent some of their top dog on that techno!(see if it was good enough they were bent on buying it!)

    this would be a big thing for intel since they got the manpower and the ressource to pushoff ati and nvidia(intel being laughed at by these two didnt help either)

  • dzikundzikun Member Posts: 150

    Have we reached our limit i graphics? FAAAR from it friend...

    I've been uplinked and downloaded, I've been inputted and outsourced. I know the upside of downsizing, I know the downside of upgrading.

    I'm a high-tech low-life. A cutting-edge, state-of-the-art, bi-coastal multi-tasker, and I can give you a gigabyte in a nanosecond.

    I'm new-wave, but I'm old-school; and my inner child is outward-bound.

    I'm a hot-wired, heat-seeking, warm-hearted cool customer; voice-activated and bio-degradable.

    RIP George Carlin.

  • drbaltazardrbaltazar Member UncommonPosts: 7,856

    Originally posted by Sovrath

    Originally posted by cpc71783


    Originally posted by zymurgeist


    Originally posted by cpc71783

    We're not even close to having reached our limit. What we face right now is that companies like ATI and nVidia are in competition every year to see who can crank out more poly's, and the problem with this is that it takes these monstrous power hungry video card processors to handle those poly's. We're essentially limiting ourselves, and these companies are making a lot of money every year by requiring that gamers buy these new graphics cards in order to keep up--in order to process those few extra poly's.

    The main problem with this is that poloygon's are limited in the way that they can be processed and still LOOK good. We have to use things like anti aliasing and anistropic filtering to make graphics utilizing polygons look even halfway decent, and this is a total failure if you don't have a high end graphics card.

    There's new technology in the works to do away with poly's all together and the extremely expensive video cards along with them, but do you think nVidia and ATI will allow that to happen? Probably not until they figure out a way to make money off of the next technology.

    Graphics that LOOK ten times better and require very little processing can be run mostly by software and require next to nothing from a video processor stand point. It's just like our current vehicles and the need for OIL. Oil companies aren't going to let alternative means of transportation ruin their current monopoly, and all at the cost of hindering our technological advancement.

    Down with the man, the power, and the greed!

     Still waiting on my 200 MPG carburator

    Trust me if you could run decent graphics on an intel GMA video chip someone would be doing it.

    There are no miracle technologies. Everything is a development of something that already existed.  

    You sure about that? Check out Vexels (essentially pixels) and their use in UNLIMITED detail. Unlimited detail being produced without the need of a video card.

    Unlimited detail uses pixels, or vexels instead of polygons, and renders an unlimited amount of detail in a virtual world. It does this by only rendering what is visible on the screen, instead of rendering the entire world around you all at the same time:

    Eat your words.

    The problem I see with unlimited detail is that developers will need to support it. I think in one of the interviews they said that they had approached some video card developers but that those developers didn't seem interested.

    They  need to make it so that it becomes so good that developers couldn't think about NOT including this tech.

    ya and the fact the presentation isnt in HD doesnt help them either if they want to be taken seriously!

  • Blackfire1Blackfire1 Member UncommonPosts: 116

    "Short answer: No.

    What's limiting companies these days are 1) cost of development, and 2) average processing power of target customers. Better graphics cost more money/time to design, more money to optimize, and require more powerful equipment that reduces the number of people who can buy/use the product.

    There are plenty of tech demos out there that blow away even stuff like what's been coming out for Crysis, nevermind what the professional animation/graphics studios have been churning out for motion pictures. The stuff just can't be generated anywhere near realtime, which makes them infeasible for gaming purposes."

     

    This is the BEST reply I've ever seen. This thread should be closed.

     

  • ZenNatureZenNature Member CommonPosts: 354

    Originally posted by wankydrake

    After looking at crysis2 screenshots i dont see in the next year and up of graphics really geting better then that i know a lot of small details could get improved but unless the whole 3D glasses catch on i just cant see it geting higher then we are now.

    Haven't you ever seen Star Trek: TNG? Not to get too nerdy here, but until we see that starship's processing of recreating physical 3D food and interactive entities and environments on the holodeck, we're not even close to the limit of 'graphics' processing. 3D glasses are going to soon be a forgotten stepping stone to new technology, kinda like records, VCRs, pagers, etc.

    I think we'd have to be able to recreate the human eye and every synapse in the human brain that processes visual stimuli to reach our 'limit', and then we'd still probably find a way to enhance it further.

  • AercusAercus Member UncommonPosts: 775

    Consider this..

    Moore's Law dictates that computing power doubles every 1.5 years. Last I saw the average lifespan of a household computer was 3-4 years, depending on country. Major technology upgrades (as in previous tech redundancy) happen about every 3-4 years. I don't know the increase in bandwidth per year, but I would assume it is growing too.

    Couple these things together and in 3-4 years time games can be designed for today's highest end machines as the medium setting. There's still plenty of room to a maximal technical feasibility when it comes to graphics, and with increasing computing power we will see improved graphics in upcoming games, just as we have done the last 35 years.

  • drbaltazardrbaltazar Member UncommonPosts: 7,856

    Originally posted by Aercus

    Consider this..

    Moore's Law dictates that computing power doubles every 1.5 years. Last I saw the average lifespan of a household computer was 3-4 years, depending on country. Major technology upgrades (as in previous tech redundancy) happen about every 3-4 years. I don't know the increase in bandwidth per year, but I would assume it is growing too.

    Couple these things together and in 3-4 years time games can be designed for today's highest end machines as the medium setting. There's still plenty of room to a maximal technical feasibility when it comes to graphics, and with increasing computing power we will see improved graphics in upcoming games, just as we have done the last 35 years.

    mm !BLIZZARD?HEY !BLIZZARD ?YES YOU UP THE TOWER!DID YOU SEE THIS!YOUR LATE AS USUAL! DONT YOU THINK ITS TIME YOU DO BETTER ?YOU VE BEEN USING THIS OLD DOG WHAT 6 YEARS. 4 generation of computer and you still use that 2004 old dog !whats going on!

     

    grin!i guess someone forgot to e-mail blizzard that it as been 4 gen of computer since they lunched wow!

    men 6 years already!

  • MeromorphMeromorph Member Posts: 75

    3D is just a gimmick, but there will always be room for graphics to improve as long as I can spot tell the difference between a monitor and a window.

  • AercusAercus Member UncommonPosts: 775

    Originally posted by Meromorph

    3D is just a gimmick, but there will always be room for graphics to improve as long as I can spot tell the difference between a monitor and a window.

     Some THC should fix that for ya... ;)

  • ForceQuitForceQuit Member Posts: 350

    Originally posted by cpc71783

    Originally posted by zymurgeist


    Originally posted by cpc71783

    We're not even close to having reached our limit. What we face right now is that companies like ATI and nVidia are in competition every year to see who can crank out more poly's, and the problem with this is that it takes these monstrous power hungry video card processors to handle those poly's. We're essentially limiting ourselves, and these companies are making a lot of money every year by requiring that gamers buy these new graphics cards in order to keep up--in order to process those few extra poly's.

    The main problem with this is that poloygon's are limited in the way that they can be processed and still LOOK good. We have to use things like anti aliasing and anistropic filtering to make graphics utilizing polygons look even halfway decent, and this is a total failure if you don't have a high end graphics card.

    There's new technology in the works to do away with poly's all together and the extremely expensive video cards along with them, but do you think nVidia and ATI will allow that to happen? Probably not until they figure out a way to make money off of the next technology.

    Graphics that LOOK ten times better and require very little processing can be run mostly by software and require next to nothing from a video processor stand point. It's just like our current vehicles and the need for OIL. Oil companies aren't going to let alternative means of transportation ruin their current monopoly, and all at the cost of hindering our technological advancement.

    Down with the man, the power, and the greed!

     Still waiting on my 200 MPG carburator

    Trust me if you could run decent graphics on an intel GMA video chip someone would be doing it.

    There are no miracle technologies. Everything is a development of something that already existed.  

    You sure about that? Check out Vexels (essentially pixels) and their use in UNLIMITED detail. Unlimited detail being produced without the need of a video card.

    Unlimited detail uses pixels, or vexels instead of polygons, and renders an unlimited amount of detail in a virtual world. It does this by only rendering what is visible on the screen, instead of rendering the entire world around you all at the same time:

    Eat your words.

    You are correct in that polygons will go the way of the dodo.  You are incorrect, however, that the technology in the link you provided will be its replacement.  Unlimited Detail Technology is more snake oil than anything else.

    Future rendering technologies will abandon proprietary APIs like DirectX and OpenGL altogether and embrace technologies like sparse voxel octrees to replace polygons.  And in the interim, before we can properly abandon raster rendering and move over to ray tracing, there will likely be a hybrid approach.

    This article over at PC Perspective is an interview with John Carmack where he discusses the many different approaches to our future graphics technology.

  • JuKuGJuKuG Member Posts: 14

    Have u ever smoked weed?

     

    And no... not even near the limit.

  • AercusAercus Member UncommonPosts: 775

    Originally posted by JuKuG

    Have u ever smoked weed?

     

    And no... not even near the limit.

     I assume the question was directed at me.. Luckily Amdam is put to shame here, and it does help immersion a lot :)

  • Death1942Death1942 Member UncommonPosts: 2,587

    Originally posted by Aercus

    Consider this..

    Moore's Law dictates that computing power doubles every 1.5 years. Last I saw the average lifespan of a household computer was 3-4 years, depending on country. Major technology upgrades (as in previous tech redundancy) happen about every 3-4 years. I don't know the increase in bandwidth per year, but I would assume it is growing too.

    Couple these things together and in 3-4 years time games can be designed for today's highest end machines as the medium setting. There's still plenty of room to a maximal technical feasibility when it comes to graphics, and with increasing computing power we will see improved graphics in upcoming games, just as we have done the last 35 years.

    true but i think the biggest thing will always be bandwidth.

    Sending data from one side of the world to the other will always reduce performance and force developers to cut back on things and it wont ever be fixed (it will only get faster and better, never fixed).

    MMO wish list:

    -Changeable worlds
    -Solid non level based game
    -Sharks with lasers attached to their heads

  • AercusAercus Member UncommonPosts: 775

    Originally posted by Death1942

    Originally posted by Aercus

    Consider this..

    Moore's Law dictates that computing power doubles every 1.5 years. Last I saw the average lifespan of a household computer was 3-4 years, depending on country. Major technology upgrades (as in previous tech redundancy) happen about every 3-4 years. I don't know the increase in bandwidth per year, but I would assume it is growing too.

    Couple these things together and in 3-4 years time games can be designed for today's highest end machines as the medium setting. There's still plenty of room to a maximal technical feasibility when it comes to graphics, and with increasing computing power we will see improved graphics in upcoming games, just as we have done the last 35 years.

    true but i think the biggest thing will always be bandwidth.

    Sending data from one side of the world to the other will always reduce performance and force developers to cut back on things and it wont ever be fixed (it will only get faster and better, never fixed).

    True to a certain extent, it's always the slowest ship which dictates the speed of the convoy. Then again, first time I played an MMO I was on ISDN dialup, now I have 10 m/bit...

  • DaywolfDaywolf Member Posts: 749

    Originally posted by cpc71783

     

    You sure about that? Check out Vexels (essentially pixels) and their use in UNLIMITED detail. Unlimited detail being produced without the need of a video card.

    Unlimited detail uses pixels, or vexels instead of polygons, and renders an unlimited amount of detail in a virtual world. It does this by only rendering what is visible on the screen, instead of rendering the entire world around you all at the same time:

    Eat your words.

    For texturing? hmmm... don't know about that. Good for image design, but I may be missing it's application for real-time 3d rendering engines. We have LOD and Anti-aliasing for smoothing. Also some shaders work well for smoothing objects, like taking low polygon models and making them look like high-polygon models. We need lighting improvements more than anything imo.

    --

    Anyhoot, not at the wall for sure, high-resolution art/rendering systems are always improving; see that silly movie called Avatar? It's just a matter of making it work for real-time 3D gaming engines. And realism isn't so popular with gamers these days it seems, all the clatter about how great the graphics in games like WoW are, and games take 3-5yrs to produce so we see the backlash of it now. Devs listen, so you get cartoony stuff I guess, so less dev time into anything really cutting edge for realism.

    M59, UO, EQ1, WWIIOL, PS, EnB, SL, SWG. MoM, EQ2, AO, SB, CoH, LOTRO, WoW, DDO+ f2p's, Demo’s & indie alpha's.

  • elusivexelusivex Member Posts: 86

    For texturing? hmmm... don't know about that. Good for image design, but I may be missing it's application for real-time 3d rendering engines. We have LOD and Anti-aliasing for smoothing. Also some shaders work well for smoothing objects, like taking low polygon models and making them look like high-polygon models. We need lighting improvements more than anything imo.

    --

    Anyhoot, not at the wall for sure, high-resolution art/rendering systems are always improving; see that silly movie called Avatar? It's just a matter of making it work for real-time 3D gaming engines. And realism isn't so popular with gamers these days it seems, all the clatter about how great the graphics in games like WoW are, and games take 3-5yrs to produce so we see the backlash of it now. Devs listen, so you get cartoony stuff I guess, so less dev time into anything really cutting edge for realism.

    If anyone said WoW has good graphics ever they need a cold hard dose of reality.    The games graphics are only just good enough to be playable for me.  They do age well but are very cartoony.   I also agree we are not even close to the graphic wall.

    A man or "gamer" should look for what is, and not for what he thinks should be.

  • VhalnVhaln Member Posts: 3,159

    Originally posted by wankydrake

    After looking at crysis2 screenshots i dont see in the next year and up of graphics really geting better then that i know a lot of small details could get improved but unless the whole 3D glasses catch on i just cant see it geting higher then we are now.

     

    Seems some of the replies have gotten off track here, because I don't think the question is about what's viable, but rather, are graphics currently so awesome that they're as realistic as we can even imagine?  

     

    I think we're just used to overlooking some imperfections, but 10 years from now, we'll look back, and those imperfections will seem glaring.  Also, depends on what's being rendered.  Some things are easier to do realistically than others.  The human face, for example, still has a ways to go, before skin looks real, expressions look real, animations include the elasticity of flesh, rather than just the polygons attached to bones, etc, etc.

     

    There's some great stuff out there, but until you can't actually tell the difference between what's real, and what's computer generated (even when it comes to the most difficult things) there will still be room for improvement.

     

    When I want a single-player story, I'll play a single-player game. When I play an MMO, I want a massively multiplayer world.

  • negentropynegentropy Member Posts: 241

    Originally posted by wankydrake

    After looking at crysis2 screenshots i dont see in the next year and up of graphics really geting better then that i know a lot of small details could get improved but unless the whole 3D glasses catch on i just cant see it geting higher then we are now.

    Reached the limit on graphics? No

    Reached the limit on creativity? Yes

    A fanatic is one who can't change his mind and won't change the subject. -Winston Churchill
  • DaywolfDaywolf Member Posts: 749

    Originally posted by elusivex

    For texturing? hmmm... don't know about that. Good for image design, but I may be missing it's application for real-time 3d rendering engines. We have LOD and Anti-aliasing for smoothing. Also some shaders work well for smoothing objects, like taking low polygon models and making them look like high-polygon models. We need lighting improvements more than anything imo.

    --

    Anyhoot, not at the wall for sure, high-resolution art/rendering systems are always improving; see that silly movie called Avatar? It's just a matter of making it work for real-time 3D gaming engines. And realism isn't so popular with gamers these days it seems, all the clatter about how great the graphics in games like WoW are, and games take 3-5yrs to produce so we see the backlash of it now. Devs listen, so you get cartoony stuff I guess, so less dev time into anything really cutting edge for realism.

    If anyone said WoW has good graphics ever they need a cold hard dose of reality.    The games graphics are only just good enough to be playable for me.  They do age well but are very cartoony.   I also agree we are not even close to the graphic wall.

    hah yeah I know, but that hit about every discussion forum on the net back then. For publishers, I'm sure that was a boon, since art like that could be cranked out much faster meaning more profits. Considering that was 3-6 years ago, many of those comments made it to many design documents no doubt, so it may not be so important to reach for realism today as it was before, at least for a many of games.



    Someone mentioned ray-tracing, that would be very good. Still very slow currently though, but has become faster on the application side as well as on the hardware side as computers get faster. Also HDRI/HDRR is improving. Sad thing is that monitors still lack good lighting range so even with the technology it takes quite a bit of power to work around those deficiencies. Brightside technology will help quite a bit when that is implamented.

    M59, UO, EQ1, WWIIOL, PS, EnB, SL, SWG. MoM, EQ2, AO, SB, CoH, LOTRO, WoW, DDO+ f2p's, Demo’s & indie alpha's.

Sign In or Register to comment.