Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

DirectX 11

135

Comments

  • botrytisbotrytis Member RarePosts: 3,363

    Well, A.Net is stopping support for Win XP in GW1. That says they are probably moving up the DX ladder.


  • seridanseridan Member UncommonPosts: 1,202
    Originally posted by botrytis

    Well, A.Net is stopping support for Win XP in GW1. That says they are probably moving up the DX ladder.

    Nope. Windows 95, 98, 98SE and Me only. They still support XP

    Block the trolls, don't answer them, so we can remove the garbage from these forums

  • The_KorriganThe_Korrigan Member RarePosts: 3,459
    Originally posted by botrytis

    Well, A.Net is stopping support for Win XP in GW1. That says they are probably moving up the DX ladder.

    Nope, not XP. They just stop support for Windows 95, 98, 98SE and ME. XP remains fully supported.

    EDIT: bleh, Seridan beat me to it =P

    Respect, walk, what did you say?
    Respect, walk
    Are you talkin' to me? Are you talkin' to me?
    - PANTERA at HELLFEST 2023
    Yes, they are back !

  • atharielathariel Member Posts: 91

    Rather than going DX9 vs DX10 vs DX11, community should pressure ANet into implementing an OpenGL renderer. Not only it would improve backwards compatibility (OpenGL 4.2 works under Windows XP without problem, DX10/DX11 doesn't because "f*** you"). Also such move would enable easy Mac/Linux ports (Starcraft 2 and WoW always had an OGL renderer and that's why there are native versions for these operating systems).

     

    The disadvantage would be possible driver problems, but then, if no one pushes for drivers quality (bug reporting), they will never improve, creating a circular logic.

     

    Also, DX11 won't make the graphics magically better. Witcher 2 is DX9 game and it is one of the best looking games this gen.

  • RasputinRasputin Member UncommonPosts: 602
    Originally posted by Maggon
    Originally posted by Rasputin
    Taking your numbers at face value (links are nice), it means that if you make DX11 version, you will accomodate only around half of the potential customer base. That means, that you are forced not only to support DX11, but also DX10 and/or DX9.

     

    I partly agree to this, DX9 obviously needs to be supported, but why DX10 ? Rather just have DX9 and DX11 supported, and not DX10 at all. You don't need all the graphical features from DX11 to benefit from it - read below.

    That is why I wrote "and/or" (see my highlight in quote).

  • RasputinRasputin Member UncommonPosts: 602
    Originally posted by rav3n2
    Originally posted by Rasputin
    Originally posted by rav3n2
    Originally posted by XAPGames

    There's way too much hardware out there that still runs XP.  The sweet spot as I see it, is to support DX11 and DX9c until DX9c obsoletes.  Now that 11 is out, I don't see any reason for 10... although there may be some.

     

    ArenaNet is quite strong on supporting old hardware.  Only recently (within the last year) did they pull support for GuildWars running on Windows 98 SE.  Even now GW still supports DX 8.1 rendering hardware.

    When DX10 came out, everyone in the Industry pretty much labelled it the API no one uses. Regarding your too much hardware, I beg to differ, not that Steam surveys are a global measure but they are a pretty good one, last time I looked 47% were running DX11 compliant Hardware, 35ish% DX10 compliant hardware. 

    The gap between bottom end and high end is definetely evening out, also since DX10 Microsoft move to enforce hardware manufacturers to be full compliant or not support it really helped push for everyone to have full compliant GPUs which helps greatly PC game developers.

    Furthermore, because of the API updates, it is much harder to support DX9 and DX10 than it is DX10 and DX11, much of the interfaces are compatible between DX10 and DX11 which is not the case when you mix in DX9. 

    Taking your numbers at face value (links are nice), it means that if you make DX11 version, you will accomodate only around half of the potential customer base. That means, that you are forced not only to support DX11, but also DX10 and/or DX9.

    As a developer, no matter what you do, you MUST support DX9 as baseline (unless you create the next Crysis, with a limited potential customer base). But when you must support DX9, why even go further? Why not just stay at DX9, when the quality is so good?

    Soon DX12 will come out, and one more version has to be supported. When does it end? It must be tempting to support the baseline only, and then raise the baseline, once the population has upgraded. Meaning, you can make a DirectX10-baseline when, say 80% can use it (as 45-50% of PCs still run XP it does not matter whether their hardware supports DX10+ or not).

    It is Microsoft's own fault. They could have designed DirectX differently, maybe more modular, so a version could be expanded on the fly. At the very least they could have kept the number of DirectX versions down, instead of the current hailstorm of new versions, which probably is part of a planned obsolesence of hardware.

    Developers need a level of stability. A game project takes 2-5 years, and as the development runs now, in that timespan 2 new DX-versions may come out.

    Here is the link http://store.steampowered.com/hwsurvey/ and I was mistaken it is actually 41% for DX11 and 36% for DX10, what I am saying is that if you support DX10 then you will still support the larger gaming population ( going by the Steam survey ).

    DirectX 11 can also run on DirectX 9 hardware with the lowest feature set which is something that came in with DX 10.1, which means supporting DirectX 9 at all is a waste of time, it is almost a completely different paradigm to DX10 and DX11.

    It does not matter what the hardware can do. If people are still running XP (which 45-50% of the PC's do), you can only run DX9.

    If you claim, that DX11 automatically will scale down to DX9, then you have to give me a link, because I never heard about that.

    And in any case, how are you even going to install a game demanding DX11 - scaling or not - which does not have DX11 installed? Like all XP machines.

  • The_KorriganThe_Korrigan Member RarePosts: 3,459
    Originally posted by Rasputin

    It does not matter what the hardware can do. If people are still running XP (which 45-50% of the PC's do), you can only run DX9.

    According to the steam statistics, gamers seem to be mostly running WIndows 7 (70%), with 10+% running Vista too. It also shows that most already are equipped for DX10, if not DX11.

    XP still has a strong following in the companies, because you don't need more to run the latest word, excel, or even for your company server. As I said, why change when it's working?

    Originally posted by Rasputin
    If you claim, that DX11 automatically will scale down to DX9, then you have to give me a link, because I never heard about that.

    You are right here, DX11 does NOT "automatically" scale down to DX9 on DX9 hardware. You have to implement separate rendering paths in your software. My guess is that people who think it's automatically downscaling are using pre-made engines which actually do the job.

    Respect, walk, what did you say?
    Respect, walk
    Are you talkin' to me? Are you talkin' to me?
    - PANTERA at HELLFEST 2023
    Yes, they are back !

  • dellirious13dellirious13 Member Posts: 205

    I don't know why this thread is a flame thread when its so easily answered:

    Anet said that they are using Direct X 9 and 10, because they don't want to handcuff people into having to buy a completely new rig. They also said that they probably would keep the Beta in Direct X 9, and probably would not have the graphics pumped up or optimized yet.

  • stragen001stragen001 Member UncommonPosts: 1,720

    DX11 != better graphics than DX9.

    GW2 is already looking much better than many DX11 games.

    Also, The Witcher 2 was DX9 only and that has some of the best graphics I have ever seen.

    Sorry you wasted your money on an Overpowered rig/graphics card OP :p

    Cluck Cluck, Gibber Gibber, My Old Mans A Mushroom

  • botrytisbotrytis Member RarePosts: 3,363

    TSW is DX11 and it is FUGLY compared to GW2. I realize different style of art but, don't you want to put your best foot forward?

    Looking to see how good GW2 looks once the finish optimizing.


  • RavenRaven Member UncommonPosts: 2,005
    Originally posted by The_Korrigan
    Originally posted by Rasputin

    It does not matter what the hardware can do. If people are still running XP (which 45-50% of the PC's do), you can only run DX9.

    According to the steam statistics, gamers seem to be mostly running WIndows 7 (70%), with 10+% running Vista too. It also shows that most already are equipped for DX10, if not DX11.

    XP still has a strong following in the companies, because you don't need more to run the latest word, excel, or even for your company server. As I said, why change when it's working?

    Originally posted by Rasputin
    If you claim, that DX11 automatically will scale down to DX9, then you have to give me a link, because I never heard about that.

    You are right here, DX11 does NOT "automatically" scale down to DX9 on DX9 hardware. You have to implement separate rendering paths in your software. My guess is that people who think it's automatically downscaling are using pre-made engines which actually do the job.

    Hehe it has nothing to do with automatically downscaling or to do with pre-made engines, its about Feature Levels an API feature which came into effect with DirectX 11, so you can have a lower feature level with a higher DirectX device, so for instance you can run feature level 9 to create a D3D11 device, this allows you to run the DirectX 9 feature set on DirectX 9 hardware with the DirectX 11 runtime, this still requires windows 7/Vista depending if you are using a D3D10 device or a D3D11 device.

    You only need a different render path if you are specifically supporting D3D9 runtime which is pre windows vista, then it becomes the same process as if you are building a multi platform renderer for DirectX and OpenGL for instance.

     

    Research people! Research before your hasty into saying something that is wrong :) ( Clicky below )

    http://msdn.microsoft.com/en-us/library/windows/desktop/ff476876(v=vs.85).aspx

    image

  • The_KorriganThe_Korrigan Member RarePosts: 3,459
    Originally posted by rav3n2

    Hehe it has nothing to do with automatically downscaling or to do with pre-made engines, its about Feature Levels an API feature which came into effect with DirectX 11, so you can have a lower feature level with a higher DirectX device, so for instance you can run feature level 9 to create a D3D11 device, this allows you to run the DirectX 9 feature set with the DirectX 11 runtime, this still requires windows 7/Vista depending if you are using a D3D10 device or a D3D11 device.

    You only need a different render path if you are specifically supporting D3D9 runtime which is pre windows vista, then it becomes the same process as if you are building a multi platform renderer for DirectX and OpenGL for instance.

     

    Research people! Research before your hasty into saying something that is wrong :) ( Clicky below )

    http://msdn.microsoft.com/en-us/library/windows/desktop/ff476876(v=vs.85).aspx

    Sorry man, but it is definitley not that simple... you can't simply write all your engine using DX11 and it's magically working without performance issues on DX9 hardware. I deal with this almost every day, I can tell you. It's cute for some low performance applications, but for real time effects or games, it's total crap and useless.

    Respect, walk, what did you say?
    Respect, walk
    Are you talkin' to me? Are you talkin' to me?
    - PANTERA at HELLFEST 2023
    Yes, they are back !

  • RavenRaven Member UncommonPosts: 2,005
    Originally posted by The_Korrigan
    Originally posted by rav3n2

    Hehe it has nothing to do with automatically downscaling or to do with pre-made engines, its about Feature Levels an API feature which came into effect with DirectX 11, so you can have a lower feature level with a higher DirectX device, so for instance you can run feature level 9 to create a D3D11 device, this allows you to run the DirectX 9 feature set with the DirectX 11 runtime, this still requires windows 7/Vista depending if you are using a D3D10 device or a D3D11 device.

    You only need a different render path if you are specifically supporting D3D9 runtime which is pre windows vista, then it becomes the same process as if you are building a multi platform renderer for DirectX and OpenGL for instance.

     

    Research people! Research before your hasty into saying something that is wrong :) ( Clicky below )

    http://msdn.microsoft.com/en-us/library/windows/desktop/ff476876(v=vs.85).aspx

    Sorry man, but it is definitley not that simple... you can't simply write all your engine using DX11 and it's magically working without performance issues on DX9 hardware. I deal with this almost every day, I can tell you. It's cute for some low performance applications, but for real time effects or games, it's total crap and useless.

    Hmmm that is simply not true, the D3D11 runtime interacts in the same way with the drivers for directx 9 hardware as the D3D9 runtime does, maybe what you are talking about is CPU emulation of components of DirectX 11 that DirectX 9 doesnt support?

    That is what the feature level is for, you dont need to make different API calls, you just cant use features that arent supported by the hardware otherwise it needs to be emulated which is slow. But it is not a completely different render path at all, all of the API calls are the same, you just cannot use anything out of the feature set ( without it being emulated ).

    image

  • The_KorriganThe_Korrigan Member RarePosts: 3,459
    Originally posted by rav3n2
    Originally posted by The_Korrigan
    Originally posted by rav3n2

    Hehe it has nothing to do with automatically downscaling or to do with pre-made engines, its about Feature Levels an API feature which came into effect with DirectX 11, so you can have a lower feature level with a higher DirectX device, so for instance you can run feature level 9 to create a D3D11 device, this allows you to run the DirectX 9 feature set with the DirectX 11 runtime, this still requires windows 7/Vista depending if you are using a D3D10 device or a D3D11 device.

    You only need a different render path if you are specifically supporting D3D9 runtime which is pre windows vista, then it becomes the same process as if you are building a multi platform renderer for DirectX and OpenGL for instance.

     

    Research people! Research before your hasty into saying something that is wrong :) ( Clicky below )

    http://msdn.microsoft.com/en-us/library/windows/desktop/ff476876(v=vs.85).aspx

    Sorry man, but it is definitley not that simple... you can't simply write all your engine using DX11 and it's magically working without performance issues on DX9 hardware. I deal with this almost every day, I can tell you. It's cute for some low performance applications, but for real time effects or games, it's total crap and useless.

    Hmmm that is simply not true, the D3D11 runtime interacts the same way with the DirectX 9 drivers as the D3D9 runtime does, maybe what you are talking about is CPU emulation of components of DirectX 11 that DirectX 9 doesnt support, that is what the feature level is for, you dont need to make different API calls, you just cant use features that arent supported by the hardware otherwise it needs to be emulated which is slow. But it is not a completely different render path at all the API calls are the same, you just cannot use anything out of the feature set ( without it being emulated ).

    If you don't make a specialized DX11 render path, your application is going to perform like crap and/or look like crap. The Microsoft crap is just marketing, it's definitely not working in the "real world". If you want to fully use DX9 hardware when it's available, and fully use DX11 hardware equally, you need separate render paths.

    Jeez, I've been doing that since just after DX11 was released (and before that using both DX10 and 9 and OpenGL).

    Respect, walk, what did you say?
    Respect, walk
    Are you talkin' to me? Are you talkin' to me?
    - PANTERA at HELLFEST 2023
    Yes, they are back !

  • RavenRaven Member UncommonPosts: 2,005
    Originally posted by The_Korrigan
    Originally posted by rav3n2
    Originally posted by The_Korrigan
    Originally posted by rav3n2

    Hehe it has nothing to do with automatically downscaling or to do with pre-made engines, its about Feature Levels an API feature which came into effect with DirectX 11, so you can have a lower feature level with a higher DirectX device, so for instance you can run feature level 9 to create a D3D11 device, this allows you to run the DirectX 9 feature set with the DirectX 11 runtime, this still requires windows 7/Vista depending if you are using a D3D10 device or a D3D11 device.

    You only need a different render path if you are specifically supporting D3D9 runtime which is pre windows vista, then it becomes the same process as if you are building a multi platform renderer for DirectX and OpenGL for instance.

     

    Research people! Research before your hasty into saying something that is wrong :) ( Clicky below )

    http://msdn.microsoft.com/en-us/library/windows/desktop/ff476876(v=vs.85).aspx

    Sorry man, but it is definitley not that simple... you can't simply write all your engine using DX11 and it's magically working without performance issues on DX9 hardware. I deal with this almost every day, I can tell you. It's cute for some low performance applications, but for real time effects or games, it's total crap and useless.

    Hmmm that is simply not true, the D3D11 runtime interacts the same way with the DirectX 9 drivers as the D3D9 runtime does, maybe what you are talking about is CPU emulation of components of DirectX 11 that DirectX 9 doesnt support, that is what the feature level is for, you dont need to make different API calls, you just cant use features that arent supported by the hardware otherwise it needs to be emulated which is slow. But it is not a completely different render path at all the API calls are the same, you just cannot use anything out of the feature set ( without it being emulated ).

    If you don't make a specialized DX11 render path, your application is going to perform like crap. The Microsoft crap is just marketing, it's definitely not working in the "real world". If you want to fully use DX9 hardware when it's available, and fully use DX11 hardware equally, you need separate render paths.

    Jeez, I've been doing that since just after DX11 was released (and before that using both DX10 and 9 and OpenGL).

    I honestly dont know what you mean, the DX11 render path is just a set of API calls to a D3D11 device, the feature level just allows you to use DirectX 9 features from a D3D11 device, so instead of:

    #if defined( D3D9_RENDERER )

    d3d9device->DrawIndexedPrimitives( )

    #elif defined( D3D11_RENDERER )

    d3d11device->DrawIndexedPrimitives()

    #endif

    You only have a single render path the API call is the same, the interaction at the driver level is what is different, there is no need for two render paths, you still need switches for unsupported features ( that you do not wish to emulate on CPU, which is something no one uses because its dead slow ). But your render path will be the same you do not need to worry about creating a D3D9 device at all, unless you want to support it running on an OS that does not support the device you are initializing the feature set with, in this case for D3D11 it would be windows 7.

    What is your rationale for the application is going to perform like crap? i.e. running your renderer with a D3D11 device with feature level 9 vs running your renderer with a D3D9 device?

    image

  • The_KorriganThe_Korrigan Member RarePosts: 3,459

    Any real time 3D graphics developer worth his grain of salt knows you will have to optimize your graphic path depending on which API you use. You simply can't use the same path for DX9 and DX11 and expect maximum performance results. DX11 opens doors DX9 can't even dream of, you can't simply just bypass those either, you have to do them differently.

    Sorry, not blaming anyone, I almost forgot I was on a gaming forum. But I have a hard time letting such misinformation pass when I'm doing that professionally for a living.

    Respect, walk, what did you say?
    Respect, walk
    Are you talkin' to me? Are you talkin' to me?
    - PANTERA at HELLFEST 2023
    Yes, they are back !

  • RavenRaven Member UncommonPosts: 2,005
    Originally posted by The_Korrigan

    Any real time 3D graphics developer worth his grain of salt knows you will have to optimize your graphic path depending on which API you use. You simply can't use the same path for DX9 and DX11 and expect maximum performance results. DX11 opens doors DX9 can't even dream of, you can't simply just bypass those either, you have to do them differently.

    Sorry, not blaming anyone, I almost forgot I was on a gaming forum. But I have a hard time letting such misinformation pass when I'm doing that professionally for a living.

    And I just told you, unless its a feature that is specific to hardware, like hardware tessellation, the runtime handles both systems pretty transparently, for unsupported features you still need switches because obviously you cannot use features that are hardware specific on a hardware that does not have them, this is not the same thing at all as using another render path. Futhermore the DirectX 11 runtime handles a lot of things much more efficiently than dx9 runtime, for the record I work as graphics programmer for Microsoft Studios.

    Again would like to hear what is your rationale behind the "Dx9 and Dx11 and expect maximum performance" besides "3d graphics developer worth his grain of salt".

    This is getting pretty offtopic tho, PM me or answer here either way, would like to hear the rationale behind your claim.

    image

  • stragen001stragen001 Member UncommonPosts: 1,720
    Originally posted by rav3n2

    for the record I work as graphics programmer for Microsoft Studios.

    This would explain why you buy into the microsoft marketing line, even when people that use the product "in the real world" tell you differently

    Cluck Cluck, Gibber Gibber, My Old Mans A Mushroom

  • The_KorriganThe_Korrigan Member RarePosts: 3,459
    Originally posted by rav3n2

    ... the runtime handles both systems pretty transparently, for unsupported features you still need switches because obviously you cannot use features that are hardware specific on a hardware that does not have them, this is not the same thing at all as using another render path.

    That's where you are wrong. It takes more than just switching off unsupported features. Depending on the features available, the optimization path is totally different. You can simply use the same functions and hope the API does all the job. That's amateur developer job, not professional.

    Just a simple example... depending on how DX10 is available or not, you will manage occlusion queries in a totally different way. DX10 can do it better and faster (and in a different way) than DX9. If you're using the same code for both, you're wasting performance.

    Originally posted by rav3n2

    for the record I work as graphics programmer for Microsoft Studios.

    And I'm Bill Gates (see how easy it is)... ;)

    Respect, walk, what did you say?
    Respect, walk
    Are you talkin' to me? Are you talkin' to me?
    - PANTERA at HELLFEST 2023
    Yes, they are back !

  • QuizzicalQuizzical Member LegendaryPosts: 25,355
    Originally posted by Rasputin
    Originally posted by The_Korrigan
    Originally posted by Kabaal

    Makes no difference to me whether the game ends up being dx9 or 11. If it were a single player then maybe, but its not and a couple of bells and whistles being there or not won't be noticed while im PvP'ing.

    DX10 or DX11 can also give a performance improvement, even if you dont use any of the "bells and whistles".

    Originally posted by Rasputin

    50% of all PCs still run Windows XP, and XP can maximum run DX9.

    Noone in their right mind would cut out a potential audience of 50% for an MMO. For Crysis, yes, but not an MMO.

     http://www.engadget.com/2011/10/15/windows-7-overtakes-xp-globally-vista-found-weeping-in-a-corner/

    And that was in October 2011, the trend obviously continued. XP is also mostly used by professionals for their office work (why change, is works?), "serious" gamers have for most moved to Windows 7.

    Strange, I have a completely different chart (from april 2012):

    http://news.cnet.com/8301-10805_3-57407968-75/windows-xp-wont-give-up-top-spot-without-a-fight/

    It depends on which computers get counted and which don't.  Should an office computer that will never allow installation of commercial games count?  How about an old backup computer that only gets turned on once every few months?  Netbooks that have hardware that simply can't handle modern games?  Computers only used by people who speak no English?

    And it gets worse if you consider the question of how to count.  Microsoft presumably knows how many licenses for each OS they've sold, but that won't pick up pirated software, and doesn't tell you when a computer is taken out of service.  A web site that queries the hardware of those who use it is largely a statement about who uses that web site.  Results from Steam will be different from results from systemrequirementslab.com or any other particular site--and none of them will coincide with the group of people you're looking for unless it's your own web site and you mainly want to know who uses your site.

  • QuizzicalQuizzical Member LegendaryPosts: 25,355
    Originally posted by The_Korrigan

    You simply can't use the same path for DX9 and DX11 and expect maximum performance results. DX11 opens doors DX9 can't even dream of, you can't simply just bypass those either, you have to do them differently.

    Because there aren't any badly optimized games out there, or any other badly optimized software.  And game developers never settle for good enough, when another year of work could squeeze out another 10% increase in frame rates.  Right? </sarcasm>

    The point of a computer is to be a general purpose machine.  If you have one particular task that you want to handle, a general purpose machine assigned to do that task can never compete with having the same resources put into building a machine specific to exactly the task that you want.  But one general purpose computer may be able to do a lot of tasks well enough, and at a vastly lower cost than having to build many thousands of special purpose machines.

    Yes, you do lose some performance with abstraction to APIs like DirectX.  But you gain a massive reduction in coding work.  Rather than having to code things separately for every possible piece of hardware that a customer might use, there are some things that you do once, and trust that the video drivers and APIs can make that one section of code run well enough on a huge number of different pieces of hardware, even if it's not completely optimal for any of them.

  • RavenRaven Member UncommonPosts: 2,005
    Originally posted by The_Korrigan
    Originally posted by rav3n2

    ... the runtime handles both systems pretty transparently, for unsupported features you still need switches because obviously you cannot use features that are hardware specific on a hardware that does not have them, this is not the same thing at all as using another render path.

    That's where you are wrong. It takes more than just switching off unsupported features. Depending on the features available, the optimization path is totally different. You can simply use the same functions and hope the API does all the job. That's amateur developer job, not professional.

    Just a simple example... depending on how DX10 is available or not, you will manage occlusion queries in a totally different way. DX10 can do it better and faster (and in a different way) than DX9. If you're using the same code for both, you're wasting performance.

    Originally posted by rav3n2

    for the record I work as graphics programmer for Microsoft Studios.

    And I'm Bill Gates (see how easy it is)... ;)

    Hehe yes and it again goes back to my point, occlusion queries are only slower because you rely on compute shaders which are part of the hardware specification not runtime. Are we going around in circles here? Do you have any actual examples to support your claims or are you going to keep telling me hardware dependent implementations as examples? 

     

    image

  • mmoskimmoski Member UncommonPosts: 282
    Originally posted by The_Korrigan
    Originally posted by rav3n2

    ... the runtime handles both systems pretty transparently, for unsupported features you still need switches because obviously you cannot use features that are hardware specific on a hardware that does not have them, this is not the same thing at all as using another render path.

    That's where you are wrong. It takes more than just switching off unsupported features. Depending on the features available, the optimization path is totally different. You can simply use the same functions and hope the API does all the job. That's amateur developer job, not professional.

    Just a simple example... depending on how DX10 is available or not, you will manage occlusion queries in a totally different way. DX10 can do it better and faster (and in a different way) than DX9. If you're using the same code for both, you're wasting performance.

    Originally posted by rav3n2

    for the record I work as graphics programmer for Microsoft Studios.

    And I'm Bill Gates (see how easy it is)... ;)


    Well, neither of you are wrong really, the decisive factor is what functionality of the API’s in context to the hardware you are going to use based on the applications performance requirements, the API’s compatibility paths could be minimal or could require a lot of performance paths, mainly depending on what your hardware target market is, and how you decided to build your application.

    Either way its moot really and as we are talking gw2 here, the main reason for them not implementing DX11, is simply time constraints, but if they do have a DX10 rendering implemented then the jump to DX11 shouldn’t be too huge, depending on how they have used the API’s.

     

  • The_KorriganThe_Korrigan Member RarePosts: 3,459
    Originally posted by Quizzical

    Yes, you do lose some performance with abstraction to APIs like DirectX.  But you gain a massive reduction in coding work.  Rather than having to code things separately for every possible piece of hardware that a customer might use, there are some things that you do once, and trust that the video drivers and APIs can make that one section of code run well enough on a huge number of different pieces of hardware, even if it's not completely optimal for any of them.

    "Abstraction" APIs like DX or OpenGL give you a guided route to the hardware, rather than being a total mess like before those norms existed. But not more. A 3D API is NOT a 3D engine. The API will provide an unified interface to the same type of hardware, but it's the engine's job to optimize the rendering path depending on what hardware the software is run on.

    Many neophytes confuse Graphic API (DX, OpenGL) and Graphic Engine (Crytek, Unreal, WoW, Guild Wars, etc...).

    Originally posted by mmoski

    Either way its moot really and as we are talking gw2 here, the main reason for them not implementing DX11, is simply time constraints, but if they do have a DX10 rendering implemented then the jump to DX11 shouldn’t be too huge, depending on how they have used the API’s.

    The point I was trying to make is that implementing DX10 (or DX11) is not just a simple as changing a version number in your source code's #define lines. It can seem simple when you use a premade engine (like Unreal) where the developers of the engine are doing all the adaptations and optimizations, but ANet have their own engine and do all the job. You can't just snap fingers and say "we are now DX11". Well some tried (like Turbine), and it was pathetic.

    Respect, walk, what did you say?
    Respect, walk
    Are you talkin' to me? Are you talkin' to me?
    - PANTERA at HELLFEST 2023
    Yes, they are back !

  • RavenRaven Member UncommonPosts: 2,005
    Originally posted by The_Korrigan
    Originally posted by Quizzical

    Yes, you do lose some performance with abstraction to APIs like DirectX.  But you gain a massive reduction in coding work.  Rather than having to code things separately for every possible piece of hardware that a customer might use, there are some things that you do once, and trust that the video drivers and APIs can make that one section of code run well enough on a huge number of different pieces of hardware, even if it's not completely optimal for any of them.

    "Abstraction" APIs like DX or OpenGL give you a guided route to the hardware, rather than being a total mess like before those norms existed. But not more. A 3D API is NOT a 3D engine. The API will provide an unified interface to the same type of hardware, but it's the engine's job to optimize the rendering path depending on what hardware the software is run on.

    Many neophytes confuse Graphic API (DX, OpenGL) and Graphic Engine (Crytek, Unreal, WoW, Guild Wars, etc...).

    Originally posted by mmoski

    Either way its moot really and as we are talking gw2 here, the main reason for them not implementing DX11, is simply time constraints, but if they do have a DX10 rendering implemented then the jump to DX11 shouldn’t be too huge, depending on how they have used the API’s.

    The point I was trying to make is that implementing DX10 (or DX11) is not just a simple as changing a version number in your source code's #define lines. It can seem simple when you use a premade engine (like Unreal) where the developers of the engine are doing all the adaptations and optimizations, but ANet have their own engine and do all the job. You can't just snap fingers and say "we are now DX11". Well some tried (like Turbine), and it was pathetic.

    I never claimed that, at all tho, did I, go back and look back at what I said, my premise was, you do not need to code for a D3D9 device anymore granted you only care about targetting Win7 crowd, dropping DirectX 9 support is not the same as dropping support for DirectX 9 compliant hardware. You responded back saying that not using the DirectX 9 runtime will have a performance hit which is ludicrous and not true.

    It is not as easy as flick a switch ofc not, but it only gets harder the more features you want to support from the lower feature set, so you can decide to simplify your renderer massively by dropping certain features all together like only supporting tesselation for feature level 11 ( and yes I know this involved different shader paths it should be implied ), this ofc does not stop you from supporting tesselation in directx9 hardware, it will just make you life harder by a factor of 10, but the same could be said about making two render paths. The bottom line is there is no performance impact in using D3D11 feature level 9 over creating two render paths one with D3D9 and one with D3D11, which in turn is what I asked you to justify/explain the rationale cause that is what you claimed.

    Which is also why I said it had nothing to do with "downscale" when you answered someone because that would imply there is some sort of automated process to make high feature level features work on lower feature level hardware which is simply not the case. 

    image

Sign In or Register to comment.