Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

DirectX 11.3 to add better support for voxels, transparency

QuizzicalQuizzical Member LegendaryPosts: 25,489

http://anandtech.com/show/8544/microsoft-details-direct3d-113-12-new-features

This, I think, offers some insight into where graphics APIs are going.  If Direct3D offers something that anyone cares about, the OpenGL will probably offer the same thing not long afterward.

I figure that the voxel stuff will get the most attention, so I'll cover it first.  A texture, as used in graphics, is really just a lookup table.  The lookup table most commonly has color data, but it can theoretically be anything.  OpenGL has long supported 1-, 2-, and 3-dimensional textures, so DirectX probably has, too.  Most textures are 2D, and OpenGL ES actually requires all textures to be 2D.

If you have a 1024x1024 texture with 32-bits per pixel, that takes up 1 MB, at least before mipmapping.  So a game could have a lot of 1024x1024 textures loaded into video memory at once.  If you have a 1024x1024x1024 texture, that takes 1 GB, so you can't have so many of them loaded simultaneously.

I don't know how Minecraft or other voxel games implement voxels.  As I see it, the most natural way to do it would be with 3D textures.  But those would necessarily have to be low resolution textures, and switching between many low resolution textures would cause huge amounts of overhead.  What DIrectX 11.3 is going to offer is a nice way to do this for 3D textures, similar to what OpenGL already offers (albeit only via extensions) for 2D textures.  This would likely give an enormous performance speedup if a voxel-based game wants to use it.

-----

The next topic of my title is order-independent transparency.  Graphics APIs have had transparency for a long, long time.  Unfortunately, it's a pretty naive implementation of it that basically says, when we draw a pixel, if that pixel is supposed to be halfway transparent, we'll change the framebuffer value for that pixel to be half of the new color and half of the old color.  Obviously, you can have fractions besides 1/2.  That's exactly what you need if you know that you'll never have one partially transparent object on top of another you can draw all of the solid objects first, then come back and draw all of the transparent objects.

The existing implementation of transparency also works if you have a simple enough scene that you can sort all transparent objects, then draw the furthest one away, then the next furthest, and so forth.  The problem is that complex scenes can't be sorted that way.  Think, for example, of an arrow poking through a wall.  If you draw the wall first and the the arrow on top, the portion of the arrow that is "behind" the wall shows on top.  You get a similar problem if you draw the arrow first and then the wall.  For completely opaque objects, this is fine as the depth buffer will sort it out, but for transparent, it's not.

Suppose, for example, that you start with an all black scene, the arrow is red, and the wall is blue, and both are halfway transparent.  If you draw the arrow first, then you change some pixels to half red, and then the wall changes some half red pixels to 1/4 red, 1/2 blue.  If you draw the wall first, then you change some pixels to half blue, and then the arrow changes some pixels to 1/2 red, 1/4 blue.  Red green blue values of 128-0-64 look very different from 64-0-128, so you need some pixels to be one and some the other.  For a long time, you flatly couldn't do that.  If you rely on the depth buffer to prevent you from drawing the stuff in back, then you implicitly have a transparent arrow and a solid wall or the other way around.

DirectX 11 did bring a sort of order independent transparency.  But it required a lot of messy hackery in coding, as well as carrying a huge performance hit.  So it meant that you could sort of have order-independent transparency in games, but it was such a mess that you probably shouldn't.  So games didn't.

What you really want to do is to be able to store that you're going to have a blending operation with this pixel at this depth and that pixel at that depth, wait until the frame is finished, then sort the blending operations for a given pixel by depth and apply them in the proper order at the end.  If DirectX 11.3 makes it possible to do this quickly and easily, then we could eventually see a lot more transparency in games.  In several years, once everyone is satisfied that Windows 7 is dead, that is.

-----

There are two other features listed there, neither of which I see as being a big deal.  I'm not sure why you'd need conservative rasterization at all.  Maybe it will be useful for some types of anti-aliasing or something.

The last feature is typed UAV loads, and there, I'm not entirely sure what it's talking about.  Multiple threads on a GPU can already read the same texture and uniforms, so presumably that's not the new thing.  If it means allowing multiple threads in the same shader stage to pass data between each other, then that would be new--though it creates serious questions of race conditions.  If you've done any multi-threaded programming on a CPU, then you know that you need to do pass data between threads a lot.  But I'm not sure why you would want to do that for graphics.  Maybe it's an effort at exposing some functionality in Direct3D that recent GPUs can do anyway, but was initially meant for non-graphical compute purposes.

Comments

  • PhryPhry Member LegendaryPosts: 11,004

    in several years once everyone is satisfied windows 7 is dead... image

    just how many years are we talking here, Windows 7 is currently the most used windows os there is, with around 50% of the world using it, with another 25% still using XP, windows 8 still hasn't achieved 10% and is at much the same level of distribution as both the Mac OSX and the various Linux distro's, and windows 7 is still growing its user base, probably at the expense of 8 as XP is gradually being phased out, which is probably why they are already rushing through windows 9 to replace 8 in much the same way they did with 7 and Vista but if its more of the same old thing they tried to do with 8, it will probably be no more popular than 8 was, which in real terms was probably one of the largest failures MS has had in OS history, although Windows ME probably is a very strong contender there too.

    i think its pretty unlikely that 11.3 is going to have any impact in the next 5 years, no matter how much they try and promote it, if anything we are more likely to be looking at a move away from directx completely, although given that games are still being made for the last gen consoles, we are still seeing games with directx 9.c support being made, which means that although MS is trying to move people away from XP still, developers are still creating games that support it.

    In real terms, unless windows 9 eradicates all the shortcomings in windows 8, which is something of a tall order, but given that they will be demonstrating a version of it this october, lets see how that looks! then windows 7 is likely to still be the primary OS used worldwide in 5 years time. image

  • grndzrogrndzro Member UncommonPosts: 1,163

    I think theh diference is Vista was a well designed OS that was simply buggy as fck. The retooled W7 that replaced it was also well designed but rock solid.

    W8 is both poorly designed and buggy. It's sucessor will be either poorly designed or buggy.

    At this point I am simply downloading about 500gb of audiobooks, all snes/genesis/n64/PSX RPG's roms.....and sit happy with Linux till it is ready for gaming. Wayland is here and all that's left is to wait for the drivers.

  • syntax42syntax42 Member UncommonPosts: 1,385
    Originally posted by Quizzical

    http://anandtech.com/show/8544/microsoft-details-direct3d-113-12-new-features

    I don't know how Minecraft or other voxel games implement voxels.  As I see it, the most natural way to do it would be with 3D textures.  But those would necessarily have to be low resolution textures, and switching between many low resolution textures would cause huge amounts of overhead.  What DIrectX 11.3 is going to offer is a nice way to do this for 3D textures, similar to what OpenGL already offers (albeit only via extensions) for 2D textures.  This would likely give an enormous performance speedup if a voxel-based game wants to use it.

    Minecraft's voxels are one cubic meter in size.  That's a fairly low resolution for the world.  The textures are only loaded for visible faces of a 1m cube and are two-dimensional.  Early implementations of Minecraft used a single texture for the sides and allowed a different top and bottom texture.  Now, some blocks (like Jack-O-Lanterns) have unique textures on more than one side.

     

    The concept of three-dimensional textures sounds interesting.  Dirt could be rendered more realistically and at a finer resolution than Minecraft's cubic meter.  The only problem with a resolution like that is the amount of memory potentially required to keep track of everything.  

    A square kilometer in Minecraft contains 1000 x 1000 x 250 blocks.  Yes, I know, but I'm keeping the calculations simple.  That's 250 million blocks.  If each block is given 8 bits of memory for identification purposes (dirt, sand, water, air, etc.) then the square kilometer takes up 250 MB.  Worlds can easily become massive, requiring compression just for hard drive savings.  Transmitting data for a multiplayer game becomes bandwidth-limiting even at a 1m resolution.

    For a voxel game to get more detailed than Minecraft, some new and creative method of expressing the changes in the game world will have to be implemented.

  • EdliEdli Member Posts: 941
    Originally posted by grndzro

     

    W8 is both poorly designed and buggy.

    Not my experience. In terms of performance and stability 8 has been much better than even 7. The problem with 8 has always been that start screen only.

  • QuizzicalQuizzical Member LegendaryPosts: 25,489
    Originally posted by Phry

    i think its pretty unlikely that 11.3 is going to have any impact in the next 5 years, no matter how much they try and promote it, if anything we are more likely to be looking at a move away from directx completely, although given that games are still being made for the last gen consoles, we are still seeing games with directx 9.c support being made, which means that although MS is trying to move people away from XP still, developers are still creating games that support it.

    In real terms, unless windows 9 eradicates all the shortcomings in windows 8, which is something of a tall order, but given that they will be demonstrating a version of it this october, lets see how that looks! then windows 7 is likely to still be the primary OS used worldwide in 5 years time. image

    That actually leads to a topic that I've been meaning to make a post about for a while.  Suppose that DirectX 11.3 and DirectX 12 get a bunch of cool stuff in 2015, but it's Windows 8.1 or later only.  And then suppose that OpenGL gets the same stuff in 2016, and it runs on any OS, including Linux and Mac OS X, but more importantly, also Windows 7.  (Actually, given the way Apple writes drivers, features that come to Windows and Linux in 2016 will probably be available on Mac OS X sometime around 2020.)  If both AMD and Nvidia have to support it via DirectX, then they'll probably agree to implement it in OpenGL.  If you want to make a game using the new stuff to launch in 2017, why use Direct3D for graphics?  OpenGL would make more sense.  That sort of thing could easily kill off Direct3D, at least among games not content to mostly remain in the DirectX 9.0c era.

    Some people say that part of the point of DirectX is that it includes sound and networking, too, not just graphics.  But I'm not aware of any reason that would stop you from using OpenGL for graphics and DirectX for other stuff.  That doesn't mean that there isn't any reason, but only that I'm not aware of one.

    One might say, if that would push OpenGL, then why didn't it do so in the past, with DirectX 10 and 11?  The answer there is that OpenGL was way behind, mired mostly in controversy about what to do about really archaic stuff (e.g., color index mode, for use with monitors that supported so few colors that red-green-blue values basically wouldn't work) that some GPU vendors didn't want to support anymore.  DirectX 9.0c was available for Windows XP in 2005, and OpenGL didn't catch up to that even for about two years.  By the time OpenGL was able to catch up to DirectX 10, DirectX 11 and Windows 7 were just around the corner.

    But now, OpenGL has not merely caught up, but is even in the lead in feature support.  It looks likely that DirectX 11.3 will change that, at least temporarily.  And there's still a lot of uncertainty in how the lower level access promised by DirectX 12 and a next generation OpenGL project will shake out.  But if the main difference between Direct3D and OpenGL is that the former ties you to Windows and the latter doesn't, the only real reason to use Direct3D would be inertia.

  • QuizzicalQuizzical Member LegendaryPosts: 25,489
    Originally posted by syntax42
    Originally posted by Quizzical

    http://anandtech.com/show/8544/microsoft-details-direct3d-113-12-new-features

    I don't know how Minecraft or other voxel games implement voxels.  As I see it, the most natural way to do it would be with 3D textures.  But those would necessarily have to be low resolution textures, and switching between many low resolution textures would cause huge amounts of overhead.  What DIrectX 11.3 is going to offer is a nice way to do this for 3D textures, similar to what OpenGL already offers (albeit only via extensions) for 2D textures.  This would likely give an enormous performance speedup if a voxel-based game wants to use it.

    Minecraft's voxels are one cubic meter in size.  That's a fairly low resolution for the world.  The textures are only loaded for visible faces of a 1m cube and are two-dimensional.  Early implementations of Minecraft used a single texture for the sides and allowed a different top and bottom texture.  Now, some blocks (like Jack-O-Lanterns) have unique textures on more than one side.

     

    The concept of three-dimensional textures sounds interesting.  Dirt could be rendered more realistically and at a finer resolution than Minecraft's cubic meter.  The only problem with a resolution like that is the amount of memory potentially required to keep track of everything.  

    A square kilometer in Minecraft contains 1000 x 1000 x 250 blocks.  Yes, I know, but I'm keeping the calculations simple.  That's 250 million blocks.  If each block is given 8 bits of memory for identification purposes (dirt, sand, water, air, etc.) then the square kilometer takes up 250 MB.  Worlds can easily become massive, requiring compression just for hard drive savings.  Transmitting data for a multiplayer game becomes bandwidth-limiting even at a 1m resolution.

    For a voxel game to get more detailed than Minecraft, some new and creative method of expressing the changes in the game world will have to be implemented.

    A texture is just a lookup table.  If you're a programmer, think of it is a read-only (from the GPU's perspective) array with high latency, but accessible simultaneously to all of the many thousands of GPU threads running simultaneously.  A 2D texture is a 2D array, and a 3D texture is a 3D array.  3D textures have been available for a long time in OpenGL, so they probably also have been available in DirectX.

    What's new is allowing textures to be used far more efficiently than before, as far as reducing CPU overhead.  OpenGL already has that, albeit only via extensions, for 2D textures; I'm not sure about 3D.  I don't think that DirectX has it yet, for any dimension of textures.

    But an array of which blocks go where doesn't draw itself.  Even if you know how the data is structured, that doesn't tell you how the game draws it.  Texture accesses are very high latency, so you don't want to overuse them or you'll kill your performance.  That's why I said I'm not sure how Minecraft does it.

  • RidelynnRidelynn Member EpicPosts: 7,383

    Maybe I don't know enough about Voxels or Minecraft, but at least in the case of Minecraft, why would the textures be 1024x1024x1024? Wouldn't they be 1024x1024x6, under the worst case for a cube-shaped voxel where each face has a distinct texture?

  • QuizzicalQuizzical Member LegendaryPosts: 25,489
    Originally posted by Ridelynn

    Maybe I don't know enough about Voxels or Minecraft, but at least in the case of Minecraft, why would the textures be 1024x1024x1024? Wouldn't they be 1024x1024x6, under the worst case for a cube-shaped voxel where each face has a distinct texture?

    I don't know how Minecraft implements it, either.  I don't think there is an obvious way to implement voxels in OpenGL.  But I'd suspect that 3D textures have something to do with it, and having more flexibility with what you can do with 3D textures would likely help a lot.

  • grndzrogrndzro Member UncommonPosts: 1,163
    Originally posted by Edli
    Originally posted by grndzro

     

    W8 is both poorly designed and buggy.

    Not my experience. In terms of performance and stability 8 has been much better than even 7. The problem with 8 has always been that start screen only.

    There is plenty of software that dosent work with Windows 8.

    .net 3.5 has to be installed manually from an obscure menu.

    If Windows 8 breaks and cannot fix itself good luck getting to safe mode. You gonna need it.

    Need older IE compatibility? forget it.

    Have fun trying to keep IE from opening html links.

     

    Here is a list of bugs with more from users in the comments.

    http://www.askvg.com/windows-8-bug-report/

     

    Sure you may be using it just fine but there are plenty of people that have had problems.

    Just look at the sub 10% adoption rate. People hate it. It isn't just the start screen.

    Weather it's bugs or by design W8 is FUBAR. Saying it's not is like lucking out on a lemon year and rubbing it everyone elses face.

  • MikehaMikeha Member EpicPosts: 9,196
    Originally posted by Edli
    Originally posted by grndzro

     

    W8 is both poorly designed and buggy.

    Not my experience. In terms of performance and stability 8 has been much better than even 7. The problem with 8 has always been that start screen only.

     

     

    Same here. I cant wait for the next version of Windows but Windows 8 will go down as one of my favorite versions of Windows ever.

  • RidelynnRidelynn Member EpicPosts: 7,383


    Originally posted by Quizzical
    Originally posted by Ridelynn Maybe I don't know enough about Voxels or Minecraft, but at least in the case of Minecraft, why would the textures be 1024x1024x1024? Wouldn't they be 1024x1024x6, under the worst case for a cube-shaped voxel where each face has a distinct texture?
    I don't know how Minecraft implements it, either.  I don't think there is an obvious way to implement voxels in OpenGL.  But I'd suspect that 3D textures have something to do with it, and having more flexibility with what you can do with 3D textures would likely help a lot.

    Well, in your 1024x1024x1024 example, each voxel would be a single pixel and the voxel itself would be viewable (destructable) down to a single pixel, not a 1Mx1Mx1M cube that Minecraft uses.

    Sure, you could get into some obscure shapes other than neat cubes, but still, I think taking your voxel down to the pixel is a bit extreme.

    3D texture isn't quite the same thing as a 3D object. You don't have to texture the skin under the clothes, the muscles under the skin, the organs under the muscles, and the bones under that, which is what a true perfect 3D texture would contain under your 1024x1024x1024 description.

    (Your physics and animation may model that stuff, but that's different than graphically textureing it).

    From my understanding, a 3D texture is what it would take to wrap around a 3D object, not a true 3D array.

    image

  • RidelynnRidelynn Member EpicPosts: 7,383

    I did a bit of research for my own education:

    3D textures are actually 3D data structures. They are primarily used for volumetric effects: a variable smoke or fog field, for instance, which would have varying levels of opacity as you move through it.

    So I was a incorrect about 3D textures in my previous post.

    That being said, Voxels could, but do not necessarily use 3D textures. You would avoid using a 3D texture unless it were absolutely necessary. If you had a "Smoke" voxel (or anything else with transparency) it may, but most are opaque and can use a set of 2D textures in some sort of wrapping array (like the 6-texture cube) and save significantly on storage space and access time.

  • RidelynnRidelynn Member EpicPosts: 7,383

    And some more research on Voxels.

    Minecraft is "voxel-based" in an extremely granular sense.

    If you look at an entire level/world, and consider the entire thing as a structure - yes. It qualifies. The "World" is non-homogeneous 3D structure. That differs from a generic 3D polygon-based objects that we are used to seeing in our games because 3D polygons have no internal structure, they are more or less empty (or filled with like solid color). You can make polygon-structures out of voxels if you use the same "void" voxel to fill them with, but volumetric renders are unique and different from polygons in that they have a non-homogeneous volume.

    If you made a person out of polygons, if you "cut" them open it would be empty inside. But if you made a person out of voxels, it would have bones, and muscles, and organs, and all the stuff inside that you may not necessarily be able to see.

    A "voxel" is analogous to a pixel - it's your base unit of which nothing can be smaller than. Several voxels make up a volumetric rendering. In Minecraft terms, the voxel is the cube, the volumetric rendering is the entire world.

    You could have voxels in an arbitrary shape you chose. Minecraft did cubes presumably because they are simple, stack nicely, and they are easy for players to manipulate. You could have a voxel be a pixel if you chose - you could make some extremely high resolution pictures (and I suppose if you take Minecraft and zoom out extremely far, you could sort of simulate this effect).

    This was a bit hard for me to grasp because I had always associated Minecraft with Voxels, and everytime I've really interacted or watched Minecraft, your zoomed in so close to the voxel that they are huge. But that's like playing on a display that's 40x80 - it's so extremely granular and blocky that it's hard to get perspective.

    So I go back to my disagreement with Quiz's discussion about 3D textures in releation to voxels. I understand both a bit better now - that being said, Quiz's case of 1024x1024x1024 would be possible in a volumetric rendering, if you decided to map out each and every voxel in a 1024x1024x1024 volumetric world and just throw it into a big texture. But a voxel itself would not need a 3D texture map, since it's the base unit, it has no internal structure to define.

    Quiz isn't necessarily wrong (as usual), the terminology was a bit off (although "Voxel" has become a bit of a buzzword lately so in that context who knows what "they" really mean by it) but it I needed to learn a bit more about it before I could really see what he was talking about.

    Textures already have a good deal of hardware-accelerated compression associated with them, and that's what we are seeing with Volume Tiled Resources (VTR: the Microsoft name for this 3D texture allocation). I found it interesting that it is based loosely on Megatextures (like those used in the idTech5 engine, such as Rage, that caused all kinds of texture pop) - in that if a developer wants to use one massive 3D texture for their voxel creation, they can, and then VTR sorts it out, compresses and culls what it can, and makes it manageable in the hardware. The article did note that this isn't for every voxel-based engine, just some specific voxel implementations (which could be some very arcane and niche cases, such as MRI rendering or highly detailed 3D CAD).

    It's extremely unlikely we'd see something like this in a game; at least for a while. In a typical case you will have a lot of repeated voxels; the data set could still be large if there are a lot of unique voxels, but and there are a multitude of off-the-shelf compression algortihms that work with compressing repeated data. So for most implementations, it will be more efficent to have a limited number of voxel "tiles" that you can plug in, but there are some outlier cases where each and every voxel could be unique and distinct - just probably not for gaming.

  • EdliEdli Member Posts: 941
    Originally posted by grndzro

    There is plenty of software that dosent work with Windows 8.

     

    Plenty? Everything I use worked just fine. Is there software that doesn't work with 8? Sure, like there was software that didn't work with 7, with vista, with XP. At work we use on older version of OSX because the software we use doesn't work with newer version of OSX. That is the case for every single OS out there, whether is for PCs or phones. You trying to pass this as a 8 problem only is disingenuous as hell just to try and prove your point. 

    As for older ie compatibility, why should I give a crap about that? I'm assuming is about ie6 and 7? Yeah they should die in a fire. 

    As for that list with bugs, name me one OS and I can do the same. An OS used by millions, I can bring you links where people here and there had z or y problem. I can do that for every OS because there is no such thing as 100% perfect OS.

    For me 8 has been working better than 7. It seems to manage the resources better, less crashes, it doesn't get bogged down and is just more snappy overall. The problem with 8 has always been about the design decisions with that start screen. 

  • RidelynnRidelynn Member EpicPosts: 7,383

    For the entire Windows 7 vs 8 derail on here:

    I side with the Win8 folks; 8 has it's quirks, but if you haven't figured out 8 by now and still feel the need to complain about it, it's your own fault for not being able to adapt.

  • RidelynnRidelynn Member EpicPosts: 7,383


    Originally posted by Quizzical
    That actually leads to a topic that I've been meaning to make a post about for a while.  Suppose that DirectX 11.3 and DirectX 12 get a bunch of cool stuff in 2015, but it's Windows 8.1 or later only.  And then suppose that OpenGL gets the same stuff in 2016, and it runs on any OS, including Linux and Mac OS X, but more importantly, also Windows 7.  (Actually, given the way Apple writes drivers, features that come to Windows and Linux in 2016 will probably be available on Mac OS X sometime around 2020.)  If both AMD and Nvidia have to support it via DirectX, then they'll probably agree to implement it in OpenGL.  If you want to make a game using the new stuff to launch in 2017, why use Direct3D for graphics?  OpenGL would make more sense.  That sort of thing could easily kill off Direct3D, at least among games not content to mostly remain in the DirectX 9.0c era.

    Realistically - most people who are writing games aren't dealing a lot with DX or OGL code. Sure, there's some level of it when you are really pushing the envelope. FOr the most part, that's just the AAA guys, and the studios that are big enough to write their own engines.

    Most everyone else - they are just taking an off-the-shelf engine (Unity, Unreal, CryEngine, Source, etc), they are plugging in their models and textures and AI, and they let the engine take care of the majority of the graphics calls. A few tweaks here and there for optimization, or to accomplish something that the engine can't do natively, but a 3rd party engine is doing the heavy lifting for the vast majority of games out there right now, and we even see that in a lot of MMOs today (Aion, Vindictus, Archeage, SWTOR, TERA, DCUO, ESO just to name a few bigger names).

    That means that developers aren't going to care much if their code is running on OGL or DX underneath either, so long as it looks and performs reasonably the same. With the engine taking care of the bulk of the work, they are free to port it across OSes easily (and that's a big reason why we are able to see a lot of cross-platform releases coming from developers themselves on or near original release date and not from cross-platform specific programmers months/years after initial release (i.e. Aspyr)) - and that can happen with or without a big war between DX and OGL on Windows.

  • QuizzicalQuizzical Member LegendaryPosts: 25,489
    Originally posted by Ridelynn

     


    Originally posted by Quizzical

    Originally posted by Ridelynn Maybe I don't know enough about Voxels or Minecraft, but at least in the case of Minecraft, why would the textures be 1024x1024x1024? Wouldn't they be 1024x1024x6, under the worst case for a cube-shaped voxel where each face has a distinct texture?
    I don't know how Minecraft implements it, either.  I don't think there is an obvious way to implement voxels in OpenGL.  But I'd suspect that 3D textures have something to do with it, and having more flexibility with what you can do with 3D textures would likely help a lot.

     

    Well, in your 1024x1024x1024 example, each voxel would be a single pixel and the voxel itself would be viewable (destructable) down to a single pixel, not a 1Mx1Mx1M cube that Minecraft uses.

    Sure, you could get into some obscure shapes other than neat cubes, but still, I think taking your voxel down to the pixel is a bit extreme.

    3D texture isn't quite the same thing as a 3D object. You don't have to texture the skin under the clothes, the muscles under the skin, the organs under the muscles, and the bones under that, which is what a true perfect 3D texture would contain under your 1024x1024x1024 description.

    (Your physics and animation may model that stuff, but that's different than graphically textureing it).

    From my understanding, a 3D texture is what it would take to wrap around a 3D object, not a true 3D array.

    image

    The reason I brought up the 1024x1024x1024 texture above was to make the argument that you can't have a ton of 3D textures that are large in all three dimensions.

    You certainly don't want to implement voxels by having a separate texture for each face, so that a simple cube already requires 6 textures and more complicated objects require more yet.  That will completely kill your performance from the overhead to switch back and forth between textures.  A program also requires a fixed number of textures no matter what vertex data you give it, which could easily make a huge mess.

    OpenGL does have cube texture maps, which are basically a collection of six textures with one corresponding to each face of a cube.  This was added in OpenGL 4.0, however, and Minecraft uses 2.1 or something like that, so this isn't how Minecraft does it.

    A texture is just a lookup table that is read-only from the perspective of the GPU, and hence thread-safe.  GPUs also have some hardware to optionally interpolate between values or wrap around the texture at its edges, but that doesn't seem useful to me for voxels.  Think of  the difference between a 2D and 3D texture as being the difference between:

    int[][] data = new int[64][64];

    and

    int[][][] data = new int[64][64][64];

  • QuizzicalQuizzical Member LegendaryPosts: 25,489
    Originally posted by Ridelynn

     


    Originally posted by Quizzical
    That actually leads to a topic that I've been meaning to make a post about for a while.  Suppose that DirectX 11.3 and DirectX 12 get a bunch of cool stuff in 2015, but it's Windows 8.1 or later only.  And then suppose that OpenGL gets the same stuff in 2016, and it runs on any OS, including Linux and Mac OS X, but more importantly, also Windows 7.  (Actually, given the way Apple writes drivers, features that come to Windows and Linux in 2016 will probably be available on Mac OS X sometime around 2020.)  If both AMD and Nvidia have to support it via DirectX, then they'll probably agree to implement it in OpenGL.  If you want to make a game using the new stuff to launch in 2017, why use Direct3D for graphics?  OpenGL would make more sense.  That sort of thing could easily kill off Direct3D, at least among games not content to mostly remain in the DirectX 9.0c era.

     

    Realistically - most people who are writing games aren't dealing a lot with DX or OGL code. Sure, there's some level of it when you are really pushing the envelope. FOr the most part, that's just the AAA guys, and the studios that are big enough to write their own engines.

    Most everyone else - they are just taking an off-the-shelf engine (Unity, Unreal, CryEngine, Source, etc), they are plugging in their models and textures and AI, and they let the engine take care of the majority of the graphics calls. A few tweaks here and there for optimization, or to accomplish something that the engine can't do natively, but a 3rd party engine is doing the heavy lifting for the vast majority of games out there right now, and we even see that in a lot of MMOs today (Aion, Vindictus, Archeage, SWTOR, TERA, DCUO, ESO just to name a few bigger names).

    That means that developers aren't going to care much if their code is running on OGL or DX underneath either, so long as it looks and performs reasonably the same. With the engine taking care of the bulk of the work, they are free to port it across OSes easily (and that's a big reason why we are able to see a lot of cross-platform releases coming from developers themselves on or near original release date and not from cross-platform specific programmers months/years after initial release (i.e. Aspyr)) - and that can happen with or without a big war between DX and OGL on Windows.

    Someone makes a decision of whether to use OpenGL or DirectX, whether it's the person programming the game or the company they licensed an engine from.  And if a game is available for platforms besides Windows and uses a GPU to do non-trivial work, it's nearly certain that it's using OpenGL or some subset of it.

Sign In or Register to comment.