Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

So apparently Nvidia Tegra 4 doesn't support any modern graphics APIs

QuizzicalQuizzical Member LegendaryPosts: 25,350

Preceding the recent launch/announcement/press conference/whatever that was about Nvidia Tegra 4, there were rumors online that it would support DirectX 11, OpenGL 4, OpenCL, CUDA, and everything else under the sun.  Launch day coverage from some sites repeated those rumors as fact.

You know who didn't repeat them?  Nvidia.  Uh oh.  Hopes of a Tegra 4 with modern graphics akin to GeForce 600 series chips were dashed.  GeForce 6000 series is closer to the truth:  the new Tegra 4 graphics architecture is heavily based on the old GeForce 6000/7000 series cards.  Those are so old that Nvidia has cycled through all of the possibilities for a first digit, started over at 1, and made it back to 6 again, with 7 coming up soon.

Tegra 4 won't even support OpenGL ES 3.0, which is the latest version of the gimpy 3D graphics API aimed at smartphones and other devices not expected to be able to do much in the way of 3D graphics.

I was hoping that we'd see the new generation of tablet hardware support the modern APIs such as DirectX 11 and OpenGL 4.  At least one variant of the upcoming Imagination PowerVR Rogue series 6 graphics will.

At least we still have AMD Temash, which uses AMD's latest and greatest GCN architecture, with support for DirectX 11.1, OpenGL 4.3, and everything else you might want.  That's great for people who want a Windows 8 tablet, but now AMD says that they're not going to do the work necessary to make Android run on it.  I read that as "we talked to tablet vendors and couldn't convince any of them to pay what we want for a Temash-based Android tablet when they can get ARM chips for $30".

Incidentally, the quad core version of Temash will be four Jaguar cores clocked at 1 GHz with two GCN CUs in a TDP of around 8 W.  There will also be a dual core version that uses about half as much power, so I'm guessing that's two Jaguar cores at 1 GHz and 1 GCN CU.  Kabini is reportedly the same silicon as Temash, except clocked higher.

«1

Comments

  • ShakyMoShakyMo Member CommonPosts: 7,207
    Dx 11.1 won't be important anytime soon. Win 8 is even less popular than win vista, its possibly microsofts biggest disaster to date.
  • botrytisbotrytis Member RarePosts: 3,363
    Originally posted by ShakyMo
    Dx 11.1 won't be important anytime soon. Win 8 is even less popular than win vista, its possibly microsofts biggest disaster to date.

    I don't think it is the biggest failure - that was Win ME.

     

    THe biggest thing for Win 8 is it is cohesive across types of platforms, ie. phones, tablets, PC's. Having one cohesive system will be important because it will be easier to move apps across rather than having specific apps for each platform.


  • ThorkuneThorkune Member UncommonPosts: 1,969
    I must be the only person that actually liked Win ME. I had zero issues with it.
  • TheLizardbonesTheLizardbones Member CommonPosts: 10,910

    If the Tegra 4 doesn't really do anything more than the Tegra 3, and if it's not really comparable to or better than the current console graphics, then what is the compelling reason to use one? It's not like phones and tablets are struggling with the current crop of games.

    I can not remember winning or losing a single debate on the internet.

  • ShakyMoShakyMo Member CommonPosts: 7,207
    The sales figures for win 8 are less than the sales figures for win Vista (when win Vista was the same age), win 8 is the lowest selling version of windows ever.
  • KenFisherKenFisher Member UncommonPosts: 5,035
    Originally posted by lizardbones

    If the Tegra 4 doesn't really do anything more than the Tegra 3, and if it's not really comparable to or better than the current console graphics, then what is the compelling reason to use one? It's not like phones and tablets are struggling with the current crop of games.

     

    I was wondering this myself.  If not a major improvement in API support, what changed?

     


    Ken Fisher - Semi retired old fart Network Administrator, now working in Network Security.  I don't Forum PVP.  If you feel I've attacked you, it was probably by accident.  When I don't understand, I ask.  Such is not intended as criticism.
  • QuizzicalQuizzical Member LegendaryPosts: 25,350
    Originally posted by ShakyMo
    Dx 11.1 won't be important anytime soon. Win 8 is even less popular than win vista, its possibly microsofts biggest disaster to date.

    It's not just that it won't support DirectX 11.1.  It won't support DirectX 11, either.  Or 10.  It's not entirely clear whether it will even support 9.0c.  It won't any recent version of OpenGL, either.  It likely won't support any version of OpenGL at all.  It doesn't even support OpenGL ES 3.0, instead relying on the OpenGL ES 2.0 standard that was meant for cell phones in 2007.  And yes, I do mean cell phones as opposed to tablets, as the first iPad was still about three years away back then.  That's a problem whether you want to run Windows 8, Windows RT, Android, or whatever else you decide on.

  • ShakyMoShakyMo Member CommonPosts: 7,207
    This isn't a comment on the quality of win 8, it probably Is better than win vista and some other dodgy win versions. It's a failure for Microsoft because they haven't been able to sell it for a number of reasons.
    1 offices don't want to retrain staff the version
    2 home desktop users don't need it, unlike vista/7 where the upgrade let you live to 64bit, use multiple monitors, use ssd etc.. I mean not many people are going to see touchscreen support on a desktop as a must have
    3 tech geeks haven't got it due to worries over the win store and what it might mean.
    4 the rise of tablets and the drop in laptop sales and the near death of netbooks, no ones going to get a win tablet when there's such a huge amount of apps for ios and android and both OS run leaner.
  • ShakyMoShakyMo Member CommonPosts: 7,207
    Oh quiz, no dx10 or dx11 sounds silly. No dx9 would be a disaster, what the hell would you run on it, how old Is dx 9 now? You're talking very old games that use older versions of direct x.
  • GrayGhost79GrayGhost79 Member UncommonPosts: 4,775
    Originally posted by Quizzical

    Preceding the recent launch/announcement/press conference/whatever that was about Nvidia Tegra 4, there were rumors online that it would support DirectX 11, OpenGL 4, OpenCL, CUDA, and everything else under the sun.  Launch day coverage from some sites repeated those rumors as fact.

    You know who didn't repeat them?  Nvidia.  Uh oh.  Hopes of a Tegra 4 with modern graphics akin to GeForce 600 series chips were dashed.  GeForce 6000 series is closer to the truth:  the new Tegra 4 graphics architecture is heavily based on the old GeForce 6000/7000 series cards.  Those are so old that Nvidia has cycled through all of the possibilities for a first digit, started over at 1, and made it back to 6 again, with 7 coming up soon.

    Tegra 4 won't even support OpenGL ES 3.0, which is the latest version of the gimpy 3D graphics API aimed at smartphones and other devices not expected to be able to do much in the way of 3D graphics.

    I was hoping that we'd see the new generation of tablet hardware support the modern APIs such as DirectX 11 and OpenGL 4.  At least one variant of the upcoming Imagination PowerVR Rogue series 6 graphics will.

    At least we still have AMD Temash, which uses AMD's latest and greatest GCN architecture, with support for DirectX 11.1, OpenGL 4.3, and everything else you might want.  That's great for people who want a Windows 8 tablet, but now AMD says that they're not going to do the work necessary to make Android run on it.  I read that as "we talked to tablet vendors and couldn't convince any of them to pay what we want for a Temash-based Android tablet when they can get ARM chips for $30".

    Incidentally, the quad core version of Temash will be four Jaguar cores clocked at 1 GHz with two GCN CUs in a TDP of around 8 W.  There will also be a dual core version that uses about half as much power, so I'm guessing that's two Jaguar cores at 1 GHz and 1 GCN CU.  Kabini is reportedly the same silicon as Temash, except clocked higher.

    Still feel like giving any sort of praise to Nvidias "Shield" console lol...

    I tried to tell you it was junk lol. 

  • adam_noxadam_nox Member UncommonPosts: 2,148
    Isn't directx a direct interfacing with the cpu, ie not interpreted or portable?  I thought that meant you had to build a version of directx for the architecture of the cpu.  Perhaps you can still put an api on a graphics chip to support it, but I would think that it wouldn't work with smartphone cpus.  Mind you this is based on a bunch of things I just kind of assumed from random posts and stuff.   That and the ordeal android had to go through just to get flash on it (another thing that has to directly access the cpu and therefore was not portable between architectures).
  • QuizzicalQuizzical Member LegendaryPosts: 25,350
    Originally posted by XAPGames
    Originally posted by lizardbones

    If the Tegra 4 doesn't really do anything more than the Tegra 3, and if it's not really comparable to or better than the current console graphics, then what is the compelling reason to use one? It's not like phones and tablets are struggling with the current crop of games.

    I was wondering this myself.  If not a major improvement in API support, what changed?

    Since when do all products need to have a compelling reason to use them?  There are more than a few for which the target market is "people who don't know any better".

    Tegra 4 will scale performance way up when using the very old API(s) that it supports.  It offers 72 shaders, while Tegra 3 only had 12 shaders.  It also offers four Cortex A15 cores, while Tegra 3 had four Cortex A9 cores, and Cortex A15 is massively faster than Cortex A9.  It's a full node die shrink from 40 nm to 28 nm, which means much better energy efficiency is possible.  But adding that much more performance means that power consumption is going to go way up unless you throttle clocks back severely.

    If you're not a gamer (in which case, why are you on this forum?), but need a tablet that offers a lot of CPU performance (for a tablet), then Tegra 4 might fit the bill nicely.  Though if you need a lot of CPU performance and are shopping for a tablet, you're likely doing something wrong.

    There will likely also be a lot of games that launch that still use OpenGL ES 2.0 in coming years.  Look how many games still use DirectX 9.0c.  I haven't dug into the differences between OpenGL ES 2.0 and 3.0, but I do know that 3.0 still only offers two programmable pipeline stages, as compared to the six in DirectX 11.1 and OpenGL 4.3.

    There is a minimum performance threshold for tessellation to really be useful, as if you try to use it and turn it down too far, you can see the jumps as it tessellates the model differently.  (You know how you can see when a game switches from one model to another as it gets closer?  Imagine that jump happening 20 separate times and being visible every single one.)  But the threshold isn't really that high to be able to have enough vertices that you can't see the jumps as it changes how a model is tessellated.  The Radeon HD 6310 integrated graphics in my E-350 based laptop/netbook has enough performance to use tessellation and make it look very smooth, and the quad core version of AMD Temash will have more performance yet.  If you've got enough performance to use tessellation, then it's a huge, huge performance optimization.

    Missing geometry shaders, though, is a killer problem at any performance level, and OpenGL ES 3.0 doesn't have those, either.  Geometry shaders mean you can do computations on the GPU that require you to see an entire triangle at once.  They also mean you can create new triangles or toss out old ones.  Not having geometry shaders means that any effects that require either of those have to be done on the CPU.  Taking GPU-friendly computations and running them on a CPU instead is not a recipe for energy efficiency.  That leads to removing effects entirely or at least having to tone them way down, which is how you get rainstorms that have about 20 or 30 raindrops on the screen at a time, each of which is a six foot long spear that looks like it would impale and kill you if you touch it.

  • QuizzicalQuizzical Member LegendaryPosts: 25,350
    Originally posted by Torvaldr
    Originally posted by bigsmiff
    I must be the only person that actually liked Win ME. I had zero issues with it.

    I never had an issue with it either.  For that matter, I've never had a problem using any version of Windows (desktop or server) unless it was just getting long in the tooth and couldn't support modern features.  It's laughable when people claim Win8 is less popular than Vista.  It's the same old "let's hate everything" mentality that seems to be fostered by the blogosphere.

    Thanks Quizz for pointing out the Tegra 4 flaws.  What are going to be the competing options?

    ARM Mali T600 series and Qualcomm Adreno 300 series graphics support OpenGL ES 3.0, but not the full OpenGL in any form.  They also support DirectX 11 feature level 9_3, which is a convoluted way of saying essentially DirectX 9.0c and nothing later.  Imagination's upcoming PowerVR 600 series graphics will support all of that, and at least some (but apparently not all) of the chips in its new generation will reportedly also support OpenGL 4 and the real DirectX 11.  All of those will be paired with ARM cores of some sort.  Usually it will be ARM Cortex A15 cores, but Qualcomm makes their own Krait cores that are similar to Cortex A15, and whatever Apple decides to license will be paired with Apple Swift cores or some successor to them.

    There's also AMD Temash, which is x86, and therefore, Windows-only.  AMD could make it run on Android, but says they won't.  That will support everything that desktop Radeon HD 7000 series graphics support, which means everything that matters as far as APIs go.  It will just run it a lot slower than the desktop cards--though it shouldn't be too far behind a Radeon HD 5450.

  • KenFisherKenFisher Member UncommonPosts: 5,035
    Originally posted by adam_nox
    Isn't directx a direct interfacing with the cpu, ie not interpreted or portable?  I thought that meant you had to build a version of directx for the architecture of the cpu.  Perhaps you can still put an api on a graphics chip to support it, but I would think that it wouldn't work with smartphone cpus.  Mind you this is based on a bunch of things I just kind of assumed from random posts and stuff.   That and the ordeal android had to go through just to get flash on it (another thing that has to directly access the cpu and therefore was not portable between architectures).

     

    Not sure on this, but I think Win 8 RT runs directly on ARM hardware and would have DirectX support BUT only for programs specifically compiled for running on RT.  That would rule out 99.999% of current games.

     

    I haven't followed what MS is doing with all this, but to me it sounds like a nightmare for developers.

     


    Ken Fisher - Semi retired old fart Network Administrator, now working in Network Security.  I don't Forum PVP.  If you feel I've attacked you, it was probably by accident.  When I don't understand, I ask.  Such is not intended as criticism.
  • KenFisherKenFisher Member UncommonPosts: 5,035
    Originally posted by Quizzical
     

    Though if you need a lot of CPU performance and are shopping for a tablet, you're likely doing something wrong.

     

    *BIG GRIN*  That made me chuckle.

     

    Then again, I have FTP and Samba servers on my Kindle Fire, so who am I to talk?

     


    Ken Fisher - Semi retired old fart Network Administrator, now working in Network Security.  I don't Forum PVP.  If you feel I've attacked you, it was probably by accident.  When I don't understand, I ask.  Such is not intended as criticism.
  • QuizzicalQuizzical Member LegendaryPosts: 25,350
    Originally posted by adam_nox
    Isn't directx a direct interfacing with the cpu, ie not interpreted or portable?  I thought that meant you had to build a version of directx for the architecture of the cpu.  Perhaps you can still put an api on a graphics chip to support it, but I would think that it wouldn't work with smartphone cpus.  Mind you this is based on a bunch of things I just kind of assumed from random posts and stuff.   That and the ordeal android had to go through just to get flash on it (another thing that has to directly access the cpu and therefore was not portable between architectures).

    DirectX is a collection of a bunch of APIs.  By far the best known one is Direct3D, so people often say "DirectX" to properly mean "Direct3D", which is what I did above.

    Direct3D basically gives programmers a way to tell video cards to do things.  That way, instead of making the CPU do all of the work to render a game (which would work, except that you'd likely measure frame rates in seconds (plural) per frame rather than frames per second), you can make the GPU do most of the work, while the CPU just has to do the work that isn't GPU-friendly.  OpenGL is the other graphics API with capabilities comparable to Direct3D.

  • TheLizardbonesTheLizardbones Member CommonPosts: 10,910


    Originally posted by Quizzical
    Originally posted by XAPGames Originally posted by lizardbones If the Tegra 4 doesn't really do anything more than the Tegra 3, and if it's not really comparable to or better than the current console graphics, then what is the compelling reason to use one? It's not like phones and tablets are struggling with the current crop of games.
    I was wondering this myself.  If not a major improvement in API support, what changed?
    Since when do all products need to have a compelling reason to use them?  There are more than a few for which the target market is "people who don't know any better".

    Tegra 4 will scale performance way up when using the very old API(s) that it supports.  It offers 72 shaders, while Tegra 3 only had 12 shaders.  It also offers four Cortex A15 cores, while Tegra 3 had four Cortex A9 cores, and Cortex A15 is massively faster than Cortex A9.  It's a full node die shrink from 40 nm to 28 nm, which means much better energy efficiency is possible.  But adding that much more performance means that power consumption is going to go way up unless you throttle clocks back severely.

    If you're not a gamer (in which case, why are you on this forum?), but need a tablet that offers a lot of CPU performance (for a tablet), then Tegra 4 might fit the bill nicely.  Though if you need a lot of CPU performance and are shopping for a tablet, you're likely doing something wrong.

    There will likely also be a lot of games that launch that still use OpenGL ES 2.0 in coming years.  Look how many games still use DirectX 9.0c.  I haven't dug into the differences between OpenGL ES 2.0 and 3.0, but I do know that 3.0 still only offers two programmable pipeline stages, as compared to the six in DirectX 11.1 and OpenGL 4.3.

    There is a minimum performance threshold for tessellation to really be useful, as if you try to use it and turn it down too far, you can see the jumps as it tessellates the model differently.  (You know how you can see when a game switches from one model to another as it gets closer?  Imagine that jump happening 20 separate times and being visible every single one.)  But the threshold isn't really that high to be able to have enough vertices that you can't see the jumps as it changes how a model is tessellated.  The Radeon HD 6310 integrated graphics in my E-350 based laptop/netbook has enough performance to use tessellation and make it look very smooth, and the quad core version of AMD Temash will have more performance yet.  If you've got enough performance to use tessellation, then it's a huge, huge performance optimization.

    Missing geometry shaders, though, is a killer problem at any performance level, and OpenGL ES 3.0 doesn't have those, either.  Geometry shaders mean you can do computations on the GPU that require you to see an entire triangle at once.  They also mean you can create new triangles or toss out old ones.  Not having geometry shaders means that any effects that require either of those have to be done on the CPU.  Taking GPU-friendly computations and running them on a CPU instead is not a recipe for energy efficiency.  That leads to removing effects entirely or at least having to tone them way down, which is how you get rainstorms that have about 20 or 30 raindrops on the screen at a time, each of which is a six foot long spear that looks like it would impale and kill you if you touch it.




    People don't buy phones or tablets for their gaming needs. Even the people I know who are into mobile devices as a hobby don't seem to be in any need of a faster GPU, much less a more feature filled GPU. For instance, why would I buy an Nvidia Shield, and a Samsung phone, when I could just get a Samsung phone and a Moga Pro Gamepad? All of the games will look the same and run the same so there doesn't seem to be any particular reason to buy into the Tegra 4.

    A compelling reason would make the difference between people buying the items that they are going to buy anyway, just with the new chip because that's all that's available, and people buying what they're going to buy anyway, and another device with the new chip, or a more expensive device with the new chip.

    It does sound like from what you said the Tegra 4 will do more than the Tegra 3 though. It won't just be faster, it'll have more features. So that answers some of my wondering. The chip will do more than the previous generation, it won't just be faster. That's enough of a reason to be a compelling reason to buy into a Tegra 4 based device for some sort of gaming. Not a tablet, but maybe an Android MiniPC or the Shield.

    I can not remember winning or losing a single debate on the internet.

  • TheLizardbonesTheLizardbones Member CommonPosts: 10,910


    Originally posted by XAPGames
    Originally posted by Quizzical  
    Though if you need a lot of CPU performance and are shopping for a tablet, you're likely doing something wrong.
     

    *BIG GRIN*  That made me chuckle.

     

    Then again, I have FTP and Samba servers on my Kindle Fire, so who am I to talk?

     




    I've been toying with running a Minecraft server on my Nook. Not because it's a good idea, but because it's possible.

    I can not remember winning or losing a single debate on the internet.

  • QuizzicalQuizzical Member LegendaryPosts: 25,350
    Originally posted by lizardbones

    It does sound like from what you said the Tegra 4 will do more than the Tegra 3 though. It won't just be faster, it'll have more features. So that answers some of my wondering. The chip will do more than the previous generation, it won't just be faster. That's enough of a reason to be a compelling reason to buy into a Tegra 4 based device for some sort of gaming. Not a tablet, but maybe an Android MiniPC or the Shield.

     

    As compared to Tegra 3, I have the impression that Tegra 4 will have basically the same API support, but just has a lot more hardware to throw at it.  Sometimes that makes things viable that weren't before.  Radeon HD 6250 integrated graphics fully support DirectX 11, and if you run a demanding DirectX 11 game on them with all settings maxed, it likely will run the game.  (It's not guaranteed to run; sometimes slow hardware will cause crashes because it triggers corner cases that the developers didn't anticipate.)  But if you're only getting 2 frames per second and want to actually play the game rather than just testing to see if it runs, it can't do that--but a faster GPU based on exactly the same architecture could.

  • ZebladeZeblade Member UncommonPosts: 931
    Nothing like taking a random thought and running with it. So where does all this info come from? "I read it from someone that heard someone talking that heard it from a phone call that was reading it off the internet". Try to remember where your reading this at.. just saying.. have fun.. haha
  • QuizzicalQuizzical Member LegendaryPosts: 25,350
    Originally posted by Zeblade
    Nothing like taking a random thought and running with it. So where does all this info come from? "I read it from someone that heard someone talking that heard it from a phone call that was reading it off the internet". Try to remember where your reading this at.. just saying.. have fun.. haha

    http://www.anandtech.com/show/6666/the-tegra-4-gpu-nvidia-claims-better-performance-than-ipad-4

    It comes from reporting on Nvidia CEO Jen-Hsun Huang's presentation at CES.

  • TheLizardbonesTheLizardbones Member CommonPosts: 10,910


    Originally posted by Quizzical
    Originally posted by lizardbones It does sound like from what you said the Tegra 4 will do more than the Tegra 3 though. It won't just be faster, it'll have more features. So that answers some of my wondering. The chip will do more than the previous generation, it won't just be faster. That's enough of a reason to be a compelling reason to buy into a Tegra 4 based device for some sort of gaming. Not a tablet, but maybe an Android MiniPC or the Shield.  
    As compared to Tegra 3, I have the impression that Tegra 4 will have basically the same API support, but just has a lot more hardware to throw at it.  Sometimes that makes things viable that weren't before.  Radeon HD 6250 integrated graphics fully support DirectX 11, and if you run a demanding DirectX 11 game on them with all settings maxed, it likely will run the game.  (It's not guaranteed to run; sometimes slow hardware will cause crashes because it triggers corner cases that the developers didn't anticipate.)  But if you're only getting 2 frames per second and want to actually play the game rather than just testing to see if it runs, it can't do that--but a faster GPU based on exactly the same architecture could.


    If a person has a Tegra 4 device, and a Tegra 3 device, and the experience is almost exactly the same, why would they dump their Tegra 3 device for the Tegra 4 device? Why would they add a Tegra 4 device? Games run fine on the current crop of hardware. I don't think they are even straining the edges of what the current hardware can do. Even regular applications aren't really pushing what the current crop of hardware can do.

    I don't think developers are going to try and push what the hardware can do unless something like the Tegra 4 is common. They would have a really limited market. Nvidia would have to pay developers to write games or applications that need the Tegra 4's power or people who buy Tegra 4 devices would have to be willing to pay a lot more for Tegra 4 games. It's a solution to a problem that doesn't exist yet.

    I can not remember winning or losing a single debate on the internet.

  • QuizzicalQuizzical Member LegendaryPosts: 25,350
    Originally posted by lizardbones

     


    Originally posted by Quizzical

    Originally posted by lizardbones It does sound like from what you said the Tegra 4 will do more than the Tegra 3 though. It won't just be faster, it'll have more features. So that answers some of my wondering. The chip will do more than the previous generation, it won't just be faster. That's enough of a reason to be a compelling reason to buy into a Tegra 4 based device for some sort of gaming. Not a tablet, but maybe an Android MiniPC or the Shield.  
    As compared to Tegra 3, I have the impression that Tegra 4 will have basically the same API support, but just has a lot more hardware to throw at it.  Sometimes that makes things viable that weren't before.  Radeon HD 6250 integrated graphics fully support DirectX 11, and if you run a demanding DirectX 11 game on them with all settings maxed, it likely will run the game.  (It's not guaranteed to run; sometimes slow hardware will cause crashes because it triggers corner cases that the developers didn't anticipate.)  But if you're only getting 2 frames per second and want to actually play the game rather than just testing to see if it runs, it can't do that--but a faster GPU based on exactly the same architecture could.

    If a person has a Tegra 4 device, and a Tegra 3 device, and the experience is almost exactly the same, why would they dump their Tegra 3 device for the Tegra 4 device? Why would they add a Tegra 4 device? Games run fine on the current crop of hardware. I don't think they are even straining the edges of what the current hardware can do. Even regular applications aren't really pushing what the current crop of hardware can do.

    I don't think developers are going to try and push what the hardware can do unless something like the Tegra 4 is common. They would have a really limited market. Nvidia would have to pay developers to write games or applications that need the Tegra 4's power or people who buy Tegra 4 devices would have to be willing to pay a lot more for Tegra 4 games. It's a solution to a problem that doesn't exist yet.

    That's kind of like saying that there's no need for a faster console because Xbox 360 games run fine on an Xbox 360.  Or going back quite a while, that there was no need for a faster console than an NES because NES games ran fine on an NES.  The games that didn't run fine weren't created, precisely because they wouldn't have run well.

    If you're planning on tossing any product that you buy today in the garbage six months from now, then maybe you don't need a faster product.  But a faster product with better API support might well be able to run games smoothly that you want to pick up two or three years from now, while a slower product won't be able to.

  • TheLizardbonesTheLizardbones Member CommonPosts: 10,910


    Originally posted by Quizzical
    Originally posted by lizardbones   Originally posted by Quizzical Originally posted by lizardbones It does sound like from what you said the Tegra 4 will do more than the Tegra 3 though. It won't just be faster, it'll have more features. So that answers some of my wondering. The chip will do more than the previous generation, it won't just be faster. That's enough of a reason to be a compelling reason to buy into a Tegra 4 based device for some sort of gaming. Not a tablet, but maybe an Android MiniPC or the Shield.  
    As compared to Tegra 3, I have the impression that Tegra 4 will have basically the same API support, but just has a lot more hardware to throw at it.  Sometimes that makes things viable that weren't before.  Radeon HD 6250 integrated graphics fully support DirectX 11, and if you run a demanding DirectX 11 game on them with all settings maxed, it likely will run the game.  (It's not guaranteed to run; sometimes slow hardware will cause crashes because it triggers corner cases that the developers didn't anticipate.)  But if you're only getting 2 frames per second and want to actually play the game rather than just testing to see if it runs, it can't do that--but a faster GPU based on exactly the same architecture could.
    If a person has a Tegra 4 device, and a Tegra 3 device, and the experience is almost exactly the same, why would they dump their Tegra 3 device for the Tegra 4 device? Why would they add a Tegra 4 device? Games run fine on the current crop of hardware. I don't think they are even straining the edges of what the current hardware can do. Even regular applications aren't really pushing what the current crop of hardware can do. I don't think developers are going to try and push what the hardware can do unless something like the Tegra 4 is common. They would have a really limited market. Nvidia would have to pay developers to write games or applications that need the Tegra 4's power or people who buy Tegra 4 devices would have to be willing to pay a lot more for Tegra 4 games. It's a solution to a problem that doesn't exist yet.
    That's kind of like saying that there's no need for a faster console because Xbox 360 games run fine on an Xbox 360.  Or going back quite a while, that there was no need for a faster console than an NES because NES games ran fine on an NES.  The games that didn't run fine weren't created, precisely because they wouldn't have run well.

    If you're planning on tossing any product that you buy today in the garbage six months from now, then maybe you don't need a faster product.  But a faster product with better API support might well be able to run games smoothly that you want to pick up two or three years from now, while a slower product won't be able to.




    A new console wouldn't just be faster though. It would have more features. We ended up with N64s because they added more features and options that didn't exist until the N64s were created. People weren't buying N64s to play NES games, they were buying N64s to play N64 games with N64 features. The Tegra 4 isn't doing this. I went and looked it up and other analysts are saying what you're saying. The Tegra 4 is definitely faster, but it's not doing anything new.

    Now, the rest is my own analysis, such as it is. The Tegra 4 doesn't add any new features, and there is also no need for faster processors. Games like Pocket Legends and Minecraft PE that run fine on my Nook 16gb run fine on my old HTC phone. They also run fine on my brand new Motorola phone. My phone has a better processor than my Nook, and way better than my old HTC phone, but my experience is the same on all these devices. What's my incentive for picking a Tegra 4 device over another, cheaper device? Unless Tegra 4 is the only option in the device I'm picking, there's no particular reason to gravitate there. I would choose based on my experiences with HTC and Samsung, and choose between those vendors rather than looking for a Tegra 4 device. It seems to me that a lot of other consumers would do the same thing.

    I can not remember winning or losing a single debate on the internet.

  • QuizzicalQuizzical Member LegendaryPosts: 25,350
    Originally posted by lizardbones

    A new console wouldn't just be faster though. It would have more features. We ended up with N64s because they added more features and options that didn't exist until the N64s were created. People weren't buying N64s to play NES games, they were buying N64s to play N64 games with N64 features. The Tegra 4 isn't doing this. I went and looked it up and other analysts are saying what you're saying. The Tegra 4 is definitely faster, but it's not doing anything new.

    Now, the rest is my own analysis, such as it is. The Tegra 4 doesn't add any new features, and there is also no need for faster processors. Games like Pocket Legends and Minecraft PE that run fine on my Nook 16gb run fine on my old HTC phone. They also run fine on my brand new Motorola phone. My phone has a better processor than my Nook, and way better than my old HTC phone, but my experience is the same on all these devices. What's my incentive for picking a Tegra 4 device over another, cheaper device? Unless Tegra 4 is the only option in the device I'm picking, there's no particular reason to gravitate there. I would choose based on my experiences with HTC and Samsung, and choose between those vendors rather than looking for a Tegra 4 device. It seems to me that a lot of other consumers would do the same thing.

    As graphics APIs go, features and performance are sometimes interchangeable.

    Any 3D image that you can create in a modern game, you could likely create on a Silicon Graphics workstation running OpenGL 1.0 twenty years ago (monitor resolution or system memory capacity would block some things, though I'm not sure exactly what--but the API on its own wouldn't).  If you're using graphical features today that didn't exist then, you could do the computations on the CPU instead of the GPU to work around it.  What you couldn't do then is to run something with graphics on par with, say, WoW, at 60 frames per second.  Or 1 frame per second.  Likely not 1 frame per minute, either.

    Having the modern APIs and hardware means you can render things much faster.  It likely also makes things much easier to code.  It doesn't make possible things that were previously impossible, but it does make them practical.

    But more brute force performance also makes more things practical.  Even if a 1 MHz GPU chip with 1 shader, 1 TMU, 1 ROP, and 1 of everything else that is essential to a GPU chip were fully OpenGL 4.3 and DirectX 11.1 compliant, you still wouldn't be able to do much with it.

    I agree with you that it's a big problem for Tegra 4 that it is rather lacking in API compliance.  That's why I started this thread.  But given the choice between getting a new Tegra 3 system or a new Tegra 4 system, the latter is an easy call.  For that matter, given the choice between a Tegra 4 or any ARM-based chips that were on the market a year ago, the only major reasons to choose against Tegra 4 are price, power consumption, and OS.  But that is equally true of many other chips that we'll see shortly.

Sign In or Register to comment.