Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Intel, tilting at windmills again.

13»

Comments

  • QuizzicalQuizzical Member LegendaryPosts: 23,665
    Quizzical said:
    Ridelynn said:
    I am sure they are not breaking any laws of physics. Just because you haven't figured out the specifics hardly makes it impossible. Just clever.

    No one thought streaming movies over the internet would work either - too much bandwidth, to many CPU cycles to compress, to many compression artifacts. Think about the AOL days, when we had thumbnails of GIFs so they could load in a timely fashion. Then we figured out better methods of compression, bandwidth got better, and CPU cycles got cheaper. 20 years later, Netflix proves not only that you can do it well, but you can make big business out of it. It took some time, and there was no magic bullet, but a confluence of a lot of factors.

    Given that video is more or less "working", I don't think interactive gaming is very far around the corner. I also don't think it will be simply taking the video output of a remote renderer and stuffing it into the internet tubes... I know that's basically what has been tried in the past, and is similar to what is happening now, but I think people are much more clever than that and will figure out even better optimizations to reduce latency/bandwidth.
    Agreed. I mean look at Geforce Now from Nvidia 
    https://www.nvidia.com/en-us/geforce/products/geforce-now/mac-pc/

    With a 50 Mbps internet speed you can stream and play Tomb Raider at 1080p at 60 frames a second. What will this service look like in 10 to 15 years?
    At a cost of how much latency and how much compression artifacts?  Compressing the amount of data by nearly 99% isn't going to be free.  Saying that they can do it at all does nothing to dispute my claims that it will be inferior to just rendering the game locally.

    Besides, stream the game for an hour and you've used over 20 GB of bandwidth--more than enough to just download most (not all) games and play it locally.  Do that every day and your ISP will probably dislike you and may try to throttle your connection somehow.
    Cnet says it is like playing on a 1070 even though her game was running on a 1080. Seems a fair trade off considering she was playing it on a $200 laptop.
    Besides this is a 8 to 15 year projection. Not now. I think to not even consider this a possibility is taking the Blockbuster stance in regards to Netflix. ;) We shall see how it plays out. 
    I don't know what the demo consisted of.  For a single demonstration, you can do things that are far too expensive to scale out to millions of consumers.  See Intel's recent 28-core CPU clocked at 5 GHz, for example, that they didn't initially even say was overclocked, much less phase-change cooling.

    If the remote server is close enough, and you've got enough bandwidth connecting it to the client, streaming games can seem like it works well.  But the cost of making it work well greatly exceeds the cost of rendering the game locally.  Scale it down to something that you can offer at a more affordable price and the experience deteriorates, too.

    Even so, it's fairly likely that if you had let someone play a game on the streaming demo, and had an actual system with a GTX 1070 and otherwise identical hardware (including the monitor) right next to it, and could go back and forth playing the same game at the same settings on both, it would be a large and obvious difference between the two.

    Even a latency difference of 40 ms is a night and day difference if you can readily compare with and without the latency.  This was something that I discovered by accident when programming my game, as a bug added that amount of latency that hadn't previously been there, and it was immediately obvious that something was broken.
  • blueturtle13blueturtle13 Member LegendaryPosts: 13,070
    Quizzical said:
    Quizzical said:
    But that is assuming we are still 'plugging in' to get our internet. All signs point to a wireless existence. Also I understand what the 'optic' means but there are still limitations in the fiber optic cable that have nothing to do with the actual transfer itself.  
    I mean Kyle Fujita of EA is at E3 now streaming Titanfall 2 using EA's new cloud gaming streaming service with nothing other than a TV, a controller and an internet connection.
    You have that exactly backwards.  Wireless is becoming more wired than ever.  That's how they can improve bandwidth.

    If you're using electromagnetic radiation to transmit data, then the rate at which you can transmit data is limited by the amount of spectrum you can use and your signal to noise ratio.  That's just the way the universe works.

    So how can successive generations of wireless communications improve bandwidth?  By having more, smaller cells.  For a given amount of spectrum, you can get some amount of bandwidth per base station.  Have more base stations that cover smaller areas and you get more cumulative bandwidth in the same total area.  And what happens to the data after it reaches those base stations?  It's sent off into the network via a wired connection.

    On another topic, making game streaming look fine to someone watching but not playing the game is easy.  Adding a steady 200 ms of latency doesn't even matter if you're only watching but not playing.  You don't feel the latency when you're not playing it yourself.
    So I guess EA, Ubisoft, Microsoft, Sony and Nvidia are just throwing away millions of dollars and resources? For a service that already is on the market and working when it shouldn't? 
    Let me turn this question around.  We've seen this movie before:  lots of big companies throwing a bunch of money at game streaming as an alternative to rendering the game locally.  They had commercially available products that people could buy and use.  AMD and Nvidia both made custom hardware for them.  They supported the latest games, not just a compatibility mode for things that you couldn't run locally.

    And then gamers stayed away in droves because it cost way too much money for a very low end experience and the company went bankrupt.  Basically, they tried to pick a fight with a laws of physics, and physics won.  The laws of physics haven't changed in the last eight years.  What makes you think that trying the same thing again will lead to a different result?  Other than that a diversified product line means that losing all the money invested isn't likely to lead to bankruptcy.

    As a compatibility mode for games that you can't run locally, streaming makes some sense.  But it's the backup plan, not the ideal approach.
    That was then. This is now and in a decade more ;)

    And it may not be the best approach right now but it may be in the future which was the entire point. For a more casual gamer, which is the vast majority of the gaming population? This will be an ideal situation. 

    거북이는 목을 내밀 때 안 움직입니다












  • QuizzicalQuizzical Member LegendaryPosts: 23,665
    Quizzical said:
    Quizzical said:
    But that is assuming we are still 'plugging in' to get our internet. All signs point to a wireless existence. Also I understand what the 'optic' means but there are still limitations in the fiber optic cable that have nothing to do with the actual transfer itself.  
    I mean Kyle Fujita of EA is at E3 now streaming Titanfall 2 using EA's new cloud gaming streaming service with nothing other than a TV, a controller and an internet connection.
    You have that exactly backwards.  Wireless is becoming more wired than ever.  That's how they can improve bandwidth.

    If you're using electromagnetic radiation to transmit data, then the rate at which you can transmit data is limited by the amount of spectrum you can use and your signal to noise ratio.  That's just the way the universe works.

    So how can successive generations of wireless communications improve bandwidth?  By having more, smaller cells.  For a given amount of spectrum, you can get some amount of bandwidth per base station.  Have more base stations that cover smaller areas and you get more cumulative bandwidth in the same total area.  And what happens to the data after it reaches those base stations?  It's sent off into the network via a wired connection.

    On another topic, making game streaming look fine to someone watching but not playing the game is easy.  Adding a steady 200 ms of latency doesn't even matter if you're only watching but not playing.  You don't feel the latency when you're not playing it yourself.
    So I guess EA, Ubisoft, Microsoft, Sony and Nvidia are just throwing away millions of dollars and resources? For a service that already is on the market and working when it shouldn't? 
    Let me turn this question around.  We've seen this movie before:  lots of big companies throwing a bunch of money at game streaming as an alternative to rendering the game locally.  They had commercially available products that people could buy and use.  AMD and Nvidia both made custom hardware for them.  They supported the latest games, not just a compatibility mode for things that you couldn't run locally.

    And then gamers stayed away in droves because it cost way too much money for a very low end experience and the company went bankrupt.  Basically, they tried to pick a fight with a laws of physics, and physics won.  The laws of physics haven't changed in the last eight years.  What makes you think that trying the same thing again will lead to a different result?  Other than that a diversified product line means that losing all the money invested isn't likely to lead to bankruptcy.

    As a compatibility mode for games that you can't run locally, streaming makes some sense.  But it's the backup plan, not the ideal approach.
    That was then. This is now and in a decade more ;)

    And it may not be the best approach right now but it may be in the future which was the entire point. For a more casual gamer, which is the vast majority of the gaming population? This will be an ideal situation. 
    Do explain:  what makes paying more for an inferior product "ideal" from the customer's perspective?
  • QuizzicalQuizzical Member LegendaryPosts: 23,665
    Comparing streaming a game to rendering it locally is like comparing two CPUs or GPUs or whatever when one is four full process nodes ahead of the other.  Maybe the four process nodes behind product can eventually reach any level of performance or efficiency or whatever, but it has absolutely no hope of ever being competitive.
  • blueturtle13blueturtle13 Member LegendaryPosts: 13,070
    Quizzical said:
    Quizzical said:
    Ridelynn said:
    I am sure they are not breaking any laws of physics. Just because you haven't figured out the specifics hardly makes it impossible. Just clever.

    No one thought streaming movies over the internet would work either - too much bandwidth, to many CPU cycles to compress, to many compression artifacts. Think about the AOL days, when we had thumbnails of GIFs so they could load in a timely fashion. Then we figured out better methods of compression, bandwidth got better, and CPU cycles got cheaper. 20 years later, Netflix proves not only that you can do it well, but you can make big business out of it. It took some time, and there was no magic bullet, but a confluence of a lot of factors.

    Given that video is more or less "working", I don't think interactive gaming is very far around the corner. I also don't think it will be simply taking the video output of a remote renderer and stuffing it into the internet tubes... I know that's basically what has been tried in the past, and is similar to what is happening now, but I think people are much more clever than that and will figure out even better optimizations to reduce latency/bandwidth.
    Agreed. I mean look at Geforce Now from Nvidia 
    https://www.nvidia.com/en-us/geforce/products/geforce-now/mac-pc/

    With a 50 Mbps internet speed you can stream and play Tomb Raider at 1080p at 60 frames a second. What will this service look like in 10 to 15 years?
    At a cost of how much latency and how much compression artifacts?  Compressing the amount of data by nearly 99% isn't going to be free.  Saying that they can do it at all does nothing to dispute my claims that it will be inferior to just rendering the game locally.

    Besides, stream the game for an hour and you've used over 20 GB of bandwidth--more than enough to just download most (not all) games and play it locally.  Do that every day and your ISP will probably dislike you and may try to throttle your connection somehow.
    Cnet says it is like playing on a 1070 even though her game was running on a 1080. Seems a fair trade off considering she was playing it on a $200 laptop.
    Besides this is a 8 to 15 year projection. Not now. I think to not even consider this a possibility is taking the Blockbuster stance in regards to Netflix. ;) We shall see how it plays out. 
    I don't know what the demo consisted of.  For a single demonstration, you can do things that are far too expensive to scale out to millions of consumers.  See Intel's recent 28-core CPU clocked at 5 GHz, for example, that they didn't initially even say was overclocked, much less phase-change cooling.

    If the remote server is close enough, and you've got enough bandwidth connecting it to the client, streaming games can seem like it works well.  But the cost of making it work well greatly exceeds the cost of rendering the game locally.  Scale it down to something that you can offer at a more affordable price and the experience deteriorates, too.

    Even so, it's fairly likely that if you had let someone play a game on the streaming demo, and had an actual system with a GTX 1070 and otherwise identical hardware (including the monitor) right next to it, and could go back and forth playing the same game at the same settings on both, it would be a large and obvious difference between the two.

    Even a latency difference of 40 ms is a night and day difference if you can readily compare with and without the latency.  This was something that I discovered by accident when programming my game, as a bug added that amount of latency that hadn't previously been there, and it was immediately obvious that something was broken.
    You are basing this opinion on previous attempts years ago. Not the here. Not the now. Not the future. I understand your point of view I just don't agree with it. We can agree to disagree. Time will tell. 

    거북이는 목을 내밀 때 안 움직입니다












  • RidelynnRidelynn Member EpicPosts: 7,277
    I can’t wait for Steam In Home streaming to come to iOS (if it ever does).

    That may be only local WiFi now - but I am certain my phone won’t have the same rendering power as my PC any time soon. But my phone is a hell of a lot easier to dork around with while I’m just laying in bed.

    There are already VPN work around a for folks who have robust enough ISPs.

    So it’s definitely not that far off.

    PS4 has been doing it since release (PSNow), and both XB and PS4 supports streaming your console content to a local PC over a local network.

    Once upon a time people thought Apple was crazy for plunking a huge investment into Akamai, a content delivery network (CDN). Then iTunes rolled around and it started to make sense.

    I don’t see how a similar strategy wouldn’t also work for game streaming. A CDN service connects you to the least latency service, and it’s the job of the CDN to seed data centers in geographically strategic areas specifically to reduce latency.

    Can it work? Absolutely. Will it beat out local gaming? That depends on what your metric is. Not ever having to buy new hardware ever again sure would be awfully nice....
    Torvalblueturtle13
Sign In or Register to comment.