Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Intel, tilting at windmills again.

2

Comments

  • MyriaMyria Member UncommonPosts: 699
    Quizzical said:

    Bandwidth isn't the only problem.  Latency is a huge problem, too, and probably more intractable than bandwidth.  It takes time to send your inputs to the remote server that is running the game.  It also takes time to send the completed image back.  It takes time to compress and decompress the image.  It also effectively backs you up by a frame for the time it takes to transmit that frame.  That could easily add an extra 100 ms of latency.
    This is the biggie people tend to ignore when discussing game streaming. And it's an issue technological advancement isn't going to solve as it's mostly a physics issue.

    In a country like South Korea, with a population density sixteen times that of the US in a country one fifth the size of California, I could see cramming enough servers with enough redundant lines close enough the majority of players as to minimize the downsides, but in a country the size of the US with the (lack of) population density thereof? Outside of major cities I don't see it ever being a very pleasant experience compared to local processing.

    And then there's the question of how you create this streaming infrastructure and not make it a big arsed target for every would be hacker and script kiddie on the net? This is already increasingly an issue, games relying on server-side processing and rendering would be even more vulnerable than your average MMO today is.
  • QuizzicalQuizzical Member LegendaryPosts: 25,348
    Torval said:
    Quizzical said:
    But PlayStation streams games now. EA showed off their streaming this E3 as did Microsoft. Seems the industry has already decided to move ahead in this direction regardless of the doubt beyond it. 
    They offer streaming as a way to run games built for older consoles that wouldn't otherwise be able to run at all.  That's basically a last resort approach of, at least it lets you play the game badly rather than not letting you play it at all.  But playing a PS3 game by streaming it to a PS4 is going to be a far inferior experience to playing the same PS3 game on a PS3.

    My argument isn't that there will never be any streaming of games at all.  Especially for purely turn-based games, the latency doesn't kill the game entirely.  My argument is that gamers will never accept it if someone tries to make streaming into the primary way to play a game.  Make streaming into the only way that a game can be played at all and you greatly diminish the market for that particular game.

    It's kind of like saying that if someone today tried to launch a new game console to compete with PS4 Pro and Xbox One X, but it offered less performance than the original Xbox One (non-X), cost $700, and didn't have any notable advantages to compensate, people wouldn't buy it.
    You're misrepresenting the service based on your opinion, not facts.

    PSNow, I've tried it, does stream games to your PC or PS4. It's not an inferior experience regarding the gameplay and latency. If there is any platform disparity it comes in features no longer supported or implemented in the same way.

    The games themselves play and look at good. They save. This is a marketing system by Sony and most of the issues pertain to how they implement features and access and the available library, not because they're streamed.

    I stopped the service because they discontinued a game I wanted to play (actually all PS1 games) and the overall price was way too expensive for the value I found in it. They've improved the service a lot since I tried it though. It now supports multiplayer, syncing game saves between PCs and PS4 with your PS+.


    Streaming can be accomplished through a lot of paradigms. You're likely correct in the way you envision streaming services working being clunky, but there are many ways to do things like I said. PSNow is an example. I actually think Microsoft will do better.
    If I've misunderstood it, then please explain how it works that I've misunderstood.  If you're rendering a game on one system, then streaming it uncompressed to display on another, that can work well if you've got the bandwidth--which you might on a LAN, but probably not on anything else.

    If you're rendering it on one system, compressing it, streaming it to another, decompressing it, and then displaying it, that's still going to take a ton of bandwidth, in addition to adding a lot of latency.  That's a problem of physics, and there's no way around that.  That doesn't mean it doesn't work, any more than running a game on a $100 APU doesn't work.  It works, but let's not pretend that that's anything other than the low end.

    If that bandwidth is going over the Internet, then in addition to being a low end experience, it's the bandwidth is going to be very expensive, at least if you use it extensively rather than sparingly.  At that point, you've got a high end price tag (for enough bandwidth that your ISP doesn't throttle you somehow) for a low end experience.

    And if that's not what it's doing, then what is it doing?
  • [Deleted User][Deleted User] Posts: 12,263
    The user and all related content has been deleted.

    거북이는 목을 내밀 때 안 움직입니다












  • QuizzicalQuizzical Member LegendaryPosts: 25,348
    Quizzical said:
    Torval said:
    Quizzical said:
    But PlayStation streams games now. EA showed off their streaming this E3 as did Microsoft. Seems the industry has already decided to move ahead in this direction regardless of the doubt beyond it. 
    They offer streaming as a way to run games built for older consoles that wouldn't otherwise be able to run at all.  That's basically a last resort approach of, at least it lets you play the game badly rather than not letting you play it at all.  But playing a PS3 game by streaming it to a PS4 is going to be a far inferior experience to playing the same PS3 game on a PS3.

    My argument isn't that there will never be any streaming of games at all.  Especially for purely turn-based games, the latency doesn't kill the game entirely.  My argument is that gamers will never accept it if someone tries to make streaming into the primary way to play a game.  Make streaming into the only way that a game can be played at all and you greatly diminish the market for that particular game.

    It's kind of like saying that if someone today tried to launch a new game console to compete with PS4 Pro and Xbox One X, but it offered less performance than the original Xbox One (non-X), cost $700, and didn't have any notable advantages to compensate, people wouldn't buy it.
    You're misrepresenting the service based on your opinion, not facts.

    PSNow, I've tried it, does stream games to your PC or PS4. It's not an inferior experience regarding the gameplay and latency. If there is any platform disparity it comes in features no longer supported or implemented in the same way.

    The games themselves play and look at good. They save. This is a marketing system by Sony and most of the issues pertain to how they implement features and access and the available library, not because they're streamed.

    I stopped the service because they discontinued a game I wanted to play (actually all PS1 games) and the overall price was way too expensive for the value I found in it. They've improved the service a lot since I tried it though. It now supports multiplayer, syncing game saves between PCs and PS4 with your PS+.


    Streaming can be accomplished through a lot of paradigms. You're likely correct in the way you envision streaming services working being clunky, but there are many ways to do things like I said. PSNow is an example. I actually think Microsoft will do better.
    If I've misunderstood it, then please explain how it works that I've misunderstood.  If you're rendering a game on one system, then streaming it uncompressed to display on another, that can work well if you've got the bandwidth--which you might on a LAN, but probably not on anything else.

    If you're rendering it on one system, compressing it, streaming it to another, decompressing it, and then displaying it, that's still going to take a ton of bandwidth, in addition to adding a lot of latency.  That's a problem of physics, and there's no way around that.  That doesn't mean it doesn't work, any more than running a game on a $100 APU doesn't work.  It works, but let's not pretend that that's anything other than the low end.

    If that bandwidth is going over the Internet, then in addition to being a low end experience, it's the bandwidth is going to be very expensive, at least if you use it extensively rather than sparingly.  At that point, you've got a high end price tag (for enough bandwidth that your ISP doesn't throttle you somehow) for a low end experience.

    And if that's not what it's doing, then what is it doing?
    So if you believe it will not happen after this next console generation then why does Ubisoft and others believe it will? Is it that they know something all of us don’t? Or is it just wishful thinking on their part? 
    People believed that it was going to happen to PCs eight years ago, and there was a ton of money behind making it happen.  At one point, OnLive had a market capitalization of $1.8 billion.  But they weren't able to convince gamers to pay a high price tag for a very low end gaming experience, so two years later, they were sold for $3 million plus assumption of debt.

    Nothing about the physics problems that relegated OnLive to the low end has changed.  If anything, trying to complete that way today would be harder because now you can get a cheap gaming rig that runs just about anything, so streaming wouldn't save you nearly as much on hardware up front costs today as it did then.

    That doesn't mean that no one will try it again.  Ubisoft is probably right that someone will make a thin client "game console" that can't do anything other than streaming games that are rendered remotely.  But whoever does that will probably lose whatever money they invest into it, as gamers won't want to use it.

    It's also possible that the next generation of game consoles will be the last for an entirely different reason.  Moore's Law isn't going to last forever, and if technology doesn't allow you to build a new console that is far better than the previous, why bother building a new console that isn't?  That doesn't mean that no new console will ever launch again, but it could mean that we don't see a clear generational leap.
  • VrikaVrika Member LegendaryPosts: 7,888
    edited June 2018
    blueturtle13 said:

    So if you believe it will not happen after this next console generation then why does Ubisoft and others believe it will? Is it that they know something all of us don’t? Or is it just wishful thinking on their part? 
    Different people are making different predictions.

    You'll just have to look at VR sales predictions vs. VR sales to see how large the difference between predictions and reality often is.

    Ubisoft seems to be at one extreme end of prediction spectrum and making a prediction that's not likely to happen. Whereas Quizzical is taking view on another end of the spectrum that I think is nearly as extreme.
    Quizzical
     
  • RidelynnRidelynn Member EpicPosts: 7,383
    Not big on technical details, but streaming doesn't just mean video streaming in the Netflix sense.

    https://www.theverge.com/2018/3/15/17123452/microsoft-gaming-cloud-xbox-future
  • [Deleted User][Deleted User] Posts: 12,263
    The user and all related content has been deleted.

    거북이는 목을 내밀 때 안 움직입니다












  • QuizzicalQuizzical Member LegendaryPosts: 25,348
    Ridelynn said:
    Not big on technical details, but streaming doesn't just mean video streaming in the Netflix sense.

    https://www.theverge.com/2018/3/15/17123452/microsoft-gaming-cloud-xbox-future
    My basic claim is that rendering a game remotely and then streaming the completed images over the public Internet to a thin client that displays them for the person playing the game will never be more than a small niche.  Not five years from now.  Not fifty years from now.  The laws of physics will have their say and will not be overturned by technology advances.  Unless some compatibility problem makes it impractical to render the game locally, rendering it locally will nearly always be the superior option.

    That doesn't mean that no type of streaming will ever happen.  Streaming to watch someone else play a game already happens, but that isn't latency sensitive and thus doesn't run into the same problems.  Streaming games over a LAN is a small niche now, but could plausibly become much more widespread.  For a game to stream assets to the client while the game is running and the client use those assets to render the game locally could also become much more widespread.
  • QuizzicalQuizzical Member LegendaryPosts: 25,348
    Sometimes predictions are wrong in the same direction many times in a row:

    https://en.wikipedia.org/wiki/Itanium#/media/File:Itanium_Sales_Forecasts_edit.png

    If a product is bad, no one has to buy it, no matter how big and powerful the companies pushing it are.  Sometime around 2007, people figured out that Itanium never was going to catch on and started phasing out support.
  • [Deleted User][Deleted User] Posts: 12,263
    The user and all related content has been deleted.

    거북이는 목을 내밀 때 안 움직입니다












  • [Deleted User][Deleted User] Posts: 12,263
    The user and all related content has been deleted.

    거북이는 목을 내밀 때 안 움직입니다












  • ceratop001ceratop001 Member RarePosts: 1,594
    Netflix is also releasing TellTale Games Minecraft StoryMode this fall that can be played while being streamed. It seems many tech companies are trying to really make this happen. 
    Don't make me start playing minecraft again lol
     
  • QuizzicalQuizzical Member LegendaryPosts: 25,348
    Quizzical said:
    Ridelynn said:
    Not big on technical details, but streaming doesn't just mean video streaming in the Netflix sense.

    https://www.theverge.com/2018/3/15/17123452/microsoft-gaming-cloud-xbox-future
    My basic claim is that rendering a game remotely and then streaming the completed images over the public Internet to a thin client that displays them for the person playing the game will never be more than a small niche.  Not five years from now.  Not fifty years from now.  The laws of physics will have their say and will not be overturned by technology advances.  Unless some compatibility problem makes it impractical to render the game locally, rendering it locally will nearly always be the superior option.

    That doesn't mean that no type of streaming will ever happen.  Streaming to watch someone else play a game already happens, but that isn't latency sensitive and thus doesn't run into the same problems.  Streaming games over a LAN is a small niche now, but could plausibly become much more widespread.  For a game to stream assets to the client while the game is running and the client use those assets to render the game locally could also become much more widespread.
    I'm not sure I buy that physics will have it's say thing. I mean the weak link in all this is the fiber optic and if a solution is found for that then the ability to achieve a higher rate of transfer will be a nonfactor.  I have a hard time believing that we are going to just settle for a capped out internet speed at some point. Technology always finds a way. I mean internet speeds have already increased by what 50X in ten years? 
    The idea of a fiber optic cable is that you're using light to transmit data.  That's the "optic" part of it.  You're not sending anything faster than the speed of light without some major scientific revolution that overturns a whole lot of fundamental things that we thought we knew about physics.  We've only had one of those in all of history, early in the 20th century, though admittedly the modern notion of science only goes back several centuries or so.

    But the other problem is latency.  Getting ten times as much bandwidth or a hundred times as much bandwidth won't do anything to help your latency.  If you have to take a frame, compress the frame, send it over the Internet, and then decompress the frame at the other end, that all takes time.

    The problem isn't that making game streaming work is impossible.  The problem is that for reasons of physics, rendering a game far away and then streaming it to the gamer is intrinsically much harder to make work well than rendering it much closer to the end user.  Game streaming will improve, but so will rendering the game locally, and there's no real chance that game streaming ever catches up.

    Let's make a simplified model in which we have only two variables:  cost and quality.  Quality includes how graphical fidelity, latency, frame rates, and everything else about having a good gaming experience.  Cost includes the cost of hardware and the cost of an Internet connection viable for how you want to use it, pro-rated into a monthly average cost.

    Suppose that at one point in time, your choices are:

    Rendering locally:  cost:  $80/month,   quality:  5/10
    Streaming:   cost:  $100/month, quality:  2/10

    A decade later, your choices are:

    Rendering locally:  cost:  $70/month,  quality:  7/10
    Streaming:  cost:  $90/month, quality:  4/10

    And then another decade after that, your choices are:

    Rendering locally:  $60/month, quality:  9/10
    Streaming:  cost:  $70/month, quality:  6/10

    With those made-up numbers, streaming clearly gets better over time.  You get better quality at lower cost.  But rendering games locally also gets better.  And at every point in time, rendering your games locally gets you better quality for less money, and it's not even close.  At no point does streaming ever become a serious option unless rendering the game locally is impractical for one reason or another.

    Maybe someday, streaming games at 1080p and 60 frames per second can look as good with as low of latency as running the game locally does today.  And by the time that happens, your $300 Wal-Mart computer can deliver a 4K resolution at 120 frames per second, while the higher end gaming rigs can do VR at 8K per eye and 240 frames per second and everyone says that the problems with VR from the 2010s have been solved.  Would that make game streaming into an interesting option?  I say no.
  • [Deleted User][Deleted User] Posts: 12,263
    The user and all related content has been deleted.

    거북이는 목을 내밀 때 안 움직입니다












  • [Deleted User][Deleted User] Posts: 12,263
    edited June 2018
    The user and all related content has been deleted.

    거북이는 목을 내밀 때 안 움직입니다












  • RidelynnRidelynn Member EpicPosts: 7,383
    I am sure they are not breaking any laws of physics. Just because you haven't figured out the specifics hardly makes it impossible. Just clever.

    No one thought streaming movies over the internet would work either - too much bandwidth, to many CPU cycles to compress, to many compression artifacts. Think about the AOL days, when we had thumbnails of GIFs so they could load in a timely fashion. Then we figured out better methods of compression, bandwidth got better, and CPU cycles got cheaper. 20 years later, Netflix proves not only that you can do it well, but you can make big business out of it. It took some time, and there was no magic bullet, but a confluence of a lot of factors.

    Given that video is more or less "working", I don't think interactive gaming is very far around the corner. I also don't think it will be simply taking the video output of a remote renderer and stuffing it into the internet tubes... I know that's basically what has been tried in the past, and is similar to what is happening now, but I think people are much more clever than that and will figure out even better optimizations to reduce latency/bandwidth.
    [Deleted User]
  • [Deleted User][Deleted User] Posts: 12,263
    The user and all related content has been deleted.

    거북이는 목을 내밀 때 안 움직입니다












  • QuizzicalQuizzical Member LegendaryPosts: 25,348
    But that is assuming we are still 'plugging in' to get our internet. All signs point to a wireless existence. Also I understand what the 'optic' means but there are still limitations in the fiber optic cable that have nothing to do with the actual transfer itself.  
    I mean Kyle Fujita of EA is at E3 now streaming Titanfall 2 using EA's new cloud gaming streaming service with nothing other than a TV, a controller and an internet connection.
    You have that exactly backwards.  Wireless is becoming more wired than ever.  That's how they can improve bandwidth.

    If you're using electromagnetic radiation to transmit data, then the rate at which you can transmit data is limited by the amount of spectrum you can use and your signal to noise ratio.  That's just the way the universe works.

    So how can successive generations of wireless communications improve bandwidth?  By having more, smaller cells.  For a given amount of spectrum, you can get some amount of bandwidth per base station.  Have more base stations that cover smaller areas and you get more cumulative bandwidth in the same total area.  And what happens to the data after it reaches those base stations?  It's sent off into the network via a wired connection.

    On another topic, making game streaming look fine to someone watching but not playing the game is easy.  Adding a steady 200 ms of latency doesn't even matter if you're only watching but not playing.  You don't feel the latency when you're not playing it yourself.
  • [Deleted User][Deleted User] Posts: 12,263
    The user and all related content has been deleted.

    거북이는 목을 내밀 때 안 움직입니다












  • QuizzicalQuizzical Member LegendaryPosts: 25,348
    Ridelynn said:
    I am sure they are not breaking any laws of physics. Just because you haven't figured out the specifics hardly makes it impossible. Just clever.

    No one thought streaming movies over the internet would work either - too much bandwidth, to many CPU cycles to compress, to many compression artifacts. Think about the AOL days, when we had thumbnails of GIFs so they could load in a timely fashion. Then we figured out better methods of compression, bandwidth got better, and CPU cycles got cheaper. 20 years later, Netflix proves not only that you can do it well, but you can make big business out of it. It took some time, and there was no magic bullet, but a confluence of a lot of factors.

    Given that video is more or less "working", I don't think interactive gaming is very far around the corner. I also don't think it will be simply taking the video output of a remote renderer and stuffing it into the internet tubes... I know that's basically what has been tried in the past, and is similar to what is happening now, but I think people are much more clever than that and will figure out even better optimizations to reduce latency/bandwidth.
    Agreed. I mean look at Geforce Now from Nvidia 
    https://www.nvidia.com/en-us/geforce/products/geforce-now/mac-pc/

    With a 50 Mbps internet speed you can stream and play Tomb Raider at 1080p at 60 frames a second. What will this service look like in 10 to 15 years?
    At a cost of how much latency and how much compression artifacts?  Compressing the amount of data by nearly 99% isn't going to be free.  Saying that they can do it at all does nothing to dispute my claims that it will be inferior to just rendering the game locally.

    Besides, stream the game for an hour and you've used over 20 GB of bandwidth--more than enough to just download most (not all) games and play it locally.  Do that every day and your ISP will probably dislike you and may try to throttle your connection somehow.
  • [Deleted User][Deleted User] Posts: 12,263
    The user and all related content has been deleted.

    거북이는 목을 내밀 때 안 움직입니다












  • QuizzicalQuizzical Member LegendaryPosts: 25,348
    Quizzical said:
    But that is assuming we are still 'plugging in' to get our internet. All signs point to a wireless existence. Also I understand what the 'optic' means but there are still limitations in the fiber optic cable that have nothing to do with the actual transfer itself.  
    I mean Kyle Fujita of EA is at E3 now streaming Titanfall 2 using EA's new cloud gaming streaming service with nothing other than a TV, a controller and an internet connection.
    You have that exactly backwards.  Wireless is becoming more wired than ever.  That's how they can improve bandwidth.

    If you're using electromagnetic radiation to transmit data, then the rate at which you can transmit data is limited by the amount of spectrum you can use and your signal to noise ratio.  That's just the way the universe works.

    So how can successive generations of wireless communications improve bandwidth?  By having more, smaller cells.  For a given amount of spectrum, you can get some amount of bandwidth per base station.  Have more base stations that cover smaller areas and you get more cumulative bandwidth in the same total area.  And what happens to the data after it reaches those base stations?  It's sent off into the network via a wired connection.

    On another topic, making game streaming look fine to someone watching but not playing the game is easy.  Adding a steady 200 ms of latency doesn't even matter if you're only watching but not playing.  You don't feel the latency when you're not playing it yourself.
    So I guess EA, Ubisoft, Microsoft, Sony and Nvidia are just throwing away millions of dollars and resources? For a service that already is on the market and working when it shouldn't? 
    Let me turn this question around.  We've seen this movie before:  lots of big companies throwing a bunch of money at game streaming as an alternative to rendering the game locally.  They had commercially available products that people could buy and use.  AMD and Nvidia both made custom hardware for them.  They supported the latest games, not just a compatibility mode for things that you couldn't run locally.

    And then gamers stayed away in droves because it cost way too much money for a very low end experience and the company went bankrupt.  Basically, they tried to pick a fight with a laws of physics, and physics won.  The laws of physics haven't changed in the last eight years.  What makes you think that trying the same thing again will lead to a different result?  Other than that a diversified product line means that losing all the money invested isn't likely to lead to bankruptcy.

    As a compatibility mode for games that you can't run locally, streaming makes some sense.  But it's the backup plan, not the ideal approach.
  • QuizzicalQuizzical Member LegendaryPosts: 25,348
    Quizzical said:
    Ridelynn said:
    I am sure they are not breaking any laws of physics. Just because you haven't figured out the specifics hardly makes it impossible. Just clever.

    No one thought streaming movies over the internet would work either - too much bandwidth, to many CPU cycles to compress, to many compression artifacts. Think about the AOL days, when we had thumbnails of GIFs so they could load in a timely fashion. Then we figured out better methods of compression, bandwidth got better, and CPU cycles got cheaper. 20 years later, Netflix proves not only that you can do it well, but you can make big business out of it. It took some time, and there was no magic bullet, but a confluence of a lot of factors.

    Given that video is more or less "working", I don't think interactive gaming is very far around the corner. I also don't think it will be simply taking the video output of a remote renderer and stuffing it into the internet tubes... I know that's basically what has been tried in the past, and is similar to what is happening now, but I think people are much more clever than that and will figure out even better optimizations to reduce latency/bandwidth.
    Agreed. I mean look at Geforce Now from Nvidia 
    https://www.nvidia.com/en-us/geforce/products/geforce-now/mac-pc/

    With a 50 Mbps internet speed you can stream and play Tomb Raider at 1080p at 60 frames a second. What will this service look like in 10 to 15 years?
    At a cost of how much latency and how much compression artifacts?  Compressing the amount of data by nearly 99% isn't going to be free.  Saying that they can do it at all does nothing to dispute my claims that it will be inferior to just rendering the game locally.

    Besides, stream the game for an hour and you've used over 20 GB of bandwidth--more than enough to just download most (not all) games and play it locally.  Do that every day and your ISP will probably dislike you and may try to throttle your connection somehow.
    Cnet says it is like playing on a 1070 even though her game was running on a 1080. Seems a fair trade off considering she was playing it on a $200 laptop.
    Besides this is a 8 to 15 year projection. Not now. I think to not even consider this a possibility is taking the Blockbuster stance in regards to Netflix. ;) We shall see how it plays out. 
    I don't know what the demo consisted of.  For a single demonstration, you can do things that are far too expensive to scale out to millions of consumers.  See Intel's recent 28-core CPU clocked at 5 GHz, for example, that they didn't initially even say was overclocked, much less phase-change cooling.

    If the remote server is close enough, and you've got enough bandwidth connecting it to the client, streaming games can seem like it works well.  But the cost of making it work well greatly exceeds the cost of rendering the game locally.  Scale it down to something that you can offer at a more affordable price and the experience deteriorates, too.

    Even so, it's fairly likely that if you had let someone play a game on the streaming demo, and had an actual system with a GTX 1070 and otherwise identical hardware (including the monitor) right next to it, and could go back and forth playing the same game at the same settings on both, it would be a large and obvious difference between the two.

    Even a latency difference of 40 ms is a night and day difference if you can readily compare with and without the latency.  This was something that I discovered by accident when programming my game, as a bug added that amount of latency that hadn't previously been there, and it was immediately obvious that something was broken.
  • [Deleted User][Deleted User] Posts: 12,263
    The user and all related content has been deleted.

    거북이는 목을 내밀 때 안 움직입니다












  • QuizzicalQuizzical Member LegendaryPosts: 25,348
    Quizzical said:
    Quizzical said:
    But that is assuming we are still 'plugging in' to get our internet. All signs point to a wireless existence. Also I understand what the 'optic' means but there are still limitations in the fiber optic cable that have nothing to do with the actual transfer itself.  
    I mean Kyle Fujita of EA is at E3 now streaming Titanfall 2 using EA's new cloud gaming streaming service with nothing other than a TV, a controller and an internet connection.
    You have that exactly backwards.  Wireless is becoming more wired than ever.  That's how they can improve bandwidth.

    If you're using electromagnetic radiation to transmit data, then the rate at which you can transmit data is limited by the amount of spectrum you can use and your signal to noise ratio.  That's just the way the universe works.

    So how can successive generations of wireless communications improve bandwidth?  By having more, smaller cells.  For a given amount of spectrum, you can get some amount of bandwidth per base station.  Have more base stations that cover smaller areas and you get more cumulative bandwidth in the same total area.  And what happens to the data after it reaches those base stations?  It's sent off into the network via a wired connection.

    On another topic, making game streaming look fine to someone watching but not playing the game is easy.  Adding a steady 200 ms of latency doesn't even matter if you're only watching but not playing.  You don't feel the latency when you're not playing it yourself.
    So I guess EA, Ubisoft, Microsoft, Sony and Nvidia are just throwing away millions of dollars and resources? For a service that already is on the market and working when it shouldn't? 
    Let me turn this question around.  We've seen this movie before:  lots of big companies throwing a bunch of money at game streaming as an alternative to rendering the game locally.  They had commercially available products that people could buy and use.  AMD and Nvidia both made custom hardware for them.  They supported the latest games, not just a compatibility mode for things that you couldn't run locally.

    And then gamers stayed away in droves because it cost way too much money for a very low end experience and the company went bankrupt.  Basically, they tried to pick a fight with a laws of physics, and physics won.  The laws of physics haven't changed in the last eight years.  What makes you think that trying the same thing again will lead to a different result?  Other than that a diversified product line means that losing all the money invested isn't likely to lead to bankruptcy.

    As a compatibility mode for games that you can't run locally, streaming makes some sense.  But it's the backup plan, not the ideal approach.
    That was then. This is now and in a decade more ;)

    And it may not be the best approach right now but it may be in the future which was the entire point. For a more casual gamer, which is the vast majority of the gaming population? This will be an ideal situation. 
    Do explain:  what makes paying more for an inferior product "ideal" from the customer's perspective?
Sign In or Register to comment.