Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

As predicted PS4 will be ushering in streaming gaming for Consoles

2»

Comments

  • nbtscannbtscan Member UncommonPosts: 862

    If I remember correctly, it was mentioned during the Sony conference that they eventually want to bring the PS1 and PS2 libraries to the streaming service, so I think this is where they plan to utilize their cloud gaming in the short term.  PS1 games are CD sized and PS2 are DVD, so the data download requirements on those would be very small compared to the BluRay media.

    The biggest factor right now for cloud gaming is internet bandwidth availability.  The US, Japan and some countries in Europe have access to pretty good internet speeds, but the rest of the world is a bit behind in that aspect.  Either that or they actually have really low data transfer limits.

    I don't think having media-less games will be within the realm of possibility until at LEAST the next generation console after the PS4.

  • SmoeySmoey Member UncommonPosts: 599
    with the 8GB or ram it ill also be ushering in more mmorpg ports.

    (\ /) ?
    ( . .)
    c('')('')

  • QuizzicalQuizzical Member LegendaryPosts: 25,351
    Originally posted by Ridelynn

    I think cloud gaming has a future.

    Cloud computing is already here. We've been using it for years, and it just gets more and more ubiquitous and prevalent over time. More and more "services" will move to the cloud, the limiting factor being bandwidth availability more than anything. The more bandwidth we have available and accessable to the general public, the more services and functionality we will see out of cloud computing.

    Email moved to the cloud early on - AOL mail, Hotmail, Gmail - the clients live in the cloud, the mail stays in the cloud, we just didn't call it the cloud until recently. More and more "stuff" is moving there - word processors are already there, operating systems are moving there (Onlive Desktop, Windows Remote Desktop, VNC, Google Chromebooks).

    Gaming does fall into that generalization; however, it requires a lot more bandwidth than we currently have - maybe in Japan/Korea/Eastern Europe, where they are sporting 50+Mb/sec on average to the typical house, they may just be on the cusp of where it's a practical possibility. In the US and most of the rest of the world, we're still struggling to provide the typical household with enough bandwidth to do video streaming (and before you go out saying "Well I have 100MB/Sec FIOS and I can stream 18 HD videos at the same time, so can everyone else" - most of the world does not have access to FIOS, most of the geography and more than 1/4 the population of the US doesn't have access to what the FCC defines as broadband, which is only around 760kb/sec).

    The question is: is it more economical for a service provider (and here I mean SAAS, not ISP) to provide the hardware (via a datacenter and hosting "in the cloud"), or require their clients to provide their own hardware and bundle their service as an application. More and more - companies are deciding that eliminating the hurdle of hardware requirements broadens their audience, and simplifies their application design: They know exactly what hardware it will run on, it's in their datacenter. They just need to keep the front end lightweight and generic. They know when and how you will get your software updated: it's in their datacenter, they don't have to worry about distributing patches. They can monitor security flaws in real time, the software lives in their datacenter, they can see breaches and react immediately, with changes being pushed to all users immediately.

    There are a lot of benefits to SAAS, and I think gaming will get there - our clients will be thin devices that don't require a lot of hardware power (think of what that will do for battery life) - they just need to be rendering engines with interface mechanics (buttons, touchscreens, motion sensors, cameras, microphones, whatever), and everything else is handled on the cloud and transmitted to you. Something as small as Google Glass could be running something as hardware-punishing as Crysis on Ultra without an issue, because all the rendering and computations are done on a more powerful computer someplace else. Wii U controller does this on a local basis - all the video for the controller screen is rendered on the Console, and transmitted to the controller via dedicated WiFi N connection (and even that gets stressed for bandwidth, even at that low 854x480 resolution). nVidia Shield is working this way too. It's starting on a local network, there still needs to be a lot of backbone work before we see it on a global scale though.

    But that all requires bandwidth. Once the bandwidth gets there, you can bet cloud gaming will already be there. Today's bottleneck isn't CPU speed, or GPU power, it's bandwidth, and it's been bandwidth for the last decade, and will continue to be bandwidth until we have something that is completely transparent, wireless, and much higher capacity than anything we can currently imagine.

    Bandwidth problems are eventually fixable.  It will take a long time to get there, but we probably will.  But that's only the third biggest barrier to cloud gaming being viable for widespread use rather than narrow niches.

    Latency is a bigger problem, and one that is likely to never be fixable.  For the rendering side of cloud gaming, the Nvidia Grid K2 is basically the state of the art.  Nvidia promises that it can offer you display latency of around 160 ms.  Nvidia is known for wildly inflating claims about future products, to the degree that even products that end up being very good don't live up to the promises that Nvidia made ahead of time.  But for the sake of argument, let's suppose that Nvidia really can deliver 160 ms display latency cloud gaming.

    For comparison, let's suppose that you can render a game locally at 30 frames per second and have a 60 Hz monitor.  That's fairly low end for local rendering performance.  30 frames per second means that the time from when a time is chosen in the game world as being the one for which a frame will be drawn to the time that the frame is completed is 1/30 of a second, or about 33 ms.  Actually, let's round that to 34 ms instead.

    Post-processing effects and modifying buffers (e.g., loading new textures) between frames in some game engines can add some time that one frame is being worked on after the CPU-side computations for the next frame are started, so let's add 10 ms for post-processing effects.  It takes a really bad game engine to add an extra 10 ms delay here, by the way.  If post-processing is going to take 10 ms, it's better to wait a few milliseconds before starting the next frame.  Adding a delay of a couple of milliseconds or so is desirable in order to allow the video card to constantly be working rather than having to wait a bit between frames, but not more than that.  Anyway, an extra 10 ms delay here brings us to 44 ms.

    A 60 Hz monitor means that you copy a new frame from the video card to the monitor 60 times per second.  Let's add 1/60 of a second, or about 17 ms to our display latency for this time.  That brings is to 61 ms display latency.

    Finally, after the monitor has the new frame, it takes time to phase it in to display it.  Let's suppose that your monitor is terrible at this, and takes a full additional frame time, or another 17 ms.  A lot of monitors will claim latency around 2 ms here, which is sometimes too optimistic, but 5 ms is more relistic, and even 10 ms is very slow.  If we call it 17 ms, we have 78 ms total display latency.

    We just beat the state of the art cloud streaming by over 80 ms under assumptions chosen to put cloud gaming in the best possible light.  A more capable gaming machine would get your display latency closer to 30 or 40 ms, and beat the Nvidia Grid K2 by about 120-130 ms, even under Nvidia's wildly optimistic assumptions.  That's display latency alone, too; add a fair bit more for input latency.

    -----

    Some people will say that 80 ms isn't noticeable.  They have no clue what they're talking about.

    Just yesterday, I was working on my game.  I was trying to make the ground hilly, unlike the complete flat ground that you may have seen in previous screenshots.  I had botched my tessellation degree computations, however, and there were cracks in the ground that you could see the sky through.  Naturally, that's bad.  Anyone who has worked with tessellation very much would probably immediately know the culprit here:  an edge that is common to two patches is being tessellated differently in the two patches.  So I tried turning the tessellation degree up and down to check if that affected the cracks.

    As I wandered around the game world looking for cracks in the ground, I noticed that the controls weren't terribly responsive.  If I pushed a button to move forward, I'd move forward all right, but there was a clear delay, both between when I pressed the buttom and started moving and when I released it and stopped moving.  And it was clearly a worse delay than I was used to.

    I had done a bunch of work all over the game engine to add hills to the ground, and figured that I must have added a bug somewhere.  Finally I found a typo in the source code that caused some portions to be tessellated far more than I expected and others far less.  The "far more" tessellation was killing my frame rates, and dropped it down to about 40 frames per second.  The animation still looked smooth to me, so poor frame rates had not been my first thought on the culprit.

    That's about 25 ms display latency from how long it takes to compute a frame.  As compared to what I was used to, that's a difference of about 15-20 ms.  And not only did I think that little bit of extra display latency was noticeable, but I thought it was obviously a problem that urgently needed to be fixed.  And I noticed it when I wasn't looking for it, but rather, was intently looking for something else.  Adding even 20 ms to your display latency is very noticeable.

    -----

    If cloud gaming adds 15-20 ms latency in total as compared to rendering a game locally, it will never be accepted as just as good as local rendering.  It will forever be relegated to the low end, for situations when you don't have any good alternative.

    So rather than asking when we can get cloud gaming latency down to levels where it's not a big deal, let's ask when we can get it down to 15-20 ms, which is still a lot.  Cloud gaming adds input delay in sending your keyboard or mouse input to where the game is being rendered.  It adds delay in encoding the completed frame server-side.  It adds delay in decoding it client-side.  It adds delay in sending the completed frame back across the Internet.  It adds delay in copying the completed frame to the GPU on the client side.

    Let's ignore most of those delays for the sake of argument.  The one I want to focus on is the gap in time between when you start sending a frame across the Internet and when you finish sending that frame.  If we assume 60 frames per second and you're sending data continuously, that's 17 ms right there.  That alone is enough to make cloud gaming never a serious competitor to rendering locally if you can do the latter.

    And that's assuming that we can transmit data over the Internet instantly.  The laws of physics have something to say about that:  you can't transmit data faster than the speed of light.  Transmitting light through fiber optic cables is much slower than the speed of light in a vacuum, even.

    -----

    So cloud gaming is only viable when you can't render the game locally.  That raises the other problem for cloud gaming:  local rendering capabilities will increase.

    In many places in a game engine, you can do things this way or that way, and have to consider which will perform better.  You can do rendering computations on a GPU, or you can do them elsewhere and pass them to the GPU.  A game engine has to do quite a bit of both, as there are some computations that GPUs just aren't very good at.  That's what you have a CPU for.

    Let's suppose that you have to choose between computing data on the CPU and passing it to the GPU or computing data on the GPU instead.  If you have a choice between passing one 32-bit floating point number from the CPU to the GPU or doing 10 32-bit floating point operations on the GPU to recompute the number, the latter will be faster, and by a huge margin.  If it's passing one floating point number or doing 100 floating point computations, the latter is likely to still be faster, though here you're in the realm where you'd have to at least stop to think about it.

    And that's comparing doing work on the GPU to passing data over a PCI Express bus.  The PCI Express bus is fast, and built for passing data to a video card.  It offers latency on the order of 50 ns, and the older PCI Express 2.0 x16 standard offers 64 Gb/s of bandwidth.  (I'm using bits here rather than bytes because that's what Internet connections typically use.)

    If you had to pass data over the Internet at large, you'll be doing pretty well if you're about 3 orders of magnitude slower in throughput and 6 orders of magnitude slower in latency.  If you had to choose between passing a single 32-bit floating point value over the Internet and doing 100,000 32-bit floating point computations on the GPU, the latter is likely to be the superior option for performance reasons.  Make it passing one float or doing 10,000 computations and it's pretty trivial that you want to do the latter.

    100,000 computations is a lot.  My point here is that passing data over the Internet is very expensive in a relative sense.  As we get more Internet bandwidth, we'll also get more local rendering capabilities.  Cloud gaming is only viable if local rendering is not--and only remains a serious option until local rendering becomes viable.  Unless you think that the relative cost between GPU power and Internet bandwidth is going to swing very dramatically in the direction of Internet bandwidth being relatively cheaper in the coming years, local rendering will be viable for tablets and ultraportable laptops long before cloud gaming is.  Local rendering might beat cloud gaming to viability in demanding 3D games on cell phones, too.

    That would leave cloud gaming relegated to a handful of narrow niches.  One of those is rendering games where you have plenty of local rendering capability, but can't use it properly because of some hardware incompatibility.  For example, playing PS3 games on a PS4.

  • QuizzicalQuizzical Member LegendaryPosts: 25,351
    Originally posted by GrayGhost79
    @Korrent1991 Sorry, I mean no offense by ignoring Quizzical. He's one of those that's just not open to change and has been wrong on every front in this discussion in the past few years so far. I told him consoles were going to need to look into streaming. He argued that they would not. Then of course Microsoft and Sony both looked into it. Sony purchased a company and Microsoft has looked into a few. I stated they would be used to stream games. He argued that it wouldn't, that they actually wanted these companies for other purposes. Yet here is Sony and Steam both looking to stream games. Sony so far has stated that they are looking to stream older titles and PS3 titles as well as free trials and etc. (This is what Sony has said) but I don't see it stopping there, I mean if you can stream PS3 titles why is it far fetched to consider the possibility to stream PS4 titles in the not so distant future? I have no problem discussing things, but not with someone closed minded and proven wrong numerous times on this specific subject.

    If you had good counter arguments, you wouldn't need to ignore the arguments and attack the poster.

    But if you want to talk about track record, then let's talk about OnLive, the best known cloud gaming service.  It launched in 2010.  At one point, it had a market capitalization of $1.8 billion.

    OnLive started with a subscription fee of $15/month.  But people wouldn't pay that.  So they tried $5/month.  And people wouldn't pay that.  So they tried no subscription fee at all, but you just pay for the games.  And people wouldn't pay that, either.  The problem was that the service was awful, for exactly the reasons that the naysayers predicted:  bad image quality and high latency, even on days when your Internet connection is excellent.

    OnLive ended up being sold for a mere $3 million, or less than 0.2% of what the company was once worth.  You could build a cloud gaming service, but you couldn't make it remotely competitive with local rendering.  Gamers figured that out and stayed away in droves.  Many people tried it, but few hung around for long.

    OnLive was founded way back in 2003.  Some people have thought cloud gaming would be the future of games for a long time.  It's now 2013, and that's a future that hasn't arrived and doesn't look likely to arrive anytime soon outside of narrow niches.

    -----

    Streaming PS4 titles to a PS4 would be astoundingly stupid.  I really hope that's not what you're proposing.  Streaming PS4 titles to a PS3 could make some sense for people who don't mind a very poor game experience if the PS3 has the appropriate video decode capabilities.

  • GrayGhost79GrayGhost79 Member UncommonPosts: 4,775
    Originally posted by Quizzical
    Originally posted by GrayGhost79
    @Korrent1991 Sorry, I mean no offense by ignoring Quizzical. He's one of those that's just not open to change and has been wrong on every front in this discussion in the past few years so far. I told him consoles were going to need to look into streaming. He argued that they would not. Then of course Microsoft and Sony both looked into it. Sony purchased a company and Microsoft has looked into a few. I stated they would be used to stream games. He argued that it wouldn't, that they actually wanted these companies for other purposes. Yet here is Sony and Steam both looking to stream games. Sony so far has stated that they are looking to stream older titles and PS3 titles as well as free trials and etc. (This is what Sony has said) but I don't see it stopping there, I mean if you can stream PS3 titles why is it far fetched to consider the possibility to stream PS4 titles in the not so distant future? I have no problem discussing things, but not with someone closed minded and proven wrong numerous times on this specific subject.

    If you had good counter arguments, you wouldn't need to ignore the arguments and attack the poster.

     

    I didn't attack any poster. You were simply wrong about them ignoring streaming, you were wrong about them not using it to stream games, you were wrong about the shield, you were wrong about the Razer Edge, etc. I mean you have a good head on your shoulders when it comes to current tech and upgrades, but discussing future gaming advances or changes simply isn't something I give you much credit for. Thats not an insult, please don't take it as one. I was asked why I ignored your posts, I simply answered. 

  • GrayGhost79GrayGhost79 Member UncommonPosts: 4,775

    Its also interesting to note that Microsoft hopes to implement complete cloud streaming of games by 2015. One of the reasons cited was never having to upgrade equipment again. It's part of the 56 page road map that microsoft forced sites to take down which also included things like Microsofts smart glass which was shown not to long ago. Not to mention the fact that Microsoft snacthed up Onlive employees laid off durring its transition. 

     

    So... it looks like Microsoft is on board with the cloud gaming as well. 

     

     

     

     

     

     

Sign In or Register to comment.