Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Gaming on a thin client

QuizzicalQuizzical Member LegendaryPosts: 25,355
Let's be clear what this thread is about.  This is about game streaming in the sense of OnLive or Google Stadia.  It has nothing to do with the physical thickness of any device involved.  There have been a number of threads about this over the years, and I've replied to a lot of comments, but hadn't created a thread to lay out the whole situation systematically until now.

The idea of a thin client is that you don't have the heavy processing power in the device that you're directly using.  Rather, you have to log in to some remote server that does the heavy lifting, and just sends you the result to display on your screen.  In a sense, thin clients are the descendants of dumb terminals for shared mainframes from an era when it wasn't practical to give everyone his own computer.

As applied to gaming, the real question is where rendering the game is done.  If your local machine does most of the computational work to generate an image and display it to you, that's a thick client, not a thin client.  If the game is rendered elsewhere and sent to you, and all your device has to do is decompress it and display it, that's a thin client.  Your cell phone is not a thin client, for example.  A device that is plenty capable of rendering a game on its own can be used as a thin client.  If a device is only going to be used as a thin client, it's also possible to use a small, cheap device that isn't capable of doing much else.

I don't use the terminology "game streaming" here as that is also used in the sense of Twitch, which is totally different.  Besides, for those who have had the misfortune of having to use a thin client in a corporate context, the epithet of comparing something to a thin client absolutely applies here.

The way that games are normally rendered is that the video card does various computations to generate frames, then it sends those frames to a monitor to display.  There is a wired connection from the video card to the monitor, and it is dedicated to be used just for moving the rendered frames from former to the latter.  It's also a fairly high bandwidth connection, as it takes several Gb/sec to transfer 1920x1080 images at 60 Hz, for example.  That's how it works in everything from a gaming desktop to a laptop to a cell phone, though smaller devices tend to have an integrated GPU rather than a discrete card.

With the thin client approach, the game is rendered on a remote server.  It has to transmit the images to you before your device can display them on the screen.  Unlike local rendering, there generally isn't a dedicated connection.  There also tends to be massively less bandwidth available.  That means that transmitting the full, uncompressed images tends to be impractical.  You could send uncompressed images over a LAN with 10 Gbps Ethernet, but that is sadly still uncommon for consumer use.  And while it is possible to have a game running on your gaming desktop and stream it to your phone, the use case that gets a lot more attention is having the game running on a remote server that you connect to over the public Internet.

This has a variety of advantages and disadvantages.  The biggest disadvantage will be thoroughly familiar to anyone who has spent a lot of time using a thin client for non-gaming uses:  it doesn't work very well.  It's not that it doesn't work at all.  It's just really clunky in day to day use, and prone to the hiccups for a variety of reasons that are impossible when rendering the game locally.

One if the problems is bandwidth.  Even if you have a 1 Gbps fiber to the home connection, your ISP isn't likely to let you actually use 1 Gbps for a few hours per day in an average day.  Even if they did, that's not enough to transmit uncompressed data.  Rather, it's going to take an awful lot of compression.

One approach to this is to compress each image on the remote server and send it, then decompress it locally.  If you do this with each frame in isolation, it's not going to work very well.  Good compression and decompression is a lot of processing load, and to get the size down to something manageable is going to take very lossy compression that looks terrible.  So that's not what they actually do.

If you watch a video, such as on YouTube or Netflix, they compress across time.  Consecutive frames tend to be very similar to each other.  If you have the entire video in advance, you can look at what doesn't change from one frame to the next and make compressing a hundred consecutive frames that are all nearly identical not take that much more space than just one frame.

That's not possible for gaming, however.  They can't look at what they need to send you far in advance and compress it.  They have to send what they have now, even though they don't know what's going to come soon.  You can get better image quality with more bandwidth, but without the ability to have the entire video in advance to readily compress across time, the image quality at a given level of bandwidth is going to be far inferior to what you may expect from watching Internet videos.

That doesn't mean that they can't compress across time at all, however.  They can do a delta compression of sorts where they they tell you what changed in one frame from the previous.  This can look pretty good so long as not much changes from one frame to the next.  This would be fine for a lot of simple things, such as visual novels where the rendering is simple enough not to meaningfully tax your phone.

The problem is that this falls apart when the image changes radically and abruptly.  Moving from a loading screen to displaying a game will do that, for example, as you move to an image completely independent of the previous.  But that will just look bad momentarily before it's able to catch up over the course of many frames.  And making loading screens effectively last an extra fraction of a second isn't a big deal.  Far more problematic is that rotating the camera rapidly will cause the same sort of problem.  That's likely to be done when you need to see what is off in some other direction immediately.

Another problem is latency.  Compressing an image, transmitting it, and decompressing it intrinsically takes more time than not doing that.  Additionally, sending input to the remote server also takes time.  This adds delay to make controls feel sluggish, as it takes some extra tens of milliseconds between when you tell the game to do something and it registers.

Some online games already impose something like this latency, by making it so that when you tell your character to do something, it doesn't actually do it until the server finds out about it and verifies that it's legal.  That's less latency without having the added compression and decompression steps.  But it's enough to be annoying--so much so that games commonly have your character immediately do whatever you said, then only override it (by rubber-banding) if the server later tells you that it was invalid.
Gdemamimmolou

Comments

  • QuizzicalQuizzical Member LegendaryPosts: 25,355
    It would be possible to reduce how apparent this latency is by having the client timestamp when you gave an input, then have the server accept it and warp your character slightly as if you had moved sooner.  That will make movement a little jumpy, though that can be smoothed out considerably by making characters accelerate slowly.  That will still feel a lot smoother than just making it feel systematically laggy.  This requires modifications on a game by game basis, however, so it's not something that can be automatically be applied to arbitrary games that a data center wants to host.

    Or at least, that's how it would play out in single-player games.  In a multiplayer game, it's a cheating scandal waiting to happen.  The fundamental rule of online game security is not to trust the client.  The client is in the hands of the enemy.  If you tell the server to trust whatever time the client says an input was recorded, then it's just a matter of time before someone modifies the client to backdate the timestamps and move sooner.  Make all your movements record as 50 ms sooner and it's nearly as good as having 50 ms better reflexes.  So of course I expect that we'll see exactly that sort of cheating scandal.

    But that leads to the security advantages of using thin clients.  For one, if you're playing a game on a remote server that you don't control, what you can do to the server is very limited.  That completely blocks all sorts of cheating methods where people run cheat programs or whatever on their local computer.  Anything that relies on reading or modifying memory is dead in the water.  Same with anything that relies on modifying local files.  This won't stop simple macros, but it will make a lot of forms of cheating impossible.

    Another advantage is that it's a perfect form of DRM.  Or at least, that's an advantage in the eyes of game publishers, even if you don't see it as such.  Still, you're not going to run into Denuvo problems when gaming on a thin client, as there wouldn't be anything for Denuvo to do.  People can't pirate game files that they never get access to.

    Something that is a real advantage to gamers is that you can completely skip the download times for games.  The remote data center already has the game installed, and just has to launch it and let you play.  No more waiting an hour to play.  Or to redownload a game that you deleted to make room on your SSD.  (You do run your games off of an SSD, right?)  Or even waiting five minutes to download the latest patch to a game that you just played yesterday.

    Another advantage for people who commonly run into PEBCAK problems is that a lot less can go wrong on a thin client.  One of the major reasons for their use in the corporate world is that they're much easier to administer.  If you have an idiot user who manages to break his computer every week, that's a major pain to fix with a thick client.  With a thin client, you just wipe it and restore it to the base image, and he's ready to go.  If you manage to break your computer by constantly doing dumb stuff ("but I had to disable my anti-virus to get this screen saver to work"), then a true thin client that you can't break as easily and is easier to repair if you do would be great for you.

    Some people expect cost to be a big advantage to the thin client approach, but that's far more of a mixed bag.  That really depends on how much you play games.  If you play for an hour per year, then yes, going the thin client approach rather than buying a beefy gaming desktop will save you a lot of money.  If you play for a few hours per day, then no, it would be cheaper to buy or build your own computer and render it locally.

    People who expect the thin client approach to be cheaper tend to think only of the cost of buying the gaming computer, and either ignore or underestimate the other costs.  But there are two big other costs.  One is bandwidth.  It's decently likely that you spend over $1000 per year on bandwidth.  Between a cell phone plan that offers a lot of data and a good home Internet connection, you could easily spend several times that.  That's enough to pay for a decent gaming computer pretty quickly.  The actual gaming desktop is likely only a small fraction of your expense.

    If everyone suddenly used several times as much bandwidth as before to play games by streaming the video from some remote data center, you're not going to maintain the same quality of connection without any change to prices.  Expect to spend a lot more for bandwidth if everyone uses massively more of it.  That price difference could easily be a lot more than you spend on gaming hardware.

    The other problem is that hardware in "the cloud" is still physical hardware that costs real money.  And the cost for a given level of performance in a data center can easily be several times what it costs to get the same level of performance in consumer hardware.  People sometimes complain that a GeForce RTX 2080 Ti costs $1100.  But the professional version of the card, the Quadro RTX 8000, costs $5500, or five times that.  Even that is still a workstation card, not a server card.  The nearest server equivalent is a Tesla V100 that goes for $9000.  If you want to fill racks of servers with video cards for rendering games, those cards cost a lot more than the consumer equivalents.

    Even if you only play games for two hours per day, that doesn't mean that twelve people will share one of those server cards.  You probably play games at similar times to the other people who would subscribe to the service.  If 1/3 of the subscribers want to play games at peak times, then you only effectively share your hardware with two other people.  The only way that that works for huge savings is if you get a gym-like situation where a ton of subscribers basically never use the service.  Gyms can afford to lose money on their regulars because the many people who sign up so that they can feel like they're doing something to get in shape and then never show up are pure profit for the gym.  Cloud gaming seems unlikely to get similar effects.

    Data centers also commonly have to pay a lot of money for cooling.  You probably have your desktop or laptop or whatever you're reading this on set to just spray hot air off into the room.  That works fine when you're only cranking out 300 W even under heavy loads.  It does not work fine when you've got racks full of servers in a data center cranking out over a megawatt of heat in total.  Data centers will tend to get electricity a lot cheaper than you do, but they can easily pay a lot more for cooling than they do to run the servers in the first place.

    What the data centers could and probably will do to save money is that even if your game is being rendered on a high end video card, that card is shared with other customers having their game rendered on the same card at the same time.  Even if your game is being rendered on a Quadro RTX 8000, only 1/4 of the card is reserved for your use, so the level of performance that you get is basically that of a GeForce GTX 1650 that you could have bought yourself for a lot less than $1100.

    Gdemamimmolou
  • QuizzicalQuizzical Member LegendaryPosts: 25,355
    Even if you're not directly paying for the data center hardware that renders the games, the companies that are will charge appropriate prices to make a profit.  You'll end up paying for it one way or another, and for an avid gamer, likely pay more to go the thin client gaming route than it would cost for a nice gaming desktop.  That doesn't necessarily mean that people won't do it.  Rent-to-own furniture stores and payday loans outlets have a lot of customers, after all.

    One might hope that not having to upgrade your hardware would be an advantage.  To that I say, yes and no.  If you're mainly trying to avoid the hassle of physically replacing a box, then yes, the thin client approach will mean a lot less of that.  If the cost is your concern, then no, data centers have to upgrade hardware, too, and you'll pay for that, even if indirectly.  And don't get the idea that data centers will always have the latest, state of the art hardware for you to play on.  Replacing everything entirely every generation would get very expensive, very fast, and for the same reasons why it would be expensive for you to do so with your own hardware.  It may be available for those who pay a premium price while those who don't pay a premium price get routed to older generation hardware instead, but you're going to have to pay to get it.

    Another real advantage is improved compatibility.  Your computer, whatever it is, can't directly run games programmed for Windows 10 and Android and iOS and PS3 and PS4 and Xbox One X and Switch.  But a single computer acting as a thin client could play games from a variety of different remote servers running whatever is appropriate and streaming it to you.  If a single platform has all or even most of the games you want to play, you'll still be better off buying whatever computer or console it is to play most of the games that you want.  But occasionally using it as a thin client to play games that are exclusive to other platforms could allow you to play games that you otherwise couldn't.  It could even give you a pretty good gaming experience in slower paced games if the only problem is compatibility, not of how powerful the hardware is.  So if what you really want is to be able to play console exclusive, region locked visual novels on Apple's shiny, new cheese grater, this would at least make it technically possible.

    That doesn't necessarily mean that it will actually happen.  In this post, I'm focusing mainly on what is technically feasible, and to some degree on what is economically viable.  I'm ignoring marketing considerations of companies wanting to restrict who can play their games in various ways, even though that will surely happen.

    One final drawback of gaming on a thin client is reliability, or rather, the lack thereof.  When you have a dedicated monitor cable from your video card to your monitor that is sending communications between those two devices and not doing anything else, the data that the video card sends tends to get there, and in consistent, small amount of time, even at high bandwidth.  That's not a very good description of the Internet.  Some packets get dropped, or take an extra long time to arrive.  What happens if a server tries to tell you how the latest frame differs from its predecessor, but the information for the predecessor hasn't arrived yet and might not be coming at all?  There are a variety of imperfect ways to mitigate this, but whatever you do, something bad is definitely going to happen.

    I've commonly been skeptical of claims that game streaming is going to take over the world.  But does that mean that Google Stadia and any other would-be competitors are doomed from the outset?  Far from it.  There are several ways that thin client style gaming could offer real value.

    First and perhaps most important is free trials.  Suppose that a company makes a demo that lets you try the first ten minutes worth of a game.  They could make that completely free to play on some remote server, and then easy to buy the full game if you like it.  They could even transfer your saved game from the trial version to the full game if you're willing to buy it.  The "ten minutes" isn't set in stone; they could make it as generous as they like.  But skipping the download is a huge deal for a short trial, and that would make it easy for a company to let more people try their game.

    Second is compatibility issues.  If you can play most of the games you want directly, but need to stream a handful for compatibility reasons, that can beat not being able to play them at all.  It can also be a lot cheaper than buying a game console for the sake of playing only one exclusive game on it that you really want to play.

    A third good use case is light gamers, though this isn't for you.  Most gamers seem to think they're casual, on the basis they can find someone else who plays twice as much as they do.  But really, if you're not just playing games, but coming to a web site about games and digging through the forums to read this far into a post, you're a hard-core gamer.  But some people aren't, and might want to pick up some particular game for a week, and then not play anything else that would overwhelm the rendering capabilities of a phone for the rest of the year.  To be able to play demanding games for those brief periods without having to shell out for a gaming desktop could save a lot of money.

    Fourth is the wealthy travelers who want to be able to play demanding games wherever they go without carrying around computers powerful enough to run them--and are willing to pay a lot of money for the privilege.  Carrying around some thin client, or perhaps a thin and light laptop that they can use as one, would make that possible.  I don't think that there are very many people in that category.  But there aren't very many whales in the "pay to win" sense, either.  If they pay enough money each, it can be worth catering to them.

    A fifth and final use case that I see as technically feasible even if not necessarily commercially viable is as a way to offer a high end graphical experience for the hardware version of whales.  Ray tracing looks a lot better than rasterization, but is really expensive on hardware.  It does scale easily to many GPUs or even multiple servers, which is not true of rasterization.  So what if maxing graphics on some future games requires not just one GPU, but a server full of them to turn on full ray tracing with all of the bells and whistles?  The $100k gaming desktop isn't going to have a lot of takers, but if you're implicitly sharing it with others because it's off in some data center, that's still expensive but less outlandish.  Would very many people be willing to pay $25/hour at peak times or $5/hour in the middle of the night to play games with beautiful graphics on such a server?  When you're looking at that expensive of hardware, the cost of the Internet bandwidth is no longer a huge problem by comparison, so you could use a ton of bandwidth to get lightly compressed images to the end user, which could also help with latency.  I don't think there would be enough people willing to pay what it would cost to make this viable, but it will at least be technically possible.
    Gdemamimmolou
  • QuizzicalQuizzical Member LegendaryPosts: 25,355
    Conspicuously missing from this list is the great mass of ordinary gaming that gamers do now with games rendered locally on their desktops or consoles or laptops or phones.  Moving more than a small fraction of that into thin client style gaming rendered on remote servers makes little technical or economic sense.  However hard some corporate suits are trying to make that happen, I don't see it ever happening.  Yes, Internet bandwidth will improve, but so will the capabilities of rendering games locally.  Plenty of cell phones today have far more processing power available than a high end gaming desktop from 15 years ago.

    If thin clients genuinely are the future of gaming, it will not be because it's what gamers wanted.  That dystopian future will only happen if game publishers or regulatory bureaucracies or something like that manages to force it upon us.  Of course, that's how any other sort of thin clients spread in the corporate world.  And gaming itself is hardly free of things that are almost universally loathed.  How many games have loot boxes, again?

    But sometimes, free markets work how you'd hope in theory, and I think this will be one of those cases.  The laws of physics and economics will have their say, and the technical challenges and costs of trying to move game rendering to remote servers while still providing a passable experience will restrict thin client gaming to a relatively small market share for the foreseeable future.  Probably the unforeseeable future, too, though that's less predictable.
    Gdemamimmolou
  • VrikaVrika Member LegendaryPosts: 7,888
    edited June 2019
    I think that streaming games will become popular. It doesn't need to be as good experience as playing with your own expensive hardware, for casual gamers it just needs to be close enough to experience provided by your own hardware and effortless to use.
     
  • rojoArcueidrojoArcueid Member EpicPosts: 10,722
    I'm ready to leave modern gaming behind the moment streaming replaces(IF it happens) traditional gaming and consoles. I already have enough tangible platforms (11 consoles and a PC) to last me a life time and still have to buy a couple more. Streaming as a third option is fine. Streaming as the only option is a gigantic no.




  • Asm0deusAsm0deus Member EpicPosts: 4,407
    edited June 2019
    Vrika said:
    I think that streaming games will become popular. It doesn't need to be as good experience as playing with your own expensive hardware, for casual gamers it just needs to be close enough to experience provided by your own hardware and effortless to use.
    Dunno I am a casual gamer and I am not interested in this at all.

    Most gamer do not just game on their PC I would think.   My PC is the heart of my living room and provides all tv/movie content along as my gaming needs.

    As a casual gamer I am not interested in any subs.  I cut the cord for a reason so don't see myself getting another one.

    Really a gaming PC only really needs a dedicated gpu, as I am fairly certain the average casual gamer uses his pc for other things too so will still need one, and there are some lower cost one's out there that can let you run a game on pretty much any low end PC.




    Brenics ~ Just to point out I do believe Chris Roberts is going down as the man who cheated backers and took down crowdfunding for gaming.





  • MendelMendel Member LegendaryPosts: 5,609
    Philosophically, I'm against thin client.  It is putting the bulk of the work on the slowest, least reliable component of a computer (networking).  Millisecond speeds are simply pathetic compared to nanosecond speeds.  Thin clients worked well when the cost for local computing power was very high and the data sent to the client was static.  Neither of those conditions really apply to modern gaming.



    GdemamiQuizzical

    Logic, my dear, merely enables one to be wrong with great authority.

Sign In or Register to comment.