It seems like whenever there is a thread about cloud gaming, some clueless person will come in and basically say that 5G will offer near-infinite bandwidth at essentially zero latency, which will fix all problems with the Internet forever. Obviously, that's not how it works. But I thought I'd explain some of what 5G will and won't bring.
If you're transmitting data using RF from one point and receiving it at one point, the maximum rate at which you can transmit data that the other end will actually pick up correctly is the amount of spectrum that you're using times some function of the signal to noise ratio. You're not guaranteed to hit that maximum rate; for that matter, you're pretty much guaranteed not to hit it. But how clever you are about transmitting the data will affect how close to it you can get.
That means that a cellular network has several paths that they could pursue in order to increase bandwidth:
1) send data from more places and to more places
2) use more of the spectrum
3) be more clever about how you transmit and receive data
I'll come back to (1), as that's the big one. But points (2) and (3) are limited in what you can do.
As I said above, for point (3), there is some best possible rate. You can make adjustments to get closer to it. But if you're already at 80% of the theoretical peak, you're not going to double your throughput just by being more clever. The move from 2G to 3G picked up most of what was available to be had here.
Any protocol to send data reliably has some overhead for error detection and correction. For example, in SATA, this overhead is about 20% of the raw data sent. In PCI Express 3.0, it is about 1.5%. In GSM (2G), it was about 70%. 3G got rid of most of that overhead.
That's hardly the only regrettable thing about the 2G standard. The modulation couldn't adjust to send data faster when the signal to noise ratio made it possible. Switching from one cell tower to another was really clunky. The latency if something got missed and had to be resent was awful. The encryption wasn't very good.
And that all got fixed with the move from 2G to 3G. Yes, moving from 3G to 4G also improved things, as will moving from 4G to 5G. But that's tinkering around the edges, not a massive revolution.
Another approach is to just use more of the spectrum, which was my point (2) above. That's limited for different reasons. For starters, there is only 6 GHz of spectrum at a frequency of 6 GHz or less. That's axiomatic, of course, and not a statement about 6 GHz in particular. But cellular networks have commonly used frequencies around 1 GHz or 2 GHz or so because they propagate pretty well. Try to send data at a frequency of 1 THz and it will get absorbed by the air pretty quickly.
In a sense, claiming more spectrum is the big advance of 5G. But that won't do everything that you might hope for. The 5G standard will use two separate chunks of spectrum: one below 6 GHz, and the other above 24 GHz. The former is just an evolutionary advance over 4G and not really claiming more spectrum than before.
The latter is the new thing. It's sometimes loosely called "millimeter wave", though that portion of RF technically doesn't start until 30 GHz. Claiming the chunk of spectrum from 24 GHz up to 300 GHz will give 5G massively more spectrum than 4G had, and thus the possibility of massively more bandwidth. Some portions of that spectrum may be reserved for various other things in some parts of the world, but still, that's going to be a lot more spectrum than before.
So why didn't previous cellular standards do this? Because it doesn't propagate very far. It's not just that it gets absorbed by the air, though some does. It's also that it gets blocked to some degree by buildings, vehicles, hills, trees, or whatever else might happen to be in the way. If you try to connect to a cell tower that uses the higher portions of spectrum from a mile away, you're not likely to think it's an improvement over 4G. Or 3G.
That's why the millimeter wave form of 5G isn't going to be deployed that broadly, at least as measured by land area. It will get used in dense, urban areas. It will be a big improvement over public WiFi for the crowded areas that use it today. But it won't get deployed into more sparsely populated areas. If your nearest neighbor lives 500 feet away, you're not going to get millimeter wave 5G. You could still get the lower frequency version of 5G (and probably will if you have 4G now), but that's just an evolutionary improvement over 4G.
And that leads us back to option (1): send data from more places to more places. That really breaks into two cases:
1) have more cell towers that each cover a smaller region
2) have multiple transmit and receive points for a given connection between one cell tower and one cell phone
The latter is already done, but 5G networks will improve its capabilities. You can have an antenna that instead of a single transmit point, has four transmit points right next to each other that each transmit their own signal. You can also have an antenna that instead of a single receive point, has four right next to each other, and then do some computations from all four inputs to figure out what data was sent. But you could do beamforming on nearly any signal at all, and 4G already uses MIMO pretty heavily.
Rather, the improvements that 5G networks will offer here are not so much due to the 5G standard as increased processing power from Moore's Law. If you want to make heavy use of MIMO and beamforming, it comes at a cost of heavy computational requirements. To get n times the signal to noise ratio, you're going to have to do more than n times the computational work when receiving the signal. For complicated, technical reasons, sometimes it could be a lot more than n. Sometimes it could be more than n^3.
And then there is the option of more cell towers that each cover a smaller region. All of your "this is the best you can do" computations are on a per cell tower basis. If you have 4 times as many towers that each cover a quarter as much area as before, then your network can have four times the aggregate bandwidth. This doesn't even cause extra interference problems.
The move from 2G to 3G already did this to some degree, as did the move from 3G to 4G. Going from 4G to 5G will do it again. Indeed, 5G will be forced to go with smaller cells when using the millimeter wave chunk of the spectrum.
But the gains from improved MIMO, beamforming, transmit diversity, and whatever else aren't really due to the move from 4G to 5G. They're mostly due to having more computational power available because of improvements in chip fabrication. If the various carriers decided to spend as much resources on building new networks that strictly followed the 3G specification today instead of 5G, they'd still get some huge bandwidth gains over the existing 3G networks.