afadeev said:
You are confusing 5G with mmWave-length spectrum. The latter is a subset of the former.
No confusion. mmWavelength is where the huge gains in speed will be seen. The rest (using legacy bands) is mainly an incremental upgrade that isn't going to mean jack to most people's use cases.
Agree on (coverage constrained) speed gains with mmWave-length, but how much speed does one really need?
Seriously, I get 200Meg up/down on my home internet, and probably use 10% of that during busy hour.
You can get 100-200Megs with MIMO LTE.
Today.
Would you pay more for that 100-200 Megs (vs. 80 Megs down / 38 Megs up I just observed on my LTE speed check) tomorrow?
More yet for 1+Gig?
I'm not sure I will.
Well, maybe a little, just for bragging rights!
I am already paying for way more bandwidth than what I can realistically consume at home, and that 200Megs router supports 15-25 devices, some of them concurrently streaming video.
1+ Gigs down over mmWave (24+GHz) makes awesome commercials, but comes at the price of massive attenuation, as signals is blocked by atmospheric gases, humidity, glass, trees, buildings, etc.
Even with clear line of sight, signal strength degrades fast, and practical range is ~500-1,000 feet.
And you can materially degrade that further by literally farting in the general direction of the cell tower
Curious, I've not read much on so what kind of peak (and normally seen) bandwidth is expected for devices operating in those bands?
+30+50% over LTE on the same spectrum.
At in those frequency bands they're going to be at or below 4G bandwidth rates, right? Or does 5G also have improved efficiency in the spec's design that makes up for the lower theoretical maximum that using a lower radio frequency implies?
Re-farming (or dynamically "sharing") 4G spectrum would reuse the existing licensed spectrum that all US carriers own. It's up to each carrier to decide how much of what band to dynamically reallocate to 5G.
5G is slightly more efficient with RAN resource block management, so add ~15% spectrum utilization gains (aka cost savings to all the carriers).
A little cheaper for the carrier, and a bit faster for the end user: so a win-win.
Mid-band (3-6 GHz) might be the sweat spot of "better enough" speed with still "wide enough" coverage.
<edit> I ask because my understanding has been that the high frequency stuff (above 20GHz) is the payoff for going 5G, and the lower frequencies are just fallbacks there to allow very wide coverage within the same spec umbrella that lets the devices slide back and forth between those trade-offs very quickly and seamlessly. So without any of the above 20GHz frequencies (AKA "mmwave") available for use I didn't think it had much of a point?
5G is not restricted to mmWave (24+ GHz) bandwidth.
Traditional spectrum bands are also covered, and benefit from 5G improvements. As does the packet core, which benefits from a lot of new capabilities.
It's just that those mmWave frequencies support "stupid fast" speeds that make for great commercials, so all carriers jumped on deploying those to maximize publicity from the 5G CapEx.
So 99+% of the world now thinks 5G == mmWave.
Which has it's place, but with hard limitations that will prevent mmWave from going nationwide. Ever.
HTH,
a