There has been this “whack-a-mole” movement around 6G of late. From the White House to MWC to the FCC, it seems the term is popping up in random places waiting for someone to whack it with a mallet and wait for it to pop up at another random location.
What is really amusing, at the moment, is that some of those talking, really seem to think it is timely. We are 10 years away from maxing out 4G, and 20 years away from maxing out 5G, if the usual formula for maturing wireless technologies holds. That may, or may not be the case with wireless going forward, but nobody thought Moore’s Law would cap out either (and it is capping out in many segments).
Therefore, while 6G is certainly on the horizon, any serious talk about what it will be is, simply, conjecture. In that vein, I am going to point to the FCC’s recent announcement that they are beginning to draft plans for frequencies approaching 100 GHz, even beyond.
I find this interesting in the sense that FCC is never on the bleeding edge of technology. Historically, it takes them forever to catch up with what is needed in spectrum management. Their model may have worked in the era of 3G, but with 4G and beyond, they need to develop a revolutionary model to keep up with technology. Are they turning over a new leaf?
I am not sure what their mindset is with this vector. They are talking about allocating 21 gigahertz of spectrum for unlicensed use at 95 GHz+. The details revolve around creating experimental licenses for those that want to dabble in these mmWave bands. Essentially, licenses will be given for study between 95 GHz and 3000 GHz (3 THz), and be valid for 10 years.
It will be interesting to see who will apply for those licenses and what their motives are. Considering this is unlicensed, at least, hoarding would not be an issue, as it is with licensed frequencies, because it is shared spectrum. Even more interesting is what technologies will be used to share bands that have the potential to be several hundred MHz wide.
The thinking behind this has validity. The current fear is the amount of bandwidth necessary to handle the data tsunami once the 5G umbrella is in place and the diversity, and ubiquity, of apps in that space, come to fruition. I happen to agree with this thinking.
With the likes of new supercomputers, such as the Aurora, the advancing capabilities of AI and machine learning (ML) — the potential for data analytics and manipulation goes over the top. Bandwidth will have to be over the top, as well, to accommodate the processing and data throughput capabilities. At this stage, that bandwidth does not exist below 100 GHz.
However, the question is one of; how do we work around the limitations at such frequencies? While the ability to use such bandwidth, technically, is not an issue, making them work in real-world deployments is. The current thinking is to use them for ultra-dense, short RF footprint scenarios. That seems to be the niche because, unless we find some new, unknown RF phenomenon, that is about all they will be able to do. However, that niche has a lot of potential applications.
There are many challenges in working at these frequencies. They will limit what 100+ GHz, and THz can be used for, even in ultra-dense small footprint applications.
As well, propagation and channel properties are only the tip of the iceberg. Interference and noise have different metrics and are not well understood at THz frequencies.
However, back to the FCC action. Again, we are late to the table. Europe is leading the charge at the moment. The European Telecommunications Standards Institute (ESTI) has created the ISG mWT working group, which is looking at how to make the 50 GHz – 300 GHz band work. There have been some trials at the lower end of this spectrum, but all of this is still in the embryonic stage.
Commissioner Michael O’Rielly makes an interesting point when he said that, “I understand that it may be a bit premature to establish exclusive-use licenses above 95 GHz when there is great uncertainty about what technologies will be introduced, what spectrum would be ideal, or what size channel blocks are needed.” Touché.
Some of this THz thinking is way outside of the current RF wheelhouse. For example, Commissioner Geoffrey Starks made an interesting point when he talked about THz spectrum being used to pave the way to better understand biological processes on the cellular level (AI and Big Data analytics). This has a lot of potential because it fits into the short-range, dense network shoebox. Its main application would be to pave the way for such processes as noninvasive cancer and other medical screenings.
Similarly, THz spectroscopy can be used to identify dangerous materials, weapons, drugs, etc. in transportation hubs, sports venues, and any number of other usage scenarios. There are other, out there, theoretical usage scenarios on the radar screen as well. There is talk that such wideband scenarios, along with the next-generation hardware/software, can be used to approximate human thinking. However, today, that is purely in the realm of science fiction. Nevertheless, it is fascinating to fantasize about some of this stuff. Even if it is likely, decades out.