Basics of 5G channel sounding
The Tuesday keynote included a short demo of NI test equipment used by AT&T Labs to do channel sounding for the upcoming 5G wireless standard. Here’s a basic explanation: If you wanted to find out how much signal loss there would be between a millimeter wave transmitter and receiver, theoretically you could solve Maxwell’s equations for electromagnetics for the space in question. Problem is, even super computers couldn’t handle the calculations involved for real-life scenarios. So engineers use a two-part approximation to find path loss. The first part is relatively simple. It treats an EM wave like a particle and calculates factors such as angle of incidence/angle of reflection from the boundaries of the space in question. The second part of the approximation is statistical. It is a measure of factors such as whether a pickup truck is deflecting the signal briefly, whether there are any Doppler effects from a moving transmitter or receiver, and so forth. The only way to get this modeling right is to put transmitters and receivers in the field and run many, many tests. These tests, in a nutshell, are dubbed channel sounding. The channel sounding apparatus visible here on the left is the NI-based transmitter sitting beneath an antenna that puts out a 360° signal with a 120° azimuth. The receiver is at right. According to AT&T’s Dr. Arun Ghosh and NI’s Dr. Tanim Taher (here standing to the left of the transmitter), the NI-based channel sounder can display data in real time that formerly required lots of post processing to get at.