Miniaturization and integration of different technologies is present in base station designs, as they are prepared for simultaneous transmission of wireless standards.
Following its introduction 20 years ago, GSM has established itself as the de facto global standard for mobile telephony. Since this time, wireless communications has continued to develop. Whereas GSM was initially only designed for voice, the GPRS and EDGE extensions have also made it usable for data services. At the start of the new millennium, the new UMTS standard arrived and underwent further development via HSDPA, HSUPA, HSPA+ and HSPA+ Advanced. Operators have recently started deploying LTE, the latest wireless communications standard.
Traditionally, each of these standards was operated in a separate frequency band. In Europe these were the 900 MHz and 1800 MHz bands for GSM, the 2100 MHz band for UMTS and 800 MHz and 2600 MHz for LTE. Due to the increased demand for data services that can be operated more efficiently using UMTS, and better frequency utilization by GSM, for several years some countries have been operating UMTS in the 900 MHz band that was previously reserved for GSM. In Germany and Poland, LTE is being expanded to the former GSM-only 1800 MHz band. On the American continent, UMTS was introduced in the 850 MHz and 1900 MHz GSM bands right from the start.
The traditional practice of allocating wireless communications technologies to frequency bands has blurred – and it can be assumed that this trend will continue.
At the time UTMS was introduced, most countries already had an extensive GSM network. Part of the infrastructure that was set up for GSM was therefore also used for UMTS. At base stations this mainly consisted of the masts, the power supply and elements of the transport network.
New racks with UMTS communications equipment and controllers were added to existing hardware. New antennas were installed to cover the additional frequencies. Sometimes hardware from different suppliers was used for GSM and UMTS.
As a result, the GSM and UMTS networks remained separate even though they shared the same infrastructure. Network expansion, maintenance and network management still take place separately and are therefore duplicated. This means higher costs for network operators who have been trying to reduce costs, especially since starting to deploy LTE. There is a demand for universal, technology-independent hardware, which will significantly simplify the long-term migration from GSM to UMTS to LTE.
What are Multi-Standard Base Stations?
Multistandard base stations (multistandard BTS) have been developed to meet the demand for universal hardware for different technologies. These base stations can transmit and receive different standards, such as GSM and UMTS, simultaneously within one frequency band using the same active RF components. A multistandard BTS supports at least two different radio access technologies (RAT). This reduces network operator’s expansion costs and also makes it easier to use the same frequency band for different technologies, e.g. GSM and UMTS at 900 MHz or GSM and LTE at 1800 MHz. Plus, multistandard BTS require less space and power, and are less expensive to install and maintain.
A multistandard BTS can of course also be operated using only one technology, which is why a distinction is made between single RAT and multi-RAT operation. As a result, operators have more flexibility during network operation and expansion.
The development of multistandard BTS was made possible by recent progress in the development of software defined radios (SDR) and amplifiers that demonstrate linear behavior under different peak to average power ratios (PAPR) such as occur with different standards.
Testing Multi-Standard Base Stations
The 3GPP global standardization body has standardized multistandard BTS in the TS 37.104 and TS 37.141 specifications for the GSM/EDGE, WCDMA/UMTS, TD-SCDMA and LTE technologies. TS 37.104 describes the minimum prerequisites for the air interface, and TS 37.141 defines the tests and test requirements. The number and the complexity of test scenarios have risen in comparison with traditional base stations, increasing the requirements on measuring equipment with respect to measuring time and adjustable parameters.
Spectrum analyzers such as the R&S FSW, R&S FSQ and R&S FSV can be used to perform the required transmitter measurements, such as spurious emissions, out-of-band emissions, and adjacent channel leakage ratio. These measurements are carried out in the sweep mode, the traditional spectrum analyzer mode of operation. Comprehensive measurement options also allow the analysis and demodulation of GSM, WCDMA, LTE FDD/TDD and TD-SCDMA signals, so that all TS 37.141 test scenarios are covered.
Since multistandard BTS simultaneously transmits different standards on the same frequency bands using the same RF components, the signals can influence each other. To optimize and troubleshoot multistandard BTS it is therefore necessary to recognize dependencies between signals of different standards. In sweep mode, however, the spectrum is analyzed sequentially. The detector measures the level at a certain point in time at a certain frequency. Briefly occurring interference outside this range will not be detected. Rohde & Schwarz has developed the multistandard radio analyzer (MSRA) for such measurements.
Multistandard Radio Analyzer (MSRA):
The MSRA is a new operating mode for the R&S FSW signal and spectrum analyzer in which the signal is first fully captured over a set frequency and time interval, temporarily stored and then analyzed. This makes it easy to find interference between signals of different technologies.
The MSRA function allows the R&S FSW to capture 200 Msamples of signal data. At an analysis bandwidth of 160 MHz, data can be captured for up to one second. Previously, signals of different technologies had to be analyzed sequentially. This made it significantly more difficult to detect time-correlated dependencies caused by mutual interference between the signals.
Pulsed, non-correlated signals are especially difficult to detect using the sweep mode because a complete pulse may be missed and not captured at all. Since the MSRA captures and analyzes an entire time and frequency interval at once, all signals in this interval are reliably captured.
The following example shows how MSRA mode helps us analyze errors on a multistandard radio signal. First, we look at the MSR signal in the MSRA view (Fig. 1). This signal consists of two GSM carriers, one UMTS carrier and an LTE carrier. The markers show the limits of the analysis range of the individual measurements. The individual measurements are called up using the tabs.
Next, we look at the UMTS measurement on the 3G FDD BTS tab (Fig. 2). In this example, the Composite EVM and EVM vs. Chip measurements are displayed.
It is evident that slot 1 is showing an unexpectedly high EVM value. The EVM value is a measure of the quality of the digitally modulated signal. EVM values that are too high lead to a higher error rate and therefore a slower data rate. As of a certain threshold, which depends on the type of modulation, data transmission is no longer possible. The EVM value is therefore an important key attribute when developing and optimizing systems for digital wireless transmission. We can examine UMTS slot 1 more closely in the EVM vs. Chip display. Here it is evident that the high EVM value is caused by defective chip 1878. Since this display takes place in the time domain, the orange marker can be placed exactly on this chip, and is therefore positioned at 6.31 ms.
When we switch to the MSRA view of the GSM measurement (Fig. 3) with the magnitude capture display (level vs. time), the offender is revealed. The marker at 6.31 ms is now positioned exactly on the rising edge of the GSM burst.
It is obvious that the edge of the GSM signal is the reason for the increased EVM value of the UMTS signal. Simultaneous analysis of the signals in MSRA mode enables users to find the cause of interference in the UMTS signal. Without the R&S FSW analyzer’s MSRA mode, such an analysis would be much more complicated. Either a second, time synchronized and triggered spectrum analyzer would be needed, or the captured data would need to be analyzed using complex external signal processing software. The MSRA mode makes troubleshooting much easier and faster, with just one instrument.
Base station manufacturers are following the example of smartphone manufacturers and focusing on miniaturization and a higher level of integration. Network operators can now use multistandard BTS to transmit different wireless standards using the same infrastructure, cutting the cost of installation, maintenance and management of their networks. However, multistandard BTS place higher requirements on measuring equipment with respect to measuring time, adjustable parameters and test scenarios as specified in 3GPP TS 37.141.
Optimization and troubleshooting that extends beyond the TS 37.141specification requires methods for analyzing the time correlation of signals of different standards. With the MSRA, Rohde & Schwarz provides manufacturers of multistandard BTS with a new operating mode that makes it easier to detect interference between signals of different technologies. Combined measurement within a time and frequency range with the R&S FSW visualizes errors that were previously extremely difficult to detect.
Filed Under: Aerospace + defense