Besides High Speeds, What Else Will 5G Bring Us?

Alibaba Cloud
26 min readMay 25, 2020

--

By Chu Peisi, nicknamed Zishuo at Alibaba.

5G is about revolutionizing our lifestyles, rather than just bringing faster network speeds. It will spur changes to several different industries. 5G not only involves large bandwidth, but also many other aspects that will benefits several different enterprises across many different industries. In this post, Alibaba Entertainment’s technical expert Chu Peisi will share what are some of the key technologies of 5G.

The Core Technologies of 5G

5G involves a large variety of several different core technologies. You might know about the three scenarios defined for 5G:

1. eMBB (Enhanced Mobile Broadband): As its name suggests, this is designed for high-traffic mobile broadband services.
2. URLLC (Ultra-Reliable and Low Latency Communications): These will have a response time is 500 milliseconds in 3G, 50 milliseconds in 4G, and required to be 1 millisecond in 5G, which will be applicable in scenarios such as autonomous driving and telemedicine.
3. mMTC (Massive Machine Type Communications): This is for large-scale Internet of Things (IoT) services.

Enhanced Mobile Broadband (eMMB)

4G was record-breaking in terms of network speed, so how can 5G scale to new heights? Let’s take a look at the formula first.

Capacity = bandwidth × spectral efficiency × number of cells

According to this formula, there are three ways to increase capacity: increasing the spectrum bandwidth, increasing the spectral efficiency, and increasing the number of cells. Increasing the number of cells means building more base stations, which is currently too costly to realistically implement.

For spectrum bandwidth, resources in the middle and low frequency bands are very limited. As a result, 5G moved towards the field of millimeter wave. As we will discuss later, millimeter wave features a very high frequency band and abundant resources, so it has become the focus of spectrum development. In addition to developing new spectrum resources, another effective method is increasing the utilization of the existing spectra. For this purpose, the cognitive radio, which has been progressing over the years, can be used to increase the utilization of radio and television white space.

“White space” is the spectra that can be used by wireless communication devices or systems during specific time periods and in specific regions without interfering with higher-level services. Based on this, the “radio and television white space” refers to the white spectra in the radio and television frequency band. The frequency band for radio and television signals has a very high quality and is suitable for wide area coverage. Therefore, the application of cognitive radio in this band deserves attention.

Operators prefer to enhance the network capacity by increasing the spectral efficiency. Methods such as checking error correction and encoding are used to approach the Shannon limit rate. Compared with Turbo code in 4G, the channel coding method in 5G is much more efficient.

Currently, 4G and Wi-Fi mainly use orthogonal frequency division multiplexing (OFDM) technology, which offers better performance than code division multiple access (CDMA) and other earlier modulation technologies do. However, orthogonal frequency division multiple access (OFDMA) requires that all resource blocks be orthogonal, which limits the use of resources. If signals that are not orthogonal can also be correctly demodulated, system capacity can be greatly increased. Non-orthogonal multiple access (NOMA) technology was developed in response to this need. After the modulation technologies reach their limits in improving the network capacity, we may resort to the multi-antenna technology. Massive multiple-input multiple-output (MIMO) can greatly improve the capacity.

Channel Coding Technology

There are three main data coding schemes: LDPC code which was proposed by Americans, Polar code, which was proposed by a professor from a university in Turkey, and Turbo 2.0 code which originated in Europe.

At the RAN1#86bis meeting held by the 3rd Generation Partnership Project (3GPP) in October 2016, in Lisbon, Portugal (the “86th meeting” in the following), Turbo code, which dominates 3G and 4G, received limited attention, while the LDPC and Polar codes were at the center of debate. LDPC code had the upper hand by virtue of its technological advantages and gained a large number of supporters, including Samsung, Qualcomm, Nokia, Intel, Lenovo, Ericsson, Sony, Sharp, Fuji, and Motorola Mobile. At that time, Polar code only had one staunch supporter, Huawei, and it would have been a hopeless endeavor even if Lenovo had intended to vote for it. In other words, LDPC code won an outright victory at the meeting and became the standard scheme for data transmission in 5G mobile broadband.

Later, at the RAN1#87 meeting held by 3GPP in November 2016 in the United States, short code schemes for 5G data channel and schemes for 5G control channel were discussed. According to the final results of the vote, for the application of 5G in eMBB scenarios, the LDPC code advocated by Qualcomm was defined as the data channel coding scheme for long code and uplink and downlink short code, while the Polar code advocated by Huawei was defined as the control channel coding scheme.

Transmission is prioritized for the 5G data channel, because it is designed mainly for the transmission of large packets. LDPC code has obvious advantages in this regard, which is why it easily became the data channel coding scheme for long code. In contrast, the 5G control channel gives priority to reliability rather than speed, because the traffic on the control channel is low. Polar code offers important advantages for reliability. In addition, Polar code was widely supported by Chinese manufacturers including Lenovo, so it finally became the international coding standard for the control channel of 5G mobile broadband.

As shown in the following figure, in terms of large message block length, the transmission efficiency of LDPC code obviously exceeds that of Turbo code and Polar code.

NOMA Technology

The 4G network uses the OFDMA technology, which both avoids multi-path interference and greatly increases the data transmission rate when combined with the MIMO technology. Since the signals of multiple users are orthogonal, distances from mobile phones to cells no longer matter. Consequently, fast power control is no longer required, and adaptive modulation and coding (AMC) is used to implement link adaptation.

From 2G and 3G, to 4G, the multi-user multiplexing technology only makes improvements in the time domain, frequency domain, and code domain. By contrast, NOMA adds a dimension based on OFDM, which is the power domain.

In the power domain, multi-user multiplexing is implemented based on the different path losses of users.

NOMA is designed to integrate the non-orthogonal multi-user multiplexing of 3G into the OFDM technology of 4G.

NOMA can superpose multiple transmission signals based on the differences of various path losses, therefore improving the signal gain. It allows all mobile devices in the same cell to obtain the maximum available bandwidth, which can resolve the network challenges caused by large-scale connections.

Millimeter Wave

As early as 2015, the U.S. Federal Communications Commission (US FCC) took the lead in choosing four frequency bands, namely 28 GHz, 37 GHz, 39 GHz, and 64–71 GHz, as the recommended bands for the 5G millimeter wave in the U.S. The US FCC held a 28 GHz spectrum auction, with 2,965 spectrum licenses totaling nearly $703 million. In China, the spectrum is allocated by the National Radio Regulatory Commission, whereas it is sold through public auction in other countries.

The millimeter wave offers substantial advantages, including the high frequency band, rich spectrum resources, and wide bandwidth. In addition, the high frequency and short wavelength allows antennas to be shorter, making it easier to construct multiple antennas on mobile phones and other small devices. According to the formula velocity of light = wavelength * frequency, the wavelength corresponding to 28 GHz frequency is about 10.7 mm, which falls within the range of the millimeter wave. Generally, the antenna length is proportional to the wavelength, and is preferred to be one-quarter or one-half of the wavelength. Therefore, the shorter wavelength of the millimeter wave means shorter antennas.

In a massive MIMO system, a base station can use large-scale antenna arrays to combine the millimeter wave with beamforming technology, effectively improving the antenna gain. However, due to the short wavelength of millimeter wave, signals carried by millimeter wave are vulnerable to external noise and attenuate to some degree. As a result, the signals are not likely to pass through buildings or obstacles, and can be absorbed by leaves and rainwater.

Massive MIMO and Beamforming

MIMO is actually multiple antennas for data transmission and reception. It has been in use since as early as LTE (4G). At present, by using more advanced MIMO technology in combination with carrier aggregation and high order modulation, the industry is able to boost LTE to reach gigabit speed (1 Gbps and above), which is 10 times the original LTE speed.

Breaking through the limit of Shannon’s theorem, MIMO has transformed signal processing from a single point-to-point channel into multiple parallel channels, making the number of parallel channels the decisive factor for spectral efficiency, therefore improving both the system capacity and the spectral efficiency.

As shown in the following figure, in the LTE and LTE-A systems, a base station has only a small number of antennas, and so does a mobile phone because the small phone cannot accommodate many antennas, which are still large for the medium and low frequency bands. However, with the introduction of the millimeter wave by 5G, antennas can become very small, allowing even a large number of antennas to be easily integrated on one device. A base station in a massive MIMO system can support up to 256 antennas.

To implement massive MIMO, the base station must accurately detect channel information and user equipment (UE) location, which is not difficult for time division duplexing (TDD) systems. However, this is quite tricky for frequency division duplexing (FDD) systems. TDD systems use the same frequency band for uplink and downlink transmission. Therefore, according to the reciprocity relationship between the uplink and downlink channels, the state of a downlink channel can be estimated based on that of the uplink channel. However, the FDD system uses different frequency bands for uplink and downlink transmission, so the preceding method is not applicable. To perform channel estimation, a large amount of channel state information (CSI) feedback has to be introduced. As the number of antennas increases, the overhead increases, and the accuracy and promptness of the feedback information may decrease. Therefore, the industry has always believed that massive MIMO is more difficult to deploy on FDD systems.

In fact, the first mention of intelligent antennas can be traced back to 3G deployment in China. In time division-synchronization code division multiple access (TD-SCDMA), through digital signal processing technologies and adaptive algorithms, a base station system enables an intelligent antenna to dynamically form directional beams for specific users within its coverage. Although TD-SCDMA is not highly developed, it does help major Chinese manufacturers accumulate more experience in MIMO antenna and beamforming technologies. Other countries have been vigorously promoting the FDD system, even though the TDD system seems to offer indispensable advantages for massive MIMO.

In a field test conducted by China Mobile in Hangzhou, Huawei’s 5G solution was used end-to-end from the chip to the core network. On the network side was Huawei’s 2.6 GHz New Radio (NR) system which supports a large bandwidth of 160 MHz and 64T64R Massive MIMO wireless devices. The NR system was connected to the core network centrally deployed in Beijing, which supports the 5G system architecture (SA). The UE side was a test UE based on the Huawei Balong 5000 chip. You can see that the base station side used 64T64R, that is, 64 transmitter antennas and 64 receiver antennas, totaling 128 antennas.

The MIMO technology evolved from single user MIMO (SU-MIMO) to multi user MIMO (MU-MIMO). SU-MIMO serves only a single UE. However, the UE is limited by such limitations as the real number of antennas and the complexity of its design, which in turn limited the further development of SU-MIMO. By contrast, MU-MIMO combines multiple UEs for spatial multiplexing, and the antennas of multiple UEs are used simultaneously. In this way, a large number of base station antennas and UE antennas form a large-scale virtual MIMO channel system. This increases the overall network capacity. However, with so many antennas, signal crossover will inevitably lead to interference, which requires pre-processing and beamforming.

This kind of spatial multiplexing technology enables a shift from omnidirectional signal coverage to precise directional coverage, eliminating interference between beams. As a result, it can provide more communication links in the same space and greatly improve the service capacity of the base station.

Assume the following example scenario. Assume that there is one omnidirectional base station (marked by the red dot in the following figure) on the edge of a square surrounded by dense buildings, and three UEs (marked by the red, green, and blue x’s in the following figure) are distributed in different directions. In scenarios where massive MIMO and precise beamforming are applied, the situation is transformed, with changes as follows. As you can see, the direction of the electromagnetic waves can be precisely controlled. It’s very high-tech, right? Well, the result may look simple, but there are a lot of advanced technologies backing it up.

Image source: https://www.cnblogs.com/myourdream/p/10409985.html

Cognitive Radio

How did cognitive radio come into being? For a start, because of resources. Spectrum resources in the low frequency band are very scarce and had already been allocated to several systems. However, these systems were not using the frequency spectra efficiently. Therefore, cognitive radio technology was considered in order to make full use of these frequency spectra without affecting the main communication systems.

A cognitive radio can be thought of as radio that cognizes the surrounding environment and adjusts its actions accordingly. For example, a cognitive radio can determine an idle frequency band and then switch to that band for data transmission. The term “cognitive radio” was coined by Joseph Mitola III. It refers to intelligent radio that can sense the external environment, learn from history, and make intelligent decisions to adjust its transmission parameters based on the current environmental situation.

Cognitive radio is a combination of Software Defined Radio (SDR) and Artificial Intelligence (MIND). We can think of the radio as being endowed with some human-specific abilities allowing it to cognize the outside world through observation and then to decide whether and how to transmit data. There will be a lot of research and applications related to cognitive radio in 5G.

Ultra-Reliable and Low Latency Communications (URLLC)

The theoretical latency in 5G is 1 ms, which is a few tenths of the latency of 4G, more or less reaching the level of quasi real-time. This low latency will naturally result in many applications. In fact, as its name implies, URLLC involves not only low latency, but also high reliability. This low latency and high reliability will facilitate the implementation of technologies such as industrial automation and control, telemedicine, and autonomous driving. This will bring about revolutionary changes, gradually making the impossible possible. Let’s take a look at how this will happen.

5GNR Frame Structure

What is 5GNR? It is actually an air interface standard for 5G, which was named “5G New Radio” by 3GPP. The air interface standard for 4G was called Long Term Evolution (LTE). Like LTE, 5GNR defines the length of one radio frame as 10 ms, and each radio frame is divided into ten 1 ms subframes. Also, each radio frame may be divided into two 5 ms half-frames, with the first half-frame including subframes #0 to #4, and the second half-frame including subframes #5 to #9. This part of the structure is unchanged.

Unlike LTE, the subcarrier spacing of 5GNR is no longer fixed at 15 kHz, but is variable and can support five configurations: 15 kHz, 30 kHz, 60 kHz, 120 kHz, and 240 kHz. So why can’t it be less than 15 kHz or greater than 240 kHz?

The lower limit of the subcarrier spacing is determined by phase noise and the Doppler effect, while the upper limit is determined by a cyclic prefix (CP). We certainly prefer a smallest possible subcarrier spacing so that more data can be transmitted with a given bandwidth. However, if the subcarrier spacing is too small, the phase noise will cause excessive signal errors, and eliminating such phase noise will place unreasonable demands on local crystal oscillators.

On the one hand, an excessively small subcarrier spacing makes physical layer performance susceptible to Doppler frequency offset. On the other hand, an excessively large subcarrier spacing leads to a shorter duration of the CP in the OFDM symbol. The CP is designed to eliminate delay spread to the maximum extent, in order to overcome the negative effects of multipath interference. The duration of the CP must be greater than the delay spread of a channel, otherwise it cannot overcome the multipath interference. Therefore, the range of 15 kHz to 240 kHz was selected by taking into account considerations such as the corresponding technology and implementation costs.

As shown in the following figure, the larger the subcarrier spacing is, the smaller the timeslot is. The smallest subcarrier spacing 15 kHz corresponds to a timeslot of 1 ms, and the largest subcarrier spacing of 240 kHz corresponds to a timeslot of 0.0625 ms. In URLLC scenarios, a large subcarrier spacing can be configured to meet the low latency requirement.

The flexible frame design of 5GNR supports both scale-up and scale-down of the transmission time interval (TTI). That is, a longer or shorter TTI can be used as needed. 5GNR also supports multiplexing with different TTIs at the same frequency. For example, a mobile broadband service requiring a high quality of service (QoS) can choose to use a 500 µs TTI, instead of having to use the standard TTI of LTE. A latency-sensitive service can use a shorter TTI, let’s say, 140 µs, instead of having to wait for 500 µs until the next subframe arrives. In other words, the two services can start at the same time after the previous transmission is completed, therefore reducing waiting time.

Multi-carrier Technology Improvement

In an OFDM system, subcarriers are orthogonal to each other in the time domain, and their spectra overlap, so the spectrum utilization is high. The OFDM technology is generally applied to data transmission in wireless systems. In an OFDM system, interference between symbols occurs due to the multipath effect of wireless channels.

To eliminate intersymbol interference (ISI), guard intervals are inserted between symbols. A common method of inserting guard intervals is to set a zero between the symbols, that is, to wait for a period of time (without sending any information) before sending the next symbol. In the OFDM system, although this can reduce or even eliminate the ISI, the orthogonality between subcarriers is destroyed, resulting in inter-carrier interference (ICI). Therefore, this method is not suitable for the OFDM system. To eliminate both the ISI and ICI in the OFDM system, the CP is usually used as the guard interval. The CP is system overhead and does not carry valid data, therefore reducing the spectral efficiency.

The cyclic prefix OFDM (CP-OFDM) technology currently used in the LTE system can address multipath delay, but it is sensitive to frequency offset and time offset between adjacent sub-bands. This is mainly due to the large spectrum leakage of the system, which easily causes inter-band interference. At present, the LTE system uses guard intervals in the frequency domain, but this reduces spectral efficiency and increases latency. In light of this, some new waveform technologies are needed for the 5G system. The current CP-OFDM encounters challenges in machine-type communications (MTC) and short access scenarios. Polar delay services, burst and short frame transmission, and low-cost UE have large frequency deviations, which can adversely affect orthogonality. In multi-point cooperative communication scenarios, it is difficult to transmit and receive signals to and from multiple points.

There are currently some candidate technologies for improvement. The new waveform techniques proposed by various companies at the 3GPP meeting included CP-OFDM with Weighted Overlap and Add (CP-OFDM with WOLA), Filter Bank Multi-Carrier with Offset Quadrature-amplitude Modulation (FBMC-OQAM ), Filter Bank OFDM (FB-OFDM), Universal Filtering Multi-Carrier (UFMC), Filter OFDM (F-OFDM), and Generalized Frequency Division Multiplexing (GFDM). With so many technologies, this may seem complicated and difficult to understand. But it’s fine, you just need to know what problems they solve. If you want more details, you can search for related articles by using the tags.

Network Slicing

Crucial for 5G, the network slicing technology has greatly liberated operators, therefore winning their favor. Conventionally, routers implement hard switching, and everything has to be configured online in advance, which is inconvenient to modify. Of course, if there were no need to process data packets on demand, this would actually not be a bad method, as it is fast and stable. However, the increasing demand for differentiated services requires faster and more efficient network management. The emergence of software-defined networking (SDN) solved this problem. SDN is an innovative network architecture and implementation of network virtualization that was initially proposed by the Clean Slate research group, affiliated Stanford University in the U.S.

When SDN transformation is used, there is no need to repeatedly configure the router of each node, and devices in the network are automatically connected. You only need to define simple network policies.

SDN uses a centralized controller to manage network devices, instead of relying on underlying network devices such as routers, switches, and firewalls, therefore it shields the differences that underlying network devices may cause. Since the control permission is completely open, users can customize any network routing and transmission policy. This makes the network more flexible and intelligent. The control plane and the data plane are separated, so different forwarding rules can be configured for different packet types or sources, therefore allocating different service levels to data packets and implementing differentiated QoS.

The following figure helps you better understand SDN.

Massive Machine Type Communications (mMTC)

First, let’s take a look at the key performance indicators (KPIs) of mMTC: Connection density is 1,000,000/km2, and battery life is 10 to 15 years under a maximum coupling loss (MCL) of 164 dB, which means that the battery can work for 10 to 15 years even when signals are extremely weak. Weaker signals mean higher transmission power and greater power consumption. Coverage enhancement is required to provide a rate of 160 bps when the MCL is 164 dB, with low UE complexity and costs.

The LTE for machine-to-machine (LTE-M) technology is an Internet of Things (IoT) technology based on LTE. It was named LTE enhanced MTC (eMTC) by 3GPP R13, and designed based on existing LTE carriers to meet the needs of IoT devices. Narrowband IoT (NB-IoT) is a combination of Narrowband Cellular IoT (NB-CIoT) and Narrowband LTE (NB-LTE). It is widely used in China.

What is the difference between eMTC and NB-IoT? As shown in the following table, each has its own advantages. For services with high requirements for on voice, mobility, and speed, eMTC technology is the better choice. For services that focus more on cost and coverage than on the aforesaid aspects, NB-IoT is preferable.

Each of the technologies has a target capacity of 50,000/cell, but they both use the power saving mode (PSM) and extended discontinuous reception (eDRX) mechanisms. In this way, the devices are dormant most of the time, which reduces signaling interaction with the base station and indirectly increases the cell capacity. This increase in capacity is made possible through the long dormancy of devices. The eDRX cycle of NB-IoT is longer than that of eMTC, so the response to downlink data is slower.

Since these two technologies can benefit different IoT scenarios in their own ways, some people say that the two are complementary and applicable to different IoT application scenarios.

For services that feature stationary devices and low data traffic, and have low requirements for latency but high requirements for working hours, facility costs, and network coverage, such as water, power and gas supply metering, street lights, manhole covers, and garbage cans, NB-IoT is more suitable technologically.

For services that have high requirements for data traffic, mobility and latency, such as elevators, smart wearable devices, and logistics tracking, eMTC is preferable technologically.

As shown in the following figure, 3GPP will address the Massive and Critical segments with 5G. The Massive segment refers to problems in large-capacity IoT communications, while the Critical segment refers to high reliability and low latency. Relevant standards are still evolving. At present, China Telecom has the lead in NB-IoT network construction. According to statistics on online use, our business areas are generally covered.

5G Networks and Coverage

Spectrum Allocation in China

The results of 5G spectrum allocation in China have been released, which, I think, has taken the status of operators in consideration. Marked green in the following figure, China Telecom and China Unicom each got 100 MHz, and China Mobile got 260 MHz.

1. China Unicom and China Telecom were allocated the international mainstream 5G frequency band around 3.5 GHz, which has the following characteristics:

  • Mature industrial chain, complete research and development, and the greatest global practicability;
  • Fast development pace, and early commercialization;
  • Lower frequency, higher cost-effectiveness, lower base station density, and less capital expenditure.

2. China Mobile was allocated the combined spectrum of 2.6 GHz and 4.9 GHz, which has the following characteristics:

  • The 100 M bandwidth at the 4.9 GHz band can support more users and larger traffic, but requires more base stations, which places pressure on capital expenditure;
  • The industrial chain of the 2.6 GHz spectrum is less mature, so China Mobile needs to actively cultivate and deploy it. However, this spectrum provides wide coverage at a lower cost, which gives a dual-band assurance to 5G commercial use.

3. The total bandwidth allocated to China Mobile is 260 MHz. However, the range of 2515–2675 MHz overlaps that previously allocated to 4G. Therefore, the newly allocated bandwidth is actually 200 MHz.

China Mobile already has a lot of Time Division LTE (TD-LTE) infrastructure at the 2.6 GHz (2575–2635 MHz) frequency band, which gives it an edge in 5G construction. On this basis, China Mobile can increase 5G coverage by upgrading its current infrastructure. However, it has to make the effort to cultivate the industrial chain because 2.6 GHz is currently not a mainstream 5G spectrum.

3.5 GHz is the mainstream frequency spectrum in China. With a relatively mature industrial chain, it is the focus of competition among operators.

Hotspot Coverage or Continuous Coverage?

Continuous outdoor coverage is feasible at the 2.6 GHz frequency band. However, its uplink coverage is weak because it is limited by user equipment capabilities and power. In terms of uplink coverage, the 2.6 GHz frequency band differs by 4 dB from the 1800 MHz frequency band and by more than 10 dB from the 800 MHz frequency band. The propagation loss of radio signals in free space follows a certain rule: A higher frequency spectrum means a greater propagation loss and a shorter propagation distance. A main factor for selecting continuous coverage or hotspot coverage is investment costs and the return on investment. If the propagation loss is great, more base stations are needed, increasing the costs. In this respect, continuous coverage is more feasible at the low frequency band allocated to China Mobile.

According to conservative estimates, the number of 5G base stations (macro base stations) will be 1.2 to 1.5 times of the number of 4G base stations. The 5G network operates at higher frequency bands, where the signal penetrability of conventional macro base stations is reduced. In this case, small base stations or indoor distributed base stations can be a powerful complement in places such as crowded indoor environments, shopping malls, venues, and underground parking lots.

SA or NSA?

First of all, what are non-standalone (NSA) and standalone (SA) networks?

This couple is not difficult to understand, and is shown in the following figure. There are two major options for upgrading from 4G to 5G. Those with deep pockets can choose to build a new set of 5G core networks and 5G base stations. Others may consider transition, using the current 4G core networks to enjoy the new NR features brought by 5G base stations. This method will improve the NR rate, but some features of the 5G core network such as network slicing will be unavailable.

Many operators are not financially strong enough to deploy 5G core networks on a large scale, as their costs for 4G deployment have not even been recovered yet. With this problem in mind, 3GPP provided a variety of NSA upgrade options to allow everyone to take part with 5G. Current 4G base stations may be unable to support the increased NR rate offered by 5G, so they will need some modifications.

In NSA networks, the 5GNR carrier carries only user data, while system-level service control still relies on 4G networks. In other words, new carriers are added to the current 4G networks for capacity expansion. The NSA architecture still relies on the core networks and control plane of the 4G system, so it cannot give full play to the low latency feature of the 5G system, nor can it provide flexible support for diverse business needs through features such as network slicing and mobile edge computing.

From a global perspective, most operators choose to use NSA networks in the early stages for the sake of fast deployment. However, such networks can only support the application of 5G to eMBB scenarios, but not to URLLC and mMTC scenarios. In addition, the 5G NSA standard was closed earlier than SA standard, which is still in progress. Therefore, currently some 5G UE chips support only NSA. If we consider only the bandwidth, it is not a big problem for mobile phones to support only NSA.

Ultra-dense Network (UDN)

5G provides high-dense networking capabilities for hotspot areas, such as large venues hosting events. In hotspot scenarios with high data traffic, the radio environment is complex with changing interference. In this case, the UDN of base stations can improve the system spectral efficiency to a certain extent, and quickly schedule radio resources. However, it also brings about many problems.

The high density of wireless access points may cause serious system interference. It can also lead to more frequent inter-cell handovers, which in turn greatly increases signaling consumption and reduces QoS. To implement rapid and flexible deployment of low-power small base stations, small base stations are required to be plug-and-play, including autonomous backhaul, and automatic configuration and management.

The following key technologies can be used to solve these problems:

  1. Multi-connection technology, which is designed to implement simultaneous connection between UE and multiple macro and micro wireless network nodes. In dual-connection mode, a macro base station serves as a primary base station and provides a unified control plane, while a micro base station serves as a secondary base station and only bears data on the user plane. The secondary base station does not connect UEs to the control plane. The radio resource control (RRC) entity corresponding to the UE exists only in the primary base station.
  2. Wireless backhaul technology. In the current network architecture, it is difficult to achieve fast, efficient and low-latency communication between base stations, and base stations have not yet reached the ideal state of plug-and-play. To make node deployment more flexible and ensure lower deployment costs, we may resort to wireless backhaul transmission by using the same spectrum and technology as the access link. In wireless backhaul, wireless resources serve UEs while providing relay services to nodes.
  3. Dynamic adjustment of small cells to maximize spectrum utilization. Occasional events such as exhibitions and football games may cause obvious traffic fluctuation and surging online sharing, which requires a large uplink capacity. For indoor venues, uplink/downlink (UL/DL) subframe ratio needs to be dynamically adjusted based on real-time traffic. For example, uplink-dominated configuration can be used to meet uplink video transmission requirements. In scenarios with high demand for downlink resources, such as movie and music downloads, a downlink resource proportion needs to be increased for transmission, for example, adjusting the D/U ratio from 3:1 to 8:1. In scenarios with high demand for uplink resources such as live broadcasts and video or audio content upload, the D/U ratio can be adjusted from 3:1 to 1:3. In addition, user groups under similar service types usually cluster, or even occupy entire cells. Therefore, in a deployment area, if user service demand for a period of time shows a stable and obvious feature, such as high demand for uplink services, uniform timeslot adjustment is required for the cells in the area.

Requirements on communication experience in complex and diverse scenarios are increasingly high. To ensure users obtain consistent experience quality even in ultra-dense scenarios such as large events, outdoor gatherings, and concerts, 5G wireless networks need to support a capacity gain of 1,000 times and 100 billion future hotspot scenarios with high-dense data traffic. In this respect, UDN can greatly increase the system frequency multiplexing efficiency and network capacity by increasing the density of base stations. Therefore, UDN will become the key solution for hotspot scenarios with high-dense data traffic. In the near future, the popularity of ultra high-definition, 3D, and immersive video will greatly increase the data rate. With large amounts of personal and business data stored in the cloud, the needs of massive real-time data interaction require a transmission rate as fast as that of optical fibers.

Summary

To summarize the content covered here, we have discussed the key technologies of 5G.

  1. The target peak rate of a single base station is 20 Gbps, and the target spectral efficiency is 3 to 5 times that of 4G. These are indicators of application in eMBB scenarios. The following technologies are primarily involved in this part: LDPC code and Polar code, used to increase capacity; the millimeter wave, used to expand spectrum resources; NOMA, used to achieve PDMA power domain gain; and Massive MIMO, used to increase capacity. By virtue of the shorter wavelength, millimeter wave allows shorter antennas to be used, so that a mobile phone can accommodate more antennas and a base station can support a total 64T64R array of 128 antennas.
  2. Latency is reduced to 1 ms. This is related to the URLLC scenarios. The new air interface standard 5GNR defines a more flexible frame structure, which allows for a more flexible subcarrier spacing configuration. The maximum subcarrier spacing of 240 kHz corresponds to a timeslot of 0.0625 ms, making ultra-low latency applications possible. New multi-carrier technologies are used to reduce resource waste, such as the guard interval in the CP-OFDM system, therefore reducing latency and increasing utilization. In addition, network slicing technology can make networks more flexible, better support ultra-low latency applications, and establish an end-to-end high-speed channel. Network slicing technology is mainly applied to the SDN and network function virtualization (NFV) of core networks.
  3. The connection density reaches 1 million per square kilometer. This is related to the mMTC scenarios. At present, 5G standards are mainly based on eMTC and NB-IoT, both of which have their own advantages and disadvantages. eMTC is a better choice for services with high demand on traffic, mobility, and latency. NB-IoT is more suitable in scenarios that feature stationary devices and low data traffic, and have low requirements for latency but high requirements for working hours, facility costs, and network coverage. Currently, NB-IoT coverage is dominant in China. The connection density mentioned here is actually an ideal value subject to change, because the increase in connection density is highly dependent on the UE dormancy implemented by the PSM and eDRX technologies. More concurrency capabilities, lower network signaling consumption, more burst data packets, and other scenarios must be taken into account in the future. The development of connection density still has a long way to go.

Afterword

The artificial intelligence (AI) industry is booming, especially in the field of image recognition. It is undeniable that the convolutional neural network (CNN) and the deep neural network (DNN) have brought about tremendous changes in this field. However, AI is not limited to DNN, image recognition, or face recognition. AI technology needs breakthroughs in more aspects to build a more intelligent world.

AI’s breakthrough in the field of image has given the “intelligent world” eyes. Images that were previously unrecognizable to computers are gradually becoming structured and recognizable. Image recognition, image tracking, and image segmentation all make the frontend more intelligent. Progress in speech recognition has enabled the “intelligent world” to hear and understand us. The development of various sensing technologies will allow AI to gradually approach human perception of the physical world, such as touch and smell. All these advances will eventually converge into a brain for intelligent decision-making and command transmission and execution. 5G networks are gradually becoming the neural networks connecting various parts of the “intelligent world”. A fascinating future is beckoning us. I believe that Alibaba Cloud ET Brain will become an indispensable part of the “intelligent world” in that future.

The application of 5G in eMBB scenarios will definitely take the lead in development, because it is clearly defined and highly perceived by users. The application of 5G in URLLC and mMTC scenarios may need more integration with other scenarios, especially industrial applications and enterprise-oriented applications that operators actively participate in.

Original Source:

--

--

Alibaba Cloud
Alibaba Cloud

Written by Alibaba Cloud

Follow me to keep abreast with the latest technology news, industry insights, and developer trends. Alibaba Cloud website:https://www.alibabacloud.com

No responses yet