Computer Network-Physical Layer-Channel Limit Capacity

The ultimate capacity of a channel refers to the highest symbol transmission rate or the ultimate information transmission rate of the channel.

Although the signal will inevitably be distorted when transmitting on the channel, as long as we can identify the original signal from the distorted waveform at the receiving end, this distortion will not affect the communication quality. For example, Figure (a) shows that although the signal is distorted after transmission through the actual channel, the original symbol can be recognized and recovered at the receiving end. But figure (b) is different. At this time, the distortion of letters is very serious, and it is impossible to identify whether the symbol is 1 or 0 at the receiving end. The higher the symbol transmission rate, or the farther the signal transmission distance, or the greater the noise interference, or the worse the transmission medium quality, the more serious the waveform distortion at the receiving end.

Conceptually, there are two factors that limit the transmission rate of symbols on the channel.

The frequency range that the (1) channel can pass.

The frequency range that a specific channel can pass is always limited. Many high-frequency components in the signal usually cannot pass through the channel. The transmitted signal shown in the above figure is a typical rectangular pulse signal, which contains rich high-frequency components. If the high-frequency components in the signal are attenuated during transmission, the leading edge and trailing edge of the waveform received at the receiving end will become less steep, and the time limit occupied by each symbol will no longer be very clear, but will be dragged back and forth. In this way, the signal waveform received at the receiving end loses the clear boundary between symbols. This phenomenon is called intersymbol interference. Serious inter-symbol crosstalk makes a series of symbols that were clearly separated become blurred and unrecognizable. As early as 1924, Nyquist Nyquist deduced the famous Nyquist criterion. In order to avoid intersymbol interference under the assumed ideal conditions, the limit symbol transmission rate is 2wbaud, where w is the bandwidth of the ideal low communication channel.

If V is used to represent the discrete series of each symbol, then the discrete series of a symbol refers to how many different symbols there are, such as 16 different symbols, which requires 4 binary bits (one symbol can carry one bit in binary digital transmission), so the data transmission rate (data transmission rate is the number of binary symbols transmitted per second, also known as bit rate). The unit is bit per second (bit/s), which is four times the symbol transmission rate, so the limit data rate is

Limit data transmission rate in ideal low communication channel =2W 1og2V? (in bytes per second)

For the Nyquist criterion, the following conclusions can be drawn:

1) In any channel, the transmission rate of symbols has an upper limit. If the transmission rate exceeds this upper limit, serious intersymbol interference will occur, making it impossible for the receiver to judge (that is, identify) the symbols.

2) The wider the frequency band of the channel (that is, the more high-frequency components the signal passes through), the symbols can be transmitted at a higher rate without intersymbol interference.

3) Nyquist criterion limits the transmission rate of symbols, but it does not limit the transmission rate of information, that is, it does not limit how many binary bits a symbol can correspond to.

Because the symbol transmission rate is limited by Nyquist criterion, in order to improve the data transmission rate, it is necessary to make each symbol carry more bits of information as much as possible, so a multi-system modulation method is needed.

(2) Signal to noise ratio

Noise exists in all electronic devices and communication channels. Because noise is generated randomly, its instantaneous value is sometimes very large, so noise will make the judgment of symbols at the receiving end wrong (1 misjudged as 0 or 0 misjudged as 1). But the influence of noise is relative. If the signal is strong, the influence of noise is small. So the signal-to-noise ratio is very important. The so-called signal-to-noise ratio is the ratio of the average power of the signal to the average power of the noise, which is often recorded as S/N and measured in decibels (b). Namely:

Signal-to-noise ratio (db) =10log10 (s/w) (db)? (2- 1)

For example, when S/W= 10, the signal-to-noise ratio is 10dB, while when S/W- 1000, the signal-to-noise ratio is 30dB.

1948, Shannon, the founder of information theory, deduced the famous Shannon formula. Shannon theory gives the limit data transmission rate of the channel with limited bandwidth and Gaussian white noise interference. When transmitting at this rate, no errors will occur. Shannon formula shows that the limit information transmission rate c of the channel is

C = wolg2 (1+signal-to-noise ratio) (bits per second) (2-2)

In equation (2-2), w is the bandwidth of the channel (in Hz); S is the average power of the signal transmitted in the channel; N is the Gaussian noise power in the channel.

For Shannon's theory, the following conclusions can be drawn:

The greater the bandwidth of 1) channel or the signal-to-noise ratio in the channel, the higher the limit transmission rate of information.

2) For a certain transmission bandwidth and a certain signal-to-noise ratio, determine the upper limit of information transmission rate.

3) As long as the information transmission rate is lower than the limit transmission rate of the channel, some method can be found to realize error-free transmission.

4) Shannon theory obtains the limit information transmission rate, and the transmission rate that the actual channel can achieve is much lower than it.

Nyquist criterion only considers the relationship between bandwidth and transmission rate of limit symbols, while Shannon theory considers not only bandwidth but also signal-to-noise ratio. This shows from another side that the number of binary digits corresponding to a symbol is limited.

From what has been said above, it is not difficult to see that if the signal-to-noise ratio cannot be improved and the symbol transmission rate has reached the upper limit, what can be done to improve the information transmission rate? This is information that each symbol carries more bits by encoding. We can use a simple example to illustrate this problem. Suppose our baseband signal is:

10 10 1 1000 1 10 1 1 10 10……

If transmitted directly, the information carried by each symbol is 1 bit. Now, every three bits in the signal are grouped, namely 10 1, 0 1 0,000,10, 165438. There are eight different arrangements of three * * *. We can use different modulation methods to represent this signal. For example. It is modulated with 8 different amplitudes, 8 different frequencies or 8 different phases. Suppose we use phase modulation, phase p0 represents 000, p 1 represents 00 1, p2 represents 0 10, ... P7 stands for 1 1 1. In this way, the original signal of 18 symbols is converted into a signal consisting of six new symbols (that is, every three bits make up a new symbol):

10 10 1000 1 10 1 1 10 10 10……= p5p0p 1……

That is to say, if symbols are sent at the same rate, the amount of information transmitted at the same time will triple. Since the publication of Shannon's formula, various new signal embedding and modulation methods have appeared constantly, all in order to approach the transmission rate limit given by Shannon's formula as much as possible. The information transmission rate that can be achieved on the actual channel is much lower than Shannon's limit transmission rate. This is because in the actual channel, the signal will suffer some other damage, such as various pulse interference and distortion in transmission. These factors are not considered in the derivation of Shannon formula.