• Communication channel capacity. Throughput of information transmission systems

    In any communication system, information is transmitted through a channel. The information transfer rate was defined in § 2.9. This speed depends not only on the channel itself, but also on the properties of the signal supplied to its input and therefore cannot characterize the channel as a means of transmitting information. Let's try to find a way to assess the ability of a channel to transmit information. Let us first consider a discrete channel through which symbols from the alphabet of volume are transmitted per unit time. When transmitting each symbol, on average, the following amount of information passes through the channel [see. (2.135) and (2.140)]:

    Where random symbols at the entrance and exit of the channel. Of the four entropies appearing here, the intrinsic information of the transmitted symbol is determined by the source of the discrete signal and does not depend on the properties of the channel. The remaining three entropies generally depend on both the signal source and the channel.

    Let's imagine that symbols from different sources can be supplied to the input of the channel, characterized by different probability distributions (but, of course, with the same values ​​of all kinds of

    sources input signal, characterizes the channel itself and is called the channel capacity. Per character

    where maximization is carried out over all multidimensional probability distributions. You can also determine the channel capacity C per unit of time (second):

    The last equality follows from the additivity of entropy. In the future, wherever this is not specifically stated, we will understand throughput as throughput per second.

    As an example, let us calculate the capacity of a symmetric channel without memory, for which the transition probabilities are given by formula (3.36). According to (3.52) and (3.53)

    Value in in this case is easily calculated, since the conditional transition probability takes only two values: if if The first of these values ​​occurs with probability and the second with probability. In addition, since a channel without memory is considered, the results of receiving individual symbols are independent of each other. That's why

    Consequently, it does not depend on the probability distribution B, but is determined only by the transition probabilities of the channel. This property is preserved for all channel models with additive noise.

    Substituting (3.56) into (3.55), we obtain

    Since only the term on the right side depends on the probability distribution, it is necessary to maximize it. Maximum value according to (2.123) is equal and it is realized when all accepted symbols are equally probable and independent of each other. It is easy to verify that this condition is satisfied if the input symbols are equally probable and independent, since

    At the same time

    Hence the throughput per second

    For a binary symmetric channel, throughput in binary units per second

    The dependence on according to (3.59) is shown in Fig. 3.9.

    With the throughput of a binary channel, since with such a probability of error, a sequence of output binary symbols can be obtained without transmitting signals over the channel at all, but by choosing them at random (for example, based on the results of tossing a coin), i.e., when the sequences at the output and input of the channel are independent. The event is called a channel break. The fact that the throughput at in a binary channel is the same as at (noise-free channel) is explained by the fact that at it is enough to invert all the output symbols (i.e., replace 0 with 1 and 1 with 0) in order to correctly restore the input signal .

    Rice. 3.9. Dependence of the capacity of a binary symmetric channel without memory on the probability of erroneous reception of a symbol

    Bandwidth continuous channel is calculated similarly. Let, for example, a channel have a limited bandwidth of width. Then the signals at the input and output of the channel, according to Kotelnikov’s theorem, are determined by their samples taken at intervals and therefore the information passing through the channel over some time is equal to the sum of the amounts of information transmitted for each such sample. Channel capacity per such sample

    Here the random variables are cross sections of processes at the input and output of the channel and the maximum is taken over all admissible input signals, i.e. over all distributions.

    The throughput C is defined as the sum of the Count values ​​taken over all samples per second. In this case, of course, the differential entropies in (3.60) must be calculated taking into account the probabilistic connections between the samples.

    Let us calculate, for example, the capacity of a continuous channel without memory with additive white Gaussian noise, which has a bandwidth of width if the average signal power (variance) does not exceed a given value. We denote the power (dispersion) of noise in the band. The samples of the input and output signals, as well as noise, are related by the equality

    n since it has a normal distribution with zero mathematical expectation, then the conditional probability density for fixed and will also be normal - with mathematical expectation and dispersion. Let’s find the throughput per sample:

    According to (2.152), the differential entropy of the normal distribution does not depend on the mathematical expectation and is equal. Therefore, to find it, we need to find such a distribution density that maximizes From (3.61), taking into account that independent random variables, we have

    Thus, the variance is fixed, as given. According to (2.153), for a fixed dispersion, the maximum differential entropy is ensured normal distribution. From (3.61) it is clear that with a normal one-dimensional distribution the distribution will also be normal and, therefore,

    Turning to the throughput C per second, we note that the information transmitted over several samples is maximum when the signal samples are independent. This can be achieved if the signal is chosen so that its spectral density is uniform in the band. As was shown in samples separated by intervals that are multiples of them are mutually uncorrelated, and for Gaussian quantities uncorrelated means independence.

    Therefore, the throughput C (per second) can be found by adding the throughputs (3.63) for independent samples:

    It is realized if a Gaussian process with uniform spectral density in the frequency band (quasi-white noise).

    From formula (3.64) it is clear that if the signal power were not limited, then the throughput would be infinite. Bandwidth is zero if the signal-to-noise ratio of the channel is zero. As this ratio increases, the throughput increases indefinitely, but slowly, due to a logarithmic dependence.

    Relation (3.64) is often called Shannon's formula. This formula is important in information theory, since it determines the dependence of the capacity of the continuous channel under consideration on such technical characteristics, such as bandwidth and signal-to-noise ratio. Shannon's formula indicates the possibility of trading bandwidth for signal power and vice versa. However, since C depends linearly on and on logarithmically, it is usually impractical to compensate for a possible reduction in bandwidth by increasing the signal power. More efficient is the reverse exchange of signal power for bandwidth.

    5.1. Information transfer rate in a discrete communication system

    In a discrete communication system, in the absence of interference, the information at the output of the communication channel (PI channel) completely coincides with the information at its input, therefore the information transfer rate is numerically equal to the productivity of the message source:

    In the presence of interference, part of the source information is lost and the information transmission speed is lower than the source performance. At the same time, information about interference is added to the message at the channel output (Fig. 12).

    Therefore, in the presence of interference, it is necessary to take into account at the channel output not all the information provided by the source, but only mutual information:

    bps (5.2)

    Based on formula (5.1) we have

    Where H(x) source performance ;

    H(x/ y)  unreliability of the channel (loss) per unit of time;

    H(y)  entropy of the output message per unit time;

    H(y/ x) =H’(n) – entropy of interference (noise) per unit time.

    Communication channel capacity(information transmission channel) C is called the maximum possible speed of information transmission over the channel


    .
    (5.4)

    To achieve the maximum, all possible output sources and all possible encoding methods are taken into account.

    Thus, the capacity of a communication channel is equal to the maximum performance of the source at the input of the channel, fully consistent with the characteristics of this channel, minus the loss of information in the channel due to interference.

    In a channel without interference C= max H(x) , because H(x/ y)=0 . When using a uniform code with radix k, consisting of n elements of duration uh, in a channel without interference


    ,

    at k=2
    bps. (5.5)

    To effectively use the channel's capacity, it must be coordinated with the source of information at the input. Such coordination is possible both for communication channels without interference and for channels with interference based on two theorems proved by K. Shannon.

    Theorem 1 (for a communication channel without interference):

    If the message source has entropyH(bits per symbol), and the communication channel is the bandwidthC(bits per second), then messages can be encoded in such a way as to transmit information over the channel at an average speed, arbitrarily close to the valueC, but not surpass it.

    K. Shannon also proposed a method of such coding, which was called statistical or optimal coding. The idea of ​​such encoding was later developed in the works of Fano and Huffman and is now widely used in practice for “message compression.”

    5.2. Bandwidth of a homogeneous symmetric communication channel

    In a homogeneous communication channel, conditional (transition) probabilities p(y 1 / x 1 ) do not depend on time. The graph of states and transitions of a homogeneous binary communication channel is shown in Fig. 13.

    In this picture x 1 and x 2 – signals at the input of the communication channel, y 1 and y 2 – output signals. If a signal was transmitted x 1 and the signal is received y 1, this means that the first signal (index 1) is not distorted. If the first signal was transmitted ( x 1), and the second signal is received ( y 2), this means that the first signal has been distorted. The transition probabilities are shown in Fig. 13. If the channel is symmetrical, then the transition probabilities are pairwise equal.

    Let's denote: p(y 2 / x 1 )= p(y 1 / x 2 )= p uh – probability of signal element distortion, p(y 1 / x 1 )= p(y 2 / x 2 )=1- p uh – probability of correct reception of a signal element.

    In accordance with formulas (5.1) and (5.3)


    .

    If the signals x 1 and x 2 have the same duration uh, That
    . Then the channel capacity will be equal to

    . (5.7)

    In this formula maxH(y)= log k. For a binary channel ( k= 2) maxH(y)= 1 and formula (5.4) takes the form


    . (5.8)

    It remains to determine the conditional entropy H(y/ x) . For a binary source we have


    Substituting this value of conditional entropy into (5.8), we finally obtain

    . (5.9)

    For a communication channel with k>2


    bps.

    In Fig. 14 plots the dependence of the capacity of a binary channel on the probability of error.

    For a communication channel with k>2 throughput is determined by an almost similar formula:

    In conclusion, let's look at one example. Let there be a binary source with performance
    bps.

    If the probability of misstatement p uh = 0.01, then it follows that out of 1000 signal elements transmitted in one second, on average 990 elements will be received without distortion and only 10 elements will be distorted. It would seem that the throughput in this case would be 990 bits per second. However, calculation using formula (5.9) gives us a value significantly smaller ( C= 919 bps). What's the matter here? But the fact is that we would get C= 990 bps, if they knew exactly which elements of the message were distorted. Ignorance of this fact (and it is practically impossible to know) leads to the fact that 10 corrupted elements reduce the value of the received message so much that the throughput is sharply reduced.

    Another example. If p uh = 0.5, then out of 1000 transmitted elements 500 will not be distorted. However, now the throughput will not be 500 bit/s, as one might expect, and formula (5.9) will give us the value C= 0. Valid for p uh = 0.5, the signal no longer actually passes through the communication channel and the communication channel is simply equivalent to a noise generator.

    At p uh 1 throughput is approaching its maximum value. However, in this case, the signals at the output of the communication system must be inverted.


    In Fig. 1 the following designations are adopted: X, Y, Z, W– signals, messages ; f– interference; PM– communication line; AI, PI– source and receiver of information; P– converters (coding, modulation, decoding, demodulation).

    There are various types channels that can be classified according to various criteria:

    1.By type of communication lines: wired; cable; fiber optic;

    power lines; radio channels, etc.

    2. By the nature of the signals: continuous; discrete; discrete-continuous (signals at the input of the system are discrete, and at the output are continuous, and vice versa).

    3. In terms of noise immunity: channels without interference; with interference.

    Communication channels are characterized by:

    1. Channel capacity is defined as the product of the channel usage time T to, width of the frequency spectrum transmitted by the channel F to And dynamic rangeD to. , which characterizes the channel’s ability to transmit different levels signals


    V k = T k F k D k. (1)

    Condition for matching the signal with the channel:

    Vc £ Vk ; T c £ Tk ; F c £ F k ; Vc £ Vk ; Dc £ Dk.

    2.Information transfer rate – the average amount of information transmitted per unit of time.

    3.

    4. Redundancy – ensures the reliability of the transmitted information ( R= 0¸1).

    One of the tasks of information theory is to determine the dependence of the speed of information transmission and the capacity of a communication channel on the parameters of the channel and the characteristics of signals and interference.

    The communication channel can be figuratively compared to roads. Narrow roads – low capacity, but cheap. Wide roads provide good traffic capacity, but are expensive. Bandwidth is determined by the bottleneck.

    The data transfer speed largely depends on the transmission medium in communication channels, which use different types of communication lines.

    Wired:

    1. Wiredtwisted pair(which partially suppresses electromagnetic radiation other sources). Transfer speed up to 1 Mbit/s. Used in telephone networks and for data transmission.

    2. Coaxial cable. Transfer speed 10–100 Mbit/s – used in local networks, cable television etc.

    3. Fiber optic. Transfer speed 1 Gbit/s.

    In environments 1–3, the attenuation in dB depends linearly on distance, i.e. power drops exponentially. Therefore, it is necessary to install regenerators (amplifiers) at a certain distance.

    Radio lines:

    1.Radio channel. Transfer speed 100–400 Kbps. Uses radio frequencies up to 1000 MHz. Up to 30 MHz, due to reflection from the ionosphere, electromagnetic waves can propagate beyond the line of sight. But this range is very noisy (for example, amateur radio communications). From 30 to 1000 MHz – the ionosphere is transparent and direct visibility is required. Antennas are installed at a height (sometimes regenerators are installed). Used in radio and television.

    2.Microwave lines. Transfer speeds up to 1 Gbit/s. Radio frequencies above 1000 MHz are used. This requires direct visibility and highly directional parabolic antennas. The distance between regenerators is 10–200 km. Used for telephone communication, television and data transmission.

    3. Satellite communications. Microwave frequencies are used, and the satellite serves as a regenerator (for many stations). The characteristics are the same as for microwave lines.

    2. Bandwidth of a discrete communication channel

    A discrete channel is a set of means intended for transmission discrete signals.

    Communication channel capacity – the highest theoretically achievable information transmission speed, provided that the error does not exceed a given value. Information transfer rate – the average amount of information transmitted per unit of time. Let us define expressions for calculating the information transmission rate and the throughput of a discrete communication channel.

    When transmitting each symbol, on average, an amount of information passes through the communication channel, determined by the formula

    I (Y, X) = I (X, Y) = H(X) – H (X/Y) = H(Y) – H (Y/X) , (2)

    Where: I (Y, X) – mutual information, i.e. the amount of information contained in Y relatively X ;H(X)– entropy of the message source; H(X/Y)– conditional entropy, which determines the loss of information per symbol associated with the presence of interference and distortion.

    When sending a message X T duration T, consisting of n elementary symbols, the average amount of transmitted information, taking into account the symmetry of the mutual amount of information, is equal to:

    I(Y T , X T) = H(X T) – H(X T /Y T) = H(Y T) – H(Y T /X T) = n . (4)

    The speed of information transfer depends on statistical properties source, encoding method and channel properties.

    Bandwidth of a discrete communication channel

    . (5)

    The maximum possible value, i.e. the maximum of the functional is sought over the entire set of probability distribution functions p (x) .

    The throughput depends on the technical characteristics of the channel (equipment speed, type of modulation, level of interference and distortion, etc.). The units of channel capacity are: , , , .

    2.1 Discrete communication channel without interference

    If there is no interference in the communication channel, then the input and output signals of the channel are connected by an unambiguous, functional relationship.

    In this case, the conditional entropy is equal to zero, and the unconditional entropies of the source and receiver are equal, i.e. the average amount of information in a received symbol relative to the transmitted one is


    I (X, Y) = H(X) = H(Y); H(X/Y) = 0.

    If X T– number of characters per time T, then the information transmission rate for a discrete communication channel without interference is equal to

    (6)

    Where V = 1/ – average transmission speed of one symbol.

    Throughput for a discrete communication channel without interference

    (7)

    Because the maximum entropy corresponds to equally probable symbols, then the throughput for uniform distribution and statistical independence of transmitted symbols is equal to:

    . (8)

    Shannon's first theorem for a channel: If the flow of information generated by the source is sufficiently close to the capacity of the communication channel, i.e.

    , where is an arbitrarily small value,

    then you can always find a coding method that will ensure the transmission of all source messages, and the information transmission rate will be very close to the channel capacity.

    The theorem does not answer the question of how to carry out coding.

    Example 1. The source produces 3 messages with probabilities:

    p 1 = 0,1; p 2 = 0.2 and p 3 = 0,7.

    Messages are independent and transmitted uniformly binary code (m = 2 ) with a symbol duration of 1 ms. Determine the speed of information transmission over a communication channel without interference.

    Solution: The source entropy is equal to

    [bit/s].

    To transmit 3 messages with a uniform code, two digits are required, and the duration of the code combination is 2t.

    Average signal speed

    V =1/2 t = 500 .

    Information transfer rate

    C = vH = 500 × 1.16 = 580 [bit/s].

    2.2 Discrete communication channel with interference

    We will consider discrete communication channels without memory.

    Channel without memory is a channel in which each transmitted signal symbol is affected by interference, regardless of what signals were transmitted previously. That is, interference does not create additional correlative connections between symbols. The name “no memory” means that during the next transmission the channel does not seem to remember the results of previous transmissions.

    Let's consider the communication channel shown in Fig. 5-1. A signal is sent to its transmitting end x(t), which arrives at the receiver input distorted by noise n(t) form y(t)[L. 47, 53]. Let us introduce the concept of communication channel capacity. The bandwidth of a communication channel is defined as the maximum value of relative information of the output signal relative to the input:

    Where I(x, y)- relative information given by formula (7-8), and all signals are considered as equivalent discrete ones (Fig. 7-1), so that


    Sometimes the value is called the speed of information transmission over a communication channel. This value is equal to the amount of relative information transmitted per unit of time. Per unit time at discrete channel In communications, it is convenient to calculate the transmission time of one symbol. In this case, the formulas for information transmission speed mean entropy and the amount of information per symbol. For continuous communication channels, two units of measurement are used, either a conventional unit (for example, a second), or a time interval between samples , in this latter case, the formulas refer to differential entropies per sample (or degree of freedom). Often the manuals do not specifically indicate which of the two units is used. In this regard, a different formula is often used for the average information transfer speed


    Where N=2f c t 0. If the readings are independent, then V=I 1 (x, y). Obviously, using the quantity V The communication channel capacity can be determined by the formula


    For the entropy of noise we can write:

    Н(n)=2f c t 0 H 1 (n),


    Noise entropy per sample for normal noise.

    Similar formulas can be written for normal signals X And y.

    Formula (7-10) for the unit of reference can be written as

    The meaning of this definition requires clarification. Note that the maximum here is taken over the set of probability distributions of input signals with constant noise, which is assumed to be given. In a particular case, this set of distributions may consist of one normal one, as is often believed.

    If the capacity of one communication channel is greater than that of another (C 1 >C 2) under other identical conditions, then physically this means that in the first case the joint probability distribution density of the input and output signals is greater than in the second, since using the formula (7-11) it is easy to see that the capacity is determined mainly by the value of the joint probability distribution density. If the relative information (or entropy) of the output signal relative to the input signal is greater, then the channel has greater capacity. It is clear that if noise increases, then throughput decreases.

    If the probabilistic connection between the output and input signals disappears, then

    p(x,y)=p(x)p(y)

    and in formula (7-11), the logarithm and, therefore, the throughput become equal to zero.

    Another case when

    p(x,y)=p(x|y)p(y)

    tends to zero, requires detailed consideration, since log p(x,y) tends to - ∞. If р(y)→ 0, then


    The reasoning can be continued as follows. Since the probability of the appearance of the output signal tends to zero, we can assume that the probability of the appearance of the signal X does not depend on y, i.e.

    p(x|y)=p(x)


    In this case, the throughput is zero, which is consistent with the physical interpretation, i.e., if no signal [or useful signal] appears at the output of the communication channel x(t), no noise n(t)], this means that there is a “plug” (gap) in the channel. In all other cases, the throughput is non-zero.

    It is natural to determine the bandwidth of a communication channel so that it does not depend on the input signal. For this purpose, a maximization operation was introduced, which, in accordance with the extreme properties of entropy, most often determines the input signal with a normal distribution law. Let us show that if x(t) And n(t) independent and y(t)=x(t)+n(t), That

    I(x,y)=Н(y)-Н(n), (7-12)

    Where H(y) And N(n)- differential entropies of the received signal and noise. Condition (7-12) means that the communication channel is linear in the sense that noise is simply added to the signal as a term. It follows directly from

    I(x,y)=Н(x)-Н(х|y)=Н(y)-Н(y|х).

    Because x And n are statistically independent, then

    Substituting this relation into the previous one, we get (7-12). Obviously, if the noise is additive and does not depend on the input signal, then maximum speed transmission of messages over a communication channel (maximum throughput) is achieved when maxН(y), because

    Let us consider a Gaussian communication channel based on the following assumptions: the channel bandwidth is limited by the frequency f with; noise in the channel - normal white with average power per unit band S n =S n 2; average useful signal power P x; signal and noise are statistically independent; the output signal is equal to the sum of the useful signal and noise.

    Obviously, in accordance with formula (7-4), the capacity of such a channel will be determined as

    H(n)=Flog2πeS n f c . (7-14)

    Since the signal and noise are statistically independent, they are not correlated with each other, therefore the average power of the total signal

    Р y =Р x +S n f c =Р x +Р n

    In accordance with formula (7-13), it is necessary to find the maximum entropy of the signal y(t) per one count at a given medium power. Due to the extreme properties of entropy (see Chapter 6), the signal y(t) should be normally distributed. White noise in the band f c is equivalent to a signal in the same band with spectral density S, if their average powers are equal, i.e.


    Indeed, for a normal signal the formula for entropy per one sample has been proven

    Throughput of information transmission systems

    One of the main characteristics of any information transmission system, in addition to those listed above, is its throughput.

    Bandwidth – the maximum possible amount of useful information transmitted per unit of time:

    c = max(Imax) / TC,

    c = [bit/s].

    Sometimes the information transmission rate is defined as the maximum amount of useful information in one elementary signal:

    s = max(Imax) / n,

    s = [bit/element].

    The considered characteristics depend only on the communication channel and its characteristics and do not depend on the source.

    Throughput of a discrete communication channel without interference. In a communication channel without interference, information can be transmitted using a non-redundant signal. In this case, the number n = m, and the entropy of the elementary signal HCmax = logK.

    max(IC) = nHCmax= mHCmax .

    Duration of an elementary signal, where is the duration of an elementary signal.

    where FC is the signal spectrum.

    Communication channel capacity without interference

    Let us introduce the concept of the rate of generation of an elementary signal by a source of information:

    Then, using the new concept, we can transform the formula for the information transmission speed:

    The resulting formula determines the maximum possible speed of information transmission in a discrete communication channel without interference. This follows from the assumption that the entropy of the signal is maximum.

    If H.C.< HCmax, то c = BHC и не является максимально возможной для данного канала связи.

    Capacity of a discrete communication channel with interference. In a discrete communication channel with noise, the situation shown in Fig. 6.

    Taking into account the property of additivity, as well as Shannon’s formulas for determining the amount of information discussed above, we can write

    IC = TC FC log(AK PC),

    IPOM = TP FP log(APP).

    For the recipient, the source of useful information and the source of interference are equivalent, therefore, on the receiving side it is impossible to isolate the interference component in the signal with the resulting information

    IRES = TC FC log(AK (PP + PC)), if TC = TP, FC = FP.

    The receiver may be narrowband, and the interference may be in other frequency ranges. In this case, it will not affect the signal.

    We will determine the resulting signal for the most “unpleasant” case, when the signal and noise parameters are close to each other or coincide. Useful information is determined by the expression

    This formula was obtained by Shannon. It determines the speed of information transmission over a communication channel if the signal has PC power and the interference has PP power. All messages at this speed will be transmitted with absolute reliability. The formula does not answer the question of how to achieve such a speed, but it gives the maximum possible value of c in a communication channel with interference, that is, the value of the transmission speed at which the received information will be absolutely reliable. In practice, it is more economical to allow a certain amount of error in the message, although the transmission speed will increase.

    Consider the case PC >> PP. If we introduce the concept of signal-to-noise ratio

    PC >> PP means that . Then

    The resulting formula reflects the maximum speed of a powerful signal in the communication channel. If PC<< PП, то с стремится к нулю. То есть сигнал принимается на фоне помех. В таком канале в единицу времени сигнал получить не удается. В реальных ситуациях полностью помеху отфильтровать нельзя. Поэтому приемник получает полезную информацию с некоторым набором ошибочных символов. Канал связи для такой ситуации можно представить в виде, изображенном на рис. 7, приняв источник информации за множество передаваемых символов {X}, а приемник – за множество получаемых символов {Y}.

    Fig.7 Graph of transition probabilities of a K-ary communication channel

    There is a certain one-to-one correspondence between. If there is no interference, then the probability of a one-to-one match is equal to one, otherwise it is less than one.

    If qi is the probability of mistaking yi for xi, and pij = p(yi / xi) is the probability of error, then

    .

    The transition probability graph reflects the final result of the influence of interference on the signal. As a rule, it is obtained experimentally.

    Useful information can be estimated as IPOL = nH(X · Y), where n is the number of elementary symbols in the signal; H(X Y) – mutual entropy of source X and source Y.

    In this case, source X is the source of useful information, and source Y is the receiver. The relationship that determines useful information can be obtained based on the meaning of mutual entropy: the shaded section of the diagram determines the messages transmitted by source X and received by receiver Y; unshaded areas represent signals from source X that did not reach the receiver and extraneous signals received by the receiver that were not transmitted by the source.

    B is the rate of generation of elementary symbols at the source output.

    To obtain max, you need to increase H(Y) and decrease H(Y/X) if possible. Graphically, this situation can be represented by combining circles on the diagram (Fig. 2d).

    If the circles do not intersect at all, X and Y exist independently of each other. In the following we will show how the general expression for the maximum transmission rate can be used when analyzing specific communication channels.

    When characterizing a discrete channel, two concepts of speed are used: technical and information.

    The technical transmission rate RT, also called the keying rate, refers to the number of symbols (elementary signals) transmitted over a channel per unit time. It depends on the properties of the communication line and the speed of the channel equipment.

    Taking into account differences in the duration of symbols, the technical speed is determined as

    where is the average symbol duration time.

    The unit of measurement is "baud" - this is the speed at which one character is transmitted per second.

    Information speed or information transmission rate is determined by the average amount of information that is transmitted over a channel per unit of time. It depends both on the characteristics of a particular channel (such as the volume of the alphabet of symbols used, the technical speed of their transmission, the statistical property of interference in the line), and on the probabilities of symbols arriving at the input and their statistical relationship.

    With a known manipulation speed, the speed of information transmission over the channel is given by the relation:

    ,

    where is the average amount of information carried by one symbol.



    For practice, it is important to find out to what extent and in what way the speed of information transmission over a specific channel can be increased. The maximum capabilities of a channel for transmitting information are characterized by its throughput.

    The channel capacity with given transition probabilities is equal to the maximum transmitted information over all input symbol distributions of source X:

    From a mathematical point of view, searching for the capacity of a discrete channel without memory comes down to searching for the probability distribution of input symbols of source X, which provides maximum transmitted information. At the same time, a restriction is imposed on the probabilities of input symbols: , .

    In general, determining the maximum under given restrictions is possible using Lagrange's multiplicative method. However, such a solution is prohibitively expensive.

    In the particular case of discrete symmetric channels without memory, the throughput (maximum ) is achieved with a uniform distribution of input symbols of the source X.

    Then for a DSC without memory, considering the error probability ε as given and for equally probable input symbols = = = =1/2, we can obtain the capacity of such a channel using the well-known expression for:

    where = is the entropy of a binary symmetric channel for a given error probability ε.

    Boundary cases are of interest:

    1. Transmission of information over a silent channel (without interference):

    , [bit/character].

    With fixed basic technical characteristics of the channel (for example, frequency band, average and peak transmitter power), which determine the value of the technical speed, the throughput of the channel without interference will be equal to [bit/sec].