• How to calculate the amount of information in a message. Probabilistic approach to determining the amount of information "Shannon's Formula. Application of ET Excel to solve problems of finding the amount of information"

    In order to be able to compare different sources of messages and different lines and channels of communication, it is necessary to introduce some quantitative measure that allows one to evaluate the information contained in the message and carried by the signal. Such a measure in the form of the amount of information was introduced by K. Shannon based on the concept of choice, which allowed him to build a fairly general mathematical theory of communication.

    Let us consider the main ideas of this theory in relation to a discrete source that issues a sequence of elementary messages. Let's try to find a convenient measure of the amount of information contained in a message. The main idea of ​​information theory is that this measure is determined not by the specific content of a given message, but by the fact that the source selects a given elementary communication from a finite set. This idea is justified by the fact that on its basis it was possible to obtain a number of far-reaching and at the same time non-trivial results that are in good agreement with intuitive ideas about the transfer of information. The main results of these results will be presented below.

    So, if the source selects one elementary message () from the alphabet set, then the output it produces amount of information depends not on the specific content of this element, but on how this choice is made. If the selected message element is predetermined, then it is natural to assume that the information contained in it is equal to zero. Therefore, we will assume that the choice of a letter occurs with a certain probability. This probability may, generally speaking, depend on what sequence preceded the given letter. Let us accept that the amount of information contained in an elementary message is a continuous function of this probability, and we will try to determine the type of this function so that it satisfies some of the simplest intuitive ideas about information.

    For this purpose, we will carry out a simple transformation of the message, which consists in the fact that we will consider each pair of “letters” created sequentially by the source as one enlarged “letter”. We call such a transformation an enlargement of the alphabet. A set of enlarged “letters” forms an alphabet of volume , since after each element of the alphabet any element can, generally speaking, be selected. Let there be a probability that the source will make a sequential selection of elements and . Then, considering the pair as a letter of a new alphabet, it can be argued that this pair contains an amount of information.

    It is natural to demand that the amount of information contained in a pair of letters satisfies the additivity condition, i.e., equals the sum of the amounts of information contained in each of the letters and the original alphabet. The information contained in a letter is equal to , where is the probability of choosing a letter after all the letters that preceded it. To determine the information contained in the letter, you need to take into account the probability of choosing the letter after the letter, also taking into account all the letters that preceded the letter. We denote this conditional probability by . Then the amount of information in a letter will be expressed by the function.

    On the other hand, the probability of choosing a pair of letters according to the rule of probability multiplication is equal to

    The requirement for the additivity of the amount of information during the operation of alphabet enlargement leads to the equality

    Let it be. Then for any and the equation must be followed

    We exclude cases or from consideration, since due to the finite number of letters of the alphabet, these equalities mean that the choice of a pair of letters by the source is an impossible event.

    Equality (1.3) is a functional equation from which the type of function can be determined. Let us differentiate both sides of equation (1.3) with respect to p:

    .

    Let us multiply both sides of the resulting equation by p and introduce the notation , then

    (1.4)

    This equation must be valid for any and all . The last restriction is not significant, since equation (1.4) is symmetric with respect to and, therefore, must be satisfied for any pair of positive values ​​of the arguments not exceeding one. But this is only possible if both sides of (1.4) represent some constant value , whence

    Integrating the resulting equation, we find

    , (1.5)

    where is an arbitrary integration constant.

    Formula (1.5) defines a class of functions that express the amount of information when choosing a letter that has a probability and satisfy the additivity condition. To determine the integration constant, we will use the condition stated above, according to which a predetermined message element, i.e., having a probability, does not contain information. Therefore, , from which it immediately follows that . - the base of natural logarithms), or, in other words, is equal to the information contained in the message that an event has occurred, the probability of which was equal to

    considering that the logarithm is taken to any base, as long as this base is maintained throughout the problem being solved.

    Thanks to the property of additivity of information, expressions (1.6) make it possible to determine the amount of information not only in the letter of a message, but also in any message of any length. You just need to take the probability of choosing this message from all possible ones, taking into account the previously selected messages.

    In computer science

    Amount of information


    Introduction

    2. Uncertainty, amount of information and entropy

    3. Shannon's formula

    4. Hartley's formula

    5. Amount of information received during the communication process

    List of used literature


    Introduction

    According to the definition of A.D. Ursula - “information is reflected diversity.” The amount of information is a quantitative measure of diversity. This may be the diversity of the aggregate contents of memory; variety of signals perceived during the process specific message; variety of outcomes of a particular situation; the diversity of elements of a certain system... is an assessment of diversity in the broadest sense of the word.

    Any message between the source and receiver of information has a certain duration in time, but the amount of information perceived by the receiver as a result of the message is ultimately characterized not by the length of the message, but by the variety of the signal generated in the receiver by this message.

    The memory of an information carrier has a certain physical capacity in which it is capable of accumulating images, and the amount of information accumulated in memory is ultimately characterized by the diversity of filling this capacity. For inanimate objects this is the diversity of their history; for living organisms this is the diversity of their experience.

    1.Bit

    Variety is essential when conveying information. You can’t paint white on white; state alone is not enough. If a memory cell is capable of being in only one (initial) state and is not able to change its state under external influence, this means that it is not capable of perceiving and remembering information. The information capacity of such a cell is 0.

    Minimal diversity is ensured by the presence of two states. If a memory cell is capable, depending on external influence, of taking one of two states, which are usually designated “0” and “1”, it has minimal information capacity.

    The information capacity of one memory cell, capable of being in two different states, is taken as a unit of measurement for the amount of information - 1 bit.

    1 bit (bit - abbreviation for English binary digit - binary number) is a unit of measurement of information capacity and amount of information, as well as another quantity - information entropy, which we will get acquainted with later. Bit, one of the most unconditional units of measurement. If the unit of measurement of length could be set arbitrarily: cubit, foot, meter, then the unit of measurement of information could not be essentially any other.

    On physical level a bit is a memory cell that at any given time is in one of two states: “0” or “1”.

    If each pixel of an image can only be either black or white, such an image is called a bitmap, because each pixel represents a memory cell with a capacity of 1 bit. A light bulb that can either be “on” or “off” also symbolizes the beat. A classic example illustrating 1 bit of information is the amount of information obtained as a result of tossing a coin - “heads” or “tails”.

    An amount of information equal to 1 bit can be obtained in response to a “yes”/“no” question. If initially there were more than two answer options, the amount of information received in a particular answer will be more than 1 bit, if there are less than two answer options, i.e. one, then this is not a question, but a statement, therefore, obtaining information is not required, since there is no uncertainty.

    The information capacity of a memory cell capable of receiving information cannot be less than 1 bit, but the amount of information received can be less than 1 bit. This occurs when the answer options “yes” and “no” are not equally likely. The inequality, in turn, is a consequence of the fact that some preliminary (a priori) information on this issue is already available, obtained, for example, on the basis of previous life experience. Thus, in all the arguments in the previous paragraph, one very important caveat should be taken into account: they are valid only for the equally probable case.

    We will denote the amount of information by the symbol I, the probability is denoted by the symbol P. Recall that the total probability full group events is equal to 1.

    2.Uncertainty, amount of information and entropy

    The founder of information theory, Claude Shannon, defined information as the removal of uncertainty. More precisely, obtaining information - necessary condition to remove uncertainty. Uncertainty arises in a situation of choice. The task that is solved in the course of removing uncertainty is reducing the number of options under consideration (reducing diversity), and ultimately choosing one option appropriate to the situation from among the possible ones. Removing uncertainty makes it possible to make informed decisions and take action. This is the controlling role of information.

    A situation of maximum uncertainty presupposes the presence of several equally probable alternatives (options), i.e. neither option is preferable. Moreover, the more equally probable options are observed, the greater the uncertainty, the more difficult it is to make an unambiguous choice and the more information is required to obtain this. For N options, this situation is described by the following probability distribution: (1/N, 1/N, … 1/N).

    The minimum uncertainty is 0, i.e. this situation of complete certainty, meaning that the choice has been made, and all necessary information received. The probability distribution for a situation of complete certainty looks like this: (1, 0, …0).

    The quantity characterizing the amount of uncertainty in information theory is denoted by the symbol H and is called entropy, more precisely information entropy.

    Entropy (H) is a measure of uncertainty expressed in bits. Entropy can also be considered as a measure of distribution uniformity random variable.

    Figure 1 shows the behavior of entropy for the case of two alternatives, when the ratio of their probabilities changes (p, (1-p)).

    Entropy reaches its maximum value at in this case then, when both probabilities are equal to each other and equal to ½, the zero entropy value corresponds to the cases (p 0 =0, p 1 =1) and (p 0 =1, p 1 =0).

    The amount of information I and entropy H characterize the same situation, but from qualitatively opposite sides. I is the amount of information that is required to remove the uncertainty H. According to Leon Brillouin’s definition, information is negative entropy (negentropy).

    When the uncertainty is completely removed, the amount of information received I is equal to the initially existing uncertainty H.

    When uncertainty is partially removed, the amount of information received and the remaining uncertainty that remains unresolved add up to the original uncertainty. H t + I t = H.

    For this reason, the formulas that will be presented below for calculating the entropy H are also formulas for calculating the amount of information I, i.e. when we are talking about the complete removal of uncertainty, H in them can be replaced by I.

    3.Shannon's formula

    In the general case, entropy H and the amount of information I obtained as a result of removing uncertainty depend on the initial number of options under consideration N and the a priori probabilities of the implementation of each of them P: (p 0 , p 1 , …p N -1 ), i.e. H=F(N, P). Entropy is calculated in this case using Shannon’s formula, proposed by him in 1948 in the article “Mathematical Theory of Communication”.

    In the special case, when all options are equally probable, the dependence remains only on the number of options considered, i.e. H=F(N). In this case, Shannon's formula is significantly simplified and coincides with Hartley's formula, which was first proposed by the American engineer Ralph Hartley in 1928, i.e. 20 years earlier.

    Shannon's formula is as follows:

    (1)

    Rice. 3. Finding the logarithm of b to base a is finding the power to which you need to raise a to get b.

    Let us remind you what a logarithm is.

    The base 2 logarithm is called binary:

    log 2 (8)=3 => 2 3 =8

    log 2 (10)=3.32 => 2 3.32 =10

    The logarithm to base 10 is called decimal:

    log 10 (100)=2 => 10 2 =100

    Basic properties of the logarithm:

    1. log(1)=0, because any number to the zero power gives 1;

    2. log(a b)=b*log(a);

    3. log(a*b)=log(a)+log(b);

    4. log(a/b)=log(a)-log(b);

    5. log(1/b)=0-log(b)=-log(b).

    The minus sign in formula (1) does not mean that entropy is a negative value. This is explained by the fact that p i £1 by definition, and the logarithm of a number less than one is a negative value. By the property of the logarithm

    , therefore this formula can be written in the second version, without the minus before the sum sign. is interpreted as a particular amount of information obtained in the case of implementation of the i-th option. Entropy in Shannon's formula is the average characteristic - the mathematical expectation of the distribution of a random variable (I 0, I 1, ... I N -1).

    The amount of information is a numerical characteristic of a signal, reflecting the degree of uncertainty (incompleteness of knowledge) that disappears after receiving a message in the form of a given signal.
    This measure of uncertainty in information theory is called entropy. If, as a result of receiving a message, complete clarity is achieved on some issue, it is said that complete or exhaustive information has been received and the need to obtain additional information No. And, conversely, if after receiving the message the uncertainty remains the same, then no information was received (zero information).
    The above considerations show that there is a close connection between the concepts of information, uncertainty and choice. Thus, any uncertainty presupposes the possibility of choice, and any information, reducing uncertainty, reduces the possibility of choice. With complete information there is no choice. Partial information reduces the number of choices, thereby reducing uncertainty.
    Let's look at an example. A person throws a coin and watches which side it lands on. Both sides of the coin are equal, so it is equally likely that one side or the other will come up. This situation is attributed to initial uncertainty, characterized by two possibilities. After the coin falls, complete clarity is achieved and uncertainty disappears (becomes zero).
    The given example refers to a group of events in relation to which a “yes-no” question can be posed.
    The amount of information that can be obtained when answering a yes-no question is called a bit (English bit - short for binary digit - binary unit).
    A bit is the minimum unit of information, because it is impossible to obtain information less than 1 bit. When receiving information of 1 bit, the uncertainty is reduced by 2 times. Thus, each coin toss gives us 1 bit of information.
    Consider a system of two light bulbs that can be turned on or off independently of each other. For such a system the following states are possible:
    Lamp A: 0 0 1 1 ;
    Lamp B: 0 1 0 1 .
    To get full information about the state of the system, you need to ask two yes-no questions about light bulb A and light bulb B, respectively. In this case, the amount of information contained in this system is already determined in 2 bits, and the number of possible states of the system is 4. If you take three light bulbs, then you need to ask three questions and get 3 bits of information. The number of states of such a system is 8, etc.
    The connection between the amount of information and the number of states of the system is established by Hartley's formula.
    i= log 2N,
    where i is the amount of information in bits; N is the number of possible states. The same formula can be presented differently:
    N=2i.
    A group of 8 bits of information is called a byte.
    If a bit is the minimum unit of information, then a byte is its basic unit. There are derived units of information: kilobyte (KB, KB), megabyte (MB, MB) and gigabyte (GB, GB).
    Thus, there is a close connection between the concepts of “information”, “uncertainty” and “choice”. Any uncertainty presupposes the possibility of choice, and any information, reducing uncertainty, reduces the possibility of choice. Partial information reduces the number of choices, thereby reducing uncertainty.
    The amount of information is a numerical characteristic of a signal, reflecting the degree of uncertainty (incompleteness of knowledge) that disappears after receiving a message in the form of a given signal.

    More on the topic The concept of quantity of information:

    1. Concept, types of information and principles of legal regulation of relations in the field of information
    2. Journalism as mass information activity. The concepts of “information” and “mass information”. Mass information as a product of mass information activity. Mass information and social information.
    Author details

    Chetvergova Yu. N.

    Place of work, position:

    Municipal educational institution "Secondary school No. 1 of Porkhov", teacher

    Pskov region

    Characteristics of the lesson (lesson)

    Education level:

    Secondary (complete) general education

    Target Audience:

    Teacher (teacher)

    Class(es):

    Item(s):

    Computer Science and ICT

    Objective of the lesson:

    Repetition, consolidation, control of knowledge and skills

    Lesson type:

    Lesson on the comprehensive application of students' knowledge of learning

    Students in the class (auditorium):

    Methodological literature used:

    Lesson developments in computer science. 10th grade. O. L. Sokolova;

    Equipment used:

    Calculator program

    Calculator

    Subject. Amount of information. Hartley and Shannon formulas

    Progress of the lesson

    Repetition of material covered in class. Addition.(10 minutes)

    Training cards. Group work(20 minutes)

    Problem solving. Pair work (10 minutes)

    Test. (40 minutes)

    Peer review. Work on mistakes.

    Basic knowledge, skills and competencies

    Knowledge:

    Which events are equally probable and which are not equally probable;

    How to find the probability of an event;

    How to find the amount of information in a message for different events.

    Skills:

    Distinguish between equally probable and non-equally probable events;

    Find the amount of information for different events.

    Competencies:

    Cooperation

    Communication skills

    Creativity and curiosity

    Critical thinking (value judgment)

    Repetition of material covered in class

    Which events are equally probable and which are not equally probable?

    In 1928, the American engineer R. Hartley proposed a scientific approach to evaluating messages. The formula he proposed was as follows:

    I = log 2 K,
    Where K is the number of equally probable events; I is the number of bits in the message such that any of the K events occurred. Then K=2 I .
    Sometimes Hartley's formula is written like this:

    I = log 2 K = log 2 (1 / r) = - log 2 r,
    since each of the K events has an equally probable outcome p = 1 / K, then K = 1 / p.

    The ball is in one of three urns: A, B or C. Determine how many bits of information the message that it is in urn B contains.

    Solution.

    Such a message contains I = log 2 3 = 1.585 bits of information.

    But not all situations have the same probability of implementation. There are many such situations in which the probabilities of realization differ. For example, if they throw an asymmetrical coin or the “sandwich rule”.

    “Once when I was a child, I dropped a sandwich. Watching me guiltily wipe away an oil stain left on the floor, my older brother reassured me:

    Don't worry, the law of the sandwich worked.

    What kind of law is this? - I asked.

    The law that says: “A sandwich always lands butter side down.” However, this is a joke,” the brother continued. “There is no law.” It's just that the sandwich really behaves rather strangely: most of the butter ends up at the bottom.

    Let’s drop the sandwich a couple more times and check,” I suggested. - You'll have to throw it away anyway.

    We checked. Out of ten times, eight times the sandwich fell butter side down.

    And then I thought: is it possible to know in advance whether the sandwich will fall butter side down or up?

    Our experiments were interrupted by our mother..."
    (Excerpt from the book “The Secret of Great Commanders”, V. Abchuk).

    In 1948, the American engineer and mathematician K. Shannon proposed a formula for calculating the amount of information for events with different probabilities.
    If I is the amount of information,
    K - number of possible events, p i - probabilities of individual events,
    then the amount of information for events with different probabilities can be determined by the formula:

    I = - Sum р i log 2 р i, where i takes values ​​from 1 to K.

    Hartley's formula can now be viewed as special case Shannon's formulas:

    I = - Sum 1 / K log 2 (1 / K) = I = log 2 K.

    In case of equally probable events, the amount of information obtained is maximum.

    How to find the probability of an event?

    If the information contained in a message is new, understandable for a person, and adds to his knowledge, i.e. lead to a decrease in knowledge uncertainty, then the message contains information.

    1 bit - the amount of information contained in the message, which reduces the uncertainty of knowledge by 2 times.

    Example

    When throwing a coin, 2 events (cases) are possible - the coin will land on heads or tails, and both events are equally probable (if large quantities of tosses, the number of times the coin lands on heads and tails is the same). After receiving a message about the result of a coin fall, the uncertainty of knowledge decreased by 2 times, and, therefore, the amount of information received in this case is equal to 1 bit.

    How to find the amount of information in a message for different events?

    Calculation of the amount of information for equally probable events.

    If events are equally probable, then the amount of information can be calculated using the formula:

    N=2I

    where N - number of possible events,

    I - amount of information in bits.

    The formula was proposed by the American engineer R. Hartley in 1928.

    Task 1.There are 32 pencils in the box, all pencils are different colors. They pulled out a red one at random. How much information was obtained?

    Solution.

    Since drawing a pencil of any color from the 32 pencils in the box is equally probable, the number of possible events

    equals 32.

    N = 32, I = ?

    N = 2 I, 32 = 2 5, I = 5 bits.

    Answer: 5 bits

    Calculation of the amount of information for events with different probabilities.

    There are many situations where possible events have different probabilities of occurrence. Let's look at examples of such events.

    1. The box contains 20 pencils, of which 15 are red and 5 are black. You are more likely to pull out a red pencil at random than a black one.

    2. If a sandwich accidentally falls, it is more likely to fall with the butter side down (heavier side) than with the butter side up.

    3. The pond is home to 8,000 crucian carp, 2,000 pike and 40,000 minnows. The biggest chance for a fisherman is to catch a gudgeon in this pond, followed by crucian carp in second place and pike in third place.

    The amount of information in a message about an event depends on its probability. The less likely an event is, the more information it carries.
    P=K/N , where K is the number of cases of realization of one of the outcomes of the event, N - the total number of possible outcomes of one of the events
    2
    I = log 2 (1/ p ), where I - amount of information, p - probability of event

    Problem 1 . There are 50 balls in a box, of which 40 are white and 10 are black. Determine the amount of information in the message about drawing a white ball and a black ball at random.

    Solution.
    Probability of drawing a white ball

    P 1 = 40/50 = 0,8
    Probability of drawing a black ball
    P 2 = 10/50 = 0,2
    Amount of information about drawing a white ball
    I 1 = log 2 (1/0.8) = log 2 1.25 = log 1.25/ log 2 " 0.32 bit
    Amount of information about drawing a black ball

    I 2 = log 2 (1/0.2) = log 2 5 = log5/log2» 2.32 bit

    Answer: 0.32 bit, 2.32 bit

    What is a logarithm?

    Logarithm of the number a to the base b is the exponent to which a number must be raised a to get the number b.

    a logab = b, a > 0, b > 0, a ≠ 1

    Analysis of problems
    Determine the amount of information obtained during the implementation of one of the events if thrown
    a) an asymmetrical tetrahedral pyramid;
    b) a symmetrical and uniform tetrahedral pyramid.

    Solution.

    A) We will throw an asymmetrical tetrahedral pyramid.
    The probability of individual events will be as follows:
    p1 = 1 / 2,
    p2 = 1/4,
    p3 = 1/8,
    p4 = 1/8,
    then the amount of information received after the implementation of one of these events is calculated by the formula:
    I = -(1/2 log 2 1/2 + 1/4 log 2 1/4 + 1/8 log 2 1/8 + 1/8 log 2 1/8) = 1/2 + 2/4 + 3 / 8 + 3 / 8 = 14/8 = 1.75 (bits).
    b) Now let’s calculate the amount of information that will be obtained when throwing a symmetrical and uniform tetrahedral pyramid:
    I = log 2 4 = 2 (bit).
    2. The probability of the first event is 0.5, and the second and third are 0.25. How much information will we receive after implementing one of them?
    3. How much information will be obtained when playing roulette with 32 sectors?
    4. How much different numbers can it be encoded using 8 bits?
    Solution: I=8 bits, K=2 I =2 8 =256 different numbers.

    Task 2.The lake is inhabited by crucian carp and perch. It is estimated that there are 1500 crucian carp, and 500 perch. How much information is contained in reports that a fisherman caught a crucian carp, a perch, or caught a fish?

    Solution.
    The events of catching crucian carp or perch are not equally probable, since there are fewer perches in the lake than crucian carp.

    The total number of crucian carp and perch in the pond is 1500 + 500 = 2000.
    Probability of catching a crucian carp

    p 1 = 1500/2000 = 0.75, perch p2 = 500/2000 = 0.25.

    I 1 = log 2 (1/ p I), I 1 = log 2 (1/ p 2), where P 1 and P 2 - the probability of catching crucian carp and perch, respectively.

    I 1 = log 2 (1 / 0.75) » 0.43 bits, I 2 = log 2 (1 / 0.25) = 2 bits - the amount of information in the message to catch crucian carp and catch perch, respectively.

    The amount of information in a message to catch a fish (crucian carp or perch) is calculated using Shannon's formula

    I = - p 1 log 2 p 1 - p 2 log 2 p 2

    I = - 0.75*log 2 0.75 - 0.25*log 2 0.25 = - 0.75*(log0.75/log2)-0.25*(log0.25/log2) =

    0,311 + 0,5 = 0,811

    Answer:the message contains 0.811 bits of information

    Training cards (20 minutes)

    №1

    1. The box contained 32 multi-colored pencils. How much information is conveyed by the message that a red pencil was taken out of the box?

    2. The message that your friend lives on the 9th floor carries 4 bits of information. How many floors are there in the house?

    3. How many kilobytes will make up a message of 384 characters in a 16-character alphabet?

    4. The book, typed using a computer, contains 250 pages; each page has 40 lines, each line has 60 characters. How much information is in the book?

    5. Write the following numbers in binary system Numbers: 37 and 52.

    №2

    2. There are 8 shelves of books in the school library. Each rack has 4 shelves. The librarian told Vasya that the book he needed was on the fifth rack on the second shelf from the top. How much information did the librarian convey to Vasya?

    4. How much information does the message contain that reduces the uncertainty of knowledge by 2 times?

    5. Write the following numbers in binary number system: 12 and 49.

    1. When guessing an integer in a certain range, 8 bits of information were received. How many numbers does this range contain?

    2. You approached a traffic light when the light was red. After that it caught fire yellow light. How much information did you receive?

    3. The Pulti tribe has a 16-character alphabet. The Multi tribe uses a 32-character alphabet. The tribal leaders exchanged letters. The letter from the Pulti tribe contained 90 characters, and the letter from the Multi tribe contained 70 characters. Compare the amount of information contained in the letters.

    4. How many kilobytes will make up a message of 384 characters in an 8-character alphabet?

    5. Write the following numbers in the binary number system: 33 and 15.

    2. The message takes 2 pages and contains 1/16 KB of information. Each page contains 256 characters. How much information does one letter of the alphabet used convey?

    3. The message, written in letters from the 128-character alphabet, contains 11 characters. How much information does it carry?

    4. The box contains 64 multi-colored pencils. How much information does the message that a green pencil was taken from the box contain?

    5. Write the following numbers in binary number system: 17 and 42.

    1. How much information will the second player receive after the first player’s first move in a tic-tac-toe game on a 4x4 board?

    2. There are 8 balls in the lottery drum. How much information does the message about the first number drawn, for example, number 2?

    3. The number of bits of information in the message “Misha took one of 16 places at the Informatics Olympiad”?

    4. Raster graphic file contains black and white image with 16 gradations gray 10x10 pixels in size. What is the information volume of this file?

    5. Write the following numbers in binary number system: 28 and 51.

    1. The Multi tribe alphabet consists of 8 letters. How much information does a 13-character message contain?

    2. A raster graphic file contains a black and white image (without grayscale) with a size of 100x100 pixels. What is the information volume of this file?

    3. When guessing an integer in a certain range, 5 bits of information were received. How many numbers does this range contain?

    4. A telegram was received: “Meet car 6.” It is known that the train has 16 cars. How much information was received?

    5. Write the following numbers in binary number system: 23 and 38.

    1. A symmetrical tetrahedral pyramid is thrown. How much information do we receive in the visual message about its fall on one of the faces?

    2. What is the information volume of the text containing the word CODING in 8-bit encoding?

    3. Color (with a palette of 256 colors) raster graphic image has a size of 10x10 pixels. How much memory will this image take?

    4. The message that your friend lives on the 8th floor carries 4 bits of information. How many floors are there in the house?

    5. Write the following numbers in binary number system: 19 and 46.

    1. One card is selected from a deck of 32 cards. How much information do we receive in the visual message about choosing a particular card?

    2. How much information is required to binary coding each character in a 256 character set?

    3. The text takes up 0.5 KB of computer memory. How many characters does this text contain?

    4. The alphabet of the Pulti tribe consists of 128 letters. How much information does one letter of this alphabet carry?

    5. Write the following numbers in the binary number system: 11 and 35.

    1. “Is your friend at home?” they asked a student at school. “No,” he replied. How much information does the answer contain?

    2. The message takes up 3 pages of 25 lines. Each line contains 60 characters. How many characters are in the alphabet used if the entire message contains 1125 bytes?

    3. The box contains 16 multi-colored balls. How much information does the message that a yellow ball was taken from the box contain?

    4. When guessing an integer in a certain range, 5 bits of information were received. How many numbers does this range contain?

    5. Write the following numbers in binary number system: 13 and 41.

    1. What is the number of bits of information in the message “Vanya took one of 8 places at the Informatics Olympiad”?

    2. The book, typed using a computer, contains 150 pages; each page has 40 lines, each line has 60 characters. How much information is in the book? Define in KB.

    3. When guessing an integer in the range from 1 to N, 8 bits of information were received. What is N equal to?

    4. A message written in letters from the 32-character alphabet contains 30 characters. How much information does it carry?

    5. Write the following numbers in binary number system: 16 and 39.

    1. The Multi tribe alphabet consists of 16 letters. How much information does one letter of this alphabet carry?

    2. The message that your friend lives on the 8th floor carries 5 bits of information. How many floors are there in the house?

    3. Find the maximum number of books (each volume is 200 pages, each page has 60 lines, 80 characters per line), placed entirely on laserdisc with a capacity of 600 MB.

    4. How much information is needed to guess one of 64 numbers?

    5. Write the following numbers in binary number system: 14 and 53.

    1. A telegram was received: “Meet car 4.” It is known that the train has 8 cars. How much information was received?

    2. The size of the message, containing 2048 characters, was 1/512 of a MB. What is the size of the alphabet (how many characters are there in the alphabet?) in which the message is written?

    3. “Are you getting off at the next stop?” - they asked the man on the bus. “Yes,” he replied. How much information does the answer contain?

    4. The message, written in letters from the 16-character alphabet, contains 25 characters. How much information does the answer contain?

    5. Write the following numbers in the binary number system: 26 and 47.

    1. How many kilobytes is a message containing 12288 bits?

    2. How much information does a message contain that reduces knowledge uncertainty by 4 times?

    3. How many characters does a message written using a 16-character alphabet contain, if its volume is 1/16 of a MB?

    4. A group of schoolchildren came to the pool, which had 8 swimming lanes. The coach announced that the group would swim in lane number 4. How much information did the students receive from this message?

    5. Write the following numbers in the binary number system: 18 and 25.

    1. You approached a traffic light when the light was yellow. After that the light turned green. How much information did you receive?

    2. A 256-character alphabet was used to write the text. Each page contains 30 lines of 60 characters per line. How much information does 6 pages of text contain?

    3. There are 64 balls in the lottery drum. How much information does the message about the first number drawn contain (for example, number 32 was drawn)?

    4. When guessing an integer in a certain range, 7 bits of information were received. How many numbers does this range contain?

    5. Write the following numbers in the binary number system: 27 and 56.

    1. The message that Petya lives in the first entrance carries 2 bits of information. How many entrances are there in the house?

    2. A message written in letters from the 128-character alphabet contains 40 characters. How much information does it carry?

    3. An information message with a volume of 1.5 KB contains 3072 characters. How many characters does the alphabet with which this message was written contain?

    4. How many kilobytes will make up a message of 284 characters in a 16-character alphabet?

    5. Write the following numbers in the binary number system: 10 and 29.

    1. How much information will the second player receive after the first player’s first move in a game of tic-tac-toe on a 4x4 board?

    2. How many bytes of information are contained in 1MB?

    3. What was the number of possible events if, after implementing one of them, we received an amount of information equal to 7 bits?

    4. A 64-character alphabet was used to record the message. Each page contains 30 lines. The entire message contains 8775 bytes of information and takes up 6 pages. How many characters are there in a line?

    5. Write the following numbers in binary number system: 22 and 59.

    1. A message written in letters from the 128-character alphabet contains 40 characters. How much information does it carry?

    2. How much information will the second player receive in the game “Guess the Number” with the correct strategy if the first player guessed a number in the range from 1 to 64?

    3. A 256-character alphabet was used to write the text. Each page contains 30 lines of 70 characters per line. How much information does 3 pages of text contain?

    4. The text takes up 0.25 KB of computer memory. How many characters does this text contain?

    5. Write the following numbers in binary number system: 32 and 51.

    1. How many bits of information are contained in 1 KB?

    2. The first tribe has a 16-character alphabet. The second tribe uses a 32-character alphabet. The tribal leaders exchanged letters. The first tribe's letter contained 90 characters, and the second tribe's letter contained 80 characters. Compare the amount of information contained in the letters.

    3. How much information will be obtained when playing roulette with 32 sectors?

    4. Information is transmitted at a speed of 2.5 KB/s. How much information will be transmitted in 20 minutes?

    5. Write the following numbers in binary number system: 21 and 48.

    Solving optional problems (20 minutes)

    №1

    The message is written using an alphabet containing 8 characters. How much information does one letter of this alphabet carry? Solution: I = log 2 8 = 3 bits.

    Answer: 3 bits.

    №2

    The information volume of one character of a certain message is 6 bits. How many characters are in the alphabet with which this message was/was composed? Solution: N=2 I = 2 6 = 64 characters.

    Answer: 64 characters.

    №3

    The information volume of one character of some message is equal to 5 bits. What are the limits (maximum and minimum value) of the power of the alphabet with which this message is composed?

    Solution: N=2I = 2 5 = 32 — maximum value power of the alphabet. If there are at least one more characters, then 6 bits will be needed for encoding.

    The minimum value is 17 characters, because for fewer characters, 4 bits will be sufficient. Answer: 4 bits.

    №4

    A message written in letters from the 128-character alphabet, containing 30 characters. How much information does it carry?

    Given: N = 128, K = 30.

    Find: 1 t — ?

    Solution:

    1) I t = KI, unknown I;

    2) I = log 2 N = log 2 l 28 = 7 bits - the volume of one character;

    3)I t = 30*7 = 210 bits - the volume of the entire message.

    Answer:210 bits the volume of the entire message.

    №5

    A message composed using the 32-character alphabet contains 80 characters. Another message is written using the 64-character alphabet and contains 70 characters. Compare the amount of information contained in the messages.

    Given: N 1 = 32, K 1 = 80, N 2 = 64, K 2 = 70.

    Find: I t1 I t2

    Solution:

    I ) I 1 = log 2 Nl = log 2 32 = 5 bits - the volume of one character of the first message;

    1. Information. Information objects various types. Basic information processes: storage, transmission and processing of information. The role of information in people's lives.
    2. Perception, memorization and transformation of signals by living organisms.
    3. The concept of information quantity: different approaches. Units for measuring the amount of information.
    4. General lesson on the topic, independent work.

    Lesson.

    Goals:
    • educational– give the concept of the amount of information, introduce the probabilistic and alphabetical approach to determining the amount of information, introduce the units of measurement of information, and develop practical skills in determining the amount of information.
    • developing– continue the formation of a scientific worldview, expand vocabulary on the topic “Information”
    • educational– to develop interest in the subject, to cultivate perseverance in overcoming difficulties in academic work.

    1. Organizational stage (welcome, identifying those absent from class)

    2. Checking homework, activating knowledge

    on the topic “Information” received in the previous 2 lessons. In order to form speech, consolidate the fundamental concepts of this topic, checking homework is carried out in the form of a frontal oral survey on the following questions:

    1. What do you understand by information? Give examples. Suggested answers: usually students easily give examples of information that they themselves receive in the world around them - news, a school bell, new knowledge in the classroom, information obtained from reading popular science literature, experience and emotions gained from reading fiction, emotional experiences , obtained from listening to music, aesthetic canons, information about the costume and life of the 18th century, emotions obtained from viewing paintings by artists of the 18th century. It is advisable that students give examples of information in both technical and biological systems, etc. (the shape of the key bit contains information about the lock, a certain air temperature in the room contains information for the fire extinguishing system, a biological cell contains information about the biological object of which it is part is…)
    2. We know that two other important entities of the world, matter and energy, existed before living organisms on Earth. Did information and information processes exist before the advent of man? The expected answer is yes, there was. For example, the information contained in a plant cell about the type of plant, the conditions of germination, reproduction, etc. allows the plant to grow and reproduce without human intervention; information accumulated by generations of predatory animals forms conditioned and unconditioned reflexes of behavior of the next generations of predators.
    3. Matter is what everything is made of, energy is what sets everything in motion. Is it true that information rules the world? Justify your answer. Answer: Information truly rules the world. A signal from the Earth forces the satellite to change its trajectory; if we see a puddle on the way, then information about its appearance, that it is wet and dirty, forces us to decide to go around the puddle. A characteristic human gesture (an arm extended forward with a vertical palm) makes us stop; information on the key bit and the shape of the lock slot allows us to make a decision on choosing a key from a bunch; reflexes formed by generations of a certain species of birds control migration processes. By reading fiction, we absorb the life experiences of the characters, which influences the adoption of certain decisions in our own lives; By listening to certain music, we form a corresponding taste, which influences our behavior, environment, etc.
    4. Name the types of information according to the form of presentation, give examples. Answer: numerical (price of a product, numbers in the calendar), text (a book written in any language, textbook text), graphic (painting, photograph, STOP sign), sound (music, speech), video (animation + sound), command (restart the computer - press Ctrl+Alt+Delete/Enter).
    5. What actions can be performed with the information? Answer: It can be processed, transmitted, stored and encoded (represented).
    6. Name the ways a person perceives information. Answer: a person perceives information using 5 senses - vision (in the form of visual images), hearing (sounds - speech, music, noise...), smell (smell using nasal receptors), taste (tongue receptors distinguish sour, bitter, salty , cold), touch (temperature of objects, type of surface...)
    7. Give examples of sign systems. Answer: natural language, formal language (decimal number system, music notes, road signs, Morse code), genetic alphabet, binary sign system.
    8. Why does a computer use a binary sign system to encode information? Answer: The binary sign system is used in the computer, since existing technical devices can reliably store and recognize only two different states (signs).

    3. Probabilistic approach to measuring the amount of information (see multimedia presentation).

    Today we will talk about measuring information, i.e. determining its quantity. (Students write down the topic of the lesson in a notebook - "Amount of information"). Which book do you think contains more information (show thin and thick)? As a rule, students choose the thick one, since it contains more words, text, and letters (some guys ask what type of information is contained in the book - graphic or textual? It should be clarified that the book contains only textual information). Which message conveys more information to you: “tomorrow we will study according to the usual schedule” or “tomorrow instead of literature there will be chemistry”? Students will intuitively answer the second because, despite almost the same number of words, the second message contains more important, new or relevant information to them. And the first message does not contain any new information at all. Did you notice that you looked at the information in terms of the number of characters it contained and in terms of its semantic importance to you? There are 2 approaches to determining the amount of information – semantic and technical (alphabetical). Semantic is used to measure information used by a person, and technical (or alphabetic) is used by a computer.

    For a person, obtaining new information leads to an increase in knowledge, or to a decrease in uncertainty. For example, saying that tomorrow is Wednesday does not reduce uncertainty, so it contains no information. Let us have a coin that we throw onto a flat surface. We know before the toss that one of two things can happen—the coin will end up in one of two positions: “heads” or “tails.” After the throw, there is complete certainty (we visually receive information that the result is, for example, “heads”). An information message that “heads” has fallen reduces our uncertainty by 2 times, since one of two things has been received information messages.

    In the surrounding reality, there are quite often situations when more than 2 equally probable events can occur. So, when throwing a six-sided die, there are 6 equally probable events. The event of one of the sides of the cube falling out reduces the uncertainty by 6 times. The larger the initial number of events, the greater the uncertainty of our knowledge, the more information we will receive when receiving an information message.

    The amount of information can be considered as a measure of the reduction in knowledge uncertainty when receiving information messages.(Students write down what is in italics in their notebooks.)

    There is a formula that relates the number of possible information messages N and the amount of information I carried by the received message:

    N=2I(N – number of possible information messages,I – the amount of information that the received message carries).

    To quantify any quantity, it is necessary to determine the unit of measurement. For example, to measure length, a certain standard is chosen - meter, mass - kilogram.

    4. Units of measurement of information

    For unit of measurement of the amount of information the amount of information that is contained in the message is accepted, reducing the uncertainty of knowledge by 2 times. Such a unit is called bit.

    Let's return to the above-mentioned receipt of an information message that “heads” came up when tossing a coin. Here the uncertainty has decreased by a factor of 2, therefore this message is equal to 1 bit. The message that a certain side of the die has fallen reduces the uncertainty by a factor of 6, therefore this message is equal to 6 bits.

    The minimum unit of measurement of the amount of information is a bit, and the next largest unit is a byte, and

    1 byte = 8 bits

    IN international system SI uses the decimal prefixes “Kilo” (10 3), “Mega” (10 6), “Giga” (10 9), ... In a computer, information is encoded using the binary sign system, therefore, in multiple units of measuring the amount of information, a coefficient of 2 n is used .

    1 kilobyte (KB) = 2 10 bytes = 1024 bytes
    1 megabyte (MB) = 2 10 KB = 1024 KB
    1 gigabyte (GB) = 2 10 MB = 1024 MB
    1 terabyte (TB) = 2 10 GB = 1024 GB

    A terabyte is a very large unit of information measurement, so it is used extremely rarely. All the information that humanity has accumulated is estimated at tens of terabytes.

    5. Determining the amount of information

    Task 1. Determine the number of exam tickets if the visual message about the number of one drawn out ticket carries 5 bits of information. The number of tickets is the number of information messages. N=2 I = 2 5 = 32 tickets.

    Task 2. How much information does the assessment message convey? test work? You can get 2, 3, 4 or 5 for the test. There are 4 messages in total (N=4). The formula takes the form of an equation - 4=2 I = 2 2 , I=2.

    Tasks for self-execution: (the formula should always be in front of your eyes, you can also post a table with powers of 2) (3 min.)

    1. How much information do we receive in the visual message about the fall of a symmetrical octagonal pyramid onto one of its faces? Answer: 3 bits, because the number of possible events (messages) N=8, 8=2 I = 2 3 , I=3.
    2. Balls with numbers are taken out of an opaque bag and it is known that the information message about the ball number carries 5 bits of information. Determine the number of balls in the bag. Answer: there are 32 balls in the bag, because N=2 I = 2 5 = 32.
    3. When playing tic-tac-toe on a field of 4 X 4 cells, how much information will the second player receive after the first player’s first move? Answer: Number of events before the start of the game N=16, 16=2 I = 2 4 , I=4. The second player after the first player's first move will receive 4 bits of information.

    6. Alphabetical approach to determining the amount of information

    The essence of the technical or alphabetic approach to measuring information is determined by the number of signs of a certain alphabet used to represent it. For example, if 5 characters of the Roman alphabet are used to represent the number XVIII, then this is the amount of information. The same number, i.e. the same information, can be written in decimal system(18). As we can see, we get 2 signs, i.e. a different value of the amount of information. In order for the same amount of information to be obtained when measuring the same information, it is necessary to agree on the use of a certain alphabet. Since in technical systems If the binary alphabet is used, it is also used to measure the amount of information. The number of characters in the alphabet is N=2, N=2 I, I is the amount of information that one character carries. 2 2 = 2 1, I=1bit. Interestingly, the very unit of measurement of the amount of information “bit” got its name from the English phrase “ B.I. nary digi T" - "binary digit".

    The greater the number of characters in the alphabet, the greater the amount of information carried by 1 character of the alphabet.

    Determine yourself the amount of information that 1 letter of the Russian alphabet carries.

    Answer: a letter of the Russian alphabet carries 5 bits of information (with an alphabetic approach to measuring information).

    How much information is contained in one 8-bit character? binary code(symbol A – 11000000)? Answer: 8 bits or 1 byte.

    Practical work(handout – instruction card to execute practical work) to determine the amount of information using a calculator:

    1. Determine the information volume of the following message in bytes (the message is printed on a card, cards on each desk):

    The amount of information a sign carries depends on the likelihood of its receipt. In Russian written speech, the frequency of use of letters in the text is different, so on average, per 1000 characters of a meaningful text there are 200 letters “a” and a hundred times less number of letters “f” (only 2). Thus, from the point of view of information theory, information capacity characters of the Russian alphabet is different (the letter “a” has the smallest, and the letter “f” has the largest).

    Determine the number of characters (number of characters in a line * number of lines) - 460 characters = 460 bytes

    Enter and save this text on your desktop using Notepad. Determine the information volume of this file using a computer (Select the APKM object à Properties) Answer: 460 bytes.

    You can write this text in the form sound file 1.wav and compare with text (Start à programs à standard à entertainment à sound recording...). Determine its information volume using a computer - 5.28 MB (5,537,254 bytes). Explain to students that this difference is caused by the difference in the representation of sound and text information. The features of this representation will be discussed later.

    2. Determine how many textbooks will fit on a disk with an information volume of 700 MB. Answer: 1. determine the number of characters in the textbook (number of characters per line * number of lines per page * number of pages) 60 * 30 *203 = 365400 characters = 365400 bytes = 365400/1024/1024 MB = 0.35 MB. Number of textbooks K=700/0.35= 2000 textbooks.

    7. Summing up the lesson in the form of a frontal survey:

    1. What approaches exist to determining the amount of information? Answer: there are 2 approaches to measuring the amount of information - semantic and technical or alphabetical.
    2. What is the difference between one approach and another? Answer: with the semantic approach, the amount of information is a measure of reducing the uncertainty of knowledge when receiving an information message, with the alphabetic approach - the number of characters in the message * the amount of information carried by 1 character of the alphabet.
    3. Name the units of information from smallest to largest. Answer: bit, byte, KB, MB, GB, TB.
    4. What is the difference between a byte and a KB, a KB from a MB, and a MB from a GB? Answer: 1024 (2 10).
    5. How many bits are there in 1 byte? Answer: 8.
    6. What is a bit in the semantic and alphabetical approach to determining the amount of information? Answer: with a semantic approach, the bit is a reduction in the uncertainty of knowledge by 2 times when receiving an information message; in the alphabetical approach, a bit is the information capacity of one character in binary coding.

    8. Homework

    1. Paragraphs 1.3.1 and 1.1.3 (N. Ugrinovich “Informatics. Basic course. Grade 8”) 2 questions on page 29 (1. Give examples of information messages that lead to a reduction in the uncertainty of knowledge. 2. Give examples of information messages that which carry 1 bit of information).
    2. Tasks: 1. How much information does the assessment message for the test contain? 2. Calculate how much information in bits is contained in 1 KB, 1 MB? 3. Calculate how many books (take any fiction book at home) will fit on a 1.44 MB floppy disk.