• Internet information resources. Information resources of the global Internet Information resources and services of the Internet

    Federal Agency for Education

    State Educational Institution of Higher Professional Education "Kemerovo State University"

    Department of Marketing

    Information resources Internet

    Performed by a student of group E – 063

    Lopatkina O. K.

    Checked by Ph.D., Associate Professor

    Shandakov Yu. D.

    Kemerovo 2010


    Introduction………………………………………………………………………………………...3

    The concept of an information resource and its diversity……………………….5

    Internet information resources in Russia……………………………..15

    Conclusion………………………………………………………………………………….21

    List of used literature……………………………………………………...22


    Introduction

    The Internet began as a defense project funded by the Defense Advanced Research Projects Agency of the US Department of Defense. The goal of the project was to develop a computer network designed to ensure the stable functioning of the country's governance system in the conditions of a nuclear war. The first documents describing the technical requirements for the system appeared in 1964; in 1969, the first four computers were united into an actual network. This network was called ARPANET. In 1971, the network already consisted of 14 computers, and in 1972 - 37. The seventies were a time of network growth and debugging of internetworking technology within the ARPANET. In 1982, the Transfer Control Protocol (TCP) and Internet Protocol (IP) were published. From that moment on, the combination “TCP/IP” entered the vocabulary of network technology specialists, which was firmly entrenched in the entire family of documents and standards related to the operation of the ARPANET, and later the Internet. The Internet itself appeared as a result of a large computer program of the US National Science Foundation. To conduct scientific research, the Foundation established several computing centers throughout the country and equipped them with supercomputers. In order for specialists involved in fundamental research to be able to use the computing power of these computer centers, they were all united into a single computer network. Initially, it was planned to use ARPANET for these purposes, but for various reasons the administration of the latter did not allow the connection of American universities to the defense network. As a result, the Foundation created its own network, NSFNET. The internetworking tools developed within the ARPANET project were chosen as the basis of this network. At this time, the first six Internet domains appeared: gov, mil, edu, com, org and net. Each of these abbreviations hides its own network: gov - a network of government organizations, mil - a network of military organizations, edu - a network of universities, com - a network of commercial organizations, org - non-governmental and non-profit organizations, net - a network of organizations responsible for the functioning of the Network itself .

    By capitalizing the word “network,” we are making a distinction between the networks that make up the Internet and the Internet itself, as a collection of different networks. Currently, the Internet, in addition to NSFNET, includes about several hundred different networks. What all these networks have in common is the fact that to exchange information with each other they use a single mechanism called internetworking technology - the TCP/IP protocol family. The main points of this technology are a unified system of addresses for all computers on the network, a unified form of exchange of information messages between networks - the IP protocol and data exchange protocols with software installed on network computers that use IP to exchange information.

    Having become a means of scientific communication within the NSF program, the Internet has become the primary means of pre-publication of scientific results. Almost all laboratories in the world with access to the Internet began to post their preprints in electronic form in Internet archives, and only after that released printed copies of these works.


    The concept of an information resource and its diversity

    An information resource on the Internet is an array of data compiled and sorted according to a certain criterion, type of information, which has a strict structure. Thanks to information resources, the Internet has gained deserved popularity in recent years. Anyone who has connected their computer to the World Wide Web has access to resources.

    So, the Internet is a network of networks that connects millions of computers around the world. Here you can find everything from government regulations and library index cards to contemporary music and biographies of Hollywood stars; from the texts of modern bestsellers to masterpieces of world classical literature; from a gallery of children's creativity to complex databases and serious system programs.

    It should be noted that, first of all, World Wide Web pages were created on the Internet by almost everyone who was involved in publishing materials before the advent of the Web. Currently, all electronic publications on the Web can be divided into several categories. These are, firstly, advertising pages, secondly, these are electronic periodicals, and thirdly, these are serious electronic monographs and reference books. Let us dwell in detail on each of these types of publications.

    Advertising materials are published most widely on the Internet. Web pages with various types of advertising can be found most often. Typically, advertising is hosted on its own servers. Almost all companies that are in one way or another connected with the computer business have their own pages. If you have the desire and time to get acquainted with press releases from Microsoft, IBM, DEC, Apple, Symantec, etc., there is no need to read computer magazines. Companies themselves publish various types of information about themselves, ranging from the history of creation and founding fathers, to the characteristics of the company’s latest products and price lists. Here you can also find reports from company representatives at various conferences and exhibitions, as well as their interviews with various specialized publications. All these materials are richly illustrated with graphics and photographs. Often you can even listen to greetings or addresses from company executives to Web users, if the capabilities of the computer and its software allow it, of course. The quality of these advertising materials is often not inferior to their printed counterparts.

    Catalogs occupy a special place among Web materials. If we are talking about commercial information or commercial products, then they can also be classified as advertising, but in addition to the commercial use of the Web, which, according to some estimates, accounts for 30-40% of the entire set of Web pages, there is also non-commercial service to network users, which is carried out either at public expense or through the enthusiasm of individuals. There's just so much in Web catalogs: these are lists of famous film actors, these are tables of contents of CDs, these are lists of hit parades and much more. All this is illustrated, for example, with photographs of film stars, or recordings of fragments of musical compositions. There are reference books on ancient art, reference books on the history of aviation or navy, and catalogs of science fiction.

    Illustrated magazines were the first of the printed periodicals on the World Wide Web. There are several reasons for this activity. Firstly, this is the age and gender of Internet users. According to numerous surveys conducted among people using the network in 1995, more than half (69%) of respondents were people aged 25 to 45 years. The largest group of users are 30-40 year old men. Women make up no more than 20% of all Internet users. Secondly, this is the professional composition of users and their education. Up to 70% are engaged in business or computer technology, more than half have higher education. The stereotype that network technologies are for the young is not entirely accurate. Students and young people under 30 make up no more than one fifth of all network users. Accordingly, entertainment magazines are divided in the same proportion. The bulk are made up of such publications as Fortune, Time, Playboy, Penthouse, UFO Library, etc. It should be noted that currently electronic copies of printed publications are rather a digest of the printed publication. Companies producing printed materials, on the one hand, are interested in attracting new readers, and on the other hand, they must not lose traditional clients of printed publications. To solve this problem, a combined approach is used, which consists in the fact that, firstly, a network user can subscribe to a publication without leaving his computer, transferring the required amount of money to the account of the publisher or intermediary company, or debiting the corresponding amount from his credit card. cards. At the same time, operations with cards can also be performed using your computer. Secondly, you can subscribe to the electronic version of the publication. In this case, after transferring the money, the user is registered in the information service of the publication, i.e. assign a specific name and password, using which the user can log into protected sections of the publication’s database. In connection with the latter opportunity and the expansion of data protection means when exchanging them over the network, the number of commercial Internet information resources closed for public use has begun to increase.

    Electronic newspapers are another type of periodic publication on the network that has begun to master the new Internet technology - the World Wide Web. Electronic newspapers appeared somewhat later than magazines, with a delay of about six months. The main obstacle for the newspaper business is the difficulty of preparing publications for the Web. The first publications of this kind were Computer World, PC Magazine Daily and other computer publications. The presentation of materials in these newspapers differs from similar materials in the printed original. First, the page size of an electronic publication is limited by the size of the computer display screen. Secondly, when submitting materials, you should take into account such capabilities of modern programs for controlling the display of information on display screens, such as the simultaneous opening of several windows in which you can place either different pages of text or place a picture explaining the text. This allows you to “condense” the information on the display screen. This simply cannot be done in a regular print publication. Authors of electronic publications also take into account the possibility of audio presentation of material, which is extremely useful when publishing interviews. In addition, reviews often publish not extensive text, but a set of graphic materials: histograms, pie charts, accompanied by a text commentary. It is quite obvious that this gives rise to a completely new type of publication, very different in form and design from the traditional printed original.

    The most fundamental electronic publications on the Web are analogues of large printed publications such as the Bible or various types of encyclopedias implemented in World Wide Web technology. It should be noted right away that in this case we are not talking about simple copying of texts. Materials of this kind are immediately implemented taking into account the limitations and new possibilities of computer technology. An example of a hypertext organization is Britannica Online. This project is associated with the organization on the Internet of one of the oldest encyclopedias in the world - the Encyclopedia Britannica. The electronic version of Britannica is a rather complex conglomerate of information technology. The main thing in this information database is the presentation of encyclopedia articles in the form of hypertext pages in the format of hypertext markup language - HTML, which is used to publish materials on the World Wide Web. Each article has connections with the next article in order in the text version of the encyclopedia and the previous article. In addition, each article is linked to other articles by hypertext links, which are used in the text of the article, and a database of personalities, countries and graphic illustrations. In addition, to navigate the encyclopedia, there is a table of contents, an index, and a mechanism for searching articles by keywords. The table of contents and index are lists of hypertext links. Moreover, if in the table of contents the links point to encyclopedia articles, then in the index the links point to lists of articles, which in turn are also lists of hypertext links. The encyclopedia search engine is of particular interest. It is based on the use of distributed Internet information retrieval system technology - WAIS (Wide Area Information System). The result of WAIS is a list of hypertext links to encyclopedia articles that the system generates on the fly. This list is no different in appearance from the index list, for example, but from the point of view of the system, these are completely different objects. The index list actually exists as a World Wide Web database file, and the lists that WAIS generates do not really exist, but are temporary entities generated by user requests. Registered users can save the lists obtained in this way for later use when working with the system.

    This project is not a charity and you have to pay money for access to the encyclopedia resources. For a user using the database for his “home” purposes, the fee is set at $150 per year, for university students - $120 per year, for businessmen - $300 per year. Subscribers can be not only individuals, but also organizations; here the fee is set depending on the size of the organization and the type of its statutory activities. In each specific case, the amount of the contribution is determined individually. However, not everything is so tough. In order to get acquainted with the encyclopedia, it is not at all necessary to pay the fees listed above. There are two ways to decide whether you need the Encyclopedia Britannica in your work or not. The first way is to demonstrate its capabilities. Here the user is asked to walk along a pre-defined route and understand how the system works. The second way is much more interesting - a seven-day free use of the encyclopedia. In this case, the reader must register, receive a name and password, and after that he can work in the system.

    In addition to Britannica, there are other encyclopedias on the Internet, for example, the Science Fiction Encyclopedia or the Wine Encyclopedia. Moreover, access to them is completely free.

    At one time, any organization had information divisions in its structure. In scientific organizations they were called departments of scientific and technical information. The main task of these units was to search for information on the organization’s profile, maintain a thematic library and promptly inform specialists about periodical materials. The idea that you can find everything on the Internet yourself is erroneous. The network is too large for each user to be able to view all available information archives, even if they have the necessary search services. Therefore, one of the forms of activity on the Internet has become the organization of commercial information services. The most popular commercial services of this type are Yahoo and Galaxy.

    Information services are hierarchical lists of information resources, much like library subject catalogs. As a rule, in addition to the subject catalog, there is a catalog of Internet information resources, organized by country, and a special page for launching queries using keywords. Users can use each of these alternatives to their liking. For the first acquaintance with the system, they usually use a thematic catalog and, moving through it, look for the necessary World Wide Web pages. Here, however, one should take into account the fact that the idea of ​​the subject area and the terms that characterize it between the creators of the system and the user may differ significantly. Adaptation requires a certain amount of time, which at first glance is wasted. However, the same thing happens when working with conventional information retrieval systems and when meeting new journals for the first time. The creators of information catalogs strive to get closer to perfection, but this is not yet very successful. There is no single thematic division on the Internet like the Universal Decimal Classification adopted in librarianship, so subject catalogs can differ significantly from one another. So the World Wide Web Consortium directory is very long. The first page takes up several display screens, which is not very convenient. The Yahoo and Galaxy directories are divided into many subdirectories. This makes each page more compact and easier to browse. However, this also has its disadvantages: a large number of nesting pages within each other can lead to the user losing orientation in the catalog tree.

    National resources also have different ordering systems. In some cases, the administrative division of the country is taken as a basis, and in others, resources are ordered by belonging to the sphere of human activity, for example, university resources, government resources, commercial information resources, etc. There are directories that combine both approaches, and then the database of national information resources contains two or more tables of contents.

    Searching for pages using keywords is an integral part of any information catalog on the Web. Typically, special fields for entering keywords are used for searching, which can be combined into search queries using connective words “or”, “and”. In this case, the word “and” indicates that the keywords must occur simultaneously in the searched document, and the word “or” requires that one of the specified words appear in the document. The search page can be combined with the first page of the directory, as is done in Yahoo, or it can be an addition to the directory and loaded separately. How this is implemented in Galaxy. When creating a request, you can use a simple form, where you simply enter a list of words, or move on to a more complex structure, where you can even specify parts of documents where the search should be carried out and where not. Search results are displayed in the form of a list of names of found documents, which is also a Web page. The difference between this page and others is that such a page does not exist in any database on the network. It is created “on the fly” by the search program, but if the system allows it, it can be saved as one of the user’s pages.

    Concluding the conversation about catalogues, you should pay attention to the catalog of electronic publications. The authors of this catalog looked through the entire Internet and included in their list only those links that are related to the publication of materials on the Internet. The name of this directory is EDOC - Electronic Documents. As the name suggests, this catalog contains not only newspapers, magazines, reference books, but also any types of electronic texts and even non-texts (catalogs of paintings, museums, exhibitions, etc.).

    The last type of electronic documents that are worth mentioning are publications prepared taking into account new computer technologies and specifically for publication on the Internet. It is simply impossible to publish these materials in the form of ordinary books. All these products are usually called virtual literature. Virtual literature can be divided into specialized literature, entertainment literature, and research in the field of virtual publications. Special literature consists of specialized journals, specialized reference books and textbooks. Entertainment literature is illustrated magazines, fiction (poetry, science fiction, fairy tales for adults and children, etc.), and research is publications designed to demonstrate new ways of organizing and presenting information to Internet users.

    Special magazines are mainly computer publications such as the magazine of the National Center for Applied Software for Supercomputers "Access". In this publication, each article is presented in the form of hypertext. The body of the article contains built-in table graphics and even the results of dynamic modeling in the form of movie clips. As a rule, graphics are prepared separately from the text and displayed at the special request of the user. At one time, very interesting material was presented on modeling the process of the collision of comet Shoemaker-Levy with the planet Jupiter. Communication about this experiment was accompanied by short films showing the collision process from various angles. Also interesting was the message from the US National Aeronautics and Space Exploration Agency (NASA), which dynamically displayed the surface of the Earth during the flight of one of the American spacecraft in 1995.

    Directories are, perhaps, the first publications on the Internet that were implemented in World Wide Web technology. In 1994, the best special edition of the World Wide Web was recognized as a reference book on oncology, which was designed for three different types of users: patients, students and specialists. The directory contained information both about typical oncological diseases and about special cases that a doctor may encounter when treating various cancers.

    Textbooks are the third type of hypermedia publications that are available on the Internet. None of the types of virtual literature has a deeper justification and more detailed elaboration of implementations than a textbook distributed on the network. In the United States, as part of the work to create an information superhighway, a virtual public university project was launched. To justify the possibility of such an educational institution, a historical analysis of various forms of higher education was carried out, from antiquity to the present day. Considering the level of development of modern computer technology, it was recognized that a virtual public university is quite real and will provide its students with a sufficient level of professional training. In addition to the functions of a standard college, such a university could easily be entrusted with the tasks of retraining industrial personnel. Naturally, a university cannot do without textbooks. But in a virtual university, benefits should also be virtual. For medical faculties, there is now no need to go to the anatomy and cut up corpses. All this can be done in virtual reality or, ultimately, behind a display screen. Of course, there is an opinion, and it is quite justified, that real practice can never be replaced by models. There are many examples of this. But in this case, it seems worth talking about a balanced combination of using software and practical exercises. Returning to the virtual university manuals, we can say that historians can now, at their desks, view archival materials and videos of historical events, listen to the voices of historical figures, lawyers can participate in imaginary trials, and physicists can conduct experiments on the fission of radioactive materials without threatening this to others. In a word, this is fantasy, but real fantasy, the elements of which have already been realized.

    The last object that I would like to touch upon when describing the Internet as a virtual publishing house is systems such as MUD (Multi-User Dungeons). This can be translated into Russian as “Multiplayer Dungeons”, which became a development of the game “Dungeons and Dragons”. If we consider that comics are publications, then “dungeons” are also electronic publications. The fact is that any fairy tale, detective story, chronicle can also be organized in the form of a distributed software environment that will imitate the original source. However, network technology introduces its own unique features into such an implementation - the possibility of simultaneous work of several users and a real time scale. A MUD is essentially a virtual second life in a fictional environment. It can be described by roles, and in this case, each of its characters begins to act in accordance with the laws of their character, and this is a real person who may be many thousands of kilometers away from other participants in the action. Until recently, MUD, like the entire Internet, was the domain of amateurs, but high-quality implementation of graphics and video requires professionals and large financial investments. As the number of network users increases, the industry of virtual environments and their commercial use will develop. Psychologists are already foreseeing new problems that this technology will cause, new types of mental disorders, mania, etc. In their opinion, the main problem is that society is technically ready for the implementation of virtual environments, but legally and morally it is not ready.

    But the Internet has already become a serious factor in the life of developed countries. The US President in 1995 announced the creation of the information highway as a national program, the European Community in 1994 considered the program for informatization of Europe, in Russia successes in the use of computer networks are more modest, but, nevertheless, about three large connection projects are being developed in the country to the Internet in the sphere of education and science. Apparently, the Internet will become one of the main means of international communications and the development of publishing in this direction is completely justified and timely.

    Internet information resources in Russia

    Today the Internet can be characterized as:

    media,

    information training structure,

    grand world reference book,

    entertainment array,

    bulletin board.

    Internet resources can be classified according to many parameters. Above is a classification by type of information presentation, and now let’s look at examples of sites dedicated to various thematic areas.

    Government Pages

    There are official pages of government bodies on the Internet. The website “Official Russia” http://www.gov.ru publishes information from the Government, the Security Council, the Federal Assembly, various legislative acts, as well as the official page of the President of Russia.

    Industry information

    Almost every field of activity is represented on the Internet. Truly, the lion's share of all educational resources is located on the websites of universities and other universities in the country. This is general background information for applicants, a description of the activities of various faculties and departments, abstracts and student academic works, reports on current scientific developments. In addition, universities in large cities often post materials about the life of their region on their websites.

    More and more universities are organizing distance learning. Students listen to lectures from teachers via the Internet, receive assignments and send the results of their work also via computer communication. Students take tests and exams not to teachers, but to a computer that objectively evaluates their knowledge. While there is no consensus on the quality of this teaching method, there are still many problems, including technical ones. However, optimists consider mass distance learning to be in the near future.

    Industrial enterprises provide information about their capacities, goods and services on the Internet. Indicative in this case is the page of the world-famous confectionery factory “Red October”, which is located at http://www.redoct.msk.ru. It must be said that not all plants and factories have their own servers on the Internet. More often their activities are described on city pages.

    Medical institutions publish information about their work, opportunities, and successes. Pharmacies and drug manufacturing companies advertise new drugs and provide the opportunity to order their products. There is also a Russian medical server http://www.rusmedserv.com/, the creators of which have collected a large number of links to medical resources on the Russian Internet.

    You cannot ignore the electronic pages of various cultural centers, museums, theaters, and philharmonic societies. Not all of the listed organizations have the opportunity to post their materials on the Internet. And, nevertheless, the cultural life and rich cultural heritage of Russia are reflected on the pages of the Hermitage http://www.hermitage.ru/ and the Bolshoi Theater http://www.bolshoi.ru/, the Moscow State Academic Philharmonic http://www .philharmonia.ru/ and on the pages dedicated to the 125th anniversary of F.I. Shaliapin http://www.sitek.ru/~shaliapin/

    There is a large number of Russian library resources. Such giants as:

    State Public Scientific Technical Library http://www.gpntb.ru/,

    Russian National Library http://www.nlr.ru/,

    Russian State Library http://www.rsl.ru/,

    The State Public Historical Library http://www.shpl.ru has rich colorful pages on the Internet.

    Moreover, the State Public Scientific Technical Library and the Russian State Library provided Internet access to their electronic catalogs.

    The catalog is open 24 hours a day to all visitors of the library’s electronic space, and can be used free of charge.

    Commercial information

    Commercial enterprises and organizations have launched advertising activities on the Internet. Finding business partners, presenting your products, recruiting personnel for qualified work - these are the goals of placing tens of thousands of electronic pages in the global computer space.

    In this section, special mention should be made of book publishing companies that offer their services to all interested parties, including libraries. For example, the website of the publishing house “Radio and Communication” http://www.trade.com.ru/books/. On the very first page, the publishing house not only publishes a list of new products, but also offers to immediately place an order for the selected books.

    The site “Books of Russia” http://books.ru/ is grandiose in its volume. A huge number of topics, a complete price list. A real electronic store.

    Press on the Internet

    Most central newspapers and magazines have their own electronic versions. However, not all of them are provided free of charge. More often, the publisher first requires you to pay for an electronic subscription, and then sends the subscriber an access password to its website. Let's look at the electronic pages of the newspaper “Izvestia” http://www.izvestia.ru. Here you can read TV programmes, sports news and weather information for free. Clicking on the “Izvestia” link will provide the user with the opportunity to get acquainted with the announcement and the contents of the issue, but an attempt to read the article or get the entire issue will not be successful - the computer will require you to enter a name and password. On the other hand, there are completely free periodicals. For example, the magazines “Friendship of Peoples”, “New World”, “Star”, “Youth” and others, whose addresses are presented on the “Magazine Hall” website. As well as independent online newspapers that do not have a paper counterpart: GAZETA.RU and LENTA.RU

    Each publishing house can independently decide to publish its publication in electronic form. To the existing examples of full-text periodicals we will add the addresses of such magazines as “PC World” http://www.computerworld.ru, “If” http://sf.convex.ru/esli/, “World of Health” http:// www.agama.com/MEDICINE/hw/

    Entertainment information

    If just a few years ago the concept of a home computer remained beyond the realm of reality for many, now the number of personal computers in the personal use of citizens has increased significantly. Moreover, with the help of telephone lines and modems - special devices for converting telephone signals into computer codes, users were able to connect to the global information network directly from home. “There is no friend according to taste,” says a well-known proverb. But on the Internet you can find like-minded people, no matter how rare the topic discussed. Those who are interested in modern music will definitely find articles about famous and rising “stars” in electronic resources. Popular films, TV series, performances, one way or another, are reflected on Internet sites. There are countless pages dedicated to artists and singers: admirers of talent create entire computer photo galleries about their idols. The same thing happens with writers and poets who are popular among young people. Correspondence clubs for lovers of the work of a particular author are organized. Passionate people collect information and post biographies, photographs, articles and interviews about writers of interest to them on websites.

    The number of sites dedicated to modern science fiction writers, creators of detective and lyrical novels is especially large. Amateurs and enthusiasts are successfully “building” huge libraries, where you can find the complete works of all famous science fiction writers in the world in Russian and English, detective novels for every taste, works of classical prose and poetry.

    Personal pages

    There is the concept of “Home” or otherwise “Personal” user pages. Having extensive experience in working with the Internet, you are convinced that there are not many private resources that are prepared, designed and maintained properly, so I will only note their existence, without characterizing this area of ​​​​the network.

    Availability of information on the Internet

    So far we have been talking about publicly accessible Internet pages. Of course, it's great to study such volumes of information for free. However, there is a considerable amount of data that is closed to the average user. One example of password-only information was already discussed when talking about the press. The authors of the Izvestia website provided part of the data for free viewing as advertising, and the main part - only for subscribers of the electronic newspaper. Other organizations do the same: in order for a potential client to become familiar with the contents of the page, he is given the opportunity to view an advertising version of a newspaper or magazine, a demo of a search in the case of an electronic catalog, and the like. Well, the product itself or access to databases is sold in the same way as any market item.


    Conclusion

    To summarize, it is worth emphasizing that Internet technologies in the use of information resources are moving forward by leaps and bounds, and this greatly facilitates the search and collection of information on the required topic. At the same time, there are some shortcomings that will be corrected over time. Such disadvantages include some clogging of the network with useless information, which most often interferes with the search for certain materials. The lack of a unified program that systematizes information and access to it is also a significant obstacle.


    List of used literature

    1. Bagrin Yu. Internet as a new marketing channel // Marketing and advertising. - 1999. - No. 11.

    2. Bokarev T.A. Ways to promote a company on the Internet // Marketing and marketing research in Russia. - 1999. - No. 4.

    3. Information resources of Russia - M.: STC Informregister, 2001.

    4. Popov I. I., Maksimov N. V., Khramtsov P. B. Introduction to network information resources and technologies: Textbook - M.: Russian State University for the Humanities, 2001.

    5. Shvartsman M.E. On the issue of cataloging Internet resources // World of bibliography. - 1998. - No. 5.

    Internet information resources

    P. Khramtsov, RRC "Kurchatov Institute"

    Abstract

    This report provides the main characteristics of information technologies on the Internet and a short list of information resources available through these technologies. classification of technologies by type of communications, access speed, volume of traffic and its share of the total volume of traffic on the Internet. The main types of information resources published on the Internet are given.

    A little history

    The Internet began as a defense project funded by the Defense Advanced Research Projects Agency of the US Department of Defense. The goal of the project was to develop a computer network designed to ensure the stable functioning of the country's governance system in the conditions of a nuclear war. The first documents describing the technical requirements for the system appeared in 1964; in 1969, the first four computers were united into an actual network. This network was called ARPANET. In 1971, the network already consisted of 14 computers, and in 1972 - 37. The seventies were the time of network growth and debugging of internetworking technology within the ARPANET. In 1982, the Transfer Control Protocol (TCP) and Internet Protocol (IP) were published. From that moment on, the combination “TCP/IP” entered the vocabulary of network technology specialists, which was firmly entrenched in the entire family of documents and standards related to the operation of the ARPANET, and later the Internet. Actually, the Internet appeared as a result of a large computer program of the US National Science Foundation. To conduct scientific research, the Foundation established several computing centers throughout the country and equipped them with supercomputers. In order for specialists involved in fundamental research to be able to use the computing power of these computer centers, they were all united into a single computer network. Initially, it was planned to use ARPANET for these purposes, but for various reasons the administration of the latter did not allow the connection of American universities to the defense network. As a result, the Foundation created its own network, NSFNET. The internetworking tools developed within the ARPANET project were chosen as the basis of this network. At this time, the first six Internet domains appeared: gov, mil, edu, com, org and net. Each of these abbreviations hides its own network: gov - a network of government organizations, mil - a network of military organizations, edu - a network of universities, com - a network of commercial organizations, org - non-governmental and non-profit organizations, net - a network of organizations responsible for the functioning of the Network itself .
    By capitalizing the word network, we make a distinction between the networks that make up the Internet and the Internet itself, as a collection of different networks. Currently, the Internet, in addition to NSFNET, includes about several hundred different networks. What all these networks have in common is the fact that to exchange information with each other they use a single mechanism called internetworking technology - the TCP/IP protocol family. The main points of this technology are a unified system of addresses for all computers on the network, a unified form of exchange of information messages between networks - the IP protocol and data exchange protocols with software installed on network computers that use IP to exchange information.
    Having become a means of scientific communication within the NSF program, the Internet has become the primary means of pre-publication of scientific results. Almost all laboratories in the world with access to the Internet began to post their preprints in electronic form in Internet archives, and only after that released printed copies of these works.

    Internet in Russia

    It did not bypass the Internet and Russia. There are approximately a dozen networks in the country that provide the ability to connect to the Internet. The standard services that these networks provide are: access to Internet resources via e-mail, connection in remote terminal mode to a computer connected to the Network, full IP connection, which consists of obtaining your own address on the Internet and the ability to work using TCP protocols /IP. Access to Network services via email consists of receiving and transmitting, as a rule, small text messages. Although messages are delivered quite quickly (from a few seconds to several hours to anywhere in the world), in reality they look at their mailbox once a day. This service is good for personal correspondence and for distributing various types of information by subscription. Typically, mail is used for advertising or distributing newsletters. Accessing the Network in remote terminal mode gives you much more options. Here the user works on computers remote from him, and therefore with their information resources, just as if he were at the console of a computer. Any information posted on the Internet is available to the user of a remote terminal, but he cannot view graphic images. Full IP connection provides maximum service for the Internet user. In this mode, all the power of modern computer technology is available. Only a full IP connection allows you to view good quality graphics, listen to audio recordings and display videos. Most modern Internet publications are focused on full IP connectivity.
    The most well-known commercial organizations providing Internet access services are JSC RELCOM, DEMOS, RO Sprint, Sovam-Teleport. Among non-commercial networks, one should highlight Radio-MSU, RUNNET, FreeNet. In addition to networks based on Internet technology, there are networks that use other technologies for data exchange. The most popular of them is X.25 technology. Actually, the RO Sprint and Sovam-Teleport networks are X.25 networks, but to communicate with the Internet they use a special mechanism for exchanging with Internet networks, which is called encapsulation. As a large abstract archive, the Russian Internet hosts the databases of the All-Russian Institute of Scientific and Technical Information (VINITI), which can be accessed at the World Wide Web address - http://www.viniti.msk.su/. Currently, SovamTeleport has started a large project to publish information on the Internet, which is called “Russia On-Line” (http://win.online.msk.ru/). This project involves the creation of a commercial information service similar to systems such as Yahoo (http://www.yahoo.com/) or Lycos (http://www.lycos.com/). These systems implement the concept of a virtual library in which materials are arranged by topic, but the list of these topics is not standard and reflects the preferences of their creators. Currently, there are about 4,000 Internet nodes in Russia. The country ranks 34th in terms of the number of gateways connected to the Network, and in terms of the growth rate of the number of connections (142% per year) it is on par with such countries as New Zealand (157%) and Belgium (147%). In total, by mid-1995 there were about 50 million users on the Internet.

    Information technologies Internet

    Such a large number of Internet users makes it very attractive from the point of view of a means of disseminating information. Currently, the Internet is widely used as: a means of commercial advertising, a huge abstract directory, a world library, a world reference service, a means of individual and group information exchange, a means of holding conferences, a world archive of audio and video information, etc.
    The first text document that was specifically designed to be distributed over the Internet, then the ARPANET, was a document called "Network Host Software", which was developed by Steve Crocker, one of the participants in the ARPANET project. This document was prepared and distributed online in 1969. Crocker's work marked the beginning of a library of Internet standards - the famous Request For Comments. From the name of these documents, literally “Materials for Comments,” it follows that they were intended for discussion by the network community and the result of such discussion should have been some kind of standard for software or norms of behavior on the network. Each of the discussion participants could make any comments to the document, and then sent the document to other discussion participants.
    The second important milestone in the dissemination of information on the Internet was the birth of the Usenet newsgroup system. Actually, the emergence of the first news exchange system is not directly related to the Network. In 1979, at North Carolina State University, two graduate students: Tom Truscott and Jim Alice, based on the message exchange protocol between computers with the Unix operating system - UUCP, created electronic bulletin board software, with which they connected two computers. In 1981, the system was implemented in the C programming language at the University of Berkeley by graduate students Mac Norton and Matt Glickman, and a freely distributed version of this program was released in 1982. In 1984 and 1986, the system was rewritten taking into account the experience of disseminating information on the ARPANET, and in 1986, a special form of information exchange was developed for the news system - the Network News Transfer Protocol (NNTP), which was recorded as an Internet standard under number 977 ( RFC-977). From this moment on, the Usenet newsgroup system became one of the standard information resources of the Network.
    Usenet is a huge electronic bulletin board divided into sections according to the interests of its users. Each newsgroup has its own name. The naming system has a hierarchical structure. For example, a news group describing Internet information technologies is called comp.infosystems, which means the “computers” group and the “information systems” subgroup. In turn, each subgroup can be divided into new subgroups. So the group comp.infosystems.www has more than 10 subgroups (comp.infosystems.www.misc, comp.infosystems.www.users, comp.infosystems. www.providers, etc.)
    At the time of its birth, Usenet was conceived as a means of exchanging opinions between work colleagues, then it began to be used as a means of communication between groups of Internet users united by a certain interest. These could be users of some software product, say the Oracle DBMS, or fans of Tolkien's science fiction. However, very soon advertising agencies also realized the benefits of Usenet. Currently, more than 35% of all Usenet messages are advertisements specially prepared for distribution on the Internet. An example of this type is the commercial news group Relis of the RELCOM joint stock company. For a long time, news was distributed in the form of simple text files, but recently, after the advent of special standards for marking up text information (Hypertext Markup Language and Standard Generalized Markup Language), news viewing software began to allow viewing not only text, but also graphic images, and short films, listen to audio announcements.
    Another important means of exchanging information on the Internet is e-mail. Email appeared around the same time as Usenet. For a long time, Usenet even used email channels to transmit news. The main task of e-mail is to ensure reliable and fast delivery of correspondence from one addressee to another. The pioneer of the introduction of e-mail systems in the USSR was the RELCOM network. However, like regular mail, it is also possible to distribute periodicals by email. This method of publishing on the Internet is called mailing lists. Mail lists (Listserv) appeared for the first time on the BITNET computer network and gained worldwide recognition. As a rule, each list is maintained by a special administrator or administration group. An example of such a list is the catalog of software from Cognitive, a well-known manufacturer of image recognition and text digitization systems.
    If you have a phone at your disposal and it works well enough, i.e. is able to maintain the connection of subscribers for a quarter of an hour, then the advantages of e-mail over regular mail are obvious. The first advantage is the speed of delivery. If subscribers are connected to the global Internet directly through an IP connection, they will receive the message in a few seconds. If a subscriber has a dial-up connection to the Internet, then the delivery speed is limited by the frequency of his connection, which he determines himself. In any case, the user will definitely receive the message within a day. If you consider that even in Moscow mail does not travel faster than a day, and from the Moscow region to Moscow it takes almost a week, then e-mail looks like a supersonic airliner against the background of a turtle. What can we say about the speed of delivery of regular mail from near and far abroad. Sometimes she just doesn't get it.
    For a long time, email was thought to be only suitable for sending text messages, so it was not considered a serious competitor to snail mail for the distribution of subscription publications. This opinion was based on the fact that transmitting large files over a dial-up line is quite a tedious task. As a rule, our telephone network does not allow a file of one megabyte in size to be transferred over the communication line in one communication session. However, the situation is gradually changing, which means that it is becoming possible to transmit graphics, videos and other non-text information by mail. In European countries and the United States, it has become normal practice to send emails containing embedded graphics or audio information. For messages of this kind, even a special format has been developed - MIME. In addition, the international standardization organization ISO has developed another standard for transmitting messages with non-text information - X.400. An important feature of the standards listed above is the transmission of specially marked text, which makes it possible to control the text style and fonts when displaying messages on the screen or when printing.
    Another big advantage of email is its low cost compared to regular mail. This becomes especially clear when sending and receiving international correspondence. The difference in tariffs reaches one or two orders of magnitude, and this is generally understandable: there is no need to burn kerosene and drive an airplane or ocean liner to the other side of the world. The same effect is visible when compared with such means of communication as telephone or fax. In order not to lose such quality of telefax communication as graphics, many postal networks use special programs - telefax servers. The essence of the telefax server is that the user, instead of communicating via international telephone connection to transmit a telefax message, sends it by e-mail to a telefax machine, which is located not far from the place where the fax was received. It is quite obvious that such a shipment will cost much less than direct delivery.
    For a long time, before the advent of computer networks, information was stored electronically in local information systems. As a rule, these were either library catalogs or abstract journals. To gain access to these systems, it was necessary to install software for working with these databases, and then periodically purchase additional magnetic tapes with documents to update the content of information arrays. The most famous resource of this kind was the abstract databases of VINITI, which stored copies of abstract scientific and technical journals. In the world, the most famous systems of this kind were the systems STN, Dialog, Hardfield Citation Index, etc. With the advent of network technologies, it is difficult to abandon such a rich heritage, on the one hand, but it is extremely difficult to convert this entire volume of materials into forms accepted for network information exchange. A solution was found in the mechanism for accessing information systems in remote terminal mode. The remote terminal software itself was developed for other purposes (ARPANET is a defense project, after all), but it turned out to be extremely convenient for remote access to old information systems. Through a remote terminal on the Internet, catalogs of many US libraries, dictionaries, newspapers, magazines and other information are available. The most complete collection of addresses at which you can log in in remote terminal mode is the Hytelnet database, versions of which are implemented for computers of all types and most of the most common operating systems.
    After email, let's turn our attention to the undisputed leader among electronic publishing technologies on the Internet - Internet file archives, which are also called FTP archives after the information exchange mechanism - File Transfer Protocol. FTP archives were originally created for the exchange and storage of network standards - RFC documents and software. But over time, they turned into huge multidisciplinary data warehouses. Of all the abundance of information contained in FTP archives on the Internet, special projects aimed at creating electronic copies of famous books are of greatest interest from the point of view of electronic publications. The best known of these are Project Guttenberg and Project Ruttenberg. The first is intended to create an archive of electronic copies of books in English, and the second is an archive of books in Scandinavian languages. There are currently 351 complete texts in the Guttenberg archive. Here are some of them: Charles Dickens: The Chimes; Bible: Holy Bible; Henry Longfellow: The Song of Hiawatha; Doyle The Adventures of Sherlock Holmes; Mark Twain: The Adventure of Tom Sawyer.Lewis Carrol: Alice`s Adventures in Wonderland. It is curious that in the USA an archive of Slavic books is being created, which is called “Nestor”, but here the work is only at the very beginning. Creating electronic copies of books is associated with a number of problems. Currently, the most serious of these is the problem of copyright. The fact is that the above-mentioned projects aim to create public, i.e. archives that are free and accessible to all network users. But this means that the texts of books must fall into the category of freely disseminated information or, as it is called in the West, Public Domain. Project Guttenberg resolved this problem according to US law, where texts published before 1917 currently fall into this category. In the Ruttenberg project, unexpected problems arose in 1995 due to the fact that the Swedish parliament became interested in the project, and now the project may be on the verge of closure, because publication of materials is permitted only until January 1, 1996. The project management has prepared an appeal to the country's parliament with a request to consider the possibility of continuing the work. In the meantime, the point is that he is recruiting volunteers to intensify the work of entering information.
    In the domestic Internet sector, things have not yet come to such large-scale actions, although there is progress in this direction. The most interesting of this kind of projects is the LIBWEB project, funded by the Russian Foundation for Basic Research. The goal of this project is to create a distributed electronic catalog of the country's leading scientific libraries, access to which will be provided via the Russian part of the Internet. Although the main technology supporting this project is the World Wide Web, access via FTP will also be possible.
    In addition to such fundamental works related to the creation of the electronic heritage of mankind, there are less demanding databases of publications of individual organizations. Such systems include legal libraries of the US Library of Congress, preprint libraries of universities and research centers, catalogs of publishing houses and university libraries. As a rule, literature in these repositories is presented in the form of PostScript files or TeX files. PostScript is the de facto standard for document printing. This format was developed by Adobe and implemented for printing on laser printers. Currently widely used in desktop publishing systems. For any type of printing device, you can purchase the necessary cartridge for printing PostScript files. A widely used program on the Internet for converting PostScript files into printer formats is ghost. Another standard that Adobe is promoting is Portable Document Format (PDF). This format is specifically designed for exchanging information on electronic media. To view documents in this format over the Internet, you can get the freely distributed program Acrobat, specially developed by the company, the implementation of which exists for all common computer architectures, ranging from ordinary personal computers to high-performance workstations. The TeX format is widely used in the scientific community. Many scientific journals accept authors' publications in this format. TeX files, or its modifications - LaTeX, are ordinary text files with document markup commands built into them. TeX was developed by the American Mathematical Society to standardize mathematics publications. The format has very powerful means of describing mathematical formulas and tables, and it also allows the use of embedded graphics. A typical example of an FTP archive of this sort is the preprint archive of the International European Nuclear Center (CERN), which is apparently one of the first archives of its kind.
    From FTP archives, which are ordinary hierarchical information repositories, we will move on to a more modern information technology for publishing information on the Web - the distributed Internet information and reference system - Gopher. Until 1995, Gopher was the most dynamic information technology on the Web. The growth rate of the number of Gopher servers, i.e. programs that manage Gopher databases outpaced the growth of servers for all other Internet technologies. Gopher was developed as a campus information system. Here, apparently, it should be clarified that US university campuses differ from our complexes of higher educational institutions in that the academic and residential buildings of the university, as a rule, form a single complex and are located at some distance from large cities. This system is called a university campus. It is for the information support of such a campus that Gopher was invented. The word "Gopher" itself is translated into Russian as "gopher". But in this case, a completely different subtext is meant. The fact is that the state of Minnesota is called the Gopher or “gopher” state. The inhabitants of this state are also called “Gophers”. The Gopher information system was developed in 1989 at the University of Minnesota, and is named after its state, not without a certain amount of humor. The main core of the system is the idea of ​​presenting all information in the form of a hierarchical tree. The Gopher developers believed that this form was very clear to users, because they deal with hierarchical library directories and hierarchical file system structures every day. In addition, the Gopher tree perfectly matched the hierarchy of the university departments and its organizational structure. In 1990-1992, Gopher became widespread in other universities in the USA and Europe. Since 1992, Internet specialists and Gopher enthusiasts began to talk about the Gopher Space (GopherSpace), which was formed on the Internet by Gopher servers. In 1993, the Gopher universal search engine, Veronica, appeared. It allows you to scan Gopher space as a simple text database using keyword-based queries. By this time, commercial organizations began to turn their attention to Gopher. In particular, a publishing house such as O'Reilly organized its Gopher server to publish the publishing house's catalogue, publishing house magazine and a number of other materials on the network. The beauty of Gopher was that text files that were stored in FTP archives were included as elements of the Gopher directory without any changes to them. In addition to texts, Gopher allows you to provide access to graphic information and audio recordings and even search queries, which are also an element of the Gopher database tree, however, to view these documents you must have special viewing programs that are not part of Gopher clients. By the time Gopher appeared on the Internet, new projects for storing literary works began to appear, developing the concept of full-text archives. One of them is the World Children's Library Project. Currently, one of the Gopher servers stores books by 16 children's authors, including: Edgar Po, Luce Carol, Frank Baum, Robert Stevenson, Mark Twain, Daniel Defoe. However, the appearance of Gopher did not add visual appeal to e-books. These were still regular text files. No special mechanisms for describing information or controlling its display have been developed within Gopher. To present text pages with graphics embedded in them, it was necessary to make graphic copies of these materials. Such copies take up too much space, which is why the archives listed above contain only text information. True, publishers began to think about a unified information storage format that would allow the same copy of a document to be used both for regular publication and for electronic publication. By this time, a special standard document markup language had already been developed - Standard Generalized Markup Language (SGML), which has recently become increasingly popular as a standard for storing and presenting information. This language is used by electronic publishing pioneers such as O'Reilly Inc. and Chicago Business School Publishing. True, this is happening in the context of the development of another information technology, the Internet - the World Wide Web or, translated into Russian, the “World Wide Web”.

    Internet information resources

    It should be noted that, first of all, World Wide Web pages were created on the Internet by almost everyone who was involved in publishing materials before the advent of the Web. And the Guttenberg project and the Ruttenberg project and the O'Reilly publishing house and a host of other organizations listed above were involved in the process of creating the World Wide Web. Currently, all electronic publications on the Web can be divided into several categories. These are, firstly, advertising pages, secondly, these are electronic periodicals, and thirdly, these are serious electronic monographs and reference books. Let us dwell in detail on each of these types of publications.
    Advertising materials are published most widely on the Internet. Web pages with various types of advertising can be found most often. Typically, advertising is hosted on its own servers. Almost all companies in one way or another connected with the computer business have their own pages. If you have the desire and time to get acquainted with press releases from Microsoft, IBM, DEC, Apple, Symantec, etc. no need to read computer magazines. Companies themselves publish various types of information about themselves, ranging from the history of creation and founding fathers, to the characteristics of the company’s latest products and price lists. Here you can also find reports from company representatives at various conferences and exhibitions, as well as their interviews with various specialized publications. All these materials are richly illustrated with graphics and photographs. Often, you can even listen to greetings or addresses from company executives to Web users, if the capabilities of the computer and its software allow it, of course. The quality of these advertising materials is often not inferior to their printed counterparts.
    Catalogs occupy a special place among Web materials. If we are talking about commercial information or commercial products, then they can also be classified as advertising, but in addition to the commercial use of the Web, which, according to some estimates, accounts for 30-40% of the entire set of Web pages, there is also non-commercial service to network users, which is carried out either at public expense, or at the expense of the enthusiasm of individuals, of whom there are surprisingly many in this world of profit and profit. There's just so much in Web catalogs: these are lists of famous film actors, these are tables of contents of CDs, these are lists of hit parades and much, much more. All this is illustrated, for example, with photographs of film stars, or recordings of fragments of musical compositions. There are reference books on ancient art, reference books on the history of aviation or navy, and catalogs of science fiction. In a word, there is something to read, watch and listen to on the Internet.
    Illustrated magazines were the first of the printed periodicals on the World Wide Web. There are apparently several reasons for this activity. Firstly, this is the age and gender of Internet users. According to numerous surveys conducted among people using the network in 1995, more than half (69%) of those surveyed were people aged 25 to 45 years. The largest group of users are 30-40 year old men. Women make up no more than 20% of all Internet users. Secondly, this is the professional composition of users and their education. Up to 70% are engaged in business or computer technology, more than half have higher education. The stereotype that network technologies are for the young is not entirely accurate. Students and young people under 30 make up no more than one fifth of all network users. Accordingly, entertainment magazines are divided in the same proportion. The bulk are made up of such publications as Fortune, Time, Playboy, Penthouse, UFO Library, etc. Recently, professionals from Playboy began to apply the principles of publishing hypertext materials on the network in the full sense of the word. An example of such material can be considered an “Amusement Park”, where the user has the opportunity to “walk around”, look and talk with the inhabitants of this world, similar to a “walking-shooting” toy. It should also be noted that currently electronic copies of printed publications are rather a digest of the printed publication. Companies producing printed materials, on the one hand, are interested in attracting new readers, and on the other hand, they must not lose traditional clients of printed publications. To solve this problem, a combined approach is used, which consists in the fact that, firstly, a network user can subscribe to a publication without leaving his computer, transferring the required amount of money to the account of the publisher or intermediary company, or debiting the corresponding amount from a credit card . At the same time, operations with cards can also be performed using your computer. Secondly, you can subscribe to the electronic version of the publication. In this case, after transferring the money, the user is registered in the information service of the publication, i.e. assign a specific name and password, using which the user can enter protected sections of the publication’s database. In connection with the latter opportunity and the expansion of data protection means when exchanging them over the network, the number of commercial Internet information resources closed for public use has begun to increase.
    Electronic newspapers are another type of periodic publication on the network that has begun to master the new Internet technology - the World Wide Web. Electronic newspapers appeared somewhat later than magazines, with a delay of about six months. The main obstacle for the newspaper business is the difficulty of preparing publications for the Web. The first publications of this kind were Computer World, PC Magazine Daily and other computer publications. the presentation of materials in these newspapers differs from similar materials in the printed original. Firstly, the page size of an electronic publication is limited by the size of the computer display screen. Secondly, when submitting materials, you should take into account such capabilities of modern programs for controlling the display of information on display screens, such as the simultaneous opening of several windows in which you can place either different pages of text, or place a picture explaining the text. This allows you to “condense” the information on the display screen. This simply cannot be done in a regular print publication. Authors of electronic publications also take into account the possibility of audio presentation of material, which is extremely useful when publishing interviews. In addition, reviews often publish not extensive text, but a set of graphic materials: histograms, pie charts, accompanied by a text commentary. It is quite obvious that this gives rise to a completely new type of publication, very different in form and design from the traditional printed original.
    The most fundamental electronic publications on the Web are analogues of large printed publications, such as the Bible or various types of encyclopedias, implemented in World Wide Web technology. It should be noted right away that in this case we are not talking about simple copying of texts. Materials of this kind are immediately implemented taking into account the limitations and new possibilities of computer technology. An example of the hypertext organization of the Books of the New Testament is given in the previous section, so we will not return to it, but will consider one of the largest projects of this kind, “Britannica Online”. This project is associated with the organization on the Internet of one of the oldest encyclopedias in the world - the Encyclopedia Britannica. The electronic version of Britannica is a rather complex conglomerate of information technology. The main thing in this information database is the presentation of encyclopedia articles in the form of hypertext pages in the format of hypertext markup language - HTML, which is used to publish materials on the World Wide Web. Each article has connections with the next article in order in the text version of the encyclopedia and the previous article. In addition, each article is linked to other articles by hypertext links, which are used in the text of the article, and a database of personalities, countries and graphic illustrations. In addition, to navigate the encyclopedia, there is a table of contents, an index, and a mechanism for searching articles by keywords. The table of contents and index are lists of hypertext links. Moreover, if in the table of contents the links point to encyclopedia articles, then in the index the links point to lists of articles, which in turn are also lists of hypertext links. The encyclopedia search engine is of particular interest. It is based on the technology of the distributed Internet information retrieval system - WAIS (Wide Area Information System). We will talk about this system itself a little later. Here we draw the reader's attention to the fact that the result of WAIS is a list of hypertext links to encyclopedia articles that the system generates on the fly. This list is no different in appearance from the index list, for example, but from the point of view of the system, these are completely different objects. The index list actually exists as a World Wide Web database file, and the lists that WAIS generates do not really exist, but are temporary entities generated by user requests. Registered users can save the lists obtained in this way for later use when working with the system.
    Here we come close to the concepts of “registered user” and “access mode” to the Britannica Online database. This project is not a charity and you have to pay money for access to the encyclopedia resources. For a user using the database for his “home” purposes, the fee is set at $150 per year, for university students - $120 per year, for businessmen - $300 per year. Subscribers can be not only individuals, but also organizations; here the fee is set depending on the scale of the organization and the type of its statutory activity. In each specific case, the amount of the contribution is determined individually. However, not everything is so tough. In order to get acquainted with the encyclopedia, it is not at all necessary to pay the fees listed above. There are two ways to decide whether you need the Encyclopedia Britannica in your work or not. The first way is to demonstrate its capabilities. Here the user is asked to walk along a pre-planned route and understand how the system works. The second way is much more interesting - a seven-day free use of the encyclopedia. In this case, the reader must register, receive a name and password, and after that he can work in the system.
    In addition to Britannica, there are other encyclopedias on the Internet, for example, the Science Fiction Encyclopedia or the Wine Encyclopedia. At the same time, access to them is completely free, and you don’t need to fork out for it. Generally speaking, before subscribing to anything, you should carefully scan the Network for the presence of materials that interest you, dear reader, and only if there is nothing suitable in open public archives, you should subscribe to a commercial information service.
    At this point we come to the problem of searching for information on the Internet. At one time, any organization had information divisions in its structure. In scientific organizations they were called departments of scientific and technical information. The main task of these units was to search for information on the organization’s profile, maintain a thematic library and promptly inform specialists about periodical materials. The idea that you can find everything on the Internet yourself is erroneous. The network is too large for each user to be able to view all available information archives, even if they have the necessary search services. Therefore, one of the forms of activity on the Internet has become the organization of commercial information services. The most popular commercial services of this type are Yahoo and Galaxy.
    Information services are hierarchical lists of information resources very similar to library subject catalogs. As a rule, in addition to the subject catalog, there is a catalog of Internet information resources, organized by country, and a special page for launching queries using keywords. Users can use each of these alternatives to their liking. For the first acquaintance with the system, they usually use a thematic catalog, and moving through it, look for the necessary World Wide Web pages. Here, however, one should take into account the fact that the idea of ​​the subject area and the terms that characterize it between the creators of the system and the user may differ significantly. Adaptation requires a certain amount of time, which at first glance is wasted. However, the same thing happens when working with conventional information retrieval systems and when meeting new journals for the first time. The creators of information catalogs strive to get closer to perfection, but this is not yet very successful. There is no single thematic division on the Internet like the Universal Decimal Classification adopted in librarianship, so subject catalogs can differ significantly from one another. So the World Wide Web Consortium directory is very long. The first page takes up several display screens, which is actually not very convenient. The Yahoo and Galaxy directories are divided into many subdirectories. This makes each page more compact and easier to browse. However, this also has its drawbacks: a large number of nesting pages within each other can lead to the user losing orientation in the catalog tree.
    National resources also have different ordering systems. In some cases, the administrative division of the country is taken as a basis, and in others, resources are ordered by belonging to the sphere of human activity, for example, university resources, government resources, commercial information resources, etc. There are catalogs in which both approaches are combined and then in database of national information resources, contains two or more tables of contents.
    Searching for pages using keywords is an integral part of any information catalog on the Web. Typically, for searching, special fields for entering keywords are used, which can be combined into search queries using the connective words “or” “and”. In this case, the word “and” indicates that the keywords must occur simultaneously in the searched document, and the word “or” requires that one of the specified words appear in the document. The search page can be combined with the first page of the directory, as is done in Yahoo, or it can be an addition to the directory and loaded separately. How this is implemented in Galaxy. When creating a request, you can use a simple form, where you simply enter a list of words, or move on to a more complex structure, where you can even specify parts of documents where the search should be carried out and where not. Search results are displayed in the form of a list of names of found documents, which is also a Web page. The difference between this page and others is that such a page does not exist in any database on the network. It is created “on the fly” by the search program, but if the system allows it, it can be saved as one of the user’s pages.
    Concluding the conversation about catalogues, you should pay attention to the catalog of electronic publications, which was actively used by the author to prepare this text. The authors of this catalog looked through the entire Internet and included in their list only those links that are related to the publication of materials on the Internet. The name of this directory is EDOC - Electronic Documents. As the name suggests, this catalog contains not only newspapers, magazines, reference books, but also any types of electronic texts and even non-texts (catalogs of paintings, museums, exhibitions, etc.).
    The last type of electronic documents that I would like to talk about are publications prepared taking into account new computer technologies and specifically for publication on the Internet. It is simply impossible to publish these materials in the form of ordinary books. All these products are usually called virtual literature. Virtual literature can be divided into specialized literature, entertainment literature, and research in the field of virtual publications. Special literature consists of specialized journals, specialized reference books and textbooks. Entertainment literature is illustrated magazines, fiction (poetry, science fiction, fairy tales for adults and children, etc.), and research is publications designed to demonstrate new ways of organizing information and presenting it to Internet users.
    Specialty journals are primarily computer publications such as the journal of the National Center for Applied Software for Supercomputers. "Access". In this publication, each article is presented in the form of hypertext. The body of the article contains built-in table graphics and even the results of dynamic modeling in the form of movie clips. As a rule, graphics are prepared separately from the text and displayed at the special request of the user. At one time, very interesting material was presented on modeling the process of the collision of comet Shoemaker-Levy with the planet Jupiter. communication about this experiment was accompanied by short films showing the collision process from various angles. Also interesting was the message from the US National Aeronautics and Space Exploration Agency (NASA), which dynamically displayed the surface of the Earth during the flight of one of the American spacecraft in 1995.
    Directories are, perhaps, the first publications on the Web that were implemented in World Wide Web technology. In 1994, the best special edition of the World Wide Web was recognized as a reference book on oncology, which was designed for three different types of users: patients, students and specialists. The directory contained information about both typical oncological diseases and special cases that a doctor may encounter when treating various cancers.
    Textbooks are the third type of hypermedia publications that are available on the Internet. None of the types of virtual literature has a deeper justification and more detailed elaboration of implementations than a textbook distributed on the network. In the United States, as part of the work to create an information superhighway, a virtual public university project was launched. To justify the possibility of such an educational institution, a historical analysis of various forms of higher education from antiquity to the present day was carried out. Considering the level of development of modern computer technology, it was recognized that a virtual public university is quite real and will provide its students with a sufficient level of professional training. In addition to the functions of a standard college, such a university could easily be entrusted with the tasks of retraining industrial personnel. Naturally, a university cannot do without textbooks. But a virtual university and benefits must be virtual. For medical faculties, there is now no need to go to the anatomy and cut up corpses. all this can be done in a virtual reality environment or ultimately behind a display screen. Of course, there is an opinion, and it is quite substantiated, that real practice can never be replaced by models. There are many examples of this. But in this case, it seems worth talking about a balanced combination of using software and practical exercises. Returning to the virtual university manuals, we can say that historians can now sit at their desks, view archival materials and videos of historical events, listen to the voices of historical figures, lawyers can participate in imaginary trials, and physicists can conduct experiments on the fission of radioactive materials without being threatened. to others. In a word, this is fantasy, but real fantasy, the elements of which have already been realized.
    When experts began to seriously study the problems of hypermedia and hypermultimedia technologies, which we dubbed hypertext in the first part, the opinion was often expressed that the computer was becoming Guttenberg’s new printing press, which would change the understanding of the perception of information and the forms of its organization. Now the line that separates most citizens of developed countries from this new printing press has practically disappeared. The power of modern personal computers is sufficient to install complex programs for presenting combined information, and modern communication systems are able to provide the necessary service for each telephone user to vast network information resources. In anticipation of this breakthrough, many enthusiasts are beginning to create hypertext fiction. Writers of poetry and fiction especially stand out in this desire. Parallel presentation of material from several persons, presentation of different views on the same event is the norm of modern hypertext literature. Already now, one can easily name candidates for hypertext implementations from among ordinary works of art: plays, dialogues, detective stories. Science fiction stands out from this list in that it allows us to realize visual effects that are very difficult to describe. However, there is an opinion that where hypertext appears, literature disappears. If we consider literature as a way of expressing thoughts and the art of describing reality, then this opinion is partly correct.
    The last object that I would like to touch upon when describing the Internet as a virtual publishing house is systems such as MUD (Multi-User Dungeons). This can be translated into Russian as “Multiplayer Dungeons”, which became a development of the game “Dungeons and Dragons”. If we consider that comics are publications, then “dungeons” are also electronic publications. The fact is that any fairy tale, detective story, chronicle can also be organized in the form of a distributed software environment that will imitate the original source. However, network technology introduces its own unique features into such an implementation - the possibility of simultaneous work of several users and a real time scale. A MUD is essentially a virtual second life in a fictional environment. can be described by roles, and in this case, each of its characters begins to act in accordance with the laws of their character, and this is a real person who may be located many thousands of kilometers from other participants in the action. Until recently, MUD, like the entire Internet, was the domain of amateurs, but high-quality implementation of graphics and video requires professionals and large financial investments. As the number of network users increases, the industry of virtual environments and their commercial use will apparently develop. Psychologists are already foreseeing new problems that this technology will cause, new types of mental disorders, manias, etc. In their opinion, the main problem is that society is technically ready for the implementation of virtual environments, but is not legally and morally ready.
    But the Internet has already become a serious factor in the life of developed countries. In 1995, the US President announced the creation of an information highway as a national program; the European Community in 1994 considered a program for informatization of Europe; in Russia, successes in the use of computer networks are more modest, but nevertheless, about three large projects for connecting to the Internet are being developed in the country spheres of education and science. Apparently, the Internet will become one of the main means of international communications and the development of publishing in this direction is completely justified and timely.

    After studying the material in this chapter, the student should:

    know

    • the concept of hypertext, which underlies the functioning of all major Internet services;
    • features of the main types and genres of information resources on the World Wide Web;
    • the most popular forms and means of supporting modern Internet communication, including the latest Internet services Web 2.0;

    be able to

    • correctly use new types of network information interaction that have become part of our daily life recently;
    • neutralize the possible negative consequences of working in the network information space, including to prevent manifestations of network addiction, which is dangerous for psychological health and interferes with the normal socialization of a person;

    own

    • an idea of ​​the patterns of emergence and logic of development of computer networks;
    • an idea of ​​the process of formation of genre features of the information content of the Internet, conditioned by this logic;
    • principles of effective use of different Internet genres;
    • an understanding of the sources of dangers that arise when working on the Internet;
    • the main principles of network information culture;
    • basic rules of network information security.

    Internet services

    Basic Internet services

    Connecting to the Internet allows the user to work with information resources of various types, of which the most common are such types of information exchange, which are called “Internet services” (i.e., information actions provided by special software that most effectively serve specialized information requests user). The following services are currently available on the Internet:

    • information retrieval based on the domain name system ( DNS service ), which ensures the availability of information;
    • e-mail, which serves for the exchange of messages between Internet users;
    • chat (English IRC - Internet Relay Chat), designed to support real-time text communication;
    • teleconferences (English BBS - Bulletin Board System electronic bulletin board), video conferences, or newsgroups (English) newsgroup ), webinars (English) webinar ), providing the possibility of collective messaging in real time or in delayed interaction mode;
    • file sharing (English FTP - File Transfer Protocol) , designed for interaction with file archives in which files of various types are stored and sent;
    • remote computer control (from English Telnet TERminaL NETwork) , designed for communication and control of computers on the Network in text mode;
    • World Wide Web (eng. WWW, W3 – World Wide Web) is a hypertext hypermedia system designed to integrate various network resources into a single information space;
    • services Web 2.0 (English) Web 2.0 ), providing each Internet user with the opportunity to collectively work with information resources of any modality;
    • streaming media .

    These services are called standard, since they are provided with computer software based on generally accepted standard technological solutions. In addition to standardized services, a variety of non-standard software tools constantly appear on the Internet and are offered to the user, which are created by various commercial programming companies and individual non-profit associations of programmers (most often working in universities). Among such programs, we can note the great success of programs that provide interpersonal communication, which include various messenger programs (English) Instant. Messenger ) type ICQ, systems Internet, audio, video telephony (type Skype etc.), internet radio And internet television, systems electronic distance learning (eng. LMS - Learning Management System ) (type Blackboard, WebCT, Moodle etc.). When using non-standardized programs, problems may arise due to their technical incompatibility with other Internet services, which may occur, in particular, when working in different Internet browsers.

    The most popular Internet services

    The scope of using the capabilities of the Internet is extremely broad - it covers almost all types of information interaction between a person and a person and a person with any technical device. However, not all the capabilities of the Internet have yet found their application in science and technology, as well as in people’s everyday lives. Both scientific, technical and mass development of the Internet continues; it is going so fast that futurologists do not undertake to predict what other services will be able to develop on the basis of global computer networks. Today, the World Open Encyclopedia "Wikipedia" lists the following Internet services as the most popular and in demand telecommunication services by the population:

    • World Wide Web (hypertext information systems):
    • web forums;
    • blogs, Twitter;
    • wiki projects (and, in particular, the Wikipedia encyclopedia itself);
    • online stores;
    • Internet auctions;
    • social media;
    • email and mailing lists;
    • newsgroups (mostly Usenet );
    • file sharing networks;
    • electronic payment systems;
    • Internet radio;
    • Internet television;
    • Internet telephony;
    • messengers;
    • FTP servers;
    • chats (both IRC and computer systems implemented as web chats);
    • search engines;
    • Internet advertising;
    • remote access terminals;
    • remote control of a computer and (or) any technical device;
    • multiplayer games;
    • Web 2.0 services;
    • Internet trading;
    • distance learning on the Internet.

    4.7. Internet information resources.

    A person who has access to the Internet finds himself in a world of practically unlimited information resources. Please note that some resources may be paid. Below is a very brief overview of the main resources on the network.

    1) Email.

    Electronic mail, or e-mail (from electronic mail - electronic mail), is one of the ways of communication between people. It combines all the advantages of mail, teletype, telegraph and fax. Moreover, sending by e-mail is cheaper than the services of each of the considered means of communication.

    Example email address [email protected]

    In the example under consideration, sas is the subscriber’s identifier, usually composed of the initial letters of his last name, first name, and patronymic. What is to the right of the @ sign is called a domain and uniquely describes the location of the subscriber. @ is a required character in an e-mail address.

    2) World Wide Web.

    Probably the most interesting, convenient and effective resource that is currently extremely popular is hypertext network information systemWorldWideWeb(world wide web). The World Wide Web, which for brevity is called the Web or WWW, is a hypertext (more precisely hypermedia) information system containing interconnected documents that can be created in various software environments and located on any computer on the Internet.

    Hypertext can be considered as a text containing links that are associated with the definition, explanation, additions of individual words, phrases, images included in the text in question. The most important property of hypertext is automatic access to information associated with the link specified by the user. Searching for this information and displaying it on the screen is carried out using special programs for working with hypertexts.

    3) Resource address.

    From the point of view of the operating system, each web page is a file located on one of the disk drives of the computer playing the role of a web server. Therefore, in order to access any web page, you need to point in one way or another to the file storing this page.

    http://sunsite.unc.edu/boutell/fag/www_fag.html

    http - protocol

    sunsite.unc.edu – computer domain address

    boutell/fag/www_fag.html - file

    4) Electronic bulletin boards.

    Electronic bulletin boards (the abbreviation BBS for Bulletin Board System is often used in literature) contains advertisements that are sent by users to everyone who reads them. Electronic boards are an analogue of regular bulletin boards, which are placed in publicly accessible places frequently visited by people. You can also draw an analogy with advertisements printed in newspapers and magazines.

    5) Teleconferences.

    On the basis of e-mail programs, electronic bulletin boards and other special packages, business meetings and scientific conferences are held, in which several people who are at their workplaces in different cities or countries can participate.

    6) Transferring files.

    Messages sent over the network can only consist of ASCII codes. However, by attaching any file to a message, it can also be sent over the network, but only in offline mode. There is another way to transfer arbitrary files between computers on the Internet. This method is based on the protocol FTP(File transfer Protocol), which involves transferring files in the so-called operational, or online, mode. This means that while the file is being transferred, the sending and receiving computers must be in direct contact with each other (like people talking to each other on the phone).

    7) Remote access.

    The FTP protocol is quite powerful, but at the same time a limited means of accessing the resources of “other people’s” computers on the network. It provides only copying, that is, sending copies of files from one computer on the network to another. Full access to the resources of computers on the Internet is provided by the protocol telnet(TERminaL over NETwork protocol - remote access protocol). Using this protocol, a user can connect to a computer located on the opposite side of the globe and work with it as with his personal computer.

    8) Search for servers.

    As noted above, in order to use the ftp or telnet protocols, you need to know the domain address of the corresponding server. If such an address is unknown, then access to the necessary resource can be significantly difficult. To make it easier to find the right servers on the Internet, a menu-based system for accessing Internet servers has been developed. This system was named GOPHER. The term comes either from the word gopher - gopher (Minnesota, the birthplace of this system, is considered the state of "golden gophers"), or from the slang term go fer - prowling person.

    9) Databases on the Internet.

    A large number of databases are connected to the Internet, containing a huge amount of information on a variety of issues: from information on specific sciences - biology, mathematics, physics - to a collection of anecdotes and fables. As a rule, they are part of a widely used information system WAIS(Wide Area Information System). A computer that has special software and provides users with access to the databases of this system is called wais-server.

    WAIS brings together wais servers around the world with access to more than 1000 public and commercial databases. To access WAIS you need to know the home address of a specific wais server.

    Introduction 2
    Networks and protocols 3
    Information service 5
    Internet 6 global network
    Hypertext 7
    Service in Internet 8
    Email - letter without envelope 8
    Teleconferences 11
    File Transfer Protocol (FTP) – file transfer 13
    Internet Real Chat (real communication on the Internet) 14
    Hypertext system GOPHER 15
    Other online information resources 16
    World Wide Web or WWW 16
    Hypertext system Hyper-G 18
    Search engine WAIS 19
    "Navigation" on the Internet 20
    Modification Web 21
    Links and documents 22
    What is URL 23
    "Navigation" on the Web 24 network
    Conclusion 24
    Internet today and tomorrow 24
    References 27

    Introduction

    Our time is a time of cars rushing forward at incredible speed, disposable diapers and, of course, limitless expanses of information that is constantly updated, how do we manage to form an opinion about at least that small amount of information that comes to us through newspapers, magazines, books , television and radio broadcasts? How to generalize and, at the same time, assimilate all those innovations that constantly appear in the educational and scientific spheres, how not to get lost in a sea of ​​names and references to this or that literature? It’s impossible to know everything, but knowing where you can get information on any issue that interests you is something, not even that... that’s all! After all, what matters is not whether you know who Julius was
    Caesar, but to what extent you can guarantee that this information is available to you, that you can find it and provide it upon request. What will help you in solving this difficult task? I can offer one of the easiest and, at the same time, most competent ways - to use the Internet.
    After all, using the Internet means using hundreds of thousands of printed publications all over the planet, being aware of the latest events and innovations in any field, having complete and reliable information on any issue and the opportunity to consult with knowledgeable people.

    So what is the Internet and what is its use, why is it so popular? This is what I have to find out in the process of writing this course work. But first, let's talk about the background, the emergence
    The Internet as an information network or network of networks, as it is commonly called.
    That is, first let's talk about computer networks.
    What is this? A computer network is an association of autonomous personal computers for sharing computing resources (processor, memory and peripherals - for example, an expensive laser printer). A computer network within a relatively small area is usually called local, while networks covering large areas, and some the entire globe, are called global.
    A local network is usually organized and operated within one company
    (organizations) and connects computers at workplaces for faster and better information exchange. Each organization that operates more than a dozen PCs tries to integrate them into a local network in order to reduce paperwork and increase the efficiency of its departments. Naturally, almost every such network must have access to external customers.
    A global network is most often an independent (technically and legally) structure, and other companies connect to it to work for a certain fee.
    One of the first computer networks, Arpanet, was created almost 30 years ago by order of the research division of the US Department of Defense.
    At first, this network united several powerful computers in organizations of the military-industrial complex, educational and research centers.
    Arpanet had a fairly large number of terminals, whose users could run their programs on remote computers.
    The operational nature of such activities and the ability to access high-performance processors predetermined the success of computer networks. And they began to appear like mushrooms. Currently the number of large
    (global) computer networks have exceeded 50,000, and the number of small
    (local) computer networks number in the hundreds of thousands.

    Networks and protocols

    At first, when different networks were based on different computing platforms, i.e. used incompatible hardware and software when data transmission standards had not yet been adopted, internet communication was difficult. A major shift in overcoming these barriers occurred in 1974, when the TCP/IP (Transmission Control Protocol/Internet) protocol was developed.
    Protocol). However, almost 10 more years passed until this protocol was adopted without fail on all host computers of the Arpanet network. By that time, several hundred other networks had joined Arpanet, and a set of computing services began to be gradually replaced by information services. The TCP/IP protocol is not the only standard for data transmission in networks. Therefore, intermediary servers called gateways are organized between two networks that use different protocols for internal communication. The situation is reminiscent of changing wheels on carriages on the border of two countries, the railway tracks of which have different gauges.

    Any protocol for exchanging information between two subscribers is a set of agreements that stipulate various technical characteristics and software tools involved in receiving/transmitting data. These characteristics include data transfer speed, error control and elimination tools, header formats and subscriber addressing methods, methods of compressing and decompressing information, and much more.
    Some of these agreements are provided by the hardware capabilities of modems, others - by service programs.
    Two PCs equipped with modems and the appropriate software can communicate with each other, similar to subscribers talking on the phone. The organization of such communication may be preceded by a regular telephone conversation between PC owners, stipulating the communication time, information transfer speed and other protocol parameters. Once a connection has been established between computers over a telephone channel, you can transfer files in the same way as over a “direct wire”. Subscribers of the BBS electronic bulletin board resort to such communication services.
    (Bulletin Board System). Significant disadvantages of such communication between computers include unexpected busyness of the telephone line, low bandwidth of communication channels, high fees for long-distance contacts, the need for manual control of the communication session by the owner of the receiving PC (client), and the difficulty of preliminary assessing the usefulness of information extracted from the BBS server.

    In the early 70s, the US Department of Defense began developing a communications system that was supposed to connect the computers of all the country's missile defense centers. High demands were placed on the reliability of the system: failure of any of its components (that is, computers and communication lines) should not affect the quality and speed of communication between other participants in the information exchange. This is how the ARPAnet network (Advanced Research Projects Agency - Bureau of Advanced Research of the US Department of Defense) appeared; the word "net" in English means
    "net").
    Data transmission on the network was organized based on the Internet protocol -
    IP. The IP protocol is the rules and description of how a network operates. This set includes rules for establishing and maintaining communication on the network, rules for handling and processing IP packets, descriptions of network packets of the family
    IP (their structure, etc.).
    The network was conceived and designed so that users were not required to have any information about the specific structure of the network. In order to send a message over a network, the computer must place the data in some
    "envelope", called, for example, IP, indicate on this "envelope" a specific address on the network and transmit the resulting packets to the network.

    For almost a decade, the development of network technologies went unnoticed by the general public: the services of networks were mainly used by programmers and the military. But then, relying on their experience, many organizations began to create computer communication systems between their departments and enterprises located over long distances. For example, in the late 80s, the US National Science Foundation (NFS) organized 5 computer centers based on supercomputers, and hundreds of scientific laboratories and universities that needed access to these computers and that already had their own local networks had to will unite into a single network using the experience and technology of the ARPAnet network.

    Information service

    The development of the network services sector has led to the massive emergence of servers focused on providing information on a specific topic.
    For example, information centers have appeared in large foreign libraries, which have transferred most of their collections into electronic format and continue to promptly add new acquisitions to these archives. This is especially valuable for our readers, because... Many domestic libraries eke out a miserable existence. But librarianship is just a drop in the information ocean. The list of services provided by Internet servers reaches several thousand items. You can get acquainted with international sources in the book by H. Khan “Yellow Pages of the Internet”. In the book
    A. Sigalov with the same name, published by the same publishing house, contains about 2000 addresses with sources of information in our country.

    Global Internet

    Several tens of thousands of computer networks interconnected and uniting several million users form the World Wide Web
    World Wide Web (abbreviated WWW). The organizational development of the Web was facilitated by the emergence of universal network navigators - browsers (from the English browse - scroll through), such as Netscape Navigator or Microsoft
    Internet Explorer. A browser is a program that runs on your computer and allows you to work with the Internet. Browsers provide access to any point on the network via a 2-bit IP address, which for convenience is divided into byte components - 192.34.101.23. Since it is inconvenient for a person to use numerical addresses, in navigators the address of the information source is specified by the symbolic code URL (Uniform Resource Locator), by which the Internet itself searches for an IP address. There are quite a few different URL formats: http://www.sportsnetwork.com ftp://ftp.unt.edu/library gopher://ulkyvm.loisville.edu

    The beginning of the URL corresponds to the data transfer protocol used. In particular, the abbreviation http comes from HyperText Transmission Protocol - a protocol for transmitting hypertext documents.

    For a detailed acquaintance with Internet services, the specifics of connecting to the network and the technology for finding the necessary information, we can recommend the book
    S. Karpenko and I. Shishigina “Internet in questions and answers.”

    Hypertext

    Internet users receive information in the form of hypertext, which is the main way of presenting data. The term Hypertext, used recently in combination with the adjective multimedia, means a document containing text, audio and visual fragments. A feature of such a document is the presence of highlighted keywords, all kinds of buttons and icons, clicks on which reproduce the corresponding fragments, which may not be part of this document, but are located in the memory of another computer.
    Selected fields, the activation of which causes the display of the following frames, are represented in the hypertext document by links to the corresponding fragment within the file system of a given computer or by URLs for calling missing components from the network. Hypertext ideas in one form or another are present in various help systems, in particular in the Windows help system of all versions (Help system). To describe hypertext documents on the Internet, a special language called HTML is used - HyperText Markup
    Language. Thus, we can say that hypertext is multi-page information of various types, linked in different pages by numerous links.

    Despite the fact that modern navigators “understand” the most common protocols, new visual means are constantly emerging in networks. Navigators are beginning to swell, but they cannot keep up with the wave of these innovations. Therefore, in 1994, the idea arose of creating an algorithmic language of an interpretive type, in which it would be possible to write a “player” for a document of any format. So in May 1995 in the company
    Sun Microsystems introduced the Java language, interpreters from which were implemented on most computing platforms on the Internet. Applications for the Internet are written in this language - applets that can be executed on any computer equipped with a Java interpreter.

    Currently, the Internet is not a single network - it is in fact a community of networks (which is why the Internet is called the "network of networks"), which now includes more than 2 million computers around the world. And if you are connected to a network that is part of the Internet, then you have access to the resources of any of them.

    Internet service

    The Internet - not to mention the fact that its appearance marked a new era in the development of communication - allows a person to expand his knowledge in any, even the most unimaginable, field of activity or research. And since the development of the Internet, on the one hand, was facilitated by commercial organizations, firms using the Network to exchange business information and publish advertisements, and on the other hand, by students who put a lot of entertainment materials on public display, everyone will find a lot of useful things here - from businessmen to those who like to relax in style.

    The range of services on the Internet has now become simply vast, so let's start with the most famous and accessible.

    Email - a letter without an envelope

    One of the types of information services that computer networks provide is electronic mail (E-mail). In this case, both subscribers - the sender and the recipient - deal with intermediaries (providers) who perform the functions of local post offices. The intermediary's server is constantly turned on and on its hard drive, officially registered subscribers are allocated “mailboxes” for temporary storage of incoming and outgoing correspondence.

    Servers. To ensure the functioning of a local network, a special computer is often allocated - a server, or several such computers.
    Server disks contain shared programs, databases, etc. The remaining computers on the local network are often called workstations. Workstations that only need to process data on the server (for example, entering information into a shared database of orders and sales) often do not install hard drives for cost (or security reasons). In networks consisting of more than 20-25 computers, the presence of a server is mandatory - otherwise, as a rule, network performance will be unsatisfactory.
    A server is also necessary when working intensively with any database.

    Sometimes servers are assigned a certain specialization (storing data, programs, providing modem and fax communications, printing, etc.). Servers, as a rule, are not used as user workstations. Servers that process valuable data are often located in an isolated room, to which only specially authorized people have access (like a bank vault).

    Comment. Many servers cost significantly more (10-20 or more times) than conventional computers. Not surprising - after all, they are not only very powerful computers with a large amount of RAM and disk memory, but they also provide exceptional reliability, high I/O performance, duplication of devices and stored data, means of monitoring the state of the server, means of ensuring uninterrupted operation during failure of some devices, etc.

    Software. Operating systems Windows for Workgroups,
    Windows 95 and Windows NT Workstation have built-in capabilities for organizing local networks without a dedicated server (often such networks are called peer-to-peer, since all computers in them have equal rights). So no additional software is required when using these OSs. And in local networks with a dedicated server, special operating systems are used on the server - Novell NetWare,
    Windows NT Server, etc., - providing reliable and efficient processing of many requests from user workstations. The workstations of such a local network can use any operating system, for example
    DOS, Windows, etc., and driver software must be running to provide access to the local network.

    To ensure efficient operation of users on a local network, auxiliary software is used, which is sometimes supplied with the network OS, and sometimes it must be purchased separately:

    A local server, as a rule, has fairly high-quality communication channels with one of the network nodes that is part of the Internet. The intermediary charges its subscribers a small monthly fee and issues an additional invoice proportional to the volume of information transferred (different tariffs apply within the country and abroad).

    Simultaneously with registration at the local “post office”, the subscriber is given a unique (from the point of view of the local server) email address and is given specially configured software that allows you to automatically establish contact with an intermediary, receive correspondence received in your address in a few minutes and send previously prepared messages. messages. You can connect with the provider at any time convenient for you. But the server will return receipts that have not been claimed for a long time to the sender.

    Interacting with a mail service program is very similar to working in a text editor. The email header includes three required sections. Firstly, this is the sender’s address, which is located after the “From:” symbols. This line is automatically included in the letter by the mail program. Secondly, after the keyword “To:” you need to type the recipient’s email address. Some networks use their own formats to represent the address. The most common format for an email address on the Internet involves specifying five components:

    Recipient_Code_PC_Code. Organization_Code. City_Code.Country_Code

    However, this format also allows for deviations. For example, instead of the usual two-character country code, you may see an educational institution code (.edu). Also, instead of the country code, you can find codes of affiliation with commercial organizations (.sot)", American government agencies (.gov); American military organizations
    (.mil)", to other organizations.

    The third component in the title, located after the keyword
    Subject, reveals the subject of the letter. Sometimes it is used as an addition to the recipient's email address. Having discovered, for example, in a letter the line
    “Subject: to Sergey”, the recipient understands that a letter was placed in his mailbox for a friend named Sergey, who does not have his own email address.

    The letter sent to the provider is immediately processed, and a special router program determines the optimal path for transmitting the message to the next computer on the network. Such internal details as breaking the transmitted information into portions - packets, attaching service information to packets, data compression, monitoring the transmitted portion and resending it if a fatal error is detected, etc. are hidden from the user.

    Three protocols are used to transmit emails on the Internet.
    The oldest of them - SMTP - Simple Mail Transfer Protocol (Simple Mail Transfer Protocol) was developed back in 1982. A couple of years later the Protocol appeared
    Post Office POP - Post Office Protocol. With the advent of email messages containing sounds and images on the Internet, the MIME protocol - Multipurpose Internet Mail Extension - arose.

    Some networks use software that supports the local language version. Then you can type the text of the letter using, for example, the Russian alphabet. But in most cases you have to limit yourself to the first half of the ASCII set, and then letters addressed to Russian subscribers abroad may look like: “Hello, Sereja! Ja poluchil tuoe poslanie ot 25.03.97...”

    You should pay attention to the efficiency and low cost of e-mail compared to traditional means of communication - telegrams, long-distance telephone calls, regular letters. E-mail reaches a subscriber located anywhere in the world in a matter of minutes. The speed of information transfer between servers reaches
    3600 characters per second, equivalent to approximately 2 pages of text. Unlike regular post offices, E-mail does not lose its letters.

    Teleconferences

    Another type of information network service called
    “teleconferences” are reminiscent of a subscription to an electronic newspaper, in which information on a certain topic appears - news, notes, answers to questions, responses to previous publications, etc. The authors of this very diverse and super-fast information are network users themselves, united by common interests. Many providers provide their subscribers with a list of conferences in which they can participate for a reasonable fee. At the same time, you will regularly receive emails with article titles on relevant topics. Headings are accompanied by identification numbers, the length of the article and, sometimes, a brief abstract in
    1-2 lines. For an additional fee proportional to the size of the article, you can order the desired publication. You just need to do this promptly, because... the server stores the contents of the next release for about 10 days.

    Teleconferences in design and way of working are very similar to e-mail, with the only difference being that your letter can be read by a huge number of people, and in turn you can take an interest in what complete strangers are writing to you. Conferences are divided by topic, the conference title consists of several words separated by periods, each subsequent one narrowing the topic. Here is the standard designation for some Usenet newsgroup groups: comp - conferences where everything related to computers and programming is discussed; news - exchange of news, issues of development of the teleconference system; rec - recreation, hobbies, interests; sci - everything related to science; soc - issues of public life; talk - a group for those who like to argue or just talk about any topic.

    When starting to work with any group, first of all you need to read the rules of work in it, which are regularly placed in these groups by a person who has voluntarily assumed the responsibilities of the group coordinator
    (moderator). There are actually two types of newsgroups - moderated and regular. Messages appearing in moderated groups are reviewed by a moderator before being sent out across the network. This, of course, is a kind of censorship, but in such a huge community as
    Usenet, it is impossible to maintain order without such strictness.

    Today, every computer fully connected to the Internet has access to Usenet news, but Usenet news spreads across other networks, being used as widely as e-mail. The way and convenience of working with news depends greatly on how you receive it. On the Internet, your client program directly retrieves news from the Usenet server, and there is no delay between viewing the list of messages contained in a group and reading those messages. If you use news via e-mail, then you first receive a list of articles, and only then receive by e-mail articles from the list that you ordered separately.

    File Transfer Protocol (FTP) – file transfer

    FTP service Internet is a service that puts maximum load on communication channels. This abbreviation stands for file transfer protocol, but when considering ftp as an Internet service, it means access to files on remote computers and in file archives. FTP is a standard program that runs over the TCP protocol, usually supplied with the operating system. Its original purpose is to transfer files between different computers operating on TCP/IP networks: on one of the computers the server program runs, on the second the user runs a client program that connects to the server and transfers or receives files. Here it is assumed that the user is registered on both computers and connects to the server using his name and password on this computer. The FTP protocol is, of course, optimized for file transfer.

    This feature was the reason that FTP programs became part of a separate Internet service. The fact is that the FTP server can be configured in such a way that you can connect to it not only under your own name, but also under the code name anonymous. Then not the entire file system of the computer becomes available to you, but a certain set of files on the server that make up the contents of the anonymous FTP server - a public file archive.
    So, if someone wants to provide files with information, programs, etc. for public use, then he just needs to organize an FTP server on his computer connected to the Internet.

    If, for example, you want to present a demo version of your software product to the world, using an FTP server is a good solution to this problem. If, on the other hand, you want to find, say, the latest version of your favorite free software, then you need to look for it on FTP servers.

    Despite its popularity, FTP also has many disadvantages.
    FTP client programs may not always be convenient or easy to use.
    It is not always possible to understand what kind of file this is in front of you. There is no simple and universal search tool for FTP servers - although there is a special ARCHIE service for this, it is an independent program, not universal and not always applicable. Descriptions of files on the server are provided in the format of the server operating system, and a list of UNIX operating system files can be confusing to a DOS user. The problem here is that the list of files provides unnecessary information, and knowing too much is harmful.

    FTP servers are not centralized, and this brings its own problems. FTP is a direct access service that requires a full connection to the Internet, but access via email is also possible - most FTP servers can send their files by email, and there are also servers that can send you files from any FTP by email servers. However, this is not always convenient, because such servers are heavily loaded, and your request may wait a long time for its turn. In addition, when sending large files, the server divides them into parts of a limited size, containing 1000 separate letters - and if one part out of a hundred is lost or damaged during transmission, then the remaining 99 will also be unnecessary.

    Internet Real Chat

    (real communication on the Internet)

    This can be translated as "parallel conversations" on the Internet or
    "switchable chatter." Imagine tens of thousands of people who gathered on the Internet to talk. With friends and strangers. Discuss certain topics or just chat. And all this happens in real time. In order to participate in the conversation, you just need to connect to the selected channel. Each channel has a name that more or less reflects the topic of conversation (for example, on the warez channel there is an exchange of stolen programs), and sometimes not.

    Hypertext system GOPHER

    One of the fairly well-known and widespread Internet services is GOPHER (though outdated). Although now it is practically not developing, or, in any case, it is developing much more slowly than other services of a similar purpose, nevertheless, quite a large amount of information is available through GOPHER - primarily for historical reasons - there was a period when GOPHER was the best means of public access to information. Modern tools for working with information on the Internet (for example, WWW viewers) also provide access to GOPHER servers, so special GOPHER client programs are not currently used. As for using the GOPHER server to provide new information to the public, it is hardly advisable to use the outdated GOPHER service.

    GOPHER is a distributed structured information export system. When working with GOPHER, you are in a system of nested menus from which files of various types are available - as a rule, simple texts, but it can be graphics, sound and any other types of files. Thus, files with information are exported to the public, but not in the form of a file system, as in FTP, but in the form of an annotated tree structure.
    GOPHER is a direct access service and requires that both the server and the client are fully connected to the Internet.

    The main advantage of Gopher is that you don't have to remember the address or name of the resource and the sequence of commands required to access it: as you move through the program menu, you actually navigate to different computers connected to the Network. However, today Gopher appears to be on its last legs, because a new service has appeared that is much more convenient.

    Other online information resources

    While the Internet is undoubtedly the most powerful means of accessing online information, there are other sources, many of which have presaged the explosive increase in Internet use. These include computer bulletin boards and commercial information services.

    World Wide Web or WWW

    Today this is the most advanced and interesting resource - a hypertext navigation system on the Internet. WWW differs from ordinary hypertext mainly in that it allows you to establish links not only to a neighboring file, but also to a file located on a computer in another hemisphere of the Earth. No effort is required from you - the computer will establish the connection on its own.

    In WWW, as in Gopher, you can access resources by directly specifying their address. In this case, you have to face a reduction
    URL - Uniform Resource Locator, a universal way to designate a resource
    Internet. The URL designation consists of two parts: the first indicates the type of connection that should be established with the source you need, the second is the name of the required server. Communication types correspond to standard Internet services.
    Here are the main ones: http - HiperText Transfer Protocol, the basis of WWW, - the type of communication required when accessing any WWW server; ftp - used when accessing FTP servers; gopher - designed to interact with Gopher; telnet - designed to gain terminal access to a remote machine; news - opens access to newsgroups.

    Information on the WWW may include text, pictures, tables, sound, animation and much more. Thanks to its wide capabilities, beauty and ease of use, the World Wide Web has gained immense popularity all over the world. Information navigation on the InterNet network:
    Introduction to the new generation of instrumental systems - "Navigators" of the WWW system.

    If you have ever had the opportunity to be amazed at the possibilities of the network
    InterNet, then you already know about the phenomenal amount of information - everything from databases, text files, documents, image-audio-video files to ready-made programs.
    With such a gigantic amount of data on the network, it is necessary to have developed instrumental systems that make information retrieval more efficient.
    Until 1991, simple first and second generation tool systems and navigators were developed on the InterNet network. But then a new stage of coordinated development began.
    It was in that year, at CERN, in Geneva, that the systems now called the World Wide Web (WWW/3W or simply “Web” - in our language) were developed
    "Cobweb").

    The same system is an attempt to integrate various tool systems and data using a common data format based on the concept of hypertext. The result of these developments was extremely successful. In fact, they have now formed the face of the network
    InterNet.
    The Web system is based on the method of linking words and phrases in a document to link to related information in the same or another document.
    Since other documents may be on different servers, these links form a kind of “web” of mutual connections that permeates the Internet.

    But what can you do with this technology on your local network?
    Web technology can also be used if your network is not included in InterNet. All components of this technology are available for many platforms as FreeWare.

    A number of leading software firms are rapidly implementing the first commercial tool systems for this new Web networking technology. They are aimed at more optimal integration of Web technology into the workstation environment and will be more convenient from the point of view of system maintenance and administration.
    To understand the essence of Web technology and the software used, it is worth recalling the basic navigation and tools available to InterNet users.
    The simplest means are FTP and Telnet. Ftp is a program that uses TCP/IP File Transport Protocol to transfer files between computers. Telnet is a program for accessing a remote computer in local terminal emulation mode.

    Although these programs work flawlessly, they are “blind” because they only process data whose location is already known
    You are intended to perform only basic operations. In fact, they were the first generation of tools on the InterNet.
    The next generation of tools - "navigation" - focused on the problem of finding the required information resources.

    Hyper-G hypertext system

    Search engine WAIS

    WAIS is another Internet service that is hardly used today, or at least practically not developed. WAIS stands for General Information System, but in fact it is a set of programs designed to index large volumes of unstructured, usually just text, information, search through such documents and retrieve them. There are programs for indexing, for local search using the obtained indexes, as well as server and client programs that communicate with each other using a special protocol
    Z39.50. The task of searching through large volumes of unstructured information is very non-trivial and today there is no generally accepted solution.
    WAIS in many cases is an acceptable option for a search engine, and since it has a freely distributed software implementation, it has gained sufficient popularity as one of the Internet services. In fact, it is hardly used today on its own, but in many cases it is used as an auxiliary tool, for example, for indexing documents stored on a WWW server. In some cases it is also used as a dictionary tool, or for searching Usenet news archives. If you are faced with the task of indexing large volumes of unstructured information, then perhaps WAIS will be an adequate solution. However, one must keep in mind that the freely distributed implementation of the system is far from perfect, that the system is quite difficult to understand and study, and, worst of all, it practically does not develop. Several organizations were consistently involved in the support and development of the free version, but none of them brought the product to a state acceptable for real work.

    "Navigation" on the Internet

    With the vast information wealth that arose in every
    "corner" of InterNet, like mushrooms in a forest clearing, the main requirement for new tools was efficiency in finding the required network resources.

    Due to the wide variety of resources available on the InterNet, information retrieval is a complex problem, especially down to the absolute network number (for example, there are hundreds of complex databases, and tens of thousands of freely accessible archive servers). In fact, every hour more and more new sources appear on the Internet, which makes a regular search an almost hopeless task. Currently, the following "navigation" aids are available:

    Archie system;

    DBMS for searching files in publicly accessible archives;

    Wide Area Information Server, which can be used to search a large number of databases and document archives.

    There are also Gopher-type systems with an interface in the form of text screen forms-menus, which link to information sources distributed over the network, thereby forming a “web” of connections - the so-called Gopher spaces. The Veronica system is designed to search for objects in this Gopher space.
    These navigational aids are still widely used today.
    But while providing efficient navigation across the InterNet, they all deal with a simple batch file format and manipulate a limited data type.

    It became clear that more advanced systems with developed service functions and the ability to process more diverse data formats were needed. And one of the sources of various data was the Web network itself, which served as the basis for the next generation of InterNet tools - Web navigators.

    Web modification

    Web hypertext links act as pointers to other parts of a single document, or to completely different documents, or to other services available over the network. Although in many ways this seems somewhat abstract in theory, in practice it is very effective and convenient.
    If you have ever used the Help subsystem for Windows, you already have practical knowledge of working with hypertext. Any word, phrase, or icon in a Help document can be described as an independent object. When you select this object, you go to the corresponding part of the document.
    Web navigators extend this idea by providing links between documents located in different network nodes and access to a variety of services such as
    FTP and Gopher spaces.
    For example, in the description of networks there may be a reference to Ethernet. If the word "Ethernet" is supported by a hypertext link, when you select it, you will be directed to a description of that network technology. This description, in turn, may contain a link to a document about equipment suppliers
    Ethernet and selecting one of them will provide a transition to a document that describes all the equipment it supplies. Moreover, this description may contain a list of all drivers for the supplied Ethernet adapters.

    ABOUT! and you discovered that the latest version of the driver has appeared specifically for the card that you have been using for a long time. And this version can be obtained by simply pressing your mouse button to activate the corresponding hypertext link.

    Documents on the Web may include, in addition to text, instructions about the fonts and formats used, links to graphics and photographs, links to other data, documents and services. All these documents are formed according to the rules of Hypertext Markup Language (HTML).

    HTML is based on an industry standard - Standard General Markup
    Language (SGML) - for creating machine-independent documents, taking into account the variety of computer platforms used. However, HTML further expands the definition of a hypertext link.
    An HTML document consists of text that is to be displayed, and tags that define how that text is to be presented and how other types of data - for example, video and audio data - are to be retrieved and retrieved. formatted, and where each hypertext link leads.
    HTML descriptors are fields in a document that are placed between "" characters and contain directives and associated parameter data
    - directive attributes.
    For example, in the Hello HTML field, the start tag indicates the beginning of the header, and the "H" directive specifies that the text that follows this tag should be placed in that header. The attributes that follow the directive further define what the directive should actually do. In our example, the argument "1" indicates that the header should be placed at the first level. The end tag indicates the end of this field.

    There are special tags to describe hypertext links (so-called anchors), headings, images and other objects.
    Creating HTML documents is not for the weak.
    Currently there are no true HTML editors that work in the
    "What You See Is What You Get"
    -WYSIWYG). But there are already several options for developing HTML documents (see appendix N=1).
    There are already HTML-ware (but not yet in WYSIWYG mode) like Hypercard
    Editor for PC Macintosh, or WinWord-macros for Microsoft Word. Almost WYSIWYG editors have already appeared - one for the NeXt Computer, Inc. platform. And one for Windows - HTML Assistant (currently as a shareware program - alpha version).

    You can also use translators that convert standard Rich Text documents into HTML format (this version is ready for PC
    Machintosh).
    So far, all these programs still require “manual” finishing of the output documents.
    But on the other hand, creating simple documents can also be done
    "manually".

    What is a URL

    The basic concept in WWW navigators is Uniform Resource Locators
    (Uniform Resource Locators - URL). URLs are used to identify the location of resources referenced in documents. For example, the URL for the main index (home page) of a set of documents in the National Center for
    Supercomputer Application Programs (NCSA) is specified in the following way: http://www.ncsa.uiuc.edu/Genaral/NCSAHome.html
    The "http:" component defines the access method - via the HyperText server
    Transfer Protocol (HTTP). This protocol is defined by its creator Tim
    Barners-Lee as "... simplest and fast enough for distributed and interoperable information hypermedia systems. It is a general object-oriented protocol that can be used for many similar problems, such as Name Servers and distributed object-oriented systems."

    The next part of the URL - "//www.ncsa.uiuc.edu" - describes which node hosts the data. Finally, the "/Genaral/" component determines where the document file "NCSAHome.html" is located.
    To learn how to navigate the WWW, try starting with the main pages on the servers at NCSA and CERN. If you are located on a local network like
    NetWare, you can navigate through all the information resources of this network through Novell's home page.
    And information about news on the InterNet can be found on the "page"
    NCSA's Mosaic "What"s New page" (see appendix N=1) and the Global Navigator
    Networks (Global Network Navigator) - excellent software developed by the staff of the publishing house O"Reily Press.

    "Navigation" on the Web

    Although several programs have been developed for Internet navigation - for example, WWW and Lynx for text mode, but only the Mosaic system has won the appreciation of almost all InterNet users. It was developed at NCSA in 1993 and combines, on the basis of a single graphical interface, both several traditional instrumental subsystems for the InterNet network, and the capabilities of new navigators.
    Another alternative Cello system in the Windows environment was created on
    Cornell University Law School. Mosaic and Cello provide an effective software environment for any beginner to successfully navigate the information “treasures” of the global InterNet network.
    Using these "shells" you don't have to waste time learning all the complex first and second generation tools.
    Due to their convenient and efficient interface and implementation on many industrial platforms, Mosaic and Cello systems have become widely used for information services within large companies.

    Conclusion

    Summing up the work done, I emphasize that Internet technologies in the use of information resources are moving forward by leaps and bounds, and this greatly facilitates the search and collection of information on the required topic. At the same time, there are some shortcomings that, hopefully, will be corrected over time. Such disadvantages include some clogging of the network with useless information, which most often interferes with the search for this or that information; the lack of a unified program systematizing information and access to it is also a significant obstacle. Based on the above, let's try to look into the future
    The Internet, which is already close, but, at the same time, depends on yesterday's network.

    Internet today and tomorrow

    1. Less than a year ago, the AltaVista search server
    (http://www.altavista.digital.com - a very powerful and fast search engine) DEC proudly announced on the front page: "We monitor changes on more than 70,000 servers around the world." As of September 1996, this figure is 275,600 and, of course, is constantly increasing. According to some forecasts, if the number of Internet hosts grows just as quickly, then by the end of the first decade of the 21st century, the existing address space simply will not be enough for everyone. The new Internet protocol standard is designed to solve this problem (and not only this one). A complete list of web search and navigation systems only will include links to more than 120 independent servers.

    2. Until recently, the telephone was the most popular and widely used means of communication. Now any person connected to
    Internet, can talk with his interlocutor over this network. All he needs for this is a sound card, microphone and speakers. Moreover, negotiations cost him much less than if he were talking on a regular telephone. For now, this is only possible for network subscribers, but today there is a very active development of hardware and software for creating an Internet gateway - a regular telephone (interesting materials on this topic are available on the server of Vokaltech, the manufacturer of the iPhone program, the de facto standard for live communication on the Internet ; document address - http://www.vokaltec.com/gateway.htm).So far, the quality of the conversation strongly depends on the speed of the Internet connection - the minimum data transfer speed is 14400 b/c. Developments are also underway to reduce the required speed to 2400 b/c.

    3. Video conferencing systems. They exist and work. The most famous freely distributed product of this kind is the Cu-program.
    SeeME. Server address - http://137.142.42.95/CuSeeME.html. Minimum connection speed 14400 b/c. This “toy” becomes a fairly effective way to hold meetings within a large enterprise.

    4. Almost every more or less large company creates its own corporate network for its own needs. Literally over the past year, another product of the Internet has become extremely popular. This is an Intranet - a corporate network built on Internet technology. And manufacturing companies are stepping on each other’s heels, trying to occupy another niche for their market. In particular, the emphasis is on html-layout systems and building Web sites, aimed at a completely untrained user.

    5. In the field of audio technology, another quite interesting “toy”
    - Real Audio (http://www.realaudio.com). This system allows you to transmit live sound over the network and can be used for broadcasting.

    6. Attempts to create “electronic money” are of interest.
    (for example, http://www.cybercash.com) and in general - a system of reliable and secure payments (one of the pioneers in this area is http://www.firstvirtual.com). At the moment, we can say that this problem has been solved. Already now, through the Internet using a credit card, you can book a plane ticket, book a hotel room, or simply buy a CD-ROM with the game you like.

    7. Another hit of this season is interactive technology. Two undisputed leaders in this direction are the platform-independent Java environment (http://www.firstvirtual.com) from Sun Microsystems and the continuation of OLE technology - ActiveX (http://www.activex.com), the brainchild of
    Microsoft. Both technologies, at first glance, are aimed at the same thing, but in fact they complement each other perfectly. And although now, perhaps, it is possible to evaluate only the first steps in their development, as well as in their application and development, the results are already impressive.

    All this is already available, but there are still many different and interesting things left outside the framework.

    References

    Figurnov V.E. IBM PC for the user. Short course. - M.: INFRA-M, 1997.

    480 pp.
    . Personal computer. school encyclopedia/Yu.L. Ketkov, A.Yu. Ketkov,

    D.E. Shaposhnikov. - M.: "Big Russian Encyclopedia"; Bustard, 1997. -

    440 pp.
    . Figurnov V.E. IBM PC for the user: from beginner to experienced. Ed. 7th, revised and additional – M.:INFRA – M, 1997. – 640 pp.: ill.
    . Bazhenova I.Yu. JAVA programming language. M.: Nauka, - 1998. – 327 p.
    . Guseva A.I. Work in local networks Netware 3.12 – 4.1. S.-P.: Peter. –

    1995. – 327 p.
    . Guseva A.I. Internetwork technology. Netware – Unix –

    Windows - Internet. S.-P.: Peter. – 1997. – 470 p.
    . A.V. Frolov and G.V. Frolov. PC step by step. M.: Dialogue MEPhI, 1998. – 405s.
    -----------------------
    SPb.: PETER, 1996
    SPb.: BHV -St. Petersburg, 1996