• Types of DVI. Differences between DVI-I, DVI-D and DVI-A. Difference between VGA, DVI and HDMI video ports

    To transmit video signals in digital form, a DVI (digital visual interface) connector is used. It was created when video media appeared in digital format - DVDs, and when it was necessary to transfer video from a computer to a monitor. The then existing methods of transmitting an analog signal did not allow achieving high picture quality, because it was physically impossible to transmit a high-resolution analog signal over a distance.

    Video distortion can always occur in a communication channel, this is especially noticeable at high frequencies, and HD quality precisely implies the presence of high frequencies in the signal spectrum. To avoid these distortions, we tried to switch to a digital signal and abandon analog when processing and transmitting video from the media to the display device. Then, in the late 90s, several companies joined forces to create a digital interface for transmitting video data, eliminating DAC (digital-to-analog) and ADC (analog-to-digital) converters from the path. The result of their work was the creation of the video signal transmission format - DVI.

    Appearance of dvi connector:


    View of the dvi connector inside:


    Basic parameters of the dvi interface

    This type of connection transmits information about the main components of the RGB signal (red, green, blue). Each component uses a separate twisted pair cable in the DVI cable, and a separate twisted pair cable to carry the clock signals. It turns out that the DVI cable consists of four twisted pairs. A twisted pair connection allows you to use the principle of differential data transmission, when the interference has a different phase in each conductor and is subtracted at the receiver, but these are technical features and it is not necessary to know them. Each color component is allocated 8 bits, and, in general, 24 bits of information are transmitted to each pixel. The maximum data transfer rate reaches 4.95 Gbps, at this speed you can transmit a signal with a resolution of 2.6 megapixels at a frame rate of 60 Hz. An HDTV signal, whose resolution is 1980x1080, has a resolution of slightly more than 2 megapixels, so it turns out that a high-resolution signal of 1980x1080 at 60 Hz can be transmitted through the DVI connector. There is only a limit on the cable length. It is believed that a high-resolution signal can be transmitted with a cable up to 5 meters long, otherwise distortion may occur in the image. When transmitting a signal with a lower resolution, it is allowed to increase the length of the DVI cable. It is also possible to use intermediate amplifiers if a larger length is still needed to transmit the video signal.

    For greater compatibility, the DVI connector was made to support an analog signal. This is how three types of DVI connectors appeared:

    1. 1) DVI-D transmits only digital signal;
    2. 2) DVI-A transmits only analog signal;
    3. 3) DVI-I is used to transmit both digital and analog signals.

    The connector itself is the same for all three types, so they are completely compatible, only they have a difference in the connected contacts in the connector.

    There are also two data transfer modes: single link (single mode), dual link (double mode). Their main difference is in the supported frequencies. If in single mode the maximum signal can be 165 MHz, then in dual mode the limitation is imposed by the physical characteristics of the cable. This suggests that DVI Dual Link cables can transmit signals with higher resolution and over longer distances. That is, if, when using a single link cable, there is interference in the image of the LCD TV in the form of colored dots, then you can try replacing it with a dual link. Structurally, a dual mode DVI cable is distinguished by the use of double twisted pairs to transmit color components.

    Features of the dvi connector

    To implement such speeds, a special TMDS coding method. And in any DVI connection, a TMDS transmitter is used on the transmitting side for encoding, and the RGB signal is restored on the receiving side.

    Additionally can be used in DVI interface DDC channel (Display Data Channel), which provides the source processor with EDID display information. This information contains details about the display device and includes information about the brand, model number, serial number, release date, screen resolution, screen size. Depending on this information, the source will produce a signal with the required resolution and screen proportions. If the source refuses to provide such information, it may block the TMDS channel.

    Just like HDMI, DVI interface supports HDCP content protection system. Such a protection system is called intelligent protection and is called so because of its implementation and the ability to set different levels of protection depending on different cases, so such protection does not block normal data exchange (for example, when copying). It is implemented on the principle of exchanging passwords with all devices connected via DVI.

    Only the image is transmitted through the DVI connector, and the sound will have to be transmitted through additional channels. Some video cards have the ability to transmit audio via a DVI cable, but special adapters are used for this, and this feature is additionally implemented in the video card itself. And then it is no longer a pure DVI interface. With a normal connection, audio needs to be transmitted additionally.

    Connecting the monitor to other devices is carried out using various interfaces, of which there are currently plenty. Depending on the technological solution, connection options are of two types - analog and digital. The latter are represented by two main interfaces - DVI D or HDMI. What is better and what, exactly, is the difference between these technologies? Which connector should you choose? Next we will look in more detail at why HDMI is better than DVI.

    Requirements of modern technology

    To determine which connector is better - DVI or HDMI, it is worth understanding why it is necessary to consider only these two connectors, because there are other interfaces, for example, DisplayPort or VGA. Firstly, DisplayPort is used primarily to connect a monitor to a computer, providing a connection between multiple screens, but this interface is definitely not suitable for HD format. VGA is currently considered an outdated solution; many leading companies have abandoned its use in their technology. By the way, it was thanks to VGA that

    But analogue broadcasting today is also fading into the background due to the fact that it clearly shows most of the image imperfections. The device first converts to analog, and then vice versa. Unnecessary image transformations lead to noise appearing on the screen - doubling, copies of objects, buttons or text become a common problem. These shortcomings are especially noticeable on the first modern LCD monitors, which only supported a VGA connection.

    As for connectors, most manufacturers now place ports for both digital cables on the rear panels of equipment. But, as a rule, this only complicates the task when it comes to deciding which cable is better - HDMI or DVI.

    Common features of interfaces

    Both HDMI and DVI transmit video signals using the same technology called TMDS. The necessary information in this case is encoded in such a way as to obtain the most harmonious sequence of bits. Thanks to the latter, a high frequency level is achieved and, as a result, a higher quality image.

    In addition, only one cord is used for both one and the other port (despite the fact that HDMI is a single-channel solution, but DVI is a multi-channel solution, which will be discussed below). This is possible thanks to the use of special adapters.

    Distinctive Features

    Many personal computer users make a choice in favor of Why is this so - why is HDMI better? What can't DVI boast? There are two main differences:

    1. An HDMI cable can transmit not only video, but also audio. This ensures high quality not only of sound, but also of pictures. Most DVI models do not have this feature, although, of course, there are exceptions to the rule.
    2. is a single-channel cord, but the data transfer speed reaches one hundred megabits per second. But the products of a competing company are distinguished by several channels, one of which transmits an analog signal. For devices that operate on such a signal, a DVI cable is a godsend. Thus, the company keeps up with the times, but does not ignore the needs of owners of equipment that cannot boast of innovative “filling”.

    To decide whether HDMI is better than DVI, you need to understand that image quality largely depends on the device that needs to be connected. The picture that appears on the screen is affected by the signal level. But the stability and quality of image transmission is already ensured by the cable.

    But you should not try to connect one cord in place of another; there are special adapters for this. Otherwise, you may simply lose the sound. Although both interfaces work using the same technology, the differences make themselves felt.

    High-Definition Multimedia Interface

    What makes HDMI better than DVI is that it is the High-Definition Multimedia Interface that is used by many companies that produce equipment. It is such a common interface that using hdmi you can connect not only TVs and monitors to computers, but also laptops, tablets and smartphones, game consoles and players.

    A cable consisting of only one channel is still wide-format, allowing you to form a whole system from various multimedia devices. The latter is especially necessary in some cases.

    New versions of the cable have excellent compatibility and easily replace previous models. HDMI has good bandwidth, which is important for gamers or simply those users who like high speeds and improved sound. At the same time, HDMI is completely innovative and only supports digital format. That is, any old models of equipment cannot be connected using it.

    Digital Visual Interface

    The DVI interface has three varieties that support different modes - digital, analog and analog-to-digital. This cable can transmit a high-expansion picture over a distance of no more than five meters. Signal transmission can be carried out in two modes. The first is single link, the second is dual link. The latter ensures operation at high frequencies. So, if the picture is of poor quality when using single mode, it is the dual link that will correct the situation.

    Competitive advantage

    HDMI is a smarter interface anyway. Developers follow trends and keep up with the times. Most models of modern technology at the moment and in the near future will definitely use this type of cable.

    HDMI or DVI for a computer, which is better to choose? In this case, you can use either one or the other interface. The final choice depends on the purpose. If high-quality sound is important, then it is better to use the first connector; if this is not necessary, DVI is also suitable for the connection. This type of interface is especially good because its developers, although they are developing technologies, also do not forget about those who use old devices. After all, quite a lot of users still have computers and televisions that are no longer “in trend.” It is in this case that it is better to go with DVI.

    VGA, DVI and HDMI are video interfaces for transmitting a video signal from a source to an image output device. They differ in the method of signal transmission and processing, as well as in the connector.

    VGA was developed in 1987 and was intended to transmit an analog signal to cathode ray tube monitors. Ten years later, LCD monitors began to take over the market. Through VGA, the process of transmitting a video signal was carried out by converting a digital signal into an analog signal, which was then transmitted and displayed on a CRT monitor. With the advent of LCD monitors, the circuit became more complex. Now we had to convert the signal from digital to analog, transmit it to the LCD monitor and convert it back to digital. It became obvious that the analog signal could be excluded from the chain, and in 1999 the DVI video interface appeared.

    HDMI was developed in the early 2000s. It differed from DVI in a more compact connector and the ability to transmit digital audio signals (since 2008, DVI also learned to transmit sound). The advantages of the new connection interface have taken their toll and at the moment it is cutting-edge. Its popularity led to the introduction of miniHDMI and microHDMI. Their differences are only in the sizes of the connectors.

    How is the image via DVI and HDMI better than VGA?

    The main argument in favor of digital interfaces is that the analog signal during transmission is exposed to external electromagnetic fields, and this leads to its distortion. There is some truth to this, but in home environments there is no serious interference that could lead to noticeable distortion even over long distances. It is also believed that DVI and HDMI transmit the signal as accurately as possible due to post-error correction, which VGA does not have. This is true, but this is an advantage only with a high-quality cable of short length (up to 5 meters).

    Another argument in favor of digital video interfaces is the absence of unnecessary signal conversions - from digital to analog and vice versa. It would seem that HDMI and DVI should outperform VGA in this aspect. In practice, sometimes it turns out the other way around, since transformations cannot be avoided in any case. Digital signals are encoded and must be decoded and processed before being displayed on the screen. Individual modules of image output devices are responsible for this process, and their transcoding algorithms are not always ideal. True, over time they are improved and are currently at a good level even in cheap monitors and televisions.

    The quality of the cable is another stumbling block. An analog signal is less demanding on it, while a digital signal needs a good conductor. This is especially true when the cable length is more than five meters. In this case, when bits are lost, error correction does not always work and the output image can be many times worse than if a VGA connection was used.

    Let's sum it up

    Despite the fact that I have downplayed the merits of DVI/HDMI, in some cases the image transmitted through them will be better. But this can only be noticed if you have a high-quality cable, a reliable connection between the connectors and a good output device - a monitor or high-definition TV. If a monitor via VGA produces a good picture, then do not expect that when connected via a digital video interface the image will sparkle with new colors. In my practice, I saw a significant improvement only once when connecting AOC monitors. They worked disgustingly through VGA - the image was fuzzy and blurry. In this case, it is only the manufacturer's fault.

    Today you can display a video image on a monitor or TV in different ways - there are more and more options for connection ports every year, and it’s not surprising to get confused in the number and difference of interfaces.

    Let's look at the most popular formats and determine cases when one or another video port standard is best suited.

    VGA

    The oldest standard for pairing a PC and a monitor, which still exists today. Developed back in 1987 by IBM, the component video interface uses an analog signal to transmit color information. Unlike more modern standards, VGA does not allow sound transmission - only pictures.

    The VGA connector is usually blue with two screws on the sides. It has a 15-pin connector and initially could only work at a resolution of 640 by 480 pixels, using a palette of 16 colors. The standard later evolved into the so-called Super VGA, supporting higher screen extensions and up to 16 million colors. And since the improved standard continued to use the old port and did not change in appearance, it is simply called VGA in the old fashioned way.

    This format is most often used on older hardware, but many computers still have this port. What is called - just in case.

    DVI

    More than ten years after the release of the VGA standard, the DVI format, a digital video interface, saw the light of day. Released in 1999, the interface was capable of transmitting video without compression in one of three modes: DVI-I (Integrated) - a combined digital and analogue transmission format, DVI-D (Digital) - supporting only a digital signal, DVI-A (Analog ) – supports only analog signal.

    DVI-I and DVI-D ports can be used in single or dual mode. In the second case, the bandwidth is doubled, which allows you to obtain a high-definition screen resolution - up to 2048 by 1536 pixels. However, for this you need to have an appropriate video card. The ports themselves differ in the number of contacts - so the Single link mode uses four twisted pairs of wires (maximum resolution 1920 by 1200 pixels at 60 Hz), and the Dual link mode, a corresponding larger number of contacts and wires (resolution up to 2560 by 1600 at 60 Hz).

    It is important to remember that the analog version of DVI-A does not support DVI-D monitors, and a video card with DVI-I can be connected to a DVI-D monitor using a cable with two DVI-D-male connectors. By analogy with VGA, this standard also transmits only video images to the screen without sound. However, since 2008, video card manufacturers have made audio transmission possible - for this you need to use a DVI-D - HDMI cable.

    You can also find on the market the mini-DVI format, invented by Apple, which is inclined to make everything smaller. However, the mini-standard only works in single mode, which means it does not support resolutions higher than 1920 by 1200 pixels.

    HDMI

    High Definition Multimedia Interface or high-definition multimedia interface allows you to transmit digital video and audio signals, and even with the possibility of copy protection. HDMI is smaller in size than its predecessors, operates at a higher speed, and most importantly, transmits sound, which made it possible to retire the previous SCART and RCA (“tulips”) standards for connecting video devices to TVs.

    The HDMI 1.0 specification appeared at the end of 2002 and had a maximum bandwidth of 4.9 Gb/s, support for 8-channel audio and video up to 165 MPix/sec (that is, FullHD at 60 Hz). Since then, the standard has constantly evolved, and in 2013 the HDMI 2.0 specification was released with a bandwidth of up to 18 Gbps, support for 4K resolution (3840 by 2160 pixels at 60 Hz) and 32-channel audio.

    Today, the HDMI standard is used not only by computers, but also by digital TVs, DVD and Blu-ray players, game consoles and many other devices. If desired, you can use adapters from HDMI to DVI and vice versa.

    The number of pins on HDMI ports starts from 19, and the connectors themselves are available in several form factors, the most common of which are HDMI (Type-A), mini-HDMI (Type-C), micro-HDMI (Type D). In addition, there are HDMI ports for signal reception (HDMI-In) and transmission (HDMI-Out). Outwardly, they are practically indistinguishable, but if, say, your monoblock has both ports, then when you try to display a picture on a second monitor, you can only use one of them, namely the HDMI-Out one.

    DisplayPort

    In 2006, another video standard for digital monitors was adopted. DisplayPort, like HDMI, transmits not only video, but also audio, and is used to connect a computer with a display or home theater. DisplayPort has a higher data transfer rate, support for resolutions up to 8K (7680 by 4320 pixels at 60 Hz) in version 1.4, released in March 2016, and the image through the port can be output to multiple monitors (from two to four, depending from permission).

    DisplayPort was specifically designed to output images from computers to monitors, while HDMI was more intended for connecting various devices to a TV. However, these ports can be used together using a Dual-Mode DisplayPort adapter.

    There are also variations of Mini DisplayPort, used primarily in laptops. In particular, the smaller format is loved by Apple.

    Thunderbolt

    Finally, a standard from Intel (in collaboration with Apple) for connecting peripheral devices to a computer. It was Apple that was the first to release a device with this interface in 2011 - the MacBook Pro laptop.

    The maximum data transfer speed is 20 Gbit/s when using optical fiber for version 2, while the 3rd version of the interface is capable of operating at speeds up to 40 Gbit/s. Thunderbolt combines not only the DisplayPort interface, but also PCI-Express, which means you can connect almost anything to it. In particular, up to six devices can be connected to one port, which reduces the need to have a huge number of different ports on a device.

    The Thunderbolt connector itself is smaller than the mini-DisplayPort, and its third version is a port compatible with USB 3.1, that is, it is made with a USB Type-C connector.

    Universal USB

    If you are suddenly worried that you will soon have to update all your home appliances due to changing standards, then do not rush. Manufacturers are striving to simplify the story with numerous interfaces and provide support for older devices through adapters. In particular, for HDMI devices you will only need to use an appropriate adapter in order to be able to connect to a modern USB Type-C port.

    By analogy with the fact that previously each mobile phone manufacturer had its own charging connector, and now most use a micro-USB port, the video standard is also striving for unification. And the unifying form factor should be the latest generation USB port, through which both monitors and regular headphones and headsets will be connected.

    For 10 years now, computers and laptops have been equipped with not one, but two or three types of connectors at the same time. The ports differ in both size and appearance. What type of monitor connection do you prefer? The article also discusses the practical usefulness of simultaneously connecting two or even three monitors.

    Common but old types of connectors

    VGA (Video Graphics Array): an outdated classic

    The blue trapezoidal interface dominated the computer field for 25-30 years. It worked great on older CRT displays due to its analog nature. But flat LCD screens appeared - digital devices, then resolutions began to increase and the good old VGA began to lose ground.

    Today it is built into video cards less and less often, but many devices (household players, projectors, TVs) are still equipped with support for the hopelessly outdated VGA. Probably, for several more years, the “old man” will remain a not very desirable, but widespread de facto standard - if you have any doubts about which cable you can use to connect the monitor in the next office, then take VGA.

    DVI-I (Digital Visual Interface): another long-lived video interface

    Actually, there are several of them: DVI-A, -D and -I, plus their varieties. But when we talk about the most common DVI standard, we mean the analog-to-digital DVI-I Dual Channel - it is this specification that is built into most PCs.

    At one time, DVI came to replace VGA, which was rapidly becoming obsolete in the mid-2000s. The ability to transmit both analog and digital signals, support for large (in that era) resolutions and high frequencies, the absence of inexpensive competitors: DVI continues to serve as a standard today. But it is unlikely that his active “life” will continue for more than another 3-4 years.

    Resolutions higher than the minimum comfortable FullHD today are increasingly found even in inexpensive computer systems. With the growth of megapixels, the once serious capabilities of DVI are ending. Without going into technical details, we note that the peak capabilities of DVI will not allow displaying an image with a resolution of over 2560 x 1600 at an acceptable frequency (above 60 Hz).

    Modern video interfaces

    HDMI (High Definition Multimedia Interface) – the king of multimedia

    The abbreviation “HD-IM-AI”, once awkward for Russian ears, is increasingly entering our lives. Why has HDMI become so popular? It's simple:

    • arbitrarily long wires (okay, to be honest - up to 25-30 meters);
    • transmission of sound (even multi-channel!) along with video - goodbye to the need to buy separate speakers for TV;
    • convenient small connectors;
    • support everywhere - players, “zombie boxes”, projectors, video recorders, game consoles - it’s hard to immediately think of equipment that doesn’t have an HDMI connector;
    • ultra-high resolutions;
    • 3D picture. And yes, it is possible along with ultra-high resolutions (HDMI 4b and 2.0 versions).

    The prospects for HDMI are the most promising - development continues; in 2013, version 2.0 specifications were adopted: this standard is compatible with old wire connectors, but supports increasingly impressive resolutions and other “tasty” features.

    DisplayPort (DP): A Connector That's Just Becoming Ubiquitous

    And DisplayPort is stunningly beautiful in appearance...

    For many years, computers were rarely equipped with this direct competitor to HDMI. And - despite the fact that DisplayPort was good for everyone: and support for very high resolutions along with a stereo signal; and audio transmission; and an impressive length of wire. It is even more profitable for manufacturers than licensed HDMI: there is no need to pay the developers of the standard the 15-25 cents that HDMI owners are entitled to.

    The DP connector simply had bad luck in its early years. However, computers are increasingly equipped with a pair of Display Ports of the modern version 1.4 standard. And on its basis, another popular standard with enormous prospects was “born”: the “little brother” of the Display Port...

    Mini DP (Mini DisplayPort)

    Together with HDMI and the completely outdated VGA, the Mini DisplayPort connector is built into almost every computer and laptop. It has all the advantages of its “big brother”, plus its miniature size – an ideal solution for ever-thinner laptops, ultrabooks, and even smartphones and tablets.

    Transmitting an audio signal so as not to buy separate speakers for the monitor? Please - how many channels do you need? Stereoscopy even in 4K? Yes, even though the interface will have to flex all its electronic muscles. Compatibility? There are a wide variety of adapters on the market, for almost any other connector. Future? The Mini DP standard is alive and well.

    Thunderbolt: exotic monitor connection options

    There are others like that. For a year now, Apple, together with Intel developers, have been promoting the fast, universal, but insanely expensive Thunderbolt interface.

    Why do monitors also need Thunderbolt? The question remains for years without a clear answer.

    In practice, monitors with its support are not so common, and there are serious doubts about the justification of Thunderbolt for video signal transmission. Is it the fashion for everything “Apple”...

    Unfortunately, beyond the scope of this article there remains the most interesting opportunity to connect screens to a computer (and even supply power to them!) using the USB 3.0 interface (or, even more interesting, 3.1). This technology has many prospects, and there are also advantages. However, this is a topic for a separate review – and for the near future!

    How to connect a new monitor to an old computer?

    An “old computer” most often means a PC with a single port – VGA or DVI. If a new monitor (or TV) absolutely does not want to be friends with such a port, then you should purchase a relatively inexpensive adapter - from VGA to HDMI, from Mini DP to DVI, etc. – there are many options.

    When using adapters, some inconveniences are possible (for example, there is no way to transmit sound or images with a particularly high resolution via VGA), but such a scheme will work properly and reliably.

    Wireless video signal (WiDi)!

    There are such interfaces, even several. Intel Wireless Display (aka WiDi, or “Wi-Dai,” no matter how strange it may sound to a Russian-speaking reader): an adapter that costs about $30 connects to the USB connector of a TV or monitor (if the technology is supported by the manufacturer).

    The signal is sent via Wi-Fi, and a video image is displayed on the screen. But this is only in theory, and in practice, significant obstacles are the distance and the presence of walls between the receiver and transmitter. The technology is interesting, it has prospects - but nothing more for now.

    Another wireless video interface is AirPlay from Apple. The essence and practical application are the same as WiDI from Intel. A little expensive, not very reliable, far from practical.

    A more interesting solution, but still not widely used, is Wireless Home Digital Interface (WHDi). It's not exactly Wi-Fi, although it's a very similar wireless technology. A key feature is a proprietary method of protection against interference, delay and distortion.

    Connecting multiple monitors at the same time

    Even a novice user can cope with the task of attaching a main or additional screen: connecting a monitor to a PC or laptop is no more difficult than a flash drive. Connecting a monitor to a computer is only possible in the correct way: the connector simply will not fit into a connector that is not intended for it.

    An excellent feature of modern video cards and operating systems is the ability to connect several monitors to one signal source (PC, laptop). The practical benefits are enormous, and in two different versions.

    1. Image clone mode

    The main computer screen operates normally. But at the same time, the image is completely duplicated on a large-diagonal TV and/or projector. You just need to connect the video cable to both the big screen and the projector. Sound is transmitted along with the image if you use modern connectors (HDMI, Mini DP).

    2. Multi-screen mode

    The resolution of monitors is constantly growing - but there will always be tasks for which I would like to have a wider screen. Calculations in a large Excel spreadsheet, or working with a couple of browsers at once; design tasks and video editing. Even typing is more convenient when there is also an additional display next to the main one. “Gap” - the frames of the screens in practice interfere no more than the frames of glasses - after a few minutes you simply don’t notice them. Gamers also like to use several monitors at once - immersion in the gameplay with such a scheme is much more exciting. By the way, some AMD video cards support up to 6 monitors simultaneously (Eyefinity technology made a lot of noise in the IT community 5 years ago).

    Picture: this is how you can call up the settings for connecting a second or third monitor: click on “Graphics Settings” from Intel or Nvidia.

    How to connect a 2nd monitor to a computer? Insert the cable connector - most likely, the image will be instantly “picked up” by the second screen. If this does not happen, or additional settings / another mode are required, a minute’s work in the graphics driver of the video card. To get to this program, just right-click on the Intel, Nvidia or AMD video driver icon - depending on which video adapter is installed in the PC, and select “Settings”. The video adapter icon is always present in the Control Panel, and in almost all cases - in the Windows tray, around the clock.