Difference between DVI-I and DVI-D. Image difference when connected via VGA, DVI and HDMI

The standard provides simultaneous transmission of visual and audio information over a single cable; it is designed for television and cinema, but PC users can also use it to output video data using an HDMI connector.


HDMI is the latest attempt to standardize a universal connection for digital audio and video applications. It immediately received strong support from the giants of the electronics industry (the group of companies developing the standard includes companies such as Sony, Toshiba, Hitachi, Panasonic, Thomson, Philips and Silicon Image), and most modern devices high-resolution output has at least one such connector. HDMI allows you to transmit copy-protected audio and video in digital format over a single cable; the first version of the standard was based on a bandwidth of 5 Gb/s, and HDMI 1.3 expanded this limit to 10.2 Gb/s.

HDMI 1.3 is the latest standard specification with increased interface bandwidth, increased clock frequency up to 340 MHz, which allows you to connect high-resolution displays that support more colors (formats with color depths up to 48-bit). The new version of the specification also defines support for new Dolby standards for transmitting compressed audio without loss in quality. In addition, other innovations appeared; specification 1.3 described a new connector, smaller in size compared to the original.

In principle, the presence of an HDMI connector on a video card is completely optional; it can be successfully replaced by an adapter from DVI to HDMI. It is simple and therefore included with most modern video cards. Moreover, on video cards of the HDMI series, the connector is in demand primarily on mid- and low-level cards, which are installed in small and quiet barebones used as media centers. Because of the built-in audio, the Radeon HD 2400 and HD 2600 graphics cards have a definite advantage for builders of such multimedia centers.

Based on materials from the company's website iXBT.com

If your monitor connects to your computer via DisplayPort or DVI, it still meets modern standards. But what is the difference between DisplayPort and DVI, we will now explain.

Signal type

Both technologies allow digital signals to be transmitted from the computer to the screen. This results in a significant improvement in image quality compared to the old VGA technology.

DVI comes in several flavors that are labeled differently. DVI-I can transmit both analog and digital signals, DVI-D only works with digital signals. But with the help of DisplayPort, only digital information is exchanged.

Screen resolution

A significant difference between DVI and DisplayPort is screen resolution, which is critical to image quality.

DVI provides two options here. With the so-called mono-channel signal transmission method, a maximum resolution of 1600x1200 pixels is achieved. Dual-channel transmission is possible - then the resolution reaches 2560x1600 pixels. This requires a special connecting cable with an increased number of contacts.

With DisplayPort technology you can achieve much higher resolution. The DP 1.3 standard, available since 2014, provides a resolution of 5120x2880 pixels.

Connectors: external difference

The systems use different connectors, which differ even purely visually. DVI connectors are significantly larger than DisplayPort connectors. For mono-channel transmission they have 18+5 contacts, and for dual-channel transmission they have 24+5 contacts, with the last five serving as analog extension. The DVI plug is tightly connected to the monitor (screwed in) to ensure uninterrupted signal transmission.

DisplayPort connectors are much smaller and similar to USB connectors. They require much less space than DVI plugs. They connect to devices in a standard way, without additional screws. Most systems have a mechanical cable retention device to prevent the cable from falling out of the slot.

Playing video and audio

With DVI you can only transmit images. Separate cables must be used to transmit audio signals. But DisplayPort transmits both image and sound.

Another way to connect a computer to an output device is the HDMI standard. In fact, this is an add-on over DVI that expands the capabilities of the technology: the HDMI channel can transmit high-definition audio and digital video.

Compatibility

DisplayPort is electrically compatible with DVI. If, for example, you have a DisplayPort connection on your PC and a DVI connection on your monitor, you can connect both devices to each other using an adapter. The video card in the computer detects this and adjusts the signals accordingly.

Connecting cable length

Among other things, the length of the connecting cable varies. For DVI it can be a maximum of five meters. But for DisplayPort, the cable length can range from 7 to 10 meters.

Using Multiple Monitors

The advantage of DisplayPort is the ability to connect multiple devices. If you want to connect more than one monitor to your computer, you only need one DisplayPort slot on the computer for the first monitor. With DVI this is not possible: it requires appropriate distributors.

Besides the fact that LCD monitors require digital data to display images, they differ from classic CRT displays in several other ways. For example, depending on the capabilities of the monitor, almost any resolution can be displayed on a CRT, since the tube does not have a clearly defined number of pixels.

And LCD monitors, due to the principle of their operation, always have a fixed (“native”) resolution, at which the monitor will provide optimal picture quality. This limitation has nothing to do with DVI, since its main reason lies in the architecture of the LCD monitor.

An LCD monitor uses an array of tiny pixels, each made up of three diodes, one for each primary color (RGB: red, green, blue). The LCD screen, which has a native resolution of 1600x1200 (UXGA), consists of 1.92 million pixels!

Of course, LCD monitors are capable of displaying other resolutions. But in such cases, the image will have to be scaled or interpolated. If, for example, an LCD monitor has a native resolution of 1280x1024, then the lower resolution of 800x600 will be stretched to 1280x1024. The quality of interpolation depends on the monitor model. An alternative is to display the reduced image in the “native” resolution of 800x600, but in this case you will have to be content with a black frame.

Both frames show the image from the LCD monitor screen. On the left is an image in “native resolution” 1280x1024 (Eizo L885). On the right is an interpolated image at 800x600 resolution. As a result of increasing the pixels, the picture appears blocky. Such problems do not exist on CRT monitors.

To display a 1600x1200 (UXGA) resolution with 1.92 million pixels and a 60Hz vertical refresh rate, the monitor requires high bandwidth. If you do the math, you need a frequency of 115 MHz. But the frequency is also affected by other factors, such as the passage of the blanking region, so the required bandwidth increases even more.

About 25% of all transmitted information relates to blanking time. It is needed to change the position of the electron gun to the next line in the CRT monitor. At the same time, LCD monitors require virtually no blanking time.

For each frame, not only image information is transmitted, but also the boundaries and the blanking area are taken into account. CRT monitors require a blanking time to turn off the electron gun when it finishes printing a line on the screen and move it to the next line to continue printing. The same thing happens at the end of the picture, that is, in the lower right corner - the electron beam turns off and changes position to the upper left corner of the screen.

About 25% of all pixel data is related to blanking time. Since LCD monitors do not use an electron gun, the blanking time is completely useless here. But it had to be taken into account in the DVI 1.0 standard, since it allows you to connect not only digital LCDs, but also digital CRT monitors (where the DAC is built into the monitor).

Blanking time turns out to be a very important factor when connecting an LCD display via a DVI interface, since each resolution requires a certain bandwidth from the transmitter (video card). The higher the required resolution, the higher the pixel frequency of the TMDS transmitter must be. The DVI standard specifies a maximum pixel frequency of 165 MHz (one channel). Thanks to the 10x frequency multiplication described above, we get a peak data throughput of 1.65 GB/s, which will be enough for a resolution of 1600x1200 at 60 Hz. If higher resolution is required, the display should be connected via Dual Link DVI, then the two DVI transmitters will work together, which will double the throughput. This option is described in more detail in the next section.

However, a simpler and cheaper solution would be to reduce the blanking data. As a result, graphics cards will be given more bandwidth, and even a 165 MHz DVI transmitter will be able to handle higher resolutions. Another option is to reduce the horizontal refresh rate of the screen.

The top of the table shows the resolutions supported by a single 165 MHz DVI transmitter. Reducing the blanking data (middle) or refresh rate (Hz) allows higher resolutions to be achieved.


This illustration shows what pixel clock is required for a specific resolution. The top line shows the operation of the LCD monitor with reduced blanking data. The second row (60Hz CRT GTF Blanking) shows the required LCD monitor bandwidth if the blanking data cannot be reduced.

The limitation of the TMDS transmitter to a pixel frequency of 165 MHz also affects the maximum possible resolution of the LCD display. Even if we reduce the damping data, we still hit a certain limit. And reducing the horizontal refresh rate may not give very good results in some applications.

To solve this problem, the DVI specification provides an additional operating mode called Dual Link. In this case, a combination of two TMDS transmitters is used, which transmit data to one monitor through one connector. The available bandwidth doubles to 330 MHz, which is enough to output almost any existing resolution. Important note: a video card with two DVI outputs is not a Dual Link card, which has two TMDS transmitters running through one DVI port!

The illustration shows dual-link DVI operation when two TMDS transmitters are used.

However, a video card with good DVI support and reduced blanking information will be quite enough to display information on one of the new 20" and 23" Apple Cinema displays in the "native" resolution of 1680x1050 or 1920x1200, respectively. At the same time, to support a 30" display with a resolution of 2560x1600, there is no escape from the Dual Link interface.

Due to the high "native" resolution of the 30" Apple Cinema display, it requires a Dual Link DVI connection!

Although dual DVI connectors have already become standard on high-end 3D workstation cards, not all consumer-grade graphics cards can boast this. Thanks to two DVI connectors, we can still use an interesting alternative.

In this example, two single-link ports are used to connect a nine-megapixel (3840x2400) display. The picture is simply divided into two parts. But both the monitor and the video card must support this mode.

Currently, you can find six different DVI connectors. Among them: DVI-D for a completely digital connection in single-link and dual-link versions; DVI-I for analog and digital connections in two versions; DVI-A for analog connection and a new VESA DMS-59 connector. Most often, graphics card manufacturers equip their products with a dual-link DVI-I connector, even if the card has one port. Using an adapter, the DVI-I port can be converted into an analog VGA output.

Overview of various DVI connectors.


DVI connector layout.

The DVI 1.0 specification does not specify the new dual-link DMS-59 connector. It was introduced by the VESA Working Group in 2003 and allows dual DVI outputs to be output on small form factor cards. It is also intended to simplify the layout of connectors on cards that support four displays.

Finally, we come to the core of our article: the quality of TMDS transmitters of different graphics cards. Although the DVI 1.0 specification stipulates a maximum pixel frequency of 165 MHz, not all video cards produce an acceptable signal at it. Many allow you to achieve 1600x1200 only at reduced pixel frequencies and with reduced blanking times. If you try to connect a 1920x1080 HDTV device to such a card (even with reduced blanking time), you'll be in for an unpleasant surprise.

All GPUs shipped today from ATi and nVidia already have an on-chip TMDS transmitter for DVI. Manufacturers of ATi GPU cards most often use an integrated transmitter for the standard 1xVGA and 1xDVI combination. By comparison, many nVidia GPU cards use an external TMDS module (for example, from Silicon Image), even though there is a TMDS transmitter on the chip itself. To provide two DVI outputs, the card manufacturer always installs a second TMDS chip, regardless of which GPU the card is based on.

The following illustrations show common designs.

Typical configuration: one VGA and one DVI output. The TMDS transmitter can be either integrated into the graphics chip or placed on a separate chip.

Possible DVI configurations: 1x VGA and 1x Single Link DVI (A), 2x Single Link DVI (B), 1x Single Link and 1x Dual Link DVI, 2x Dual Link DVI (D). Note: if the card has two DVI outputs, this does not mean that they are dual-link! Illustrations E and F show the new high-density VESA DMS-59 port configuration, providing four or two single-link DVI outputs.

As further testing in our article will show, the quality of DVI output on ATi or nVidia cards varies greatly. Even if the individual TMDS chip on a card is known for its quality, this does not mean that every card with that chip will provide a high-quality DVI signal. Even its location on the graphics card greatly affects the final result.

DVI compatible

To test the DVI quality of modern graphics cards on ATi and nVidia processors, we sent six sample cards to the Silicon Image test labs to check compatibility with the DVI standard.

Interestingly, to obtain a DVI license it is not at all necessary to conduct compatibility tests with the standard. As a result, products are entering the market that claim to support DVI but do not meet the specifications. One of the reasons for this state of affairs is the complex and therefore expensive testing procedure.

In response to this problem, Silicon Image founded a test center in December 2003. DVI Compliance Test Center (CTC). Manufacturers of DVI-enabled devices may submit their products for DVI compatibility testing. In fact, that's what we did with our six graphics cards.

The tests are divided into three categories: transmitter (usually a video card), cable, and receiver (monitor). To evaluate DVI compatibility, so-called eye diagrams are created to represent the DVI signal. If the signal does not go beyond certain limits, then the test is considered passed. Otherwise, the device is not compatible with the DVI standard.

The illustration shows the eye diagram of a TMDS transmitter at 162 MHz (UXGA) transmitting billions of bits of data.

The eye diagram test is the most important test to evaluate signal quality. The diagram shows signal fluctuations (phase jitter), amplitude distortion and the “ringing” effect. These tests also allow you to clearly see the quality of DVI.

DVI compatibility tests include the following checks.

  1. Transmitter: Eye diagram with specified boundaries.
  2. Cables: Eye diagrams are created before and after signal transmission, then compared. And again, the limits of signal deviation are strictly defined. But here large discrepancies with the ideal signal are already allowed.
  3. Receiver: The eye diagram is again created, but again, even greater discrepancies are allowed.

The biggest problems with serial high-speed transmission are signal phase jitter. If there is no such effect, then you can always clearly highlight the signal on the chart. Most signal jitter is generated by the graphics chip's clock signal, resulting in low-frequency jitter in the 100 kHz to 10 MHz range. In an eye diagram, signal fluctuation is noticeable by changes in frequency, data, data relative to frequency, amplitude, too much or too little rise. Additionally, DVI measurements vary at different frequencies, which must be taken into account when checking the eye diagram. But thanks to the eye diagram, you can clearly evaluate the quality of the DVI signal.

For measurements, one million overlapping areas are analyzed using an oscilloscope. This is sufficient to evaluate the overall performance of a DVI connection since the signal will not change significantly over a long period of time. Graphical representation of the data is produced using special software that Silicon Image created in collaboration with Tektronix. A signal that complies with the DVI specification must not interfere with the boundaries (blue areas) that are automatically drawn by the software. If the signal falls into the blue area, the test is considered failed and the device does not comply with the DVI specification. The program immediately shows the result.

The video card did not pass the DVI compatibility test.

The software immediately shows whether the card passed the test or not.

Different boundaries (eyes) are used for the cable, transmitter and receiver. The signal should not interfere with these areas.

To understand how DVI compatibility is determined and what needs to be considered, we need to dive into more detail.

Since DVI transmission is completely digital, the question arises where the signal phase jitter comes from. Two reasons can be put forward here. The first is that jitter is caused by the data itself, that is, the 24 parallel bits of data that the graphics chip produces. However, the data is automatically corrected in the TMDS chip when necessary, ensuring that there is no jitter in the data. Therefore, the remaining cause of jitter is the clock signal.

At first glance, the data signal appears to be free of interference. This is guaranteed thanks to the latch register built into the TMDS. But the main problem still remains the clock signal, which spoils the data flow through the 10x PLL multiplication.

Since the frequency is multiplied by a factor of 10 by the PLL, the impact of even small amounts of distortion is magnified. As a result, the data reaches the receiver no longer in its original state.

Above is an ideal clock signal, below is a signal where one of the edges began to be transmitted too early. Thanks to the PLL, this directly affects the data signal. In general, every disturbance in the clock signal results in errors in data transmission.

When the receiver samples the corrupted data signal using the "ideal" hypothetical PLL clock, it receives erroneous data (yellow bar).

How it actually works: If the receiver uses a corrupted transmitter clock signal, it will still be able to read the corrupted data (red bar). This is why the clock signal is also transmitted over the DVI cable! The receiver requires the same (damaged) clock signal.

The DVI standard includes jitter management. If both components use the same corrupted clock signal, then information can be read from the corrupted data signal without error. Thus, DVI-compatible devices can operate even in environments with low-frequency jitter. The error in the clock signal can then be bypassed.

As we explained above, DVI works optimally if the transmitter and receiver use the same clock signal and their architecture is the same. But this doesn't always happen. This is why using DVI can cause problems despite sophisticated anti-jitter measures.

The illustration shows the optimal scenario for DVI transmission. Multiplying the clock signal in a PLL introduces a delay. And the data flow will no longer be consistent. But everything is corrected by taking into account the same delay in the PLL of the receiver, so the data is received correctly.

The DVI 1.0 standard clearly defines PLL latency. This architecture is called non-coherent. If the PLL does not meet these latency specifications, problems may arise. There is heated debate in the industry today about whether such a decoupled architecture should be used. Moreover, a number of companies are in favor of a complete revision of the standard.

This example uses the PLL clock signal instead of the graphics chip signal. Therefore, the data signals and clock signals are consistent. However, due to the delay in the receiver's PLL, the data is not processed correctly, and jitter removal no longer works!

You should now understand why using long cables can be problematic, even without taking into account external interference. A long cable can introduce delay into the clock signal (remember that data signals and clock signals have different frequency ranges), additional delay can affect the quality of signal reception.

To ensure video signal transmission in digital format, DVI is used. The interface was developed during the period when DVD discs began to be produced. At that time, there was a need to transfer video from a PC to a monitor.

The analogue broadcast transmission methods known at that time were not conducive to transmitting high-quality images to the monitor. Since it is physically impossible to carry out such a high-resolution transmission at a distance.

Distortion can form in the channel at any time, this can be especially observed at higher frequencies. HD is precisely the owner of high frequencies. To avoid this kind of interference and distortion, manufacturers of modern technology have set a goal to abandon the analogue broadcast option and switch to a digital type of signal in the process of processing and transmitting video to the monitor.

In the 90s, manufacturers joined forces, as a result of which DVI technology appeared.

The DVI connector is considered one of the most popular methods for connecting monitors and projects. The presence of a DVI interface on a device does not guarantee that the user will be able to realize all the capabilities available in this port. In this article we will look at DVI I and DVI D, the differences and similarities between these ports.

DVI Connector Features

Ports are responsible for transmitting images to the monitor. There are several modifications of the connector in question. Both digital and analog signals are transmitted. This type of port is most often represented by two options: DVI-I and DVI-D.

Is there a difference between them? DVI-D or DVI-I, which is better? More on this later.

DVI-I interface

This interface is considered the most used in video cards. “I” speaks of unification from the translation “integrated”. The port uses 2 channels for data transmission - analog and digital. Functioning separately, they have various modifications of DVI-I:

  • Single Link. This device includes independent digital and analog channels. The type of connection on the video adapter and how the connection occurs determines which one will function.

This type of interface is not used by professionals because it does not transmit to 30″ and LCD monitors.

  • Dual Link– this is a modernized port, which contains: 2 digital and 1 analog channel. The channels operate independently of each other.

The difference is that most video cards have at least 2 DVI-I connectors.

DVI-D interface

This port looks different from the first DVI-I. The interface can accept a couple of channels. The first Single Link type contains only 1 channel, and it is not enough to connect to 3D monitors.

Dual Link is the second type. There are no analog channels, but the interface has wide options for transmitting information. Dual - indicates two channels, which makes it possible to send the image to the monitor in three-dimensional format, since 2 channels have 120 Hz and are capable of transmitting high resolution.

The main differences between DVI-I and DVI-D

Most modern video card models are available with a DVI interface instead of the classic, but outdated VGA. Of course, you shouldn't forget about HDMI. From what was said earlier, it is clear that DVI is available in two types. What's the difference between DVI-I and DVI-D?

The differences boil down to the following: I can transmit both analog and digital signals, while D can only transmit digital signals. Thus, DVI-D is not suitable for connecting an analog monitor.

DVI is a digital video connector that replaced VGA. DVI-I is responsible for transmitting digital and analog signals. As for the analog signal, it is required for compatibility of older monitors with the beam tube. Time passed, and this option was no longer required; video cards began to use exclusively digital signals. As a result, DVI-D took over these tasks.

You need to understand that inserting a DVI-I adapter or the same type of cable into DVI-D will not work. Because the connector connectors are different. The DVI-D interface can be connected to “i” without any problems. This option allows you to receive an exclusively digital signal. Analog signals are not read in this situation, since the DVI-D connector does not have an “i” pin, which is responsible for transmitting an analog signal.

What do they have in common?

The differences between DVI-I and DVI-D have been examined, and we can begin to consider their combined characteristics.

DVI-I is universal and has the option of transmitting two types of signals: digital and analog. Due to the use of special additional elements in the form of adapters, and connection with other devices, “I” is capable of efficiently transmitting different formats. The use of this type for an analog signal has practically no striking distinctive features from “D”.

Connecting the monitor to other devices is carried out using various interfaces, of which there are currently plenty. Depending on the technological solution, connection options are of two types - analog and digital. The latter are represented by two main interfaces - DVI D or HDMI. What is better and what, exactly, is the difference between these technologies? Which connector should you choose? Next we will look in more detail at why HDMI is better than DVI.

Requirements of modern technology

To determine which connector is better - DVI or HDMI, it is worth understanding why it is necessary to consider only these two connectors, because there are other interfaces, for example, DisplayPort or VGA. Firstly, DisplayPort is used primarily to connect a monitor to a computer, providing a connection between multiple screens, but this interface is definitely not suitable for HD format. VGA is currently considered an outdated solution; many leading companies have abandoned its use in their technology. By the way, it was thanks to VGA that

But analogue broadcasting today is also fading into the background due to the fact that it clearly shows most of the image imperfections. The device first converts to analog, and then vice versa. Unnecessary image transformations lead to noise appearing on the screen - doubling, copies of objects, buttons or text become a common problem. These shortcomings are especially noticeable on the first modern LCD monitors, which only supported a VGA connection.

As for connectors, most manufacturers now place ports for both digital cables on the rear panels of equipment. But, as a rule, this only complicates the task when it comes to deciding which cable is better - HDMI or DVI.

Common features of interfaces

Both HDMI and DVI transmit video signals using the same technology called TMDS. The necessary information in this case is encoded in such a way as to obtain the most harmonious sequence of bits. Thanks to the latter, a high frequency level is achieved and, as a result, a higher quality image.

In addition, only one cord is used for both one and the other port (despite the fact that HDMI is a single-channel solution, but DVI is a multi-channel solution, which will be discussed below). This is possible thanks to the use of special adapters.

Distinctive features

Many personal computer users make a choice in favor of Why is this so - why is HDMI better? What can't DVI boast? There are two main differences:

  1. An HDMI cable can transmit not only video, but also audio. This ensures high quality not only of sound, but also of pictures. Most DVI models do not have this feature, although, of course, there are exceptions to the rule.
  2. is a single-channel cord, but the data transfer speed reaches one hundred megabits per second. But the products of a competing company are distinguished by several channels, one of which transmits an analog signal. For devices that operate on such a signal, a DVI cable is a godsend. Thus, the company keeps up with the times, but also does not ignore the needs of owners of equipment that cannot boast of innovative “filling”.

To decide whether HDMI is better than DVI, you need to understand that image quality largely depends on the device that needs to be connected. The picture that appears on the screen is affected by the signal level. But the stability and quality of image transmission is already ensured by the cable.

But you should not try to connect one cord in place of another; there are special adapters for this. Otherwise, you may simply lose the sound. Although both interfaces work using the same technology, the differences make themselves felt.

High-Definition Multimedia Interface

What makes HDMI better than DVI is that it is the High-Definition Multimedia Interface that is used by many companies that produce equipment. It is such a common interface that using hdmi you can connect not only TVs and monitors to computers, but also laptops, tablets and smartphones, game consoles and players.

A cable consisting of only one channel is still wide-format, allowing you to form a whole system from various multimedia devices. The latter is especially necessary in some cases.

New versions of the cable have excellent compatibility and easily replace previous models. HDMI has good bandwidth, which is important for gamers or simply those users who like high speeds and improved sound. At the same time, HDMI is completely innovative and only supports digital format. That is, any old models of equipment cannot be connected using it.

Digital Visual Interface

The DVI interface has three varieties that support different modes - digital, analog and analog-to-digital. This cable can transmit a high-expansion picture over a distance of no more than five meters. Signal transmission can be carried out in two modes. The first is single link, the second is dual link. The latter ensures operation at high frequencies. So, if the picture is of poor quality when using single mode, it is the dual link that will correct the situation.

Competitive advantage

HDMI is a smarter interface anyway. Developers follow trends and keep up with the times. Most models of modern technology at the moment and in the near future will definitely use this type of cable.

HDMI or DVI for a computer, which is better to choose? In this case, you can use either one or the other interface. The final choice depends on the purpose. If high-quality sound is important, then it is better to use the first connector; if this is not necessary, DVI is also suitable for the connection. This type of interface is especially good because its developers, although they are developing technologies, also do not forget about those who use old devices. After all, quite a lot of users still have computers and televisions that are no longer “in trend.” It is in this case that it is better to go with DVI.