Graphics adapter what to do. VGA-compatible video controller: what is it and how to fix problems with graphics adapters

Very often, many users of computers and laptops are faced with a rather unpleasant situation when, after reinstalling the operating system, discrete graphics accelerators installed directly on the motherboard disappear from the list of equipment presented in the “Device Manager”. In this case, instead of his own video card, the user sees some kind of video controller (VGA-compatible), which is marked with a yellow triangle with exclamation mark, which indicates that the driver for it is not installed. It also happens that the device does not seem to be marked as a device without a driver, but when launching the same games, serious problems begin, since they do not detect the required graphics adapter in the system. Why this happens and what actions can be taken in such a situation will be discussed further.

What is a video controller (VGA-compatible) in Device Manager?

Let's start with the fact that such a device, displayed in the list of equipment in Device Manager, is only indirectly related to a non-working graphics adapter. It’s just that the system detects the video card not as the equipment available on board, but as a certain virtual adapter. You can sometimes guess that this is an “iron” card by the fact that it is often indicated that it is a PCI video controller (VGA-compatible). PCI slot on the motherboard is precisely used for installing the graphics adapter. But, again, the operating system sees it exclusively in the form virtual controller. Why?

Why is the wrong driver installed?

Problem incorrect installation drivers is most often due to the fact that in its own database Windows data does not find the necessary control software for the graphics adapter (if anyone doesn’t know that during the initial installation and during the repeated installation, it uses exclusively its own driver databases).

Another very common situation is that when you reinstall the system without formatting the system partition, the new installed OS may inherit errors from the old one, in which the graphics adapter drivers were not completely removed. Because of this, conflicts arise, and Windows itself installs the most suitable driver(as it seems to her), which is completely unsuitable for the functioning of the video card. True, you can also come across cases when the name of the video card seems to be displayed, but the system still shows that the driver for a card that is not present in the system is installed, namely a VGA-compatible video controller (NVIDIA, for example). For GeForce series devices, the reason lies precisely in the fact that outdated drivers were not completely removed.

How to reinstall the driver for a VGA-compatible video controller using the simplest method?

Despite such conflicts, the situation can be corrected quite simply.

First of all, in the “Device Manager”, select in the list VGA-compatible video controller and then through RMB menu select the driver update item, then specify the search for the system updated drivers. If this does not help, it is quite possible that the solution to the problem will be to roll back the driver (only if the corresponding button in the adapter properties section is active).

If this does not work, simply remove the device from the system and see how correctly the graphics adapter is detected after that (in some cases this happens instantly, and sometimes you may need to reboot the system).

Application of automated programs and databases

In the case of discrete graphics chips, you can also use the driver disk that was supplied when you purchased them. Another good way to solve the problem is to visit the manufacturer’s official website, where you can find the latest software based on the video card model.

For NVIDIA and ATI adapters, manufacturers often provide additional programs that also allow installation or updating (for example, NVIDIA Experience). If their use does not give anything, try using automated programs like DriverPack Solution or Driver Booster. The first utility contains its own database, which is much more more than that that Windows uses. And both update applications can access official resources manufacturers via the Internet to download and install updates. You can also use some informative utilities.

For example, in the popular Everest program When viewing information about image output devices for a video card, you can also download drivers.

What to do if the driver cannot be found?

If none of the above helped, but in the list graphics devices still, only a VGA-compatible video controller is present, use the “Device Manager”, through the RMB menu, call up the section of its properties, go to the details tab, from the drop-down list, set the display of equipment ID, copy or write down the most long string with identifiers DEV and VEN, then use it to search for a driver on the Internet, download the necessary software and install it yourself.

Note: If you receive errors when installing the found driver, you will have to remove all drivers manually. To do this, perform an analysis in the program Driver Sweeper, delete all found elements, go to the registry (regedit), search for keys by the name of the video card manufacturer, delete everything that is found, restart the computer and try to install the drivers again. Sometimes the problem may be related to PhysX components, so it is possible that you will have to remove them too.

Video adapter((also known as graphics card, video card) from English. videocard) is a device that converts an image stored in the computer's memory into a video signal for the monitor.

Characteristics

Main characteristics of video adapters:

The memory bus width is measured in bits - the number of bits of information transferred per second. Important parameter in card performance.

The amount of video memory, measured in Megabytes - built-in RAM on the board itself, the value shows how much information the graphics card can store.

Core and memory frequencies are measured in Megahertz; the higher, the faster the video card will process information.

Technical process - printing technology, measured in nanometers (nm), modern cards are produced according to 110 nm or 90 nm technical process standards. The less this parameter, those more elements can be placed on a crystal.

Texture and pixel fill rate, measured in million pixels per second, shows the amount of information displayed per unit of time.

Card outputs - previously the video adapter had only one VGA connector, now boards are equipped in addition DVI-I output or simply with two DVI-I for connecting two LCD monitors, as well as composite video output and video input (denoted as ViVo)

Design

A modern graphics card consists of the following parts:

graphics processing unit (GPU)- deals with calculations of the output image, relieving the central processor of this responsibility, and makes calculations for processing three-dimensional graphics commands. It is the basis of the graphics card; the performance and capabilities of the entire device depend on it. Modern GPUs in complexity they are not much inferior to the central processor of a computer, and often surpass them in the number of transistors. The architecture of a modern GPU usually involves the presence of several information processing units, namely: a 2D graphics processing unit, a 3D graphics processing unit, which in turn is usually divided into a geometric kernel (plus a vertex cache) and a rasterization unit (plus a texture cache), etc.

video controller- is responsible for forming an image in video memory, gives RAMDAC commands to generate scan signals for the monitor and processes requests from the central processor. In addition, there is usually an external data bus controller (for example PCI or AGP), an internal data bus controller and a video memory controller. The width of the internal bus and video memory bus is usually wider than the external one (64, 128 or 256 bits versus 16 or 32); many video controllers also have RAMDAC built in. Modern graphics adapters (ATI, nVidia) usually have at least two video controllers that operate independently of each other and simultaneously control one or more displays each.

video memory- acts as a frame buffer in which it is stored in digital format an image generated and constantly modified by the graphics processor and displayed on a monitor (or multiple monitors). Video memory also stores intermediate image elements invisible on the screen and other data. Video memory comes in several types, differing in access speed and operating frequency. Modern video cards are equipped with DDR, DDR2 or GDDR3 memory. It should also be borne in mind that in addition to the video memory located on the video card, modern graphics processors usually use part of the total system memory computer, direct access to which is provided by the video adapter driver via the AGP or PCIE bus.

digital-to-analog converter DAC (RAMDAC)- serves to convert the image generated by the video controller into color intensity levels supplied to the analog monitor. The possible color range of the image is determined only by the RAMDAC parameters. Most often, RAMDAC has four main blocks - three digital-to-analog converters, one for each color channel(red, blue, green, RGB), and SRAM for storing gamma correction data. Most DACs have a resolution of 8 bits per channel - this results in 256 brightness levels for each primary color, which gives a total of 16.7 million colors (and due to gamma correction, it is possible to display the original 16.7 million colors in much more color space). Some RAMDACs have a bit depth of 10bit for each channel (1024 brightness levels), which allows you to immediately display more than 1 billion colors, but this feature is practically not used. To support a second monitor, a second DAC is often installed. It is worth noting that monitors and video projectors connected to the digital DVI output of a video card to convert the digital data stream use their own digital-to-analog converters and do not depend on the characteristics of the DAC of the video card.

video ROM (Video ROM)- a permanent storage device in which video BIOS are recorded, screen fonts, service tables, etc. ROM is not used directly by the video controller - only the central processor accesses it. The video BIOS stored in ROM ensures the initialization and operation of the video card before loading the main operating system, and also contains system data that can be read and interpreted by the video driver during operation (depending on the method used to separate responsibilities between the driver and the BIOS). On many modern maps electrically reprogrammable ROMs (EEPROM, Flash ROM) are installed, allowing the video BIOS to be rewritten by the user himself using a special program.

cooling system- intended for preservation temperature regime video processor and video memory in acceptable values. Correct and fully functional operation of a modern graphics adapter is ensured using a special video driver software, supplied by the video chip manufacturer and loaded during the operating system startup process. The video driver functions as an interface between the system with applications running on it and the video adapter. Just like the video BIOS, the video driver organizes and programmatically controls the operation of all parts of the video adapter through special control registers, which are accessed through the corresponding bus.

====== **Video memory** ======

In addition to the data bus, the second bottleneck of any video adapter is the memory bandwidth of the video adapter itself. Moreover, initially the problem arose not so much because of the speed of video data processing (this is now often the problem of the video controller’s information hunger, when it processes data faster than it can read/write from/to video memory), but because of the need to access it from the outside video adapter chip, central processor, and RAMDAC. The fact is that at high resolutions and large color depths, in order to display the screen page on the monitor, it is necessary to read all this data from video memory and convert it into analog signal which will go to the monitor. To explain it more clearly and simply, let's start with the fact that the image that you see on the monitor screen is stored not in the monitor, but in the memory of the video adapter. And it needs to be read from memory and displayed on the screen as many times per second as the number of frames per second the monitor displays. Let's take the volume of one screen page with a resolution of 1024×768 pixels and a color depth of 24bit (True Color), this is 2.25MB. At a frame rate of 75Hz, it is necessary to read this page from the memory of the video adapter 75 times per second (the read pixels are transferred to RAMDAC and it converts digital data about the color of a pixel into an analog signal arriving at the monitor), and it is impossible to delay or skip a pixel, therefore the nominally required video memory bandwidth for this permission is approximately 170MB/sec, and this does not take into account the fact that the video controller itself needs to write and read data from this memory. For a resolution of 1600x1200x32bit at the same frame rate of 75Hz, the nominal bandwidth required is already 550 MB per second; for comparison, the Pentium2 processor had a peak memory speed of 528 MB per second. The problem could be solved in two ways - either use special types memory that allows two devices to read from it at the same time, or put very fast memory. We will talk about memory types.

FPM DRAM- FPM DRAM (Fast Page Mode Dynamic RAM - dynamic RAM with fast page access) is the basic type of video memory, identical to that used in motherboards. Uses asynchronous access, in which control signals are not strictly tied to clock frequency systems. Actively used until approximately 1996.

VRAM(Video RAM - video RAM) - the so-called dual-port DRAM. This type of memory provides access to data from two devices at once, that is, it is possible to simultaneously write data to any memory cell, and at the same time read data from some neighboring cell. Due to this, it allows you to combine the time display of the image on the screen and its processing in video memory, which reduces access delays and increases operating speed. That is, RAMDAC can freely display the screen buffer on the monitor screen over and over again without interfering with the video chip in any way to manipulate the data. But however, this is still the same DRAM and its speed is not very high.

WRAM(Window RAM) - a version of VRAM, with increased bandwidth by ~ 25% and support for some frequently used functions, such as drawing fonts, moving image blocks, etc. It is used almost only on Matrox and Number Nine accelerators, since it requires special methods of access and data processing, the presence of only one manufacturer of this type memory (Samsung) has greatly reduced the possibilities of its use. Video adapters built using this type of memory do not tend to decrease performance when installed high resolutions and screen refresh rates; on single-port memory, in such cases, RAMDAC takes up more and more time on the video memory access bus and the performance of the video adapter can drop significantly.

EDO DRAM(Extended Data Out DRAM - dynamic RAM with extended output data retention time) - a type of memory with pipelining elements that allows you to slightly speed up the exchange of data blocks with video memory by approximately 25%.

SDRAM(Synchronous Dynamic RAM - synchronous dynamic RAM) replaced EDO DRAM and other asynchronous single-port memory types. After the first read from memory, or the first write to memory, following operations reads or writes occur with zero delays. This achieves the highest possible speed of reading and writing data.

DDR DRAM(Double Data Rate) - a version of SDRAM with data transfer on two signal edges, resulting in doubling the operating speed. Further development while it occurs in the form of another compaction of the number of packets in one bus cycle (DDR2, QDDR, etc.) SGRAM (Synchronous Graphics RAM) is a variant of DRAM with synchronous access. In principle, the operation of SGRAM is completely similar to SDRAM, but it additionally supports some specific functions, such as block and mask recording. Unlike VRAM and WRAM, SGRAM is single-port, but can open two memory pages as one, emulating the dual-port nature of other types of video memory.

MDRAM(Multibank DRAM - multi-bank RAM) - a version of DRAM developed by MoSys, organized in the form of many independent banks of 32 KB each, operating in a pipeline mode.

RDRAM(RAMBus DRAM) memory that uses a special data transmission channel (Rambus Channel), which is a data bus one byte wide. Through this channel it is possible to transmit information in very large flows, highest speed data transfer for one channel is currently 1600MB/sec (frequency 800MHz, data is transmitted over both pulse slices). Several memory chips can be connected to one such channel. The controller of this memory works with one Rambus channel; four such controllers can be placed on one logic chip, which means theoretically it is possible to support up to 4 such channels, providing maximum throughput at 6.4GB/sec. The disadvantage of this memory is that you need to read the information big blocks, otherwise its performance drops sharply.

Accelerators

Let's start traditionally with history. The first video cards weren't even 3D accelerators, they weren't accelerators at all. They served only as a DAC (digital-to-analog converter) - they converted the data calculated central processor(representing digital code) into an analog signal available for display on a monitor. But the complexity of the images grew, and this could not continue. The trend towards more complex images has led to the emergence of a 2D accelerator - a video card that has its own, albeit simplest processor, which took on some of the functions, offloading the central processor. But when the need arose to build 3D images, the situation became more complicated.

To build, say, a simple fragment of a wall, the processor had to perform the following operations: first it is necessary to select the edges of this object, then apply textures, add lighting... and when there are hundreds of such objects, their shape is complex, they move and overlap, cast shadows, etc. d. the task becomes incredibly difficult. To help the processor solve this problem, 3D graphics accelerators were created, the operation of which will be discussed in this article.

So, each stage of image construction is very resource-intensive and requires many calculations. It seems quite logical to take them out of the CPU and transfer them to a specialized video card processor. Especially considering that graphic data is streaming in nature, and the computational need is much greater than the logical one. Each new round of accelerator development represents a certain generation, so first we will introduce the standardization of generations (generations can be understood in different ways - I will give only one option):

1. The first generation, which was more or less widespread, were accelerators using the Direct3D 5 and Glide APIs. The representative of the former was NVIDIA Riva128, and the latter – 3Dfx Voodoo. Cards of this generation took on only the last part of scene construction - texturing and shading. All previous steps were performed by the CPU.

2. The second generation used the Direct3D 6 API, and at this time the rapid revival of the API developed by SGI - OpenGL began. Representatives of the cards at that time were NVIDIA RivaTNT and ATI Rage. This was practically an evolutionary development of the previous generation cards.

3. Third generation – Direct3D 7. It was then that cards appeared equipped with a TCL block, which removed a significant part of the load from the CPU. This block was responsible for transformation, lighting and clipping. (TCL - Transformaton-Clipping-Lighting) Now the video card built the scene on its own - from start to finish. Representatives of this generation were NVIDIA GeForce 256 and ATI Radeon.

4. The fourth generation is another revolution. Besides other new API capabilities Direct3D 8 (and 8.1) these cards brought with them the most important feature - hardware shaders. We will describe the reason for their appearance a little later. This generation is represented by NVIDIA GeForce 3.4 and ATI Radeon 8500, 9000, 9100, 9200.

5. The fifth generation is mainly the development of shader technologies (version 2.0), and an attempt to introduce AA and AF into a number of mandatory functions. This generation, supports Direct3D API versions up to 9.0b inclusive, is represented by ATI RADEON 9500, 9600, 9700, 9800, X800, as well as NVIDIA GeForce FX 5200, 5500, 5600, 5700, 5800, 5900, 5950.

6. The sixth generation is the DirectX9.0c generation. It so far includes only one NVIDIA GeForce 6 series and GeForce 6800Ultra/6800GT/6800 boards based on the NV40 chip. These cards support shaders version 3.0, and offer some other features. Now, having decided on the general structure of the pipeline and generations of video cards, we will take a closer look at the vertex and fragment processors, and also determine the differences between the versions of the corresponding shaders.

The reason for the appearance of shaders was the lack of any flexibility in the fixed TCL block. It quickly became clear that waiting for the moment when manufacturers add the next portion of functions to the TCL block of video cards is not the best way out. This approach did not suit anyone. The developers did not like the idea that in order to introduce, for example, a game new effect they need to wait a year for the release of a new accelerator. Manufacturers were also not in a good position - they would have to constantly increase both the chips themselves and the drivers for them. This was the reason for the emergence of shaders - programs that can configure the accelerator as required by the next scene. A shader is a program that is loaded into the accelerator and configures its nodes to process the corresponding elements. There is no longer any limitation to a predetermined set of effects processing methods. Now it has become possible to compose any programs from standard instructions (limited by the specifications of the version of the shader used) that set the necessary effects. Shaders are divided according to their functions into vertex and fragment (pixel): the first work with vertices and triangles, replacing the functionality of the TCL block (now it has practically disappeared - if necessary, it is emulated by a special vertex shader). Fragment shaders are used to create programs for processing fragments of size 2x2 pixels - quads. They are necessary to implement some textural effects. Shaders are also characterized by a version number - each subsequent one adds more and more new features to the previous ones. The latest specification for fragment and vertex shaders today is version 3.0, supported via the DirectX 9c API, and both accelerator manufacturers and developers of new games will focus on it. Users who want to purchase a modern gaming video card should also pay attention to their hardware support. Let's pay attention to the main difference between shaders 3.0 and previous versions(except 2.0a) – this is DFC – Dynamic Flow Control – dynamic control flow. On the one hand, this is an excellent opportunity that allows you to significantly increase the speed of scene construction; on the other hand, there are extra transistors, and as resulting side effects, excess heat and lower maximum frequencies. Let's describe this feature in more detail. Let's imagine a situation where for some vertex (or fragment) the shader needs to be executed not all of it, but only 12% of it. If DFC is used, we will only complete the required 12% based on the object's parameters. Without DFC, we are forced to execute the entire shader. It is easy to see that with DFC we will get a gain of almost 10 times, while paying with reduced performance on vertices for which we need to execute 100% of the shader. This is why debates on the Internet are still raging about whether this is good or not. I will not make comparisons - here everyone makes their own choice, but I will just note that I am personally a supporter of the third shader model. The first shaders consisted of just a few commands and were easy to write in low-level assembly language. Although the complexity of debugging assembly code initially scared many developers away from shaders... But with the growing complexity of shader effects, sometimes numbering tens and hundreds of commands, a need arose for a more convenient, high level language writing shaders. Two of them appeared at once: NVIDIA Cg (C for graphics) and Microsoft HLSL (High Level Shading Language) - the latter is part of the DirectX 9 standard. The advantages and disadvantages of these languages, and other nuances will be of interest only to programmers, so we will dwell on them in more detail we won't. We only note that Cg is not widely used due to the emergence of a new, more advanced GLSL - an analogue of HLSL for the OpenGL API.

Video card

Video card (also known as graphics card, graphics card, video adapter) (eng. videocard)- a device that converts an image stored in the computer's memory into a video signal for the monitor.

Usually the video card is an expansion card and is inserted into connector extensions, universal (ISA, VLB,PCI,PCI-Express) or specialized ( AGP), but it can also be built-in (integrated).

Modern video cards are not limited simple conclusion images, they have built-in graphics microprocessor, which can produce additional processing, relieving the central CPU computer. For example, all modern NVIDIA video cards and AMD( ATi) support OpenGL applications in hardware.

Video cards have following standards

On PS/2 computers, most of the video adapter circuitry is located on system board. This video adapter contains everything electronic circuits, necessary to support the VGA specification, on a single full-size board with an 8-bit interface.

BIOS VGA is a program designed to manage VGA circuits. Through BIOS programs can initiate some VGA procedures and functions without accessing the adapter.

All VGA equipment provides display of up to 256 shades on the screen from a palette of 262,144 colors (256 KB). An analog monitor is used for this.

If problems arise when booting the system, it boots into safe mode, where the default is VGA adapter in 640x480 mode, 16 colors.

SuperVGA Super Video Graphics Array. Provides higher resolution than the VGA standard. Supports operating modes with resolutions of 800:600, 1024:768, 1280:1024 pixels (or more) with simultaneous display of 2 in 4, 8, 16, 32 degrees of color.

With adapters SVGA various models from different manufacturers you can communicate through a single software interface VESA

Existing standard VESA on boards SVGA provides the use of almost all common image formats and color coding options, up to a resolution of 1280x1024 pixels with 16,777,216 shades (24-bit color coding).



A modern video card consists of the following parts:

Bios (Basic Input/Output System - basic system I/O). The video adapter BIOS contains basic commands, which provide an interface between the video adapter hardware and software. A BIOS that can be modified using software is called flash BIOS.

Graphics processing unit (graphics processing unit)- deals with calculations of the output image, relieving the central processor of this responsibility, and makes calculations for processing 3D graphics commands. It is the basis of the graphics card; the performance and capabilities of the entire device depend on it. Modern graphic processors are not much inferior in complexity to the computer's central processor, and often surpass it both in the number of transistors and in computing power, thanks to a large number universal computing units. However, the architecture GPU The previous generation usually involves the presence of several information processing units, namely: a 2D graphics processing unit, a 3D graphics processing unit, in turn, usually divided into a geometric kernel (plus a vertex cache) and a rasterization unit (plus a texture cache), etc.

Video controller- responsible for forming an image in video memory, gives commands RAMDAC to generate scanning signals for the monitor and processes requests from the central processor. In addition, there is usually an external data bus controller (for example, PCI or AGP), internal data bus controller and video memory controller. The width of the internal bus and video memory bus is usually larger than the external one (64, 128 or 256 bits versus 16 or 32); many video controllers also have built-in RAMDAC. Modern graphics adapters ( ATI, nVidia) usually have at least two video controllers that operate independently of each other and simultaneously control one or more displays each.

Video memory- acts as a frame buffer in which the image is stored, generated and constantly changed by the graphics processor and displayed on the monitor screen (or several monitors). Video memory also stores intermediate image elements invisible on the screen and other data. Video memory comes in several types, differing in access speed and operating frequency. Modern video cards are equipped with memory type DDR, DDR2, GDDR3, GDDR4 and GDDR5. It should also be borne in mind that in addition to the video memory located on the video card, modern graphics processors usually use part of the computer’s general system memory in their work, direct access to which is organized by the video adapter driver via the bus AGP or PCIE.

Digital-to-analog converter(DAC, RAMDAC - Random Access Memory Digital-to-Analog Converter)- serves to convert the image generated by the video controller into color intensity levels supplied to an analog monitor.

Video ROM- a permanent storage device in which video BIOS, screen fonts, service tables, etc. are written. ROM is not used directly by the video controller - only the central processor accesses it. The video BIOS, stored in ROM, ensures initialization and operation of the video card before loading the main operating system, and also contains system data that can be read and interpreted by the video driver during operation (depending on the method used sharing of responsibilities between the driver and the BIOS). Many modern cards use electrically reprogrammable ROMs ( EEPROM, Flash ROM), allowing the video BIOS to be rewritten by the user himself using a special program.

Cooling system-designed to maintain the temperature of the video processor and video memory within acceptable limits.

Video adapter - This electronic board, which processes video data (text and graphics) and controls the operation of the display. Contains video memory, input/output registers and a BIOS module. Sends ray brightness control and image scanning signals to the display .

The most common video adapter today is SVGA adapter(Super Video Graphics Array - super videographic array), which can display on the display screen 1280x1024pixels at 256 colors and 1024x768 pixels at 16 million colors.

With the increasing number of applications using complex graphics and video, a variety of video adapters are being widely used along with traditional video adapters. computer video signal processing devices:

Rice. 2.12. Graphics accelerator

Graphics accelerators (accelerators) - specialized graphics coprocessors, increasing the efficiency of the video system. Their use frees the central processor from a large amount of operations with video data, since the accelerators independently calculate which pixels to display on the screen and what their colors are.

Frame grabbers , which allow you to display a video signal from a VCR, camera, laser player, etc. on a computer screen, so that capture the desired frame into memory and subsequently save it as a file.

TV tuners - video cards that turn a computer into a TV. The TV tuner allows you to select any desired television program and display it on the screen in a scalable window. This way you can monitor the progress of the transfer without stopping your work.

2.13. Keyboard

Computer keyboard - a device for entering information into a computer and supplying control signals. Contains standard set typewriter keys and some additional keys - control and function keys, cursor keys and a small numeric keypad.

All characters typed on the keyboard are immediately displayed on the monitor at the cursor position ( cursor- a glowing symbol on the monitor screen indicating the position at which the next character entered from the keyboard will be displayed).

The most common keyboard today is with a key layout QWERTY(read “querti”), named after the keys located in the upper left row of the alphanumeric part of the keyboard:

Rice. 2.13. Computer keyboard

This keyboard has 12 function keys located along the top edge. Pressing function key results in sending to the computer not just one character, but a whole set of characters. Function keys can be programmed by the user. For example, in many programs, to get help (hints) the key is used F1, and to exit the program - the key F10.

Control keys have the following purpose:

Small numeric keypad used in two modes - entering numbers and controlling the cursor. These modes are switched using the key Num Lock.

The keyboard contains a built-in microcontroller (local control device), which performs the following functions:

    sequentially polls the keys, reading the input signal and generating a binary scan code keys;

    controls the keyboard indicator lights;

    Conducts internal diagnostics of faults;

    communicates with the central processor through I/O port keyboards.

The keyboard has built-in buffer- small intermediate memory where entered characters are placed. If the buffer overflows, pressing a key will be accompanied by a sound signal - this means that the character was not entered (rejected). The operation of the keyboard is supported by special programs “hardwired” into BIOS, and driver keyboard, which provides the ability to enter Russian letters, control the speed of the keyboard, etc.

The signals supplied to the monitor come from the video adapter built into the system or connected to the computer.

There are three ways to connect computer systems to a CRT or LCD monitor:

Separate video cards. This method, which requires AGP or PCI expansion slots, provides the highest level of efficiency and maximum operational flexibility in choosing the amount of memory and capabilities required (Figure 17);

A graphics chipset built into the motherboard. The lowest cost graphics configuration and fairly low efficiency, especially for 3D games or graphics applications. Resolution and color rendering capabilities are lower than when using separate video adapters, and the amount of memory is almost impossible to change;

Figure 15 – Appearance video adapter

The following components are required for the video adapter to work:

BIOS (Basic Input/Output System - basic input/output system);

The video adapter BIOS, like the system BIOS, is stored in a ROM chip; it contains basic commands that provide the interface between the video adapter hardware and the software. The program that accesses the video adapter's BIOS functions may be a standalone application, operating system or system BIOS. Accessing BIOS functions allows you to display monitor information during POST and begin booting the system before loading any other driver software from disk. The BIOS of a stand-alone video adapter does not depend on the motherboard BIOS. When using a video adapter built into the system logic set, the BIOS of the motherboard and the video adapter are common.

Graphics processor - video accelerator chip with limited set functions. This architecture, used in many video adapters presented on the modern computer market, assumes that the electronic circuits of the video adapter solve algorithmically simple, but time-consuming tasks. In particular, the electronic circuits of the video adapter construct graphic primitives - straight lines, circles, etc., while the computer's central processor is left to construct the image, decompose it into components and send instructions to the video adapter, for example: draw a rectangle of a certain size and color.

Modern graphics systems also use a three-dimensional graphics processor (3D graphics), which is used in almost all video adapters optimized for computer games, as well as in most of the most common video cards. The 3D graphics processor, which is a 3D graphics processing unit, is located in the accelerator chipset and is used to generate polygon images, create lighting effects, and draw halftones.

Video memory. When forming an image, the video adapter accesses memory. The memory capacity on the video adapter (video memory) can vary: from 4 to 512 MB and higher. Additional memory does not increase the speed of the video adapter, but it allows you to increase image resolution and/or the number of reproduced colors. Video adapters built into the system logic use part random access memory strictly limited in BIOS settings.

The amount of memory required to create a mode with given resolution and the number of colors is calculated as follows. Each pixel in an image requires a certain amount of memory to encode, and the total number of pixels is determined by the given resolution. For example, at a resolution of 1,024x768, the screen displays 786,432 pixels.

If this resolution supported only two colors, then only one bit of memory would be needed to display each pixel, with a bit with a value of 0 defining a black point and a bit with a value of 1 defining a white point. By allocating 24 bits of memory to each pixel, more than 16.7 million colors can be displayed, since the number of possible combinations for a 4-bit binary number is 16,777,216 (i.e. 2 24). By multiplying the number of pixels used at a given screen resolution by the number of bits required to display each pixel, we obtain the amount of memory required to generate and store images in this format. Below is an example of such calculations:

1,024 × 768 = 786,432 pixels × 24 bits/pixel = 18,874,368 bits = 2,359,296 bytes = 2.25 MB

The video adapter's digital-to-analog converter (commonly called RAMDAC) converts computer-generated digital images into analog signals that a monitor can display. The speed of a digital-to-analog converter is measured in MHz; The faster the conversion process, the higher the vertical regeneration frequency. In modern high-performance video adapters, performance can reach 300 MHz and higher.

As the speed of the digital-to-analog converter increases, the vertical refresh rate increases, which allows you to achieve more high resolution screen at optimal refresh rates (72–85 Hz or more). As a rule, video adapters with speeds of 300 MHz and higher support resolutions up to 1,920x1,200 at refresh rates of more than 75 Hz. Of course, do not forget to make sure that the required resolution is supported by both the monitor and the video adapter you are using.

Connector Video adapters are usually connected to the AGP connector on the motherboard; graphics adapters for PCI are less common - this is rather the province of older models of video adapters.

The video adapter communicates with the monitor via a special VGA or DVI interface (Figure 18).

Figure 16 – DVI and VGA connectors

VGA is an analog signal transmission interface, i.e. Control signals for three primary colors are transmitted, but each signal has 64 brightness levels. As a result, the number of possible combinations (colors) increases to 262,144 (64 ). To create a realistic image using computer graphics color is often more important than high resolution, since the human eye perceives a picture with more color shades as more believable.

DVI is a digital signal transmission mode, i.e. the signal is converted to analog not when leaving the video adapter, but in the monitor itself. This is the advantage of DVI over VGA. A digital signal has only two discrete values: 1 and 0, i.e. every time you transfer a unit digitally, you receive exactly one unit. Regardless of voltage fluctuations or any interference occurring during transmission. In an analog system, as a result of transmitting a unit, you can no longer get a unit, but 0.935 or 1.062. Therefore, it is not necessary that you will see on the screen exactly what the video card generates.

The main characteristics of the video adapter are: memory frequency, processor frequency, type of slot and connector for connecting to the monitor.