• FPS limit for g sync. NVIDIA G-Sync technology. Monitors that support G-Sync: description and reviews. Testing without calibration

    Testing methodology

    The ASUS ROG SWIFT PG278Q monitor was tested using our new method. We decided to ditch the slow and sometimes inaccurate Spyder4 Elite in favor of the faster and more accurate X-Rite i1Display Pro colorimeter. Now this colorimeter will be used in conjunction with the latest version of the Argyll CMS software package to measure the basic parameters of the display. All operations will be carried out in Windows 8. During testing, the screen refresh rate is 60 Hz.

    In accordance with the new methodology, we will measure the following monitor parameters:

    • White brightness at backlight power from 0 to 100% in 10% increments;
    • Black brightness at backlight power from 0 to 100% in 10% increments;
    • Display contrast at backlight power from 0 to 100% in 10% increments;
    • Color gamut;
    • Color temperature;
    • Gamma curves of the three primary RGB colors;
    • Gray gamma curve;
    • Delta E (according to CIEDE2000 standard).

    To calibrate and analyze Delta E, a graphical interface for Argyll CMS is used - DispcalGUI, the latest version at the time of writing. All measurements described above are carried out before and after calibration. During tests, we measure the main monitor profiles - default, sRGB (if available) and Adobe RGB (if available). Calibration is carried out in the default profile, with the exception of special cases, which will be discussed later. For wide-gamut monitors, we select sRGB hardware emulation mode if available. In the latter case, colors are converted using the monitor's internal LUTs (which can be up to 14 bits per channel) and output to a 10-bit matrix, while an attempt to narrow the color gamut to the sRGB boundaries using OS color correction tools will lead to a decrease in color encoding accuracy. Before starting all tests, the monitor warms up for an hour, and all its settings are reset to factory settings.

    We will also continue our old practice of publishing calibration profiles for the monitors we tested at the end of the article. At the same time, the 3DNews test laboratory warns that such a profile will not be able to 100% correct the shortcomings of your specific monitor. The fact is that all monitors (even within the same model) will certainly differ from each other in their small color rendering errors. It is physically impossible to make two identical matrices - they are too complex. Therefore, any serious monitor calibration requires a colorimeter or spectrophotometer. But a “universal” profile created for a specific instance can generally improve the situation for other devices of the same model, especially in the case of cheap displays with pronounced color rendering defects.

    Viewing angles, backlight uniformity

    The first thing that interested us in the ASUS PG278Q was the viewing angles, because the monitor uses a TN matrix - its biggest problems are always associated with them. Fortunately, everything turned out to be not so bad. Of course, IPS matrices have larger viewing angles, but the ASUS PG278Q did not have to be rotated often to eliminate distortions in contrast and color reproduction.

    But the ASUS PG278Q developers could not avoid problems with screen flare. The monitor has a slight flare at all four corners and at the top. If a game is running on the display, then it will not be easy to see the flare, but if you start a movie in a dark room (with the usual vertical black stripes at the top and bottom) - and the defect immediately becomes noticeable.

    Testing without calibration

    The maximum brightness of the ASUS PG278Q was 404 cd/m2 - even more than what the manufacturer promises. Such a high value is justified by 3D support, because when using active shutter glasses, the perceived brightness of the monitor can drop by half. The maximum brightness of the black field was 0.40 cd/m2, which is also quite good. As a result, the static contrast fluctuates around 1000:1 throughout the entire backlight brightness range. An excellent result - such a high contrast is typical for high-quality IPS matrices. MVA is, however, out of reach.

    Our test subject's color gamut is as good as required. The sRGB color space is covered by 107.1%. The white point is near the D65 reference point.

    If we talk about games, then the ASUS PG278Q has complete order with the color palette, but with professional photo processing there may be problems due to slightly oversaturated colors due to the excess color gamut compared to sRGB. However, the display we are considering is designed specifically for games, so you shouldn’t pay much attention to this drawback.

    The color temperature of the ASUS PG278Q during measurements was kept at 6,000 K, which is 500 K below normal. This means that light colors may have a slight warm undertone.

    Only the red gamma curve was not far from the standard, and the blue and green curves sagged, although they tried to stay together. At the same time, the monitor’s gray scale is doing almost well. When measuring dark tones, it practically does not deviate from the standard curve, and when moving to light tones, it deviates, but not much.

    The average Delta E color accuracy score was 2.08 units, and the maximum was 7.07 units. The results, of course, are not the best, but, firstly, the ASUS PG278Q is still intended for games, and not for photo processing, and secondly, for a TN matrix, the results we obtained are quite satisfactory.

    Testing after calibration

    Usually, after calibration, the white brightness drops, and very strongly - by 10% or more, even for quite high-quality panels. In the case of ASUS PG278Q, it dropped by about 3% - to 391 cd/m2. The brightness of the black field was not affected by hardware calibration. As a result, the static contrast dropped to 970:1.

    Calibration had virtually no effect on the color gamut, but the white point returned to its proper place, although it moved only slightly.

    After calibration, the color temperature rose slightly, but did not reach the reference temperature. Now the gap between the measured and reference values ​​was approximately 100-200 K instead of 500 K, which, however, is quite tolerable.

    The position of the three main gamma curves, unfortunately, remained almost unchanged after calibration, while the gray gamma began to look a little better.

    But calibration had the best effect on color accuracy. The average Delta E value dropped to 0.36 units, the maximum - to 1.26 units. Excellent results for any matrix, and for TN+Film they are simply fantastic.

    G-Sync testing: methodology

    NVIDIA's G-Sync guide provides settings for testing in several games, where the frames per second will fluctuate between 40 and 60 FPS. It is in such conditions that at a refresh rate of 60 Hz the most “freezes” occur with V-Sync enabled. To start, we'll compare three usage scenarios: with V-Sync, without it, and with G-Sync - all at 60Hz.

    But remember that raising the refresh rate from 60 to 120/144 Hz in itself makes tearing less noticeable without vertical synchronization, and with V-Sync it reduces stuttering from 13 to 8/7 ms, respectively. Is there any real benefit to G-Sync over V-Sync at 144Hz? Let's check this too.

    I would like to especially note that, if you believe the description, in the case of G-Sync the refresh rate makes no sense at all. Therefore, it is not entirely correct to say that we, for example, compared V-Sync and G-Sync at 60 Hz. V-Sync was indeed at 60Hz, and G-Sync means the screen refreshes on demand rather than at a specific period. But even with G-Sync enabled, we can still select the screen refresh rate in the driver control panel. At the same time, FRAPS in games when G-Sync is activated shows that the same frame rate ceiling is in effect as if V-Sync were working. It turns out that this setting regulates the minimum frame lifetime and, accordingly, the screen refresh interval. Roughly speaking, the frequency range in which the monitor operates is set - from 30 to 60-144 Hz.

    In order to enable G-Sync, you need to go to the NVIDIA control panel, find the corresponding link in the left corner of the screen and check the only checkbox. The technology is supported in drivers for Windows 7 and 8.

    Next, you need to make sure that G-Sync is also enabled in the 3D Settings section - it can be found in the Vertical Sync submenu.

    That's all: the G-Sync function is enabled for all games running in full screen mode - this function cannot yet work in a window. For testing, we used a bench with a GeForce GTX TITAN Black graphics card.

    Tests were carried out in the games Assasin's Creed: Black Flag, as well as in Counter-Strike: Global Offensive. We tested the new technology in two ways: we simply played, and then hunted for gaps using a script that smoothly moved the game camera, that is, “moved the mouse” horizontally. The first method allowed us to evaluate the sensations of G-Sync “in battle,” and the second method allowed us to more clearly see the difference between vertical sync on/off and G-Sync.

    G-Sync in Assasin's Creed: Black Flag, 60 Hz

    No V-Sync and G-Sync at 60Hz The gaps were clearly visible with almost any camera movement.

    The gap is noticeable in the upper right part of the frame, near the ship's mast

    When V-Sync was turned on, image tearing disappeared, but freezes appeared, which did not benefit the gameplay.

    The double mast of the ship in the photo is one of the signs of a “frieze”

    After enabling G-Sync The gaps and freezes disappeared completely, the game began to run smoother. Of course, the periodic decrease in frame rate to 35-40 FPS was noticeable, but thanks to the synchronization of the display and video card, it did not cause such noticeable slowdowns as with vertical synchronization.

    However, as they say, it's better to see once than to hear a hundred times, so we made a short video showing the work of the new Assassins with V-sync on and off, as well as with G-Sync. Of course, video cannot fully convey the “live” impressions, if only because it is filmed at 30 frames per second. In addition, the camera “sees” the world differently than the human eye, so artifacts that cannot be seen in the real world, such as ghosting, may be noticeable in the video. Nevertheless, we tried to make this video as clear as possible: at least the presence or absence of tears in it is quite noticeable.

    Now let's launch Assasin's Creed: Black Flag with minimal settings and see what has changed. The number of frames per second in this game mode did not exceed 60 FPS - the set screen refresh rate. Without vertical sync enabled, there was noticeable tearing on the screen. But as soon as V-Sync was turned on, the gaps disappeared and the “picture” began to look almost the same as with G-Sync.

    When the graphics settings were set to maximum, the number of frames per second began to fluctuate around 25-35 FPS. Of course, the gaps immediately returned without V-Sync and the “freezes” with it. Even turning on G-Sync could not correct this situation - with such a low number of FPS, the GPU itself causes the brakes.

    G-Sync in Assasin's Creed: Black Flag, 144 Hz

    With V-Sync and G-sync disabled There was some tearing to be found on the screen, but thanks to the 144Hz refresh rate there was much less of it than before. When turned on V-Sync The gaps disappeared, but freezes began to occur more often - almost the same as with a screen refresh rate of 60 Hz.

    Enabling G-Sync, as before, was able to correct the situation, but the greatest improvement in the picture was noticeable only at high frame rates - from 60 FPS and above. But without lowering the settings or adding a second video card of the GeForce GTX Titan Black level, it was not possible to achieve such a high frame rate.

    G-Sync in Counter-Strike: Global Offensive, 60 and 144 Hz

    In online games, the gameplay and image quality are affected not only by the video card and monitor, but also by ping - the higher it is, the greater the delay in the game’s “response”. During our tests, the ping was between 25-50ms, and the frame rate hovered around 200 FPS during the test.

    Image settings used in Counter-Strike: Global Offensive

    Without using G-Sync and V-Sync in CS, as in Assasin's Creed, there were gaps. When turned on V-Sync at 60Hz it became more difficult to play - the frame rate dropped to 60 FPS, and the game character began to run unevenly due to the large number of freezes.

    When turned on G-Sync The frame rate remained at 60 frames per second, but there were much fewer freezes. This is not to say that they disappeared completely, but they stopped spoiling the impression of the game.

    Now let's increase the screen refresh rate and see what changes. With G-Sync and V-Sync disabled at 144 Hz There were much fewer gaps than at 60 Hz, but they did not disappear completely. But when turned on V-Sync all the gaps disappeared, and the “freezes” became almost unnoticeable: playing in this mode is very comfortable, and the speed of movement does not decrease. Enabling G-Sync and completely turned the image into candy: the gameplay became so smooth that even a 25-ms ping began to greatly affect the gameplay.

    Testing ULMB Mode

    Ultra Low Motion Blur is enabled from the monitor menu, but you must first disable G-Sync and set the screen refresh rate to 85, 100 or 120 Hz. Lower or higher frequencies are not supported.

    The practical application of this “trick” is obvious: text on websites is less blurred during scrolling, and in strategy games and other RTS games, moving units look more detailed.

    ASUS ROG SWIFT PG278Q in 3D

    ASUS ROG SWIFT PG278Q is the world's first monitor capable of reproducing stereoscopic images at a resolution of 2560x1440 thanks to the DisplayPort 1.2 interface. Also, in principle, no small achievement. Unfortunately, the monitor does not have a built-in IR transmitter, so we took the transmitter from the NVIDIA 3D Vision kit, and the glasses from the 3D Vision 2 kit. This combination worked without problems, and we were able to test stereoscopic 3D properly.

    We did not find any ghosting effect or other artifacts found in pseudo-3D video. Of course, sometimes in games some objects were at the wrong depth, but this cannot be attributed to the monitor’s shortcomings. On the ASUS PG278Q you can both watch stereo movies and play similar games. The main thing is that the video adapter works.

    ⇡ Conclusions

    Without wanting to downplay NVIDIA's achievements, it should be noted that in general G-Sync is an innovation that comes down to getting rid of a long-standing and harmful atavism - regular updating of LCD panels that do not need it in the first place. It turned out that to do this, it was enough to make small changes to the DisplayPort protocol, which, with a snap of the fingers, were included in the 1.2a specification and, according to AMD’s promises, will very soon find application in display controllers from many manufacturers.

    For now, however, only a proprietary version of this solution is available in the form of G-Sync, which we had the pleasure of testing in the ASUS ROG SWIFT PG278Q monitor. The irony is that this is exactly the kind of monitor for which the benefits of G-Sync are not very noticeable. The 144Hz screen refresh itself reduces the notorious screen tearing to such an extent that many will be willing to turn a blind eye to this problem. And with vertical sync, we have less pronounced stuttering and input lag compared to 60Hz screens. G-Sync in such a situation can only bring the smoothness of the game to the ideal.

    Still, synchronizing screen refresh with frame rendering on the GPU is still a more elegant and cost-effective solution than constantly refreshing at ultra-high frequencies. Also, let's not forget that the use of G-Sync is not limited to matrices with a frequency of 120/144 Hz. First of all, 4K monitors come to mind, which are still limited to a frequency of 60 Hz both by matrix specifications and by video input bandwidth. Then there are IPS monitors, which also do not have the ability to switch to 120/144 Hz due to the limitations of the technology itself.

    With a 60Hz refresh rate, the effect of G-Sync cannot be overstated. If frame rates consistently exceed 60 FPS, then simple V-sync will eliminate tearing just as well, but only G-Sync can keep frames flowing smoothly when the frame rate drops below the refresh rate. In addition, thanks to G-Sync, the performance range of 30-60 FPS becomes much more playable, which either reduces GPU performance requirements or allows you to set more aggressive quality settings. And again the thought returns to 4K monitors, which require extremely powerful hardware to play on with good graphics.

    It's also commendable that NVIDIA has adopted the ULMB technology that we previously saw on the EIZO Foris FG2421. It's a pity that it can't work simultaneously with G-Sync yet.

    The ASUS ROG SWIFT PG278Q monitor itself is good primarily because of its combination of 2560x1440 resolution and 144 Hz refresh rate. Previously, there were no devices with such parameters on the market, and yet it’s high time for gaming monitors with such a low response time and support for stereoscopic 3D to grow out of the Full HD format. You shouldn’t be too picky about the fact that the PG278Q has a TN matrix installed, because it is a really good specimen with the highest brightness, contrast and excellent color rendition, which, after calibration, will be the envy of IPS displays. The technology is only given away by limited viewing angles. Let's not leave without praise the design, befitting such a quality product. ASUS ROG SWIFT PG278Q receives the well-deserved “Editor's Choice” award - it turned out to be so good.

    The only thing that prevents me from recommending this gaming monitor for purchase without further thought is the price, which is around 30 thousand rubles. In addition, at the time of writing this article, ASUS ROG SWIFT PG278Q is still not sold in the Russian Federation, so there is nowhere to see it, as well as G-Sync, with your own eyes. But we hope that ASUS and NVIDIA will solve this problem in the future - for example, by showing G-Sync at computer game exhibitions. Well, the price will probably go down one day...

    From the file server site you can download the color profile for this monitor, which we received after calibration.

    The editors of the site thanks the Grafitek company for providing the X-Rite i1Display Pro colorimeter.

    Do you have a G-SYNC capable monitor and an NVIDIA graphics card? Let's look at what G-SYNC is, how to enable it and configure it correctly in order to fully use the potential and capabilities of this technology. Keep in mind that just turning it on isn't everything.

    Every gamer knows what vertical synchronization (V-Sync) is. This function synchronizes the image frames in such a way as to eliminate the effect of screen tearing. If you disable vertical synchronization on a regular monitor, you will decrease the input lag (latency) and you will notice that the game will respond better to your commands, but thus the frames will not be properly synchronized and it will end up with screen tearing.

    V-Sync eliminates screen tearing, but at the same time causes an increase in the delay of the image output relative to the controls, so that the game becomes less comfortable. Every time you move the mouse, it appears that the movement effect occurs with a slight delay. And here the G-SYNC function comes to the rescue, which eliminates both of these shortcomings.

    What is G-SYNC?

    A rather expensive but effective solution for NVIDIA GeForce video cards is the use of G-SYNC technology, which eliminates screen tearing without using additional input lag. But to implement it you need a monitor that includes the G-SYNC module. The module adjusts the screen refresh rate to the number of frames per second, so there is no additional delay and the effect of screen tearing is eliminated.

    Many users, after purchasing such a monitor, only enable NVIDIA G-SYNC support in the NVIDIA Control Panel settings with the belief that this is all they should do. Theoretically yes, because G-SYNC will work, but if you want to fully maximize the use of this technology, then you need to enable a number of additional functions related to the appropriate setting of classic V-sync and limiting the FPS in games to a number of frames less than the maximum refresh rate monitor. Why? You will learn all this from the following recommendations.

    Enabling G-SYNC in the NVIDIA Control Panel

    Let's start with the simplest basic solution, that is, from the moment you turn on the G-SYNC module. This can be done using the NVIDIA Control Panel. Right-click on your desktop and select NVIDIA Control Panel.

    Then go to the Display tab - G-SYNC Setup. Here you can enable the technology using the “Enable G-SYNC” field. Tag it.

    You can then specify whether it will only work in full screen mode, or can also activate in games running in windowed mode or a full screen window (without borders).

    If you select the “Enable G-SYNC for full screen mode” option, the function will only work in games that have the full screen mode set (this option can be changed in the settings of specific games). Games in windowed mode or full screen will not use this technology.

    If you want windowed games to also use G-SYNC technology, then enable the “Enable G-SYNC for windowed and full screen mode” option. When this option is selected, the function intercepts the currently active window and overlays its action on it, enabling it to support modified screen refresh. You may need to restart your computer to activate this option.

    How to check that this technology is enabled. To do this, open the Display menu at the top of the window and check the “G-SYNC Indicator” field in it. This will inform you that G-SYNC is enabled when you launch the game.

    Then go to the Manage 3D Settings tab in the side menu. In the "Global settings" section, find the "Preferred refresh rate" field.

    Set this to "Highest available". Some games may impose their own refresh rate, which may result in G-SYNC not being fully utilized. Thanks to this parameter, all game settings will be ignored and the ability to use the maximum monitor refresh rate will always be enabled, which in devices with G-SYNC is most often 144Hz.

    In general, this is the basic setup you need to do to enable G-SYNC. But, if you want to fully use the potential of your equipment, you should read the following instructions.

    What should I do with V-SYNC if I have G-SYNC? Leave it on or turn it off?

    This is the most common dilemma of G-SYNC monitor owners. It is generally accepted that this technology completely replaces the classic V-SYNC, which can be completely disabled in the NVIDIA Control Panel or simply ignored.

    First you need to understand the difference between them. The task of both functions is theoretically the same - to overcome the effect of screen tearing. But the method of action is significantly different.

    V-SYNC synchronizes frames to match the monitor's constant refresh rate. Consequently, the function acts as an intermediary, capturing the picture and therefore the display frame so as to adapt them to a constant frame rate, thereby preventing image tearing. As a result, this can lead to input lag (delay), because V-SYNC must first “capture and organize” the image, and only then display it on the screen.

    G-SYNC works exactly the opposite. It adjusts not the image, but the monitor refresh rate to the number of frames displayed on the screen. Everything is done in hardware using the G-SYNC module built into the monitor, so there is no additional delay in displaying the image, as is the case with vertical synchronization. This is its main advantage.

    The whole problem is that G-SYNC only works well when the FPS is in the supported refresh rate range. This range covers frequencies from 30 Hz to the maximum value the monitor supports (60Hz or 144Hz). That is, this technology works to its full potential when FPS does not fall below 30 and does not exceed 60 or 144 frames per second, depending on the maximum supported refresh rate. The infographic below, created by BlurBusters, looks really good.

    What happens if the fps goes outside this range? G-SYNC will not be able to adjust the screen update, so anything outside the range will not work. You will find exactly the same problems as on a regular monitor without G-SYNC and classic vertical sync will work. If it is turned off, screen tearing will occur. If it is turned on, you will not see the gap effect, but an iput lag (delay) will appear.

    Therefore, it is in your best interest to stay within the G-SYNC refresh range, which is a minimum of 30Hz and a maximum of whatever the monitor maxes out (144Hz is most common, but there are 60Hz displays as well). How to do this? Using appropriate vertical synchronization parameters, as well as by limiting the maximum number of FPS.

    What, then, is the conclusion from this? In a situation where the number of frames per second drops below 30 FPS, you need to leave vertical sync still enabled. These are rare cases, but if it does happen, V-SYNC ensures that there will be no tearing effect. If the upper limit is exceeded, then everything is simple - you need to limit the maximum number of frames per second so as not to approach the upper limit, when crossed, V-SYNC is turned on, thereby ensuring continuous operation of G-SYNC.

    Therefore, if you have a 144 Hz monitor, you need to enable the FPS cap at 142 to avoid going too close to the upper limit. If the monitor is 60 Hz, set the limit to 58. Even if the computer is able to make more FPS, it will not do this. Then V-SYNC will not turn on and only G-SYNC will be active.

    Enabling Vsync in NVIDIA Settings

    Open the NVIDIA Control Panel and go to the “Manage 3D Settings” tab. In the Global Setting section, find the Vertical Sync option and set the option to “On”.

    Thanks to this, vertical synchronization will always be ready to turn on if the FPS drops below 30 FPS, and a monitor with G-SYNC technology would not be able to cope with this.

    Limit FPS to less than maximum screen refresh rate

    The best way to limit frames per second is to use RTSS (RivaTuner Statistics Server) program. Of course, the best solution is to use the limiter built into the game, but not everyone has one.

    Download and run the program, then in the list of games on the left side, check the Global field. Here you can set a common limiter for all applications. On the right side, find the “Framerate limit” field. Set the limit here for 144Hz monitors - 142 FPS, respectively, for 60Hz devices -58 FPS.

    When the limit is set, there will be no delay in activating classic vertical synchronization and playing will become much more comfortable.

    What is vertical sync in games? This function is responsible for the correct display of games on standard LCD monitors with a frequency of 60 Hz. When enabled, the frame rate is limited to 60Hz and no stuttering appears on the screen. Disabling it will increase the frame rate, but at the same time there will be a screen tearing effect.

    Why do you need vertical synchronization in games?

    Vertical sync is a somewhat controversial topic in gaming. On the one hand, it seems very necessary for a visually comfortable gaming experience, provided that you have a standard LCD monitor.

    Thanks to it, no errors appear on the screen during the game, the picture is stable and has no gaps. The downside is that the frame rate is capped at 60 Hz, so more demanding players may experience so-called input lag, that is, a slight delay when moving in the game with the mouse (can be equated to artificial smoothing of mouse movement).

    Disabling Vsync also has its pros and cons. First of all, we provide unlimited FPS frame rate and thereby completely remove the mentioned input lag. This is convenient in games like Counter-Strike, where reaction and accuracy are important. Movement and aiming are very clear, dynamic, every mouse movement occurs with high precision. In some cases, we will be able to get a higher FPS frequency, since V-Sync, depending on the video card, can slightly reduce the hardware performance (the difference is about 3-5 FPS). Unfortunately, the downside is that without vertical sync you get screen tearing. When turning or changing movement in the game, we notice that the image is torn into two or three horizontal parts.

    Enable or disable V-Sync?

    Is vertical sync necessary? It all depends on our individual preferences and what we want to get. In multiplayer FPS games, it is recommended to disable V-sync to increase aim accuracy. The effect of screen tearing, as a rule, is not so noticeable, and when we get used to it, we won’t even notice it.

    In turn, in story games you can safely enable V-Sync. Here, high accuracy is not so important, the first violin is played by the environment, visual comfort, so you should rely on good quality.

    Vertical sync can usually be turned on or off in the game's graphics settings. But if we don’t find such a function there, then you can manually turn it off manually in the video card settings - both for all applications and only for selected applications.

    Vertical synchronization on NVIDIA video cards

    On GeForce video cards, the function is located in the Nvidia Control Panel. Right-click on the Windows 10 desktop and then select Nvidia Control Panel.

    In the sidebar, select the Manage 3D Settings tab under 3D Settings. The available settings will be displayed on the right.

    The settings are divided into two tabs - global and program. On the first tab, you can set parameters for all games and, for example, whether to enable or disable vertical sync in each. Whereas on the second tab you can set the same parameters, but individually for each game separately.

    Select the global or program tab, and then look for the “Vertical sync” option in the list. Nearby there is a drop-down field - select forced shutdown or enable vertical sync.

    V-Sync on AMD graphics

    In the case of video cards, AMD looks exactly the same as Nvidia. Right-click on your desktop and then go to Panel Catalyst Control Center.

    Then open the “Games” tab on the left and select “3D Application Settings”. A list of available options will be displayed on the right that can be forcibly enabled from the settings of the AMD Radeon video card. When we are on the “System Parameters” tab, we select for everyone.

    If you need to set parameters individually for each game separately, then click on the “Add” button and specify the EXE file. It will be added to the list as a new bookmark and when you go to it, you can set parameters only for this game.

    When you have selected the tab with the added application or system parameters (general), then find the “Wait for vertical update” option in the list. A selection field will appear where we can force this option to be enabled or disabled.

    V-Sync on integrated Intel HD Graphics

    If we use the integrated Intel HD Graphics chip, a control panel is also available. It should be available by right-clicking on the desktop or through the key combination Ctrl + Alt + F12.

    On the Intel panel, go to the Settings Mode tab - Control Panel - 3D Graphics, and then to User Settings.

    Here we find a field with vertical synchronization Vertical Sync. You can force it by setting it to Enabled or set it to Application Settings. Unfortunately, the Intel HD card options do not have a forced shutdown function - you can only enable V-Sync. Since it is not possible to disable vertical synchronization in the video card, this can only be done in the settings of the game itself.

    instcomputer.ru

    Windows 10, low FPS, and a floating mouse:: Counter-Strike: Global Offensive General Discussions

    Counter-Strike: Global Offensive > General Discussions > Topic Details

    Windows 10, low FPS, and floating mouse

    Good day everyone, I recently updated my system to Win10 (the previous one was Win7). After the update I encountered several problems. The first problem is the relatively low FPS compared to what it was. Actually, after installing Win10, my FPS seemed to be locked. I’ll explain the essence of the problem, I have an average system and on the “seven” I had a good FPS of 200-300. On the "ten", my FPS does not rise above 60, either in the menu or in the game itself. I searched almost the entire Internet and did not find a solution to this problem. The second problem is a slight floating of the mouse which is barely felt, but at the same time it greatly interferes with accurate aiming. P.S. before installing the "ten" this problem did not exist. My system: GPU: GeForce GTX 660TiCPU: ItelCore i3-3220 3.3GHzRAM: 8GB Hard drive (on which CS:GO is installed): 2TBMonitor: ASUS VK278 60HzMouse: Razer DeathAdder 2013Mat: Razer Goliathus SpeedKeyboard: Razer BlackWidow Ultimate 2013

    Please share your thoughts on this topic. I will be very happy)

    Note: This is ONLY to be used to report spam, advertising, and problematic (harassment, fighting, or rude) posts.

    steamcommunity.com

    Windows 10 update lets you disable Vsync and unlock max fps

    11.05.2016 02:22

    Game projects optimized for the Universal Windows Platform (UWP) with DirectX 12 support can be launched without activating the V-Sync option. The update also adds support for NVIDIA G-SYNC and AMD FreeSync technologies.

    The update will help avoid twitching and delays in the image on the screen, as well as improve the visual quality of the image.

    Microsoft said that Gears of War: Ultimate Edition and Forza Motorsport 6: Apex will receive patches with this option in the very near future.

    Automatic updates will gradually come to all computers with the “tenth” version of Windows.

    There are things that are not just difficult to write about, but very difficult. Which you just need to see once, rather than hear about them a hundred times or read on the Internet. For example, it is impossible to describe some natural wonders, such as the majestic Grand Canyon or the snow-capped Altai Mountains. You can look at beautiful pictures with their images a hundred times and admire videos, but all this cannot replace live impressions.

    The topic of smooth frame output on the monitor using Nvidia G-Sync technology also applies to such topics - from the text descriptions the changes do not seem so significant, but in the very first minutes of playing a 3D game on a system with an Nvidia Geforce video card connected to G-Sync -monitor, it becomes clear how big the qualitative leap is. And although more than a year has passed since the announcement of the technology, the technology does not lose its relevance, it still has no competitors (among the solutions that have entered the market), and the corresponding monitors continue to be produced.

    Nvidia has been working on improving the visual experience of Geforce GPU users in modern games by making rendering smoother for quite some time. You can recall the adaptive synchronization technology Adaptive V-Sync, which is a hybrid that combines modes with vertical sync enabled and disabled (V-Sync On and V-Sync Off, respectively). In the case when the GPU provides rendering at a frame rate lower than the monitor refresh rate, synchronization is disabled, and for FPS exceeding the refresh rate, it is enabled.

    Adaptive Sync didn't solve all the smoothness issues, but it was still an important step in the right direction. But why was it necessary to create any special synchronization modes and even release software and hardware solutions? What's wrong with technologies that have been around for decades? Today we'll tell you how Nvidia G-Sync technology helps eliminate all known display artifacts, such as image tearing, unsmooth footage, and increased lag.

    Looking far ahead, we can say that G-Sync synchronization technology allows you to get a smooth frame change with the highest possible performance and comfort, which is very noticeable when playing on such a monitor - this is noticeable even to the average home user, and for avid gamers it can mean an improvement reaction time, and at the same time game achievements.

    Today, most PC gamers use monitors with a refresh rate of 60 Hz - typical LCD screens, the most popular now. Accordingly, both when synchronization is turned on (V-Sync On) and when it is turned off, there are always some shortcomings associated with the basic problems of ancient technologies, which we will talk about later: high delays and FPS jerks when V-Sync is turned on and unpleasant tearing images when turned off.

    And since delays and unsmooth frame rates are more disruptive and annoying to the game, rarely do any players turn on synchronization at all. And even some models of monitors with a refresh rate of 120 and 144 Hz that have appeared on the market cannot help eliminate the problems completely, they simply make them somewhat less noticeable, updating the screen content twice as often, but the same artifacts are still present: lags and the absence of the same comfortable smoothness.

    And since monitors with G-Sync, paired with an appropriate Nvidia Geforce graphics card, can provide not only a high refresh rate, but also eliminate all these shortcomings, purchasing such solutions can be considered even more important than even upgrading to a more powerful GPU. But let’s first figure out why it was necessary to do something different from long-known solutions - what’s the problem?

    Problems with existing video output methods

    Technologies for displaying images on a screen with a fixed refresh rate have appeared since the times when cathode ray tube (CRT) monitors were used. Most readers should remember them - pot-bellied, just like ancient televisions. These technologies were originally developed to display television images at a fixed frame rate, but in the case of devices for displaying 3D images dynamically calculated on a PC, this solution raises big problems that have not yet been solved.

    Even the most modern LCD monitors have a fixed refresh rate of the image on the screen, although technologically nothing prevents you from changing the image on them at any time, with any frequency (within reasonable limits, of course). But PC gamers since the days of CRT monitors have been forced to put up with a decidedly less-than-ideal solution to the problem of synchronizing the frame rate of 3D rendering with the monitor's refresh rate. Until now there have been very few options for image output - two, and both of them have drawbacks.

    The root of all the problems is that with a fixed refresh rate of the image on the monitor, the video card renders each frame at a different time - this is due to the constantly changing complexity of the scene and the load on the GPU. And the rendering time of each frame is not constant, it changes every frame. It’s no wonder that when trying to display a number of frames on the monitor, synchronization problems arise, because some of them require much more time to render than others. As a result, we get different preparation times for each frame: sometimes 10 ms, sometimes 25 ms, for example. And monitors that existed before the advent of G-Sync could only display frames after a certain period of time - not earlier, not later.

    The matter is further complicated by the wealth of software and hardware configurations of gaming PCs, combined with very different loads depending on the game, quality settings, video driver settings, etc. As a result, it is impossible to configure each gaming system so that training is carried out with constant or at least not too different times in all 3D applications and conditions - as is possible on game consoles with their single hardware configuration.

    Naturally, unlike consoles with their predictable frame rendering times, PC players are still seriously limited in their ability to achieve a smooth gaming experience without noticeable drops and lags. In an ideal (read - impossible in reality) case, updating the image on the monitor should be carried out strictly after the next frame is calculated and prepared by the graphics processor:

    As you can see, in this hypothetical example, the GPU always has time to draw a frame before it needs to be transferred to the monitor - the frame time is always slightly less than the time between updates to the information on the display, and in between the GPU rests a little. But in reality, everything is completely different - the frame rendering time is very different. Imagine if the GPU does not have time to render a frame in the allotted time - then the frame must either be displayed later, skipping one image update on the monitor (vertical synchronization is enabled - V-Sync On), or the frames must be displayed in parts with synchronization disabled, and then on the monitor simultaneously there will be pieces from several adjacent frames.

    Most users turn off V-Sync to get lower latency and smoother frames on the screen, but this solution introduces visible artifacts in the form of image tearing. And with synchronization enabled, there will be no image tearing, since the frames are displayed exclusively in their entirety, but the delay between the player’s action and the image update on the screen increases, and the frame output rate is very uneven, since the GPU never draws frames in strict accordance with the image update time on the monitor.

    This problem has existed for many years and clearly interferes with the comfort of viewing the result of 3D rendering, but until some time no one bothered to solve it. And the solution, in theory, is quite simple - you just need to display information on the screen strictly when the GPU finishes working on the next frame. But let’s first take a closer look at examples of how exactly existing image output technologies work, and what solution Nvidia offers us in its G-Sync technology.

    Disadvantages of output when synchronization is disabled

    As we have already mentioned, the vast majority of players prefer to keep synchronization turned off (V-Sync Off) in order to get the frames drawn by the GPU to be displayed on the monitor as quickly as possible and with minimal delay between the player’s actions (keystrokes, mouse commands) and their display. For serious players this is necessary for victories, and for ordinary players in this case the sensations will be more pleasant. This is how working with V-Sync disabled looks schematically:

    There are no problems or delays with the output of frames. But although disabled vertical synchronization solves the lag problem as much as possible, providing minimal latency, at the same time artifacts appear in the image - picture tearing, when the image on the screen consists of several pieces of adjacent frames drawn by the GPU. Also noticeable is the lack of smoothness of the video due to the unevenness of the frames coming from the GPU to the screen - image breaks in different places.

    This image tearing occurs as a result of an image consisting of two of more frames rendered on the GPU during a single monitor refresh cycle. Of several - when the frame rate exceeds the monitor refresh rate, and of two - when it approximately corresponds to it. Look at the diagram shown above - if the contents of the frame buffer are updated in the middle between times when information is displayed on the monitor, then the final image on it will be distorted - part of the information in this case belongs to the previous frame, and the rest to the current one being drawn.

    With synchronization disabled, frames are transmitted to the monitor with absolutely no regard to the frequency and time of its update, and therefore never coincide with the monitor’s refresh rate. In other words, with V-Sync disabled, monitors without G-Sync support will always experience such image tearing.

    The point is not only that it is unpleasant for the player to see stripes twitching all over the screen, but also that the simultaneous rendering of parts of different frames can misinform the brain, which is especially noticeable with dynamic objects in the frame - the player sees parts of objects shifted relative to each other. You have to put up with this only because disabling V-Sync provides minimal output delays at the moment, but far from ideal dynamic image quality, as you can see in the following examples (clicking on frames in full resolution):

    Using the examples above, taken using the FCAT software and hardware complex, you can see that the real image on the screen can be composed of pieces of several adjacent frames - and sometimes unevenly, when a narrow strip is taken from one of the frames, and the neighboring ones occupy the remaining ( noticeably larger) part of the screen.

    Problems with image tearing are even more visible in the dynamics (if your system and/or browser does not support playing MP4/H.264 videos in a resolution of 1920x1080 pixels with a refresh rate of 60 FPS, then you will have to download them and view them locally using a media player with corresponding capabilities):

    As you can see, even in dynamics, unpleasant artifacts in the form of picture breaks are easily noticeable. Let's see how this looks schematically - in a diagram that shows the output method when synchronization is disabled. In this case, frames arrive on the monitor immediately after the GPU finishes rendering them, and the image is displayed on the display even if the output of information from the current frame has not yet been completely completed - the remaining part of the buffer falls on the next screen update. That is why each frame of our example displayed on the monitor consists of two frames drawn on the GPU - with an image break in the place marked in red.

    In this example, the first frame (Draw 1) is drawn by the GPU to the screen buffer faster than its 16.7 ms refresh time - and before the image is transferred to the monitor (Scan 0/1). The GPU immediately starts working on the next frame (Draw 2), which breaks the picture on the monitor, containing another half of the previous frame.

    As a result, in many cases a clearly visible stripe appears on the image - the boundary between the partial display of adjacent frames. In the future, this process is repeated, since the GPU works on each frame for a different amount of time, and without synchronizing the process, the frames from the GPU and those displayed on the monitor never match.

    Pros and cons of Vsync

    When traditional vertical synchronization (V-Sync On) is enabled, the information on the monitor is updated only when the work on the frame is completely completed by the GPU, which eliminates tearing in the image, because the frames are displayed exclusively on the screen. But, since the monitor updates the content only at certain intervals (depending on the characteristics of the output device), this binding brings other problems.

    Most modern LCD monitors update information at a rate of 60 Hz, that is, 60 times per second - approximately every 16 milliseconds. And with synchronization enabled, the image output time is strictly tied to the monitor’s refresh rate. But as we know, the GPU rendering rate is always variable, and the time it takes to render each frame varies depending on the constantly changing complexity of the 3D scene and quality settings.

    It cannot always be equal to 16.7 ms, but will be either less than this value or more. When synchronization is enabled, the GPU's work on frames again finishes either earlier or later than the screen refresh time. If the frame was rendered faster than this moment, then there are no special problems - the visual information is simply waiting for the monitor to update to display the entire frame on the screen, and the GPU is idle. But if the frame does not have time to render in the allotted time, then it has to wait for the next image update cycle on the monitor, which causes an increase in the delay between the player’s actions and their visual display on the screen. In this case, the image of the previous “old” frame is again displayed on the screen.

    Although all this happens quite quickly, the increase in latency is visually easily noticeable, and not only by professional players. And since the frame rendering time is always variable, turning on the binding to the monitor refresh rate causes jerks when displaying a dynamic image, because the frames are displayed either quickly (equal to the monitor refresh rate), or twice, three or four times slower. Let's look at a schematic example of such work:

    The illustration shows how frames are displayed on the monitor when vertical synchronization is turned on (V-Sync On). The first frame (Draw 1) is rendered by the GPU faster than 16.7 ms, so the GPU does not go to work on drawing the next frame, and does not tear the image, as is the case with V-Sync Off, but waits for the first frame to be completely output to the monitor. And only after that it starts drawing the next frame (Draw 2).

    But working on the second frame (Draw 2) takes longer than 16.7 ms, so after they expire, visual information from the previous frame is displayed on the screen, and it is shown on the screen for another 16.7 ms. And even after the GPU finishes working on the next frame, it is not displayed on the screen, since the monitor has a fixed refresh rate. In total, you have to wait 33.3 ms for the second frame to be output, and all this time is added to the delay between the player's action and the end of the frame being output to the monitor.

    Added to the problem of time lag is a gap in the smoothness of the video sequence, noticeable in the jerkiness of the 3D animation. The problem is shown very clearly in a short video:

    But even the most powerful graphics processors in demanding modern games cannot always provide a sufficiently high frame rate that exceeds the typical monitor refresh rate of 60 Hz. And, accordingly, they will not provide the opportunity to play comfortably with synchronization turned on and without problems such as picture tearing. Especially when it comes to games such as the online game Battlefield 4, the very demanding Far Cry 4 and Assassin’s Creed Unity in high resolutions and maximum game settings.

    That is, the modern player has little choice - either get a lack of smoothness and increased delays, or be content with imperfect picture quality with broken pieces of frames. Of course, in reality everything doesn’t look so bad, because somehow we played all this time, right? But in times when they are trying to achieve the ideal in both quality and comfort, you want more. Moreover, LCD displays have the fundamental technological ability to output frames when the graphics processor indicates it. The only thing left to do is to connect the GPU and monitor, and such a solution already exists - Nvidia G-Sync technology.

    G-Sync technology - Nvidia's solution to problems

    So, most modern games with synchronization turned off cause picture tearing, and with synchronization turned on - unsmooth frame changes and increased delays. Even with high refresh rates, traditional monitors do not eliminate these problems. It's likely that Nvidia's employees have been so fed up with the choice between two less-than-ideal options for displaying frames in 3D applications for many years that they decided to get rid of the problems by giving players a fundamentally new approach to updating information on the display.

    The difference between G-Sync technology and existing display methods is that the timing and frame rate of the Nvidia variant is determined by the Geforce GPU, and it is dynamically variable rather than fixed as was previously the case. In other words, in this case, the GPU takes full control of the frame output - as soon as it finishes working on the next frame, it is displayed on the monitor, without delays or image tearing.

    Using such a connection between the GPU and specially adapted monitor hardware gives players a better output method - simply ideal, in terms of quality, eliminating all the problems we mentioned above. G-Sync ensures perfectly smooth frame changes on the monitor, without any delays, jerks or artifacts caused by the display of visual information on the screen.

    Naturally, G-Sync doesn't work magically, and to make the technology work on the monitor side requires the addition of special hardware logic in the form of a small board supplied by Nvidia.

    The company is working with monitor manufacturers to include G-Sync cards in their gaming display models. For some models there is even an option for an upgrade by the user himself, but this option is more expensive and does not make sense, because it is easier to immediately buy a G-Sync monitor. For a PC, it is enough to have any of the modern Nvidia Geforce video cards in its configuration, and an installed G-Sync-optimized video driver - any of the latest versions will do.

    When Nvidia G-Sync technology is enabled, after finishing processing the next frame of a 3D scene, the Geforce graphics processor sends a special signal to the G-Sync controller board built into the monitor, and it tells the monitor when to update the image on the screen. This allows you to achieve simply perfect smoothness and responsiveness when playing on a PC - you can verify this by watching a short video (necessarily at 60 frames per second!):

    Let's see what the configuration looks like with G-Sync technology enabled, according to our diagram:

    As you can see, everything is very simple. Enabling G-Sync locks the monitor's refresh rate to the end of each frame rendering on the GPU. The GPU completely controls the work: as soon as it finishes rendering the frame, the image is immediately displayed on a G-Sync-compatible monitor, and the result is not a fixed display refresh rate, but a variable one - exactly like the GPU frame rate. This eliminates problems with image tearing (after all, it always contains information from one frame), minimizes frame rate stuttering (the monitor does not wait longer than the frame is physically processed on the GPU) and reduces output lag compared to the method with V-sync enabled.

    It must be said that players clearly did not have enough of such a solution; the new method of synchronizing the GPU and the Nvidia G-Sync monitor really has a very strong effect on the comfort of playing on a PC - that almost perfect smoothness appears, which was not there before - in our time of super-powerful video cards! Since the announcement of G-Sync technology, old methods have instantly become an anachronism and upgrading to a G-Sync monitor capable of a variable refresh rate of up to 144 Hz seems like a very attractive option that allows you to finally get rid of problems, lags and artifacts.

    Does G-Sync have any disadvantages? Of course, like any technology. For example, G-Sync has an unpleasant limitation, which is that it provides smooth display of frames on the screen at a frequency of 30 FPS. And the selected refresh rate for a monitor in G-Sync mode sets the upper limit for the speed at which screen content is refreshed. That is, with a refresh rate set to 60 Hz, maximum smoothness will be provided at a frequency of 30–60 FPS, and at 144 Hz - from 30 to 144 FPS, but not less than the lower limit. And with a variable frequency (for example, from 20 to 40 FPS), the result will no longer be ideal, although it is noticeably better than traditional V-Sync.

    But the main disadvantage of G-Sync is that it is Nvidia's own technology, which competitors do not have access to. Therefore, at the beginning of the outgoing year, AMD announced a similar FreeSync technology - also consisting in dynamically changing the frame rate of the monitor in accordance with the preparation of frames from the GPU. An important difference is that AMD’s development is open and does not require additional hardware solutions in the form of specialized monitors, since FreeSync has transformed into Adaptive-Sync, which has become an optional part of the DisplayPort 1.2a standard from the well-known organization VESA (Video Electronics Standards Association). It turns out that AMD will skillfully use the theme developed by its competitor to its advantage, since without the advent and popularization of G-Sync, they would not have had any FreeSync, as we think.

    Interestingly, Adaptive-Sync technology is also part of the VESA embedded DisplayPort (eDP) standard, and is already used in many display components that use eDP for signal transmission. Another difference from G-Sync is that VESA members can use Adaptive-Sync without having to pay anything. However, it is very likely that Nvidia will also support Adaptive-Sync in the future as part of the DisplayPort 1.2a standard, because such support will not require much effort from them. But the company will not give up G-Sync either, as it considers its own solutions a priority.

    The first monitors with Adaptive-Sync support should appear in the first quarter of 2015, they will not only have DisplayPort 1.2a ports, but also special support for Adaptive-Sync (not all monitors with DisplayPort 1.2a support will be able to boast of this). Thus, in March 2015, Samsung plans to launch the Samsung UD590 (23.6 and 28 inches) and UE850 (23.6, 27 and 31.5 inches) monitor lines with support for UltraHD resolution and Adaptive-Sync technology. AMD claims that monitors with support for this technology will be up to $100 cheaper than similar devices with G-Sync support, but it is difficult to compare them, since all monitors are different and come out at different times. In addition, there are already not so expensive G-Sync models on the market.

    Visual difference and subjective impressions

    We described the theory above, and now it’s time to show everything clearly and describe your feelings. We tested Nvidia G-Sync technology in practice in several 3D applications using an Inno3D iChill Geforce GTX 780 HerculeZ X3 Ultra graphics card and an Asus PG278Q monitor that supports G-Sync technology. There are several models of monitors on the market that support G-Sync from different manufacturers: Asus, Acer, BenQ, AOC and others, and for the monitor model Asus VG248QE you can even buy a kit to upgrade it to support G-Sync on your own.

    The youngest video card model to use G-Sync technology is the Geforce GTX 650 Ti, with the extremely important requirement of a DisplayPort connector on board. Other system requirements include an operating system of at least Microsoft Windows 7, the use of a good DisplayPort 1.2 cable, and the use of a high-quality mouse with high sensitivity and polling rate is recommended. G-Sync technology works with all full-screen 3D applications that use the OpenGL and Direct3D graphics APIs when running on Windows 7 and 8.1 operating systems.

    Any modern driver will be suitable for operation, which - G-Sync has been supported by all the company's drivers for more than a year. If you have all the required components, you only need to enable G-Sync in the drivers, if this has not already been done, and the technology will work in all full-screen applications - and only in them, based on the very principle of the technology.

    To enable G-Sync technology for full-screen applications and get the best experience, you need to enable the 144 Hz refresh rate in the Nvidia Control Panel or operating system desktop settings. Then, you need to make sure that the use of the technology is allowed on the corresponding “G-Sync Setup” page...

    And also - select the appropriate item on the “Manage 3D Parameters” page in the “Vertical Sync Pulse” parameter of the global 3D parameters. There you can also disable the use of G-Sync technology for testing purposes or if any problems arise (looking ahead, we did not find any during our testing).

    G-Sync technology works at all resolutions supported by monitors, up to UltraHD, but in our case we used the native resolution of 2560x1440 pixels at 144 Hz. In my comparisons to the current state of affairs, I used a 60Hz refresh rate mode with G-Sync disabled to emulate the behavior of typical non-G-Sync monitors found on most gamers. Most of which use Full HD monitors capable of a maximum mode of 60 Hz.

    It is necessary to mention that although with G-Sync enabled, the screen refresh will be at the ideal frequency - when the GPU “wants” it, the optimal mode will still be rendering at a frame rate of about 40-60 FPS - this is the most suitable frame rate for modern games, not too small to hit the lower limit of 30 FPS, but also not requiring lower settings. By the way, this is the frequency that Nvidia’s Geforce Experience program strives for, providing the appropriate settings for popular games in the software of the same name included with the drivers.

    In addition to games, we also tried a specialized test application from Nvidia - . This application shows a 3D pendulum scene that is convenient for assessing the smoothness and quality, allows you to simulate different frame rates and select the display mode: V-Sync Off/On and G-Sync. Using this test software it is very easy to show the difference between different synchronization modes - for example, between V-Sync On and G-Sync:

    The Pendulum Demo application allows you to test different synchronization methods in different conditions, it simulates an exact frame rate of 60 FPS to compare V-Sync and G-Sync in ideal conditions for the outdated synchronization method - in this mode there should simply be no difference between the methods. But the 40–50 FPS mode puts V-Sync On in an awkward position, where delays and unsmooth frame changes are visible to the naked eye, since the frame rendering time exceeds the refresh period at 60 Hz. When G-Sync is turned on, everything becomes perfect.

    As for comparing modes with V-Sync disabled and G-Sync enabled, the Nvidia application also helps to see the difference here - at frame rates between 40 and 60 FPS, image tearing is clearly visible, although there are fewer lags than with V-Sync On. And even an unsmooth video sequence in relation to the G-Sync mode is noticeable, although in theory this should not be the case - perhaps this is how the brain perceives “broken” frames.

    Well, with G-Sync enabled, any of the modes of the test application (constant frame rate or variable - it doesn’t matter) always ensures the smoothest video possible. And in games, all the problems of the traditional approach to updating information on a monitor with a fixed refresh rate are sometimes even more noticeable - in this case, you can clearly evaluate the difference between all three modes using the example of the game StarCraft II (viewing a previously saved recording):

    If your system and browser support playing the MP4/H.264 video format at a frequency of 60 FPS, then you will clearly see that in the disabled synchronization mode there are obvious tearing of the picture, and when V-Sync is turned on, jerks and unsmoothness of the video are observed. All this disappears when Nvidia G-Sync is turned on, in which there are no artifacts in the image, no increase in delays, or “ragged” frame rate.

    Of course, G-Sync is not a magic wand, and this technology will not get rid of delays and slowdowns that are not caused by the process of outputting frames to a monitor with a fixed refresh rate. If the game itself has problems with the smoothness of frame output and large jerks in FPS caused by loading textures, data processing on the CPU, suboptimal work with video memory, lack of code optimization, etc., then they will remain in place. Moreover, they will become even more noticeable, since the output of the remaining frames will be perfectly smooth. However, in practice, problems do not occur too often on powerful systems, and G-Sync really improves the perception of dynamic video.

    Since Nvidia's new output technology affects the entire output pipeline, it could theoretically cause artifacts and uneven frame rates, especially if the game artificially caps the FPS at some point. Probably, such cases, if they exist, are so rare that we did not even notice them. But they noted a clear improvement in gaming comfort - when playing on a monitor with G-Sync technology enabled, one gets the impression that the PC has become so much more powerful that it is capable of a constant frame rate of at least 60 FPS without any dropouts.

    The feeling you get when playing on a G-Sync monitor is very difficult to describe in words. The difference is especially noticeable at 40-60 FPS - a frame rate that is very common in demanding modern games. The difference compared to conventional monitors is simply amazing, and we will try not only to tell it in words and show it in video examples, but also to show frame rate graphs obtained under different display modes.

    In games of such genres as real-time strategy and similar ones, like StarCraft II, League of Legends, DotA 2, etc., the advantages of G-Sync technology are clearly visible, as you can see from the example in the video above. In addition, such games always require fast-paced action that does not tolerate delays and unsmooth frame rates, and smooth scrolling plays a rather important role in comfort, which is greatly hampered by picture tearing with V-Sync Off and delays and lags with V-Sync On. So G-Sync technology is ideal for games of this type.

    First-person shooters like Crysis 3 and Far Cry 4 are even more common; they are also very demanding on computing resources, and with high quality settings, players often get frame rates of just about 30-60 FPS - ideal for use G-Sync, which really significantly improves the comfort when playing in such conditions. The traditional vertical synchronization method will very often force you to output frames at a frequency of only 30 FPS, increasing lags and jerks.

    The same goes for third-person games like the Batman, Assassin's Creed, and Tomb Raider series. These games also use the latest graphics technology and require fairly powerful GPUs to achieve high frame rates. With maximum settings in these games and disabling V-Sync, FPS often results in the order of 30–90, which causes unpleasant image tearing. Enabling V-Sync only helps in some scenes with lower resource requirements, and the frame rate jumps from 30 to 60 steps, which causes slowdowns and jerks. And turning on G-Sync solves all these problems, and this is clearly noticeable in practice.

    Practice test results

    In this section, we'll look at the impact of G-Sync and V-Sync on frame rates - the performance graphs give you a clear idea of ​​how the different technologies perform. During testing, we tested several games, but not all of them are convenient to show the difference between V-Sync and G-Sync - some gaming benchmarks do not allow you to force V-Sync, other games do not have a convenient means of playing the exact game sequence (most modern games, unfortunately), still others execute on our test system either too quickly or within a narrow frame rate range.

    So we settled on Just Cause 2 with maximum settings, as well as a couple of benchmarks: Unigine Heaven and Unigine Valley - also at maximum quality settings. The frame rates in these applications vary quite widely, which is convenient for our purpose of showing what happens to frame output under different conditions.

    Unfortunately, at the moment we do not have the FCAT software and hardware system in use, and we will not be able to show graphs of real FPS and recorded videos in different modes. Instead, we tested the second-average and instantaneous frame rates using a well-known utility at 60 and 120 Hz monitor refresh rates using the V-Sync On, V-Sync Off, Adaptive V-Sync screen refresh methods, and G-Sync technology at 144 Hz to show the clear difference between the new technology and current 60 Hz monitors with traditional vertical sync.

    G-Sync vs V-Sync On

    We will begin our study by comparing modes with vertical synchronization enabled (V-Sync On) and G-Sync technology - this is the most revealing comparison, which will show the difference between methods that do not have the disadvantages of image tearing. First, we will look at the Heaven test application at maximum quality settings in a resolution of 2560x1440 pixels (clicking on thumbnail images opens graphs in full resolution):

    As can be seen in the graph, the frame rate with G-Sync enabled and without synchronization is almost the same, except for the frequency above 60 FPS. But the FPS in the mode with the vertical synchronization method enabled is noticeably different, because in it the frame rate can be lower than or equal to 60 FPS and a multiple of integers: 1, 2, 3, 4, 5, 6..., since the monitor sometimes has to show the same previous frame over several update periods (two, three, four, and so on). That is, possible “steps” of the frame rate value with V-Sync On and 60 Hz: 60, 30, 20, 15, 12, 10, ... FPS.

    This gradation is clearly visible in the red line of the graph - during the run of this test, the frame rate was often equal to 20 or 30 FPS, and much less often - 60 FPS. Although with G-Sync and V-Sync Off (No Sync) it was often within a wider range: 35–50 FPS. With V-Sync enabled, this output rate is not possible, so the monitor always displays 30 FPS in such cases - limiting performance and adding lag to the overall output time.

    It should be noted that the graph above does not show the instantaneous frame rate, but average values ​​within a second, and in reality FPS can “jump” much more - almost every frame, which causes unpleasant instability and lags. In order to see this clearly, we present a couple of graphs with instant FPS - more precisely, with graphs of the rendering time of each frame in milliseconds. First example (the lines are slightly shifted relative to each other, only approximate behavior in each mode is shown):

    As you can see, in this example, the frame rate in the case of G-Sync changes more or less smoothly, and with V-Sync On it changes stepwise (there are single jumps in rendering time in both cases - this is normal). With Vsync enabled, frame rendering and output times can be as low as 16.7 ms; 33.3 ms; 50 ms, as can be seen on the graph. In FPS numbers this corresponds to 60, 30 and 20 frames per second. Besides this, there is no particular difference between the behavior of the two lines; there are peaks in both cases. Let's look at another indicative period of time:

    In this case, there are obvious fluctuations in the frame rendering time, and with them the FPS in the case of vertical synchronization enabled. Look, with V-Sync On there is an abrupt change in frame rendering time from 16.7 ms (60 FPS) to 33.3 ms (30 FPS) and back - in reality, this causes that very uncomfortable unsmoothness and clearly visible jerks in the video sequence. The smoothness of frame changes in the case of G-Sync is much higher and playing in this mode will be noticeably more comfortable.

    Let's look at the FPS graph in the second test application - Unigine Valley:

    In this benchmark we note about the same thing as in Heaven. The frame rates in G-Sync and V-Sync Off modes are almost the same (except for a peak above 60 Hz), and V-Sync turned on causes a clear step change in FPS, most often showing 30 FPS, sometimes dropping to 20 FPS and rising to 60 FPS - typical behavior of this method, causing lags, jerks and unsmooth video footage.

    In this subsection, we just have to look at a segment from the built-in test of the game Just Cause 2:

    This game perfectly shows the inadequacy of the outdated V-Sync On synchronization method! With the frame rate varying from 40 to 60-70 FPS, the G-Sync and V-Sync Off lines almost coincide, but the frame rate with V-Sync On reaches 60 FPS only in short periods. That is, with the real capabilities of the GPU for playing at 40-55 FPS, the player will be content with only 30 FPS.

    Moreover, in the section of the graph where the red line jumps from 30 to 40 FPS, in reality, when viewing the image, there is a clear unsmooth frame rate - it jumps from 60 to 30 almost every frame, which clearly does not add smoothness and comfort when playing. But maybe vertical sync will cope better with a frame refresh rate of 120 Hz?

    G-Sync vs V-Sync 60/120 Hz

    Let's look at two modes of enabled vertical synchronization V-Sync On at 60 and 120 Hz image refresh rates, comparing them with the V-Sync Off mode (as we defined earlier, this line is almost identical to G-Sync). At a refresh rate of 120 Hz, more values ​​are added to the FPS “steps” we already know: 120, 40, 24, 17 FPS, etc., which can make the graph less stepped. Let's look at the frame rate in the Heaven benchmark:

    It is noticeable that the 120Hz refresh rate helps V-Sync On mode achieve better performance and smoother frame rates. In cases where at 60 Hz the graph shows 20 FPS, the 120 Hz mode gives an intermediate value of at least 24 FPS. And 40 FPS instead of 30 FPS is clearly visible on the graph. But there are no fewer steps, but even more, so that the frame rate with a 120 Hz update, although it changes by a smaller amount, does so more often, which also adversely affects the overall smoothness.

    There are fewer changes in the Valley benchmark, as the average frame rate is closest to the 30 FPS level available for both 60 and 120 Hz refresh rates. Sync Off provides smoother frames but with visual artifacts, and V-Sync On modes again show jagged lines. In this subsection we just have to look at the game Just Cause 2.

    And again we clearly see how flawed vertical synchronization is, which does not provide a smooth change of frames. Even switching to a 120 Hz refresh rate gives the V-Sync On mode just a few extra “steps” of FPS - the jumps in frame rate back and forth from one step to another have not gone away - all this is very unpleasant when viewing animated 3D scenes, you can take our word for it or watch the sample videos above again.

    Impact of output method on average frame rate

    What happens to the average frame rate when all these synchronization modes are enabled, and how does enabling V-Sync and G-Sync affect the average performance? You can roughly estimate the speed loss even from the FPS graphs shown above, but we will also present the average frame rate values ​​we obtained during testing. The first one will be Unigine Heaven again:

    The performance in the Adaptive V-Sync and V-Sync Off modes is almost the same - after all, the speed almost does not increase above 60 FPS. It is logical that enabling V-Sync also leads to a decrease in the average frame rate, since this mode uses stepped FPS indicators. At 60Hz, the drop in average frame rate was more than a quarter, and turning on 120Hz brought back only half the loss in average FPS.

    The most interesting thing for us is how much the average frame rate drops in G-Sync mode. For some reason, the speed above 60 FPS is cut, although the monitor was set to 144 Hz mode, so the speed when G-Sync is turned on was slightly lower than the mode with synchronization disabled. In general, we can assume that there are no losses at all, and they certainly cannot be compared with the lack of speed with V-Sync On. Let's consider the second benchmark - Valley.

    In this case, the drop in average rendering speed in modes with V-Sync enabled decreased, since the frame rate throughout the test was close to 30 FPS - one of the frequency “steps” for V-Sync in both modes: 60 and 120 Hz. Well, for obvious reasons, the losses in the second case were slightly lower.

    When G-Sync was turned on, the average frame rate was again lower than that noted in the disabled synchronization mode, all for the same reason - turning on G-Sync “killed” FPS values ​​above 60. But the difference is small, and Nvidia’s new mode provides noticeably faster speeds than with Vsync enabled. Let's look at the last chart - the average frame rate in the game Just Cause 2:

    In the case of this game, the V-Sync On mode suffered significantly more than in the test applications on the Unigine engine. The average frame rate in this mode at 60 Hz is more than one and a half times lower than when synchronization is disabled altogether! Enabling a 120 Hz refresh rate greatly improves the situation, but still G-Sync allows you to achieve noticeably better performance even in average FPS numbers, not to mention the comfort of the game, which can no longer be assessed by numbers alone - you have to see it with your own eyes.

    So, in this section we found out that G-Sync technology provides frame rates close to the mode with synchronization disabled, and its inclusion has almost no impact on performance. In contrast to vertical synchronization V-Sync, when enabled, the frame rate changes in steps, and often there are jumps from one step to another, which causes unsmooth movements when outputting an animated series of frames and has a detrimental effect on comfort in 3D games.

    In other words, both our subjective impressions and test results suggest that Nvidia's G-Sync technology really changes the visual comfort of 3D games for the better. The new method is devoid of graphic artifacts in the form of tearing of a picture consisting of several adjacent frames, as we see in the mode with V-Sync disabled, and there are no problems with the smoothness of frame output to the monitor and increased output delays, as in the V-Sync mode On.

    Conclusion

    With all the difficulties of objectively measuring the smoothness of video output, I would first like to express a subjective assessment. We were quite impressed with the gaming experience on the Nvidia GeForce and the G-Sync-enabled monitor from Asus. Even a one-time “live” demonstration of G-Sync really makes a strong impression with the smoothness of frame changes, and after a long trial of this technology, it becomes very dreary to continue playing on a monitor with old methods of displaying images on the screen.

    Perhaps G-Sync can be considered the biggest change in the process of displaying visual information on the screen in a long time - we finally saw something truly new in the connection between displays and GPUs, which directly affects the comfort of viewing 3D graphics, and even and so noticeable. And before Nvidia announced G-Sync technology, for many years we were tied to outdated image output standards, rooted in the requirements of the TV and film industry.

    Of course, I would like to have such capabilities even earlier, but now is not a bad time for its implementation, since in many demanding 3D games, at maximum settings, top modern video cards provide a frame rate at which the benefits of enabling G-Sync become maximum. And before the advent of technology from Nvidia, the realism achieved in games was simply “killed” by far from the best methods of updating the image on the monitor, causing image tearing, increased delays and jerks in the frame rate. G-Sync technology allows you to get rid of these problems by equating the frame rate on the screen to the rendering speed of the graphics processor (albeit with some limitations) - this process is now managed by the GPU itself.

    We have not met a single person who tried G-Sync at work and remained dissatisfied with this technology. Reviews from the very first lucky people who tested the technology at an Nvidia event last fall were entirely enthusiastic. Journalists from the specialized press and game developers (John Carmack, Tim Sweeney and Johan Andersson) also supported the new withdrawal method. We now join in - after several days of using a monitor with G-Sync, I don’t want to go back to old devices with long-outdated synchronization methods. Ah, if only there were more choice of monitors with G-Sync, and if they weren’t equipped exclusively with TN matrices...

    Well, among the disadvantages of Nvidia’s technology, we can note that it operates at a frame rate of at least 30 FPS, which can be considered an annoying drawback - it would be better if even at 20-25 FPS the image was displayed clearly after it was prepared on the GPU . But the main disadvantage of the technology is that G-Sync is the company’s own solution, which is not used by other GPU manufacturers: AMD and Intel. You can also understand Nvidia, because they spent resources on the development and implementation of the technology and negotiated with monitor manufacturers to support it precisely with the desire to make money. Actually, they once again acted as the engine of technical progress, despite the company’s supposed greed for profit. Let's reveal a big "secret": profit is the main goal of any commercial company, and Nvidia is no exception.

    And yet, the future is likely to lie in more universal open standards, similar in essence to G-Sync, like Adaptive-Sync, an optional feature within DisplayPort 1.2a. But the appearance and distribution of monitors with such support will have to wait some more time - somewhere until the middle of next year, and G-Sync monitors from different companies (Asus, Acer, BenQ, AOC and others) have already been on sale for several months , although not too cheap. Nothing prevents Nvidia from supporting Adaptive-Sync in the future, although they have not officially commented on this topic. Let's hope that GeForce fans not only now have a working solution in the form of G-Sync, but that in the future it will be possible to use dynamic refresh rates within the framework of a generally accepted standard.

    Among other disadvantages of Nvidia G-Sync technology for users, we note that its support on the monitor side costs the manufacturer a certain amount, which also results in an increase in the retail price relative to standard monitors. However, among G-Sync monitors there are models of different prices, including some that are not too expensive. The main thing is that they are already on sale, and every player can get maximum comfort when playing right now, and so far only when using Nvidia Geforce video cards - the company vouches for this technology.

    G-Sync Technology Overview | A Brief History of Fixed Refresh Rate

    Once upon a time, monitors were bulky and contained cathode ray tubes and electron guns. Electron guns bombard the screen with photons to illuminate colored phosphor dots we call pixels. They draw from left to right each "scanning" line from top to bottom. Adjusting the speed of the electron gun from one full update to the next was not very practical before, and there was no particular need for this before the advent of 3D games. Therefore, CRTs and related analog video standards were designed with a fixed refresh rate.

    LCD monitors gradually replaced CRTs, and digital connectors (DVI, HDMI and DisplayPort) replaced analog connectors (VGA). But the associations responsible for standardizing video signals (led by VESA) have not moved away from fixed refresh rates. Film and television still rely on an input signal at a constant frame rate. Once again, switching to a variable refresh rate doesn't seem all that necessary.

    Adjustable frame rates and fixed refresh rates are not the same

    Before the advent of modern 3D graphics, fixed refresh rates were not an issue for displays. But it came up when we first encountered powerful GPUs: the rate at which the GPU rendered individual frames (what we call the frame rate, usually expressed in FPS or frames per second) is not constant. It changes over time. In heavy graphic scenes, the card can provide 30 FPS, and when looking at an empty sky - 60 FPS.


    Disabling synchronization causes gaps

    It turns out that the variable frame rate of the GPU and the fixed refresh rate of the LCD panel do not work very well together. In this configuration, we encounter a graphical artifact called “tearing.” It appears when two or more partial frames are rendered together during the same monitor refresh cycle. Usually they are displaced, which gives a very unpleasant effect while moving.

    The image above shows two well-known artifacts that are common but difficult to capture. Because these are display artifacts, you won't see this in regular game screenshots, but our images show what you actually see while playing. To shoot them, you need a camera with a high-speed shooting mode. Or if you have a card that supports video capture, you can record an uncompressed video stream from the DVI port and clearly see the transition from one frame to the next; This is the method we use for FCAT tests. However, it is best to observe the described effect with your own eyes.

    The tearing effect is visible in both images. The top one was done using a camera, the bottom one was done through the video capture function. The bottom picture is “cut” horizontally and looks displaced. In the top two images, the left photo was taken on a Sharp screen with a 60 Hz refresh rate, the right one was taken on an Asus display with a 120 Hz refresh rate. The tearing on a 120Hz display is less pronounced because the refresh rate is twice as high. However, the effect is visible and appears in the same way as in the left image. This type of artifact is a clear sign that the images were taken with vertical sync (V-sync) turned off.


    Battlefield 4 on GeForce GTX 770 with V-sync disabled

    The second effect that is visible in the images of BioShock: Infinite is called ghosting. It is especially visible at the bottom left of the photo and is associated with a delay in screen refresh. In short, individual pixels do not change color quickly enough, resulting in this type of glow. A single frame cannot convey the effect ghosting has on the game itself. A panel with an 8ms gray-to-gray response time, such as the Sharp, will end up producing a blurry image with any movement on the screen. This is why these displays are generally not recommended for first-person shooters.

    V-sync: "wasted on soap"

    Vertical sync, or V-sync, is a very old solution to the tearing problem. When this feature is activated, the graphics card attempts to match the screen's refresh rate, eliminating tearing completely. The problem is that if your graphics card can't keep the frame rate above 60 FPS (on a 60 Hz display), the effective frame rate will jump between multiples of the screen refresh rate (60, 30, 20, 15 FPS, etc.) etc.), which in turn will lead to noticeable slowdowns.


    When the frame rate drops below the refresh rate with V-sync active, you will experience stuttering

    Moreover, since V-sync makes the graphics card wait and sometimes relies on the invisible surface buffer, V-sync can introduce additional input lag into the render chain. Thus, V-sync can be both a blessing and a curse, solving some problems but causing other disadvantages. An informal survey of our staff found that gamers tend to disable V-sync, only turning it on when tearing becomes unbearable.

    Get Creative: Nvidia Unveils G-Sync

    When starting a new video card GeForce GTX 680 Nvidia has included a driver mode called Adaptive V-sync that attempts to mitigate problems by enabling V-sync when frame rates are above the monitor's refresh rate and quickly disabling it when performance drops sharply below the refresh rate. Although the technology did its job well, it was a workaround that did not eliminate tearing if the frame rate was lower than the monitor's refresh rate.

    Implementation G-Sync much more interesting. Generally speaking, Nvidia is showing that instead of forcing graphics cards to run at a fixed display frequency, we can force new monitors to run at a variable frequency.


    GPU frame rate determines the monitor's refresh rate, removing artifacts associated with enabling and disabling V-sync

    The packet data transfer mechanism of the DisplayPort connector has opened up new possibilities. By using variable blanking intervals in the DisplayPort video signal and replacing the monitor scaler with a module that operates on variable blanking signals, the LCD panel can operate at a variable refresh rate related to the frame rate that the video card is outputting (within the monitor's refresh rate). In practice, Nvidia got creative with the special features of the DisplayPort interface and tried to catch two birds with one stone.

    Even before the tests begin, I would like to commend the team for their creative approach to solving a real problem affecting PC gaming. This is innovation at its finest. But what are the results G-Sync in practice? Let's find out.

    Nvidia sent us an engineering sample of the monitor Asus VG248QE, in which the scaling device is replaced by a module G-Sync. We are already familiar with this display. The article is dedicated to him "Asus VG248QE Review: 24-Inch 144Hz Gaming Monitor for $400", in which the monitor earned Tom's Hardware Smart Buy award. Now it's time to find out how Nvidia's new technology will affect the most popular games.

    G-Sync Technology Overview | 3D LightBoost, built-in memory, standards and 4K

    As we reviewed Nvidia's press materials, we asked ourselves many questions, both about the technology's place in the present and its role in the future. During a recent trip to the company's headquarters in Santa Clara, our US colleagues received some answers.

    G-Sync and 3D LightBoost

    The first thing we noticed was that Nvidia sent a monitor Asus VG248QE, modified to support G-Sync. This monitor also supports Nvidia's 3D LightBoost technology, which was originally designed to boost the brightness of 3D displays but has long been used unofficially in 2D mode, using pulsating panel backlighting to reduce ghosting (or motion blur). Naturally, it became interesting whether this technology is used in G-Sync.

    Nvidia gave a negative answer. While using both technologies at the same time would be ideal, today strobing the backlight at a variable refresh rate leads to flicker and brightness issues. Solving them is incredibly difficult because you need to adjust the brightness and track the pulses. As a result, the choice now is between the two technologies, although the company is trying to find a way to use them simultaneously in the future.

    Built-in G-Sync module memory

    As we already know, G-Sync eliminates the step input lag associated with V-sync as there is no longer a need to wait for the panel scan to complete. However, we noticed that the module G-Sync has built-in memory. Can the module buffer frames on its own? If so, how long will it take for the frame to travel through the new channel?

    According to Nvidia, frames are not buffered in the module's memory. As data arrives, it is displayed on the screen, and the memory performs some other functions. However, the processing time for G-Sync noticeably less than one millisecond. In fact, we encounter almost the same delay when V-sync is turned off, and it is associated with the characteristics of the game, video driver, mouse, etc.

    Will G-Sync be standardized?

    This question was asked in a recent interview with AMD, when the reader wanted to know the company's reaction to the technology G-Sync. However, we wanted to ask the developer directly and find out if Nvidia plans to bring the technology to an industry standard. In theory, a company can offer G-Sync as an upgrade to the DisplayPort standard, providing variable refresh rates. After all, Nvidia is a member of the VESA association.

    However, there are no new specifications planned for DisplayPort, HDMI or DVI. G-Sync already supports DisplayPort 1.2, that is, the standard does not need to be changed.

    As noted, Nvidia is working on compatibility G-Sync with a technology that is now called 3D LightBoost (but will soon have a different name). In addition, the company is looking for a way to reduce the cost of modules G-Sync and make them more accessible.

    G-Sync at Ultra HD resolutions

    Nvidia promises monitors with support G-Sync and resolutions up to 3840x2160 pixels. However, the model from Asus, which we will look at today, only supports 1920x1080 pixels. Currently, Ultra HD monitors use the STMicro Athena controller, which has two scalers to create a tiled display. We are wondering if there will be a module G-Sync support MST configuration?

    In truth, 4K displays with variable frame rates will still have to wait. There is no separate upscaling device that supports 4K resolution yet; the nearest one should appear in the first quarter of 2014, and monitors equipped with them will not appear until the second quarter. Since the module G-Sync replaces the scaling device, compatible panels will begin to appear after this point. Fortunately, the module natively supports Ultra HD.

    What happens up to 30 Hz?

    G-Sync can change the screen refresh rate up to 30 Hz. This is explained by the fact that at very low refresh rates, the image on the LCD screen begins to deteriorate, which leads to the appearance of visual artifacts. If the source provides less than 30 FPS, the module will update the panel automatically, avoiding possible problems. This means that a single image can be played back more than once, but the lower threshold is 30Hz which will ensure the best possible image quality.

    G-Sync Technology Overview | 60Hz Panels, SLI, Surround and Availability

    Is the technology limited to only high refresh rate panels?

    You will notice that the first monitor with G-Sync initially has a very high screen refresh rate (above the level required for the technology) and a resolution of 1920x1080 pixels. But the Asus display has its own limitations, such as the 6-bit TN panel. We became curious about the implementation of technology G-Sync Is it only planned for high refresh rate displays or will we be able to see it on more common 60Hz monitors? In addition, I want to get access to a resolution of 2560x1440 pixels as quickly as possible.

    Nvidia reiterated that the best experience is from G-Sync can be obtained when your video card keeps the frame rate within 30 - 60 FPS. Thus, the technology can truly benefit from conventional 60Hz monitors with a module G-Sync .

    But why use a 144 Hz monitor then? It seems that many monitor manufacturers have decided to implement a low motion blur feature (3D LightBoost) that requires a high refresh rate. But those who decide not to use this function (and why not, because it is not yet compatible with G-Sync), can create a panel with G-Sync for much less money.

    Speaking of resolutions, it can be noted that everything is going like this: QHD screens with a refresh rate of more than 120 Hz may begin to be produced at the beginning of 2014.

    Are there problems with SLI and G-Sync?

    What do you need to see G-Sync in Surround mode?

    Now, of course, there is no need to combine two graphics adapters to provide 1080p image output on the screen. Even a mid-range Kepler-based graphics card will be able to provide the level of performance required to comfortably game at this resolution. But there is also no way to run two cards in SLI on three G-Sync-monitors in Surround mode.

    This limitation is due to modern display outputs on Nvidia cards, which typically have two DVI, one HDMI, and one DisplayPort. G-Sync requires DisplayPort 1.2, and the adapter will not work (just like the MST hub). The only option is to connect three monitors in Surround mode to three cards, i.e. There is a separate card for each monitor. Naturally, we assume that Nvidia's partners will start releasing "G-Sync Edition" cards with more DisplayPort connectors.

    G-Sync and triple buffering

    For comfortable gaming with vertical sync, active triple buffering was required. Is it needed for G-Sync? The answer is no. G-Sync not only does it not require triple buffering, since the channel never stops, it, on the contrary, harms G-Sync, because it adds an extra frame of latency with no performance gain. Unfortunately, game triple buffering is often set independently and cannot be bypassed manually.

    What about games that typically don't respond well to V-sync being disabled?

    Games like Skyrim, which is part of our test suite, are designed to run with V-sync on a 60Hz panel (though this sometimes makes life difficult for us due to input lag). To test them, modification of certain files with the .ini extension is required. The way he behaves G-Sync with games based on Gamebryo and Creation engines that are sensitive to vertical sync settings? Are they capped at 60 FPS?

    Secondly, you need a monitor with an Nvidia module G-Sync. This module replaces the screen scaler. And, for example, add to a split Ultra HD display G-Sync impossible. In today's review, we're using a prototype with a resolution of 1920x1080 pixels and a refresh rate of up to 144 Hz. But even with its help you can get an idea of ​​what impact it will have G-Sync, if manufacturers start installing it in cheaper panels at 60 Hz.

    Third, a DisplayPort 1.2 cable is required. DVI and HDMI are not supported. In the short term this means that the only option for work G-Sync on three monitors in Surround mode, they are connected via a triple SLI connection, since each card has only one DisplayPort connector, and adapters for DVI to DisplayPort do not work in this case. The same goes for MST hubs.

    And finally, don’t forget about driver support. The latest package version 331.93 beta is already compatible with G-Sync, and we assume that future versions with WHQL certificate will be equipped with it.

    Test bench

    Test bench configuration
    CPU Intel Core i7-3970X (Sandy Bridge-E), base frequency 3.5 GHz, overclocked to 4.3 GHz, LGA 2011, 15 MB shared L3 cache, Hyper-Threading on, power saving features on.
    System board MSI X79A-GD45 Plus (LGA 2011) X79 Express Chipset, BIOS 17.5
    RAM G.Skill 32 GB (8 x 4 GB) DDR3-2133, F3-17000CL9Q-16GBXM x2 @ 9-11-10-28 and 1.65 V
    Storage Samsung 840 Pro SSD 256 GB SATA 6Gb/s
    Video cards Nvidia GeForce GTX 780 Ti 3 GB
    Nvidia GeForce GTX 760 2 GB
    power unit Corsair AX860i 860W
    System software and drivers
    OS Windows 8 Professional 64-bit
    DirectX DirectX 11
    Video driver Nvidia GeForce 331.93 Beta

    Now we need to figure out in what cases G-Sync has the greatest impact. There's a good chance you're already using a 60Hz monitor. 120Hz and 144Hz models are more popular among gamers, but Nvidia rightly assumes that the majority of enthusiasts in the market will still stick to 60Hz.

    With Vsync active on a 60Hz monitor, the most noticeable artifacts appear when the card can't handle 60fps, resulting in annoying jumps between 30 and 60 FPS. There are noticeable slowdowns here. With Vsync turned off, tearing will be most noticeable in scenes where you need to pan the camera a lot or have a lot of movement. Some players find this so distracting that they simply turn on V-sync and suffer the stuttering and input lag.

    With 120Hz and 144Hz refresh rates and higher frame rates, the display refreshes more frequently, reducing the time a single frame persists across multiple screen scans when performance is insufficient. However, problems with active and inactive V-sync remain. For this reason, we will test the Asus monitor in 60 and 144 Hz modes with technologies turned on and off G-Sync .

    G-Sync Technology Overview | Testing G-Sync with V-Sync enabled

    It's time to start testing G-Sync. All that remains is to install a video capture card, an array of several SSDs and move on to tests, right?

    No, that's wrong.

    Today we measure quality, not productivity. In our case, tests can show only one thing: the frame rate at a specific point in time. About the quality and experience of use with the technology turned on and off G-Sync they say absolutely nothing. Therefore, you will have to rely on our carefully verified and eloquent description, which we will try to bring as close as possible to reality.

    Why not just record a video and give it to the readers to judge? The fact is that the camera records video at a fixed speed of 60 Hz. Your monitor also plays video at a constant 60Hz refresh rate. Because G-Sync introduces variable refresh rate, you won’t see the technology in action.

    Considering the number of games available, the number of possible test combinations is countless. V-sync on, V-sync off, G-Sync on, G-Sync off, 60 Hz, 120 Hz, 144 Hz, ... The list goes on for a long time. But we'll start with a 60Hz refresh rate and active Vsync.

    It's probably easiest to start with Nvidia's own demo utility, which has a pendulum swinging from side to side. The utility can simulate frame rates of 60, 50 or 40 FPS. Or the frequency may fluctuate between 40 and 60 FPS. You can then disable or enable V-sync and G-Sync. Although the test is fictitious, it demonstrates the capabilities of the technology well. You can watch the scene at 50 FPS with vertical synchronization enabled and think: “Everything is quite good, and visible slowdowns can be tolerated.” But after activation G-Sync I immediately want to say: “What was I thinking? The difference is as obvious as day and night. How could I live with this before?”

    But let's not forget that this is a technical demonstration. I would like evidence based on real games. To do this, you need to run a game with high system requirements, such as Arma III.

    Can be installed in a test vehicle in Arma III GeForce GTX 770 and set to ultra settings. With vertical synchronization disabled, the frame rate fluctuates between 40 – 50 FPS. But if you enable V-sync, it will drop to 30 FPS. Performance is not high enough to see constant fluctuations between 30 and 60 FPS. Instead, the graphics card's frame rate simply decreases.

    Since there was no image slowdown, there was a significant difference when activating G-Sync unnoticeable, except that the actual frame rate jumps 10 - 20 FPS higher. Input lag should also be reduced since the same frame is not maintained across multiple scans of the monitor. We feel that Arma is generally less jittery than many other games, so the lag isn't noticeable.

    On the other hand, in Metro: Last Light the influence G-Sync more pronounced. With video card GeForce GTX 770 The game can be run at 1920x1080 pixels with very high detail settings, including 16x AF, normal tessellation and motion blur. In this case, you can select SSAA settings from 1x to 2x to 3x to gradually reduce the frame rate.

    Additionally, the game's environment includes a hallway that makes it easy to strafe back and forth. Having launched the level with active vertical synchronization at 60 Hz, we went into the city. Fraps showed that with 3x SSAA anti-aliasing the frame rate was 30 FPS, and with anti-aliasing disabled it was 60 FPS. In the first case, slowdowns and delays are noticeable. With SSAA disabled, you will get an absolutely smooth picture at 60 FPS. However, activating 2x SSAA leads to fluctuations from 60 to 30 FPS, making each duplicate frame an inconvenience. This is one game where we would definitely turn off Vsync and just ignore tearing. Many people have already developed a habit anyway.

    However G-Sync eliminates all negative effects. You will no longer have to look at the Fraps counter waiting for drops below 60 FPS to lower yet another graphics setting. On the contrary, you can increase some of them, since even if it slows down to 50 - 40 FPS, there will be no obvious slowdowns. What if you disable vertical sync? You will learn about this later.

    G-Sync Technology Overview | Testing G-Sync with V-Sync disabled

    The conclusions in this material are based on a survey of the authors and friends of Tom's Hardware via Skype (in other words, the sample of respondents is small), but almost all of them understand what vertical synchronization is and what disadvantages users have to put up with in this regard. According to them , they resort to V-sync only when tearing due to very large variations in the frame rate and refresh rate of the monitor becomes unbearable.

    As you can imagine, the visual impact of having Vsync turned off is hard to confuse, although it is heavily influenced by the specific game and its detail settings.

    Let's take for example Crysis 3. The game can easily bring your graphics subsystem to its knees at the highest graphics settings. And, since Crysis 3 is a first-person shooter with very dynamic gameplay, the tearing can be quite noticeable. In the example above, the FCAT output was captured between two frames. As you can see, the tree is completely cut.

    On the other hand, when we force Vsync off in Skyrim, the tearing isn't that bad. Please note that in this case the frame rate is very high and several frames appear on the screen with each scan. According to reviews, the number of movements per frame is relatively low. There are problems when playing Skyrim in this configuration, and it may not be the most optimal. But it shows that even with V-sync turned off, the feel of the game can change.

    For our third example, we've chosen a shot of Lara Croft's shoulder from Tomb Raider, which shows some pretty clear tearing (also look at the hair and the strap of the tank top). Tomb Raider is the only game in our sample that allows you to choose between double and triple buffering when Vsync is activated.

    The latest graph shows that Metro: Last Light with G-Sync at 144 Hz, generally provides the same performance as with V-sync disabled. However, the absence of gaps cannot be seen on the graph. If you use technology with a 60 Hz screen, the frame rate will be at 60 FPS, but there will be no slowdowns or delays.

    If anything, those of you (and us) who have spent countless hours on graphics tests, watching the same benchmark over and over again, can get used to them and visually determine how good a particular result is. This is how we measure the absolute performance of video cards. Changes in the picture from the active G-Sync are immediately noticeable because they have the same smoothness as when V-sync is turned on, but without the tearing characteristic of V-sync turned off. It's a pity that we can't show the difference in the video right now.

    G-Sync Technology Overview | Game Compatibility: Almost Great

    Checking other games

    We tested a few more games. Crysis 3, Tomb Raider, Skyrim, BioShock: Infinite, Battlefield 4 visited the test bench. All of them, except Skyrim, benefited from technology G-Sync. The effect depended on competitive play. But if you saw it, you would immediately admit that you ignored the shortcomings that were present before.

    Artifacts may still appear. For example, the creeping effect associated with smoothing is more noticeable during smooth motion. You'll likely want to set the anti-aliasing as high as possible to remove any nasty jagged edges that weren't so noticeable before.

    Skyrim: Special Case

    The Creation graphics engine that powers Skyrim enables V-sync by default. To test the game at frame rates above 60 FPS, you need to add the line iPresentInterval=0 to one of the game’s .ini files.

    So Skyrim can be tested in three ways: in its original state, allowing the Nvidia driver to "use application settings", enable G-Sync in the driver and leave the Skyrim settings untouched and then enable G-Sync and disable V-sync in the game file with the .ini extension.

    The first configuration, in which the experienced monitor was set to 60 Hz, showed a stable 60 FPS on ultra settings with a video card GeForce GTX 770. Consequently, we got a smooth and pleasant picture. However, user input still suffers from latency. Additionally, side-to-side strafe revealed noticeable motion blur. However, this is how most people play on PC. Of course, you can buy a screen with a 144Hz refresh rate and it will actually eliminate blur. But since GeForce GTX 770 provides a refresh rate of approximately 90 - 100 frames per second, there will be noticeable stuttering when the engine fluctuates between 144 and 72 FPS.

    At 60 Hz G-Sync has a negative impact on the picture, this is probably due to active vertical synchronization, despite the fact that the technology should work with V-sync disabled. Now lateral strafe (especially closer to the walls) leads to pronounced slowdowns. This is a potential problem for 60Hz panels with G-Sync, at least in games like Skyrim. Fortunately, in the case of the Asus VG248Q monitor, you can switch to 144 Hz mode, and despite active V-sync, G-Sync will work at this frame rate without any complaints.

    Completely disabling vertical synchronization in Skyrim results in more “sharper” mouse control. However, this does introduce image tearing (not to mention other artifacts such as flickering water). Enabling G-Sync leaves stuttering at 60Hz, but at 144Hz the situation improves significantly. Although in our video card reviews we test the game with Vsync disabled, we would not recommend playing without it.

    For Skyrim, perhaps the best solution would be to disable G-Sync and play at 60 Hz, which will give you a constant 60 frames per second on your chosen graphics settings.

    G-Sync Technology Overview | Is G-Sync what you've been waiting for?

    Even before we received a test sample of the Asus monitor from technology G-Sync, we're already encouraged by the fact that Nvidia is working on a very real problem affecting games for which no solution has yet been proposed. Until now, you could enable or disable vertical sync to suit your taste. Moreover, any decision was accompanied by compromises that negatively affected the gaming experience. If you choose to leave Vsync on until the tearing becomes unbearable, you're choosing the lesser of two evils.

    G-Sync solves the problem by giving the monitor the ability to scan the screen at a variable frequency. Innovation like this is the only way to continue to advance our industry while maintaining the technical advantage of personal computers over gaming consoles and platforms. Nvidia will no doubt face criticism for not developing a standard that competitors could apply. However, the company uses DisplayPort 1.2 for its solution. As a result, just two months after the announcement of the technology G-Sync she was in our hands.

    The question is, will Nvidia deliver on everything it promised with G-Sync?

    Three talented developers extolling the qualities of a technology you've never seen in action can inspire anyone. But if your first experience with G-Sync Based on Nvidia's pendulum demo test, you'll definitely wonder if such a huge difference is even possible, or if the test represents a special scenario that's too good to be true.

    Naturally, when testing the technology in real games, the effect is not so clear. On the one hand, there were exclamations of “Wow!” and “Crazy!”, on the other - “I think I see the difference.” The best effect of activation G-Sync noticeable when changing the display refresh rate from 60 Hz to 144 Hz. But we also tried to test at 60Hz with G-Sync to see what you'll get with (hopefully) cheaper displays in the future. In some cases, simply going from 60 to 144 Hz will blow you away, especially if your graphics card can handle high frame rates.

    Today we know that Asus plans to introduce support G-Sync in the model Asus VG248QE, which the company says will retail for $400 next year. The monitor has a native resolution of 1920x1080 pixels and a refresh rate of 144 Hz. Version without G-Sync has already received our Smart Buy award for outstanding performance. But for us personally, the 6-bit TN panel is a drawback. I really want to see 2560x1440 pixels on an IPS matrix. We'll even agree to a 60Hz refresh rate if it helps keep the price down.

    Although at CES we are expecting a whole bunch of announcements, official comments from Nvidia regarding other displays with modules G-Sync and we haven’t heard their prices. Additionally, we're not sure what the company's plans are for the upgrade module, which should allow you to implement the module G-Sync into an already purchased monitor Asus VG248QE in 20 minutes.

    For now we can say it's worth the wait. You'll find that in some games the impact of new technology is unmistakable, while in others it's less pronounced. But anyway G-Sync answers the "bearded" question of whether to enable or not enable vertical synchronization.

    There is another interesting thought. After we tested G-Sync, how much longer can AMD avoid commenting? The company teased our readers in his interview(English), noting that she would soon decide on this possibility. What if she has something planned? The end of 2013 and the beginning of 2014 has a lot of interesting news for us to discuss, including Battlefield 4 Mantle versions, upcoming Nvidia Maxwell architecture, G-Sync, AMD xDMA engine with CrossFire support and rumors about new dual-chip video cards. Now we are short of video cards with more than 3 GB (Nvidia) and 4 GB (AMD) GDDR5 memory, but they cost less than $1000...