Curious why is triple-buffering not a standard feature?
  1 / 2    
I now have a love/hate relationship with vsync (as its currently implemented/supported) It used to be just hate. I've always turned off vsync for as long as i can remember, due solely to input lag. For me, when i load up a new game, if i see an option for vsync, there is no debate -i just simply turn it off on sight and have always done so. In most games that don't offer the option but enable it, i just do the usual search for "*gamename* input lag fix". I've done this for ages. Its just the usual routine, along with checking for mods or graphical optimizations. I've never filed a complaint or posted a thread about it and i wonder how game developers or folks and Nvidia and AMD would know that people want it if they just turn it off (assuming we all do). Recently discussion about the Oculus Rift and its "need" for vsync (or not, as it turns out), led me to try vsync in Skyrim, in 3D, on my new-to-me Samsung ES7500. The ES7500's pixel response time is very fast, scoring just behind the fast Asus VGs278H 3D Vision 2 monitor, giving it great 3D and motion. Vsync in Skyrim was fantastic, and a revelation. Panning around was like panning a large window to a real world. It was awe inspiring to just walk along slowly compared to without vsync, especially with a larger field of view because with a larger field of view, visuals passing you at the edges are moving faster. Everything about it was a big improvement, except for input lag. After using vsync for a couple days in Skyrim going back to playing without vsync, but with low input lag, it was a relief. My mind had sort of forgotten what i was missing, and it was literally a relief to know i didn't have to play with input lag anymore. Is it complex to implement and maintain support for triple-buffering or what? If not, i'd like to humbly request it be implemented as a new standard feature. Is triple-buffering a total solution to input lag or is that a myth? Great explanation here on vsync and how triple buffering theoretically eliminates the need for the graphics card to pause while waiting for screen refresh. http://hardforum.com/showthread.php?t=928593
I now have a love/hate relationship with vsync (as its currently implemented/supported) It used to be just hate.

I've always turned off vsync for as long as i can remember, due solely to input lag. For me, when i load up a new game, if i see an option for vsync, there is no debate -i just simply turn it off on sight and have always done so. In most games that don't offer the option but enable it, i just do the usual search for "*gamename* input lag fix". I've done this for ages. Its just the usual routine, along with checking for mods or graphical optimizations. I've never filed a complaint or posted a thread about it and i wonder how game developers or folks and Nvidia and AMD would know that people want it if they just turn it off (assuming we all do).

Recently discussion about the Oculus Rift and its "need" for vsync (or not, as it turns out), led me to try vsync in Skyrim, in 3D, on my new-to-me Samsung ES7500. The ES7500's pixel response time is very fast, scoring just behind the fast Asus VGs278H 3D Vision 2 monitor, giving it great 3D and motion. Vsync in Skyrim was fantastic, and a revelation. Panning around was like panning a large window to a real world. It was awe inspiring to just walk along slowly compared to without vsync, especially with a larger field of view because with a larger field of view, visuals passing you at the edges are moving faster. Everything about it was a big improvement, except for input lag. After using vsync for a couple days in Skyrim going back to playing without vsync, but with low input lag, it was a relief. My mind had sort of forgotten what i was missing, and it was literally a relief to know i didn't have to play with input lag anymore.

Is it complex to implement and maintain support for triple-buffering or what? If not, i'd like to humbly request it be implemented as a new standard feature. Is triple-buffering a total solution to input lag or is that a myth?

Great explanation here on vsync and how triple buffering theoretically eliminates the need for the graphics card to pause while waiting for screen refresh.

http://hardforum.com/showthread.php?t=928593

46" Samsung ES7500 3DTV (checkerboard, high FOV as desktop monitor, highly recommend!) - Metro 2033 3D PNG screens - Metro LL filter realism mod - Flugan's Deus Ex:HR Depth changers - Nvidia tech support online form - Nvidia support: 1-800-797-6530

#1
Posted 04/18/2013 03:00 AM   
Triple buffering doesn't necessarily help with input lag, at least not directly. Rather it helps maintain a higher frame rate when vsync is enabled. The NVIDIA driver already supports triple buffering in OpenGL as a standard feature. Direct3D on the other hand, buffering must usually be controlled by the application.
Triple buffering doesn't necessarily help with input lag, at least not directly. Rather it helps maintain a higher frame rate when vsync is enabled.

The NVIDIA driver already supports triple buffering in OpenGL as a standard feature. Direct3D on the other hand, buffering must usually be controlled by the application.

"This is your code. These are also your bugs. Really. Yes, the API runtime and the
driver have bugs, but this is not one of them. Now go fix it already." -fgiesen

#2
Posted 04/18/2013 03:25 AM   
To deal with input lag, you'll need to adjust the maximum pre-rendered frames (which works similarly with triple buffering but with adjustable number of back-buffer)
To deal with input lag, you'll need to adjust the maximum pre-rendered frames (which works similarly with triple buffering but with adjustable number of back-buffer)

#3
Posted 04/18/2013 08:02 AM   
Prerender frames actually barely impacts the input latency these days. back when i was running halo on a 5900 and athlonxp, sure, it helped alot. these days, not at all
Prerender frames actually barely impacts the input latency these days.

back when i was running halo on a 5900 and athlonxp, sure, it helped alot.

these days, not at all



In Memory of Chris "ChrisRay" Arthington, 1982-2010

Specs:
CPU:Intel Xeon x5690 @ 4.2Ghz, Mainboard:Asus Rampage III Extreme, Memory:48GB Corsair Vengeance LP 1600
Video:EVGA Geforce GTX 1080 Founders Edition, NVidia Geforce GTX 1060 Founders Edition
Monitor:ROG PG279Q, BenQ BL2211, Sound:Creative XFI Titanium Fatal1ty Pro
SDD:Crucial MX300 275, Crucial MX300 525, Crucial MX200 250
HDD:500GB Spinpoint F3, 1TB WD Black, 2TB WD Red, 1TB WD Black
Case:NZXT Phantom 820, PSU:Seasonic X-850, OS:Windows 7 SP1
Cooler: ThermalRight Silver Arrow IB-E Extreme

WIP:
CPU:Intel Xeon x5660, Mainboard:Asus Rampage II Gene, Memory:16GB Corsair Vengeance 1600 LP
Video:EVGA Geforce GTX 680+ 4GB, Palit Geforce GTX 550ti
Monitor:Pending, Sound:Pending
SDD:Pending
HDD:Pending
Case:NZXT Guardian 921RB, PSU:Corsair 620HX, OS:Windows 7 SP1
Cooler: ThermalRight True Spirit 120M

#4
Posted 04/18/2013 10:21 AM   
At least there's one benefit to being CPU-limited. I can still tell a pretty big difference between 0 and 3 in some games, but typically only when GPU-limited and/or with vsync enabled.
At least there's one benefit to being CPU-limited. I can still tell a pretty big difference between 0 and 3 in some games, but typically only when GPU-limited and/or with vsync enabled.

"This is your code. These are also your bugs. Really. Yes, the API runtime and the
driver have bugs, but this is not one of them. Now go fix it already." -fgiesen

#5
Posted 04/18/2013 11:19 AM   
So what is the cause of the input lag when enabling Vsync if its not Vsync? Isn't the back buffer a super simple element of the GPU, just a place to store pixel data? Seems like a small thing to hold off on implementing vs the reward in . Again, i was downright shocked at how smooth everything became with in enabled while moving around using my avatar and also just smooth movement in general, like the menu in Prince of Persia. Seems like a checkbox you'd want marked towards having Nvidia GPUs sell themselves, unless input lag is unavoidable for some reason.
So what is the cause of the input lag when enabling Vsync if its not Vsync?

Isn't the back buffer a super simple element of the GPU, just a place to store pixel data? Seems like a small thing to hold off on implementing vs the reward in . Again, i was downright shocked at how smooth everything became with in enabled while moving around using my avatar and also just smooth movement in general, like the menu in Prince of Persia. Seems like a checkbox you'd want marked towards having Nvidia GPUs sell themselves, unless input lag is unavoidable for some reason.

46" Samsung ES7500 3DTV (checkerboard, high FOV as desktop monitor, highly recommend!) - Metro 2033 3D PNG screens - Metro LL filter realism mod - Flugan's Deus Ex:HR Depth changers - Nvidia tech support online form - Nvidia support: 1-800-797-6530

#6
Posted 04/23/2013 02:14 AM   
Vsync does cause input lag, in some applications more than others, depending on how the engine handles things. Some games actually have delayed input response for whatever reason, so just capping the framerate can cause input latency. Adding another 1-2 frames(caused by vsync) just makes the 60fps cap worse. Input lag on vsync is unavoidable because the only way to prevent tearing is to not display partially rendered frames. I'd love a tear control that didn't cap framerates at all, though, allowing the GPU to render freely, but displaying only the last completed frame.
Vsync does cause input lag, in some applications more than others, depending on how the engine handles things. Some games actually have delayed input response for whatever reason, so just capping the framerate can cause input latency. Adding another 1-2 frames(caused by vsync) just makes the 60fps cap worse.
Input lag on vsync is unavoidable because the only way to prevent tearing is to not display partially rendered frames.
I'd love a tear control that didn't cap framerates at all, though, allowing the GPU to render freely, but displaying only the last completed frame.

Z97-GD65 Gaming | i5 4690k | 2x8gb Crucial DDR3L | TP650new
GB 1060Mini | ViewSonic VX2268WM | Samsung 713n
Realtek acl-1150 | ATH A700X

#7
Posted 04/23/2013 02:38 AM   
The reason why Vsync enabled causes input lag is because the effect of the current input is stored in the back-buffer (which is not visible yet until a vertical scan is signaled) that's why at the time the back-buffer swapped/flipped with the front-buffer(the visible buffer) it will show the old input event and not the current input event, that's why it looks like delayed input effect (due to the input queuing in back-buffers) In this case the more back-buffers you have will cause more delays (longer queue) While with Vsync disabled the game renders to the back-buffer and then swapped/flipped it immediately regardless of whether the monitor's vertical scan is ready or not, that's why the current frame is showing the current input event (doesn't show delayed input effect) but with a possible of tearing effect. In this case having more than 1 back-buffer won't be useful(the remaining back-buffers won't be used/necessary) unless your GPU can renders more than 1 frame within vsync interval (ie. 1/60 second interval on 60Hz refresh rate)
The reason why Vsync enabled causes input lag is because the effect of the current input is stored in the back-buffer (which is not visible yet until a vertical scan is signaled) that's why at the time the back-buffer swapped/flipped with the front-buffer(the visible buffer) it will show the old input event and not the current input event, that's why it looks like delayed input effect (due to the input queuing in back-buffers)
In this case the more back-buffers you have will cause more delays (longer queue)

While with Vsync disabled the game renders to the back-buffer and then swapped/flipped it immediately regardless of whether the monitor's vertical scan is ready or not, that's why the current frame is showing the current input event (doesn't show delayed input effect) but with a possible of tearing effect.
In this case having more than 1 back-buffer won't be useful(the remaining back-buffers won't be used/necessary) unless your GPU can renders more than 1 frame within vsync interval (ie. 1/60 second interval on 60Hz refresh rate)

#8
Posted 04/23/2013 06:00 AM   
I'm using vsync everywhere (just can't stand the tearing). The mouse lag seems to be game dependent. For example, in Counter-Strike (all of them, from 1.6 to GO) enabling vsync introduces no mouse lag at all. In others, like Skyrim, the mouse lag is insane. For games with vsync mouse lag I usually am able to get rid of it by doing either one of the below, or both: Set "Maximum prerendered frames" to 1 in the NVidia 3D settings profile for the game. And/or: In NVidia Inspector in the profile for the game, set the frame limiter to the current refresh rate of the monitor (60 in my case.) For even lower mouse lag, you can set it to a value that's lower by 2 (that would be 58 in my case) but this will result in some frame dropping (if you're looking closely enough.) Unfortunately, I don't know how SLI affects those settings. I'm only using a single card. Additionally, for cases where the game itself doesn't do triple buffering *and* can't maintain a constant 60FPS (or whatever your monitor's refresh rate is), I use D3DOverrider to enable TB. This gets rid of mouse lag when the game falls under 60FPS. Using a 500Hz polling rate for your mouse also helps a little. By default, the USB bus operates the mouse at a polling rate of 125 updates per second (in other words, 125Hz). This means when you move the mouse, the system picks up the change after 8 milliseconds. So you can have up to 8ms of mouse lag regardless of the other optimizations described above. I bought a 1000Hz mouse (that's 1ms), but set it to 500Hz (2ms), since 1000Hz is overkill and can result in somewhat higher CPU usage. 500Hz is the sweet spot. Your monitor's frame latency also plays a role, of course. But you've said that you have a fast monitor, so that shouldn't be an issue. In general, any of the above on it's own doesn't have a big effect on input lag. But when combined all together, quite a few milliseconds of input lag are cut out, and it does make a big difference in the end. For example you get rid of 16ms of lag by lowering the prerendered frames settings, another 16ms by enforcing TP, and another 6ms my using a 500Hz mouse polling rate. In total that amounts to 38ms, which is quite a big difference.
I'm using vsync everywhere (just can't stand the tearing). The mouse lag seems to be game dependent. For example, in Counter-Strike (all of them, from 1.6 to GO) enabling vsync introduces no mouse lag at all. In others, like Skyrim, the mouse lag is insane.

For games with vsync mouse lag I usually am able to get rid of it by doing either one of the below, or both:

Set "Maximum prerendered frames" to 1 in the NVidia 3D settings profile for the game.

And/or:

In NVidia Inspector in the profile for the game, set the frame limiter to the current refresh rate of the monitor (60 in my case.) For even lower mouse lag, you can set it to a value that's lower by 2 (that would be 58 in my case) but this will result in some frame dropping (if you're looking closely enough.)

Unfortunately, I don't know how SLI affects those settings. I'm only using a single card.

Additionally, for cases where the game itself doesn't do triple buffering *and* can't maintain a constant 60FPS (or whatever your monitor's refresh rate is), I use D3DOverrider to enable TB. This gets rid of mouse lag when the game falls under 60FPS.

Using a 500Hz polling rate for your mouse also helps a little. By default, the USB bus operates the mouse at a polling rate of 125 updates per second (in other words, 125Hz). This means when you move the mouse, the system picks up the change after 8 milliseconds. So you can have up to 8ms of mouse lag regardless of the other optimizations described above. I bought a 1000Hz mouse (that's 1ms), but set it to 500Hz (2ms), since 1000Hz is overkill and can result in somewhat higher CPU usage. 500Hz is the sweet spot.

Your monitor's frame latency also plays a role, of course. But you've said that you have a fast monitor, so that shouldn't be an issue.

In general, any of the above on it's own doesn't have a big effect on input lag. But when combined all together, quite a few milliseconds of input lag are cut out, and it does make a big difference in the end. For example you get rid of 16ms of lag by lowering the prerendered frames settings, another 16ms by enforcing TP, and another 6ms my using a 500Hz mouse polling rate. In total that amounts to 38ms, which is quite a big difference.

GPU: EVGA 980Ti FTW
CPU: Core i5 2500K
PSU: Corsair HX650
MB: MSI P67A-C43 (B3)
RAM: 16GB DDR3 1600
Sound: Asus Xonar D1 PCI
Monitor: ViewSonic XG2703-GS
OS: Gentoo Linux AMD64 / Windows 10 x64

#9
Posted 04/23/2013 08:13 AM   
Just because you don't notice doesn't mean it's not there. It's impossible for vsync not to introduce latency. It's better in some games(source, fear2), and worse in others(ut2k4 wtf) but it is always there. I can't deal with it even on a 120hz display running 120fps in most games.
Just because you don't notice doesn't mean it's not there.
It's impossible for vsync not to introduce latency. It's better in some games(source, fear2), and worse in others(ut2k4 wtf) but it is always there.
I can't deal with it even on a 120hz display running 120fps in most games.

Z97-GD65 Gaming | i5 4690k | 2x8gb Crucial DDR3L | TP650new
GB 1060Mini | ViewSonic VX2268WM | Samsung 713n
Realtek acl-1150 | ATH A700X

#10
Posted 04/23/2013 08:28 AM   
[quote="MisressMouse"]Just because you don't notice doesn't mean it's not there. It's impossible for vsync not to introduce latency. It's better in some games(source, fear2), and worse in others(ut2k4 wtf) but it is always there. I can't deal with it even on a 120hz display running 120fps in most games.[/quote] Yes, of course that is true. You are going to have at least the 1 frame latency due to vsync. It's just a matter of whether I want the upper half of my screen having that input latency while the lower half of it does not, with the tearing line in the middle that separates the two frames. It's just personally I can't live with that effect :-)
MisressMouse said:Just because you don't notice doesn't mean it's not there.
It's impossible for vsync not to introduce latency. It's better in some games(source, fear2), and worse in others(ut2k4 wtf) but it is always there.
I can't deal with it even on a 120hz display running 120fps in most games.

Yes, of course that is true. You are going to have at least the 1 frame latency due to vsync. It's just a matter of whether I want the upper half of my screen having that input latency while the lower half of it does not, with the tearing line in the middle that separates the two frames. It's just personally I can't live with that effect :-)

GPU: EVGA 980Ti FTW
CPU: Core i5 2500K
PSU: Corsair HX650
MB: MSI P67A-C43 (B3)
RAM: 16GB DDR3 1600
Sound: Asus Xonar D1 PCI
Monitor: ViewSonic XG2703-GS
OS: Gentoo Linux AMD64 / Windows 10 x64

#11
Posted 04/23/2013 08:40 AM   
[quote="Libertine"]So what is the cause of the input lag when enabling Vsync if its not Vsync?[/quote] For the most part input lag is caused by the CPU pre-rendering additional frames ahead of the GPU. Input lag introduced by vsync itself is often minor in comparison, but it ultimately depends on frame rate. Lower FPS means more time between the swapping of back and front buffers.
Libertine said:So what is the cause of the input lag when enabling Vsync if its not Vsync?


For the most part input lag is caused by the CPU pre-rendering additional frames ahead of the GPU. Input lag introduced by vsync itself is often minor in comparison, but it ultimately depends on frame rate. Lower FPS means more time between the swapping of back and front buffers.

"This is your code. These are also your bugs. Really. Yes, the API runtime and the
driver have bugs, but this is not one of them. Now go fix it already." -fgiesen

#12
Posted 04/23/2013 08:54 AM   
IMO, it's about high time for someone to invent a new method of communicating with the display device. Digital panels are not CRTs, but due to compatibility reasons, they are still driven as if they had electron beams tracing all over it. Now that compatibility isn't such an issue anymore, it would be better to be able to just send frames to the display and then let it sort it out on its own. Monitors would then be able to put an end to tearing *and* get rid of input lag. But, frankly, I don't see that happening any time soon. Sadly, frame latency is not viewed as a big enough issue to justify the costs of changing the current technology.
IMO, it's about high time for someone to invent a new method of communicating with the display device. Digital panels are not CRTs, but due to compatibility reasons, they are still driven as if they had electron beams tracing all over it. Now that compatibility isn't such an issue anymore, it would be better to be able to just send frames to the display and then let it sort it out on its own. Monitors would then be able to put an end to tearing *and* get rid of input lag.

But, frankly, I don't see that happening any time soon. Sadly, frame latency is not viewed as a big enough issue to justify the costs of changing the current technology.

GPU: EVGA 980Ti FTW
CPU: Core i5 2500K
PSU: Corsair HX650
MB: MSI P67A-C43 (B3)
RAM: 16GB DDR3 1600
Sound: Asus Xonar D1 PCI
Monitor: ViewSonic XG2703-GS
OS: Gentoo Linux AMD64 / Windows 10 x64

#13
Posted 04/23/2013 09:08 AM   
If the display device (i.e. monitor) could read the entire front buffer and update every pixel on the screen before continuing to the next front buffer, the tearing problem would be solved without V-sync (the video card could continue to swap buffers at its own pace, which also helps to reduce input lag). However we'd still have the problem of how much extra latency does this cause the display device to introduce. Another problem we have with this approach is that the display device may be unable to grab the entire contents of the front buffer before the video card alters it (at higher FPS than refresh rate for example). So therein lies the notion of synching once again.. at least partially If the video card sends off an entire frame to get buffered by the display device, we'll probably end up with even more input lag than we already have today.
If the display device (i.e. monitor) could read the entire front buffer and update every pixel on the screen before continuing to the next front buffer, the tearing problem would be solved without V-sync (the video card could continue to swap buffers at its own pace, which also helps to reduce input lag). However we'd still have the problem of how much extra latency does this cause the display device to introduce.

Another problem we have with this approach is that the display device may be unable to grab the entire contents of the front buffer before the video card alters it (at higher FPS than refresh rate for example). So therein lies the notion of synching once again.. at least partially

If the video card sends off an entire frame to get buffered by the display device, we'll probably end up with even more input lag than we already have today.

"This is your code. These are also your bugs. Really. Yes, the API runtime and the
driver have bugs, but this is not one of them. Now go fix it already." -fgiesen

#14
Posted 04/23/2013 09:27 AM   
Current monitors already buffer frames, so the capability is already there. The best option would be to feed the monitor with frames, and then it's up to the monitor what to do with them. For example, the monitor would display the latest available frame after finishing the current one, choosing not to buffer any previous ones that weren't displayed yet. In that scenario, latency would be determined only by current refresh rate. I imagine gaming-oriented monitors would do things differently than others. Of course people with more deep knowledge about displays would come up with way better algorithms than me :-) The point is, the current way of treating digital panels the same as CRT displays doesn't seem like it's the correct way of doing things.
Current monitors already buffer frames, so the capability is already there. The best option would be to feed the monitor with frames, and then it's up to the monitor what to do with them. For example, the monitor would display the latest available frame after finishing the current one, choosing not to buffer any previous ones that weren't displayed yet. In that scenario, latency would be determined only by current refresh rate. I imagine gaming-oriented monitors would do things differently than others.

Of course people with more deep knowledge about displays would come up with way better algorithms than me :-) The point is, the current way of treating digital panels the same as CRT displays doesn't seem like it's the correct way of doing things.

GPU: EVGA 980Ti FTW
CPU: Core i5 2500K
PSU: Corsair HX650
MB: MSI P67A-C43 (B3)
RAM: 16GB DDR3 1600
Sound: Asus Xonar D1 PCI
Monitor: ViewSonic XG2703-GS
OS: Gentoo Linux AMD64 / Windows 10 x64

#15
Posted 04/23/2013 11:45 AM   
  1 / 2    
Scroll To Top