Stutter Stutter or Smooth as Butter

Brands Hatch especially, even in practice screen stutter. I think it's not just me and maybe down to the unreal engine? I've tried all different things so if you're smooth as Butter, please post your settings and how you're eliminating it. Thank you.

Acer Predator X34p
I 7-6700k
Nvidia 1080ti
16gb Ram a lam.
And not forgetting my brand new Master Cooler 212 Evo
 
borderless window
Ah, cos the vsync stuff doesn't work for that I guess?
Gonna try to film this 59.7 fps @ 60 Hz vsync vs. 59.99 fps @ 60 Hz vsync so we an finally see if there's a full refresh cycle stutter every xy seconds or a stutter once every second but of different length.
Cool! The arithmetic predicts a nice short stutter period (3.3ish seconds) for 59.7 fps, which seems doable :D
Hah, I literally put it down on paper and then typed it in here
Ah, sorry I meant this:
1586651958456.png

You're right, except we can't have v-sync this way.
Doh! I figured I was perhaps missing something :roflmao: Oh dear... Was getting mixed up about which was the final buffer... Ahem.
 
[v-sync]

0000ms: display f0 / fb = f1 / f2 begin on bb
0060ms: f2 complete
0100ms: display f1 / fb = f2 / f3 begin on bb
Hmmm, I'm back to being confused about the single vs. double buffering again.
When I stare at the quoted bit above, I can't figure out why at 100 ms, the system couldn't (without any tearing) just display f2 - it has been completed for 40 ms by that time. Help?
 
Ah, cos the vsync stuff doesn't work for that I guess?
The frame limiting to lower than refresh rate with vsync on kinda works, but there's still higher input lag than in true fullscreen. Sadly I'm pretty sensitive to it...

The only real option to fix that would be to run vsync off with double the refresh rate, but depending on game, either my CPU or my GPU can't consistently achieve that.
 
The frame limiting to lower than refresh rate with vsync on kinda works, but there's still higher input lag than in true fullscreen. Sadly I'm pretty sensitive to it...

The only real option to fix that would be to run vsync off with double the refresh rate, but depending on game, either my CPU or my GPU can't consistently achieve that.
You mean doubled fps compared to refresh rate as in 120 fps @ 60 Hz?
Or 120 fps @ 120 Hz?
 
120 fps at 60 Hz, yes. Specifically, running vsync off with scanline sync 2x. That would ensure low input lag, smooth image and no tearing. It's perfect when you can achieve that. Sadly, I can't, not consistently enough.
 
Ah, sorry I meant this:
Ooh that is MSI afterburner next to "Show in On-Screen Display" checkbox there's a dropdown by default set to "text". You can choose "graph" there, and I've also changed the "Hardware polling period" (on the top) to 100ms, although I'm not sure if it makes any difference for this particular plot.

Hmmm, I'm back to being confused about the single vs. double buffering again.
When I stare at the quoted bit above, I can't figure out why at 100 ms, the system couldn't (without any tearing) just display f2 - it has been completed for 40 ms by that time. Help?
I was wondering the same exact thing, this is what I meant in my previous post whether the front buffer is swapped right after back buffer is filled. Again, these are my conjectures as I don't know it for a fact: so the back buffer (which is necessary for v-sync) basically has to be copied-to/swapped-with the front buffer. I think this is an operation with non-negligible cost (i.e. duration), hence it can't be sure the copy will have finished by the time the monitor starts refreshing, hence risking tearing. However, I would imagine it has a constant complexity, hence predictable execution time and I don't understand why it is not done in a predictive manner like you say. Might as well be some technical difficulty. Yet, all evidence seems to suggest that indeed the latest frame rendered in the back buffers is not copied immediately to the front buffer. Otherwise v-sync would have much lower input lag than reported. And this post I just discovered by RealNC seems also to suggest that if I'm reading it correctly (para. 2):
Once all possible frame buffers and all pre-render queues have been filled, only then will the game be prevented from queuing more frames to be rendered or displayed.
(if it were to swap immediately at 60ms in our example, then the back buffer would be emptied, and it would start rendering again @60ms, therefore it would never "be prevented from queuing more frames")

Very interesting read btw, and it also suggests that the fraction be in the range 0.007 - 0.015 or if its lower it might not work at all.

So, now I just noticed that the blurbusters' guide RealNC's post points to also mentions "predictive frame capping" hah! as a feature that should be added to RTSS. So we're not completely out of line after all... :D
 
Last edited:
@Neilski To further answer this question with (finally) some more credible resources that came up after searching the "vblank" signal RealNC refers to:

Vertical Blanking Interval
The pause between sending video data is sometimes used in real time computer graphics to modify the frame buffer, or to provide a time reference for when switching the source buffer for video output can happen without causing a visible tear. [...]

And my reading of this is that the frame in the back buffer sits there waiting for the next refresh cycle to begin (the 40ms wait time we were wondering about):
Vertical synchronization eliminates this by timing frame buffer fills to coincide with the vertical blanking interval, thus ensuring that only whole frames are seen on-screen.

Although they appear to be referring to older :) technologies, I think the same principles should apply.
 
Okay guys I can't film it, I can't record it...
But I did play around with the pendulum demo, different vsync options in Nvidia inspector etc.
Limited to all various numbers of fps via rtss.

The stutter is always 1 monitor refresh length long and it becomes more frequent the further below the hz value I limit.

So it's like we all thought:
1 frame stutter whenever the buffer is empty.
The more often the buffer is empty, the lower will be the input lag because the displayed frame won't be as old.

Now I think we all agreed that in theory that would mean inconsistent input lag as it would go higher and lower (the input lag) depending on how much time the frame has been spending in the buffer, right?

But this doesn't seem to be the case. The input lag is very consistent from my mouse widdling tests.
It will however become shorter, the further away you go from the Hz and become almost the same as without a limiter if you get too close to it.
So with a perfect 60 Hz monitor (like my old monitor was), the input lag at 60.00 fps or 59.99 fps would almost be the same.
However at 59.97 fps it would be a lot lower!

I'm gonna try if I can at least film this... Not sure though, sadly
 
Alright I shot 3 videos of me moving my mouse in cs:go while also recording the screen.
I recorded at 118.734 fps, slowed them down to 40.0% in sony vegas.
That's 47.5 fps playback speed.
Since I'm a noob with video editing, my jump-steps are 16.67ms (59.940 fps ntsc).

Anyway I jump-stepped forward until my hand was moving and then jumped forward until the game started moving.
FPS Limiter at 59.99 fps: 6-7 steps = 6*16.67 = 100ms in vegas. 6 frames in reality translate to 6*0,0084ms (118 fps) = 50ms (yeah that's logical... 118 fps is kinda 2x 60 fps :roflmao: f*ck me lol).

Anyway so:
59.99 fps = 6 frames, 50ms input lag
59.90 fps = 3 frames. 25ms input lag
59.70 fps = Also 3 frames but it felt snappier.

I guess I forgot to try without the limiter on... Let's do it again...
Without limiter it's 8 frames and with limiter to 59.70 fps it's 3 frames.

This isn't really that accurate but it shows that there's a certain "border" which flips everything from 6-8 frames input lag down to about 3 frames.

Of course I did this all with pre-rendered-frames at 1 (ultra low latency mode blahblah).

And now that I think about it.. screw it, I'm gonna do it with 3 pre-rendered-frames (default).

EDIT: yep, it's about 10-14 frames now. Can't really see it...
59.70 overwrites pre-rendered-frames at least to some amount it seems, I counted 6 frames now.

btw it was all planned a bit differently... About 1.5 years ago I recorded frametimes with vsync on in Assetto Corsa and it looked like this: (53.3 fps vs 60 fps in the limiter):
1586704549318.png


I could also record it with fraps and had tic tac graph like in afterburner jumping between 30 and 60 fps.
Sadly I couldn't find a way to get this again.. fraps, afterburner, a lot of different games all clearly stuttered inconsistently, showing one frame a second time every few frames.
But today all measurement tools simply said "these are 45 fps". Yeah.. my monitor is showing it's hz counter, it's 60 Hz and I've got no tearing.. Who do you try kidding?!
 
Last edited:
That's interesting! So if I interpreted your results correctly, the lower the limiter the less input lag but the longer the stutter. And 1 stutter/sec no matter what. Right?

So, today I also said enough with the theories, after I realised my phone camera can record @960fps for 0.4 sec, which is sufficient for the measurement I wanted to do, i.e. measure latency from key-press to display update.

And here is my method: I used only RTSS limiter and ACC. I mapped num lock to downshift and aligned the num lock LED of the keyboard next to the gear indicator in the HUD. Recorded the slow motion video of me pressing num lock (thankfully that thing can detect motion to trigger 960fps recording for 0.4secs) about 8-10 measurements with each setting. I took care to space out 59.999 and 59.99 measurements just in case it has a long period as I thought it might have.

Then the output video of the camera is 30fps. So, I measured the time (used avidemux) elapsed from the first frame the num lock LED switches up to the time of the frame with the slightest indication of gear change (it seamed to fade in while the previous gear faded out). That time I divided with 32 (960/30 = 32).

numlock-results.png


  • v-sync (with or without 60fps limiter): as expected, worst case scenario.
  • vsync+59.999: almost no improvement at all (as RealNC suggested) + quite significant deviation (10 ms)
  • vsync+ 59, 59.9, 59.99: seem to have similar effect (on input lag at least). Deviation on 59 is 9ms, almost double the dev. of the other two
  • the rest we knew already, I just did to validate my approach (to the extent possible).

Out of these I would chose 59.99, but maybe also 59.9. I must say that I still don't notice stutter with these settings on ACC, might be just me. Oh and for all v-sync setups my gpu was at about 60% load (forgot to check 60fps cap only) and with no cap upwards of 97% of course.

In any case I'm a bit disappointed. I didn't expect it to be this much with ultra low latency mode. There's clearly some other delay in my system or some kind of buffering going on, triple/quadriple who knows, cause even 90ms is more than 5 x 16.6666; would that mean there's 4-5 buffers along the chain?

I'll probably stick to 59.99 and save some 25 ms of input lag, still better than before.
 
So if I interpreted your results correctly, the lower the limiter the less input lag but the longer the stutter. And 1 stutter/sec no matter what. Right?
No, my first post today said this:
Limited to all various numbers of fps via rtss.

The stutter is always 1 monitor refresh length long and it becomes more frequent the further below the hz value I limit
Oh I see that I forgot to post my other results oops. Sadly the stuttering wasn't visible on my videos..
But with the limiter very close to my refresh rate I experienced a micro stutter (1 refresh cycle long, 1 frame displayed a second time) about every 10 seconds.
When I went to 59.7 fps the stutter became more frequent. About every 3 seconds.
I simply did this with a stopwatch and the "lap" function. Pressing it every time I saw a stutter.
Of course I got a bit hypnotized from the Nvidia pendulum after watching it for 20 Minuten with no blinking allowed :roflmao:

Anyway: stutter is NOT once a second!!!
It's 1 frame every time the buffer is empty!
However my brain hurts so no idea which buffer where and what is full or empty but our theory about the stutter changing its happening-time seems correct! :)
In any case I'm a bit disappointed. I didn't expect it to be this much with ultra low latency mode. There's clearly some other delay in my system or some kind of buffering going on, triple/quadriple who knows, cause even 90ms is more than 5 x 16.6666; would that mean there's 4-5 buffers along the chain?
Well we come very close to the no-sync input lag so that's great!
About the no-sync input lag:

Keyboard signal output delay - > USB input - > sending the signal across the motherboard - > processing the incoming signal - > sending it through the drivers to the game - > processing the signal in the game - > changing gear - > updating game - > sending new image to cpu - > processing and sending to gpu - > processing and buffering - > to monitor - > monitor input buffer - > image processing - > sending to the interface - > refreshing crystals.

Monitors alone have between 8-40 ms input lag. Plus USB etc you easily end up at your measured input Lags.

Also acc is known for quite high input Lags even with 165 Hz gaming monitors without sync!
 
Nice work guys!
I haven't had the mental bandwidth today to think about this but will do so at some point.
However, it's really clear from the data that both of you have collected that something deeply funky is going on (in the drivers??) as the cap drops away from the sync rate - the massive difference in lag between the 59.999 and 59.99 fps for @gyrtenudre and similarly the massive diff between 59.99 and 59.9 for @RasmusP. The fact that the two of you have different "inflection" points for the lag change is probably telling us something too - different drivers maybe? Are you both on Nvidia cards?
 
Anyway: stutter is NOT once a second!!!
It's 1 frame every time the buffer is empty!
However my brain hurts so no idea which buffer where and what is full or empty but our theory about the stutter changing its happening-time seems correct!
Good to know, this is seems more intuitive!

Monitors alone have between 8-40 ms input lag. Plus USB etc you easily end up at your measured input Lags.
Hmm, you're right, it's a lot of components plus I also want to add I don't trust this cheapo TV of mine to truly feature a 6ms response time like the spec states.

..aaaand! I finally managed to get scanline to work, so here's the updated comparison chart:
numlock-results.png


If I manage to hide the tearing consistently, I'm clearly going with scanline.

Edit: Forgot to answer this:
Are you both on Nvidia cards?
Yes! Driver version v445.75, should be latest as of this writing.
 
Have been doing a little more reading, and I've now forgotten whether or not this is one of the pages you linked @gyrtenudre, but it has some very interesting info on it:

Essentially, all of this stuff is soooo much more complicated than I had ever imagined, with several nested layers of fun going on. But at least I'm now starting to have a clue about why capping is making a difference... :)
 
Did I reply to this thread already? Not sure.....
I set the monitor to 60Hz, use Vsync and a FPS cap at 60fps. If I change any of these I got stuttering.
End of (my) story (at least).
 
Did I reply to this thread already? Not sure.....
I set the monitor to 60Hz, use Vsync and a FPS cap at 60fps. If I change any of these I got stuttering.
End of (my) story (at least).
Did you do the "denominator" trick to get the more accurate fps limit?
Rtss at 59.97 (5997 in Rtss then) shouldn't really stutter but reduce the input lag significantly.
It's important to check via the vsync tester website what's the exact hz of your monitor. Then go minus 00.03 of that in rtss.

Do you use pre rendered frames at 1 or 2 or the default of 3? It's now called ultra low latency or something similar in the Nvidia panel...
 
Back
Top