Microstutter is a phenomenon where the time between frames doesn't remain even. It actually doesn't have anything DIRECTLY to do with a camera, but I'll explain what RB meant after I define Microstutter a little better:
At 60FPS, every frame should be 16.666~ms apart. (1000ms/60frames=16ms/frame)
So, your frames should look like:
16ms, 16ms, 16ms, 16ms, 16ms, 16ms..
Microstutter will exhibit situations where MAME and other games will show 50FPS, but the time between varies like this:
8ms, 8ms, 8ms, 8ms, 24ms, 8ms, 8ms..
As you can see, they're not evenly interspersed. It will cause the game to not look fluid at all.
It's a known issue with ATi's drivers. While it hits SLI cards particularly bad, I do not believe it is limited to SLI. It seems to be able to hit any card from the last year or two that ATi has put out.
Now, RB was telling you that it's particularly noticible with a camera; this is because the high speed camera picks up the differences in the frame timing extremely well. It has nothing to do with screen capture. just the nature of recording off an uneven source.
Also, I hate to break this news to you, but most of your findings will only be applicable to you in the end. There are a lot of factors that affect input lag, but the one thing you cannot actually control are video driver issues. I know of at least one issue with nVidia drivers, or instance, that will trigger insane amounts of input lag on a completely inconsistent basis and may even cause multimonitor MAME to refuse to take input at all if VSync is turned on with some versions of nVidia's drivers.
These uncontrollable factors make it impossible to actually come up with a single relevant lag test for every combination of OS, video card, and setting.
---
Try checking the MAME manual at http://docs.mamedev.org
|