Where's my Refresh Rate?
dizzy:
In my experience with DirectX programming, all issues are driver issues. Refresh rate, in particular, is always the fault of the video driver.
KittKitt:
Quote from: J. M. Pescado on 2006 September 08, 01:23:21
I've always firmly believed that to be nonsense, and people only see "flicker" because people like you keep telling them they should be seeing flicker. It's rubbish, since your computer can only put out maybe 20-30 FPS most of the time anyway, so your image would only change about 20-30 times a second. TV only puts out about 20-some frames a second, and no one complains about that.
Don't be crying at me because your old eyes are sub-standard. :P
I don't actually see the flicker unless the refresh rate is ridiculously low, or when a monitor screen is filmed by external camera (in which case you're seeing the refresh flicker because frame rate of the recording is slower than the actual refresh rate, so you're only seeing part of it), but I still get a lot more eye strain and eventual headaches unless I keep my refresh at or above 70.
And again, in your above logic for why you can only achieve a 20-30 times a second refresh rate is inherently flawed. Refresh is independant from (and usually faster than) frame rate. Even if your computer at a given time is only capable of say 30 FPS (which will depend greatly on what you're doing and at what resolution and what your system specs are obviously), your refresh rate will still be whatever it was set to. The frame rate updates the frame buffer, which though not an entirely accurate analogy, could be thought of as a pattern. That 'pattern' is what the computer will use to illuminate pixels on the screen until its replaced by the next 'pattern' (frame).
So for simplicity, we'll assume a frame rate of 30 and a refresh of 60. That means that in one second, you're going to technically be shown 60 frames, but 30 of them will simply be duplicates of the one previous, because the frame buffer hasn't updated the pattern yet.
It doesn't do a damn thing in terms of smoothing animation, but it does reduce flicker, even if said flicker is happening faster than you can actually percieve with the naked eye. Just because you don't physically see something doesn't mean it isn't there, nor does it mean you won't eventually get a headache. ;)
Incidentally, in NTSC television, you're getting a framerate of not quite 30 interlaced FPS, but its refresh rate is 60 hertz. PAL by contrast used to be a standard of 25 frames per second, but at a refresh rate of 50 hertz, and because this could still sometimes cause notable flicker issues, most modern PAL sets now operate at 100 hertz, which not only eliminates the full screen flicker some people could see, but also nearly eliminates the old 'horizontal line flicker' issue that almost everyone could visibly see in older PAL sets.
Anyway.. flicker is real. It is not LIES AND PROPOGANDA! :D
-Kitt
Gwill:
Quote from: KittKitt on 2006 September 07, 21:45:35
Note not to confuse this with 'frame rate', which has to do with moving pixels (which of course, don't *really* move, but rather one is turned off and one nearby is turned on in its place).
I spend all day at work trying to teach old Projectionists who have spent their entire life clipping 35mm film to use their new digital projectors. I keep trying to turn everything into movie terminology. :P
KittKitt:
Quote from: Gwill on 2006 September 08, 22:57:19
I spend all day at work trying to teach old Projectionists who have spent their entire life clipping 35mm film to use their new digital projectors. I keep trying to turn everything into movie terminology. :P
I don't envy you. The old dog new tricks bit is always a pain, but that's one industry where things have really changed drastically in a relatively short time. =/
But.. it does give me a great analogy for explanation purposes, though technically it's not an analogy since it's doing the same thing but in a slightly easier to envision mechanism.
In the case of a film projector, your frame rate is the speed at which the frames of film are passed in front of the lamp and lens. In the silent film days, this rate was often variable, ranging from about 18 fps on up depending on the sequence and generally at the whim of the projectionist. Since the advent of sound film, a standard of 24 fps has been adopted because it's the slowest speed that allows the incorporation of sound at any sort of reasonable quality. Some special format films run at 48 fps, but that's not really important here.
What is important is that while the human eye will see an illusion of motion at as few as 16 fps, but in most cases can readily distinguish that it's seeing a series of still images instead below this level, you would (and will) still see horrible visible flicker at these speeds if the light source were to run (flash) at the same speed.
Because the bulbs they use for projectors fire much faster than this, the effect is that even though what you're looking at in any given second is 24 frames of picture and 24 black spaces, the light that's being shone through the film fires much faster, which gives the eye the illusion that it's seeing more frames than it really is. I'm not entirely positive what sort of bulbs are being used in projectors nowdays, but I do know that most light bulbs in your home run at around 60 hertz, which normally gives the illusion that they're always on, even though in actuality, they're flickering on and off just like anything that runs on an electric alternating current does.
What this breaks down to mean is that the 60 hertz light bulb is then the 'refresh rate', because the image is being refreshed to our eye that many times, even though the film cell hasn't yet changed.
Incidentally, the reason 60 hertz was chosen intitially for light bulbs and the NTSC television standards were because that's exactly the same frequency that the alternating current uses in the USA and other areas that are predominantly NTSC over PAL. Matching things up is a lot simpler the less conversion you have to do to the power before it does the job it's meant to do.
That's why if you run lights off a small portable generator you can see them flicker. It's not that the generator doesn't produce enough power, but it's not capable of producing it at a high enough frequency to maintain the constant lit illusion we're accustomed to.
-Kitt
Jysudo:
Quote from: dizzy on 2006 September 08, 16:17:52
In my experience with DirectX programming, all issues are driver issues. Refresh rate, in particular, is always the fault of the video driver.
Ok, I just realise that it is the refresh rate that is in the sub account that is broken. the Sims game in my main XP account (windowed mode) is working perfectly fine.
I have been trying for a few days to solve this problem to no avail.
Navigation
[0] Message Index
[#] Next page
[*] Previous page