[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: percieving fast motion (was: Re: gettimeofday() and clock)
Gregor Mückl wrote:
>> That's nonsense. I'm a professional graphics programmer and I can
>> absolutely assure you that the difference between 30Hz and 60Hz is
>> like day and night. In the flight simulation business, we couldn't
>> sell a simulator that ran graphics at 30Hz. We havn't sold a 30Hz
>> system for 10 years.
>>
>> At 30Hz, each image is drawn twice onto the faceplate of the CRT.
>> Your eye/brain interpolates the position of the images it sees and
>> the only way to rationalize an object that moves rapidly for 1/60th
>> second and then stops for 1/60th second is to presume that there are
>> really *TWO* objects moving along right behind one another. This
>> double-imaging artifact is really unpleasant in any kind of fast-moving
>> application.
>>
>
> Well. One point is wrong here: The human eye cannot recognise that
> motion is rendered jerky because the same frame is drawn during two
> refreshes. It is too slow for that.
Nope - you are 100% wrong. I've been studying human perception for
the last 20 years that I've been a graphics programmer - and what I
say is *easy* to demonstrate.
The US government has spent hundreds of millions of dollars extra
to have flight simulators that update graphics at 60Hz rather than
the (much cheaper) 30Hz. The double-imaging effect at 30Hz makes
the job of a fighter pilot impossible to do.
> What happens is something different: At the specified frame rate you
> generate a sequence of frames that get drawn only by one screen refresh.
> The human eye cannot distinguish between these frames, but percieves a
> series of them layed over each other. Imagine that you would alpha-blend
> your frames one over the other.
Nope - you are guessing.
It does nothing of the sort.
The reason we are able to percieve motion in a series of still images is
not the old "persistance of vision" thing - that was something the Victorians
presumed was happening (they didn't have computer graphics so it was hard
to show otherwise).
Imagine your classic caveman with a rock who wishes to kill a cute
bunnyrabbit for his supper. When the rabbit runs along the ground,
he has no problem in throwing the rock and hitting the rabbit.
However, suppose there are some trees between him and the rabbit. As the
rabbit runs, it disappears behind a tree, then reappears, disappears and
so on. It's an evolutionary trait that humans have developed that allows
us to imagine the position of the rabbit even at times when we can't
actually see it. We *INTERPOLATE* the position of the rabbit and percieve
it as smooth motion so we can still hit it with that rock.
It is this feature of the eye/brain that makes intermittant graphics
signals work. When there is no light coming from the CRT (because we
are between frames), we simply interpolate the position of moving objects
just as if a tree had come between us momentarily.
However, if the moving object is displayed twice in the same position,
(as happens when you display an object at 30Hz on a 60Hz CRT - or at
36Hz on a 72Hz CRT - or even at 60Hz on a 120Hz CRT), your eye/brain
cannot understand the motion of the object. How could something moving
so quickly possibly stop dead, then accellerate so quickly again? This
makes no sense. However, if we imagine that there are TWO objects - both
moving rapidly - spaced at exactly the right distance - each disappearing
for 1/30th of a second and then reappearing - then the image makes perfect
sense and we can again interpolate the two object's position.
This really happens. On 30Hz update/60Hz video systems, you see double
images of everything that's moving fast. On 20Hz update/60Hz video, nearly
everyone sees triple-imaged objects. At 15Hz update/60Hz video, a *few*
people see quadruple images - but most of us have brains that say - "Well,
as unlikely as it may seem, this object really is stopping and starting",
and at 10Hz update, we *all* see that...which means we see jittery motion
instead of multiple objects.
>> Not so. You can percieve motion up to around 70 to 120Hz - depending
>> on the individual, the room lighting conditions and whether the image
>> is centered on your eye or viewed with peripheral vision. That's why
>> higher quality graphics card/CRT combo's are run at 72 or 76 Hertz.
>
> Impossible. Impulses from the receptors in human eyes usually last for
> about 1/25th second. So it cannot resolve shorter time periods than
> that.
Not true - they don't all fire at once.
Imagine a crowd of 100 people watching some event. Each person opens their
eyes for 1 second - then closes them again for 9 seconds. If they all
did this at the exact same time then I could jump up in front of them
and wave a flag for 1 second and only have a one-in-ten chance of being
seen. However, if they each open their eyes at a different *times* then
the odds that I'll go unnoticed drops to almost zero.
So, our neurons fire slowly - but there are one heck of a lot of them
and they don't all fire at once. Humans can see flashes of light
containing as few as 50 photons that last for less than one millionth
of a second.
But it's all much more complex than that. Our eyes jitter around to
increase resolution, our brains reconstruct parts of the image that
are temporarily hidden using memory images recorded earlier and it
inserts them into the image that we consciously percieve. Our eyes
actually switch off when we move them quickly and (again) the brain
inserts 'archival footage' to fill in the gaps. We have different
spatial resolution for colour and monochrome information - and
different 'motion detection' rates for colour and monochrome. We
re-order events in time to hide the effects of our own reaction
times from our 'high level' selves. We 'guess' at colour information
when it's too dim to detect it properly. The missing video when we
blink is seamlessly edited out of our visual perception.
But our 'high level' conscious selves are kept carefully unaware of
the low level nastiness that goes on in our perceptual systems.
It takes subtle experiments to find these phenomena - and it's often
hard to believe the findings because we are so sure we 'know ourselves'.
Our brains are *INCREDIBLY* sneaky about doing the best with limited
sensor and motor capability and the 'programming' model is so well
structured that the conscious levels of our minds are totally unaware
of it. Evolution has optimised the heck out of a really crappy pair
of cameras on a shakey, flickery mount and provided an interface to
our conscious minds that is seamless and has no noticable flaws...until
you stuff 30Hz video into it - for which we are *NOT* evolved to deal
with.
Unless you've studied all this stuff, you really shouldn't contradict
someone who has.
> So a sequence of frames with a rate higher than 30fps gets blurred
> by the human eye.
That's not true. If it were then we could all run our CRT's at 25Hz video
rates. Television signals could consume about half the current bandwidth
and DVD's would hold twice as much video as they currently do.
I was born and raised in England - where 50Hz television is the standard.
It's noticably more flickery than US television at 60Hz and Americans
often cannot stand to watch it because their brains have adapted to
live with 60Hz. This would not be the case if the receptors in our
eyes worked that slowly.
Both TV standards actually only send half of the scan lines every field
(this is called 'interlace') so the complete picture only updates at
half speed. Occasionally, in computer graphic imagery, you'll see
single-pixel horizontal lines flickering violently. TV studio's go
to enormous lengths to avoid this - but sometimes it's inevitable.
If your eyes worked the way you say, you would be unable to see that
effect.
Those of us who programmed graphics for the old Amigas and Atari's
using Televisions instead of monitors will attest to the fact that
30Hz interlace flicker is a VERY real phenomenon.
So, go read a book on the subject - or just take my word for it.
I've been researching this stuff for 20 years and I'm paid a *TON*
of money for knowing what I know. You may take it that I'm quite
expert in this subject.
>> Doing generalized motion blur in interactive computer graphics is an
>> unsolved problem.
>
> Really? That's an article I found while I did a little research for this
> posting. It describes a method which could in fact be very promising:
>
> http://www.acm.org/crossroads/xrds3-4/ellen.html
Completely impractical - unless you are doing ray-tracing - which they are.
Rendering every pixel at a different moment in time means that the hardware
can't simply scan-convert a polygon. Scan-converting polygons is the only
known technique that gives us reasonable frame rates. Even the very best
so-called 'realtime' raytracers are not even 1/1000th as fast as a cheap
PC graphics card.
> Another, simpler idea I had before I stumbled over this article was to
> approximate the movement the object makes by a series of linear segments
> each one being as long as the time the frame is displayed. Then you blur
> it along this line segment while rendering the corresponding frame by
> drawing it several times (using alpha-blending, of course) in different
> positions along this line. You could become really sophisticated and
> make the number of drawing steps depend on the object's speed and
> distance from the viewer. One possbile drawback I see with this method
> is that it might irritate the viewer when the object being rendered has
> noticeable direction changes with each frame.
That's hard to do when you have texture though.
This (and other) techniques for faking motion blur help - but they
aren't really general techniques and they don't fit in well with
things like lighting and texture.
However, I've used them in the past - and you see them a lot in
non-realtime situations.
>> I can absolutely assure you that 60Hz is the MINIMUM frame rate I'd
>> consider for most applications. 76Hz would be preferable.
>
> High framerates are hacks. Only better rendering algorithms are the
> final solutions IMO. How these algorithms would work is pretty clear.
> But they need a vast amount of computational power. And I'm actually in
> a mood to hack a little demo to show off what I think. Maybe tomorrow
> evening.
That is the problem. If you have more computational power, it's better
to use it to simply render frames faster than it is to complexify the
algorithms. To wipe out the problem of double imaging at 30Hz, you'll
need more computational power. However, it's a lot cheaper and easier
to simply double the speed of the machine and run at 60Hz than it is
to put all the complexity *and* additional speed needed to render at
30Hz with motion blur.
There is an additional problem with 30Hz rendering - and that is latency.
When you have 'twitch time' human interaction going on, the 33ms to render
the frame plus the 16ms to display it on the CRT adds up to 50ms - which
is longer than most people can tolerate for aiming and shooting moving objects.
Running at 60Hz reduces that to a more comfortable 33ms.
Quake-heads will tell you this is important - which is why some of them
run the game at 120Hz frame rates - even though their CRT is running
at 60Hz and fully half of the pixels rendered never make it onto the
screen!
The reduction in latency is 8ms - which makes a difference that they can
easily feel.
Motion blur (at 30Hz) not only doesn't help the latency problem - it
actually makes it worse because the front edge of the moving object has
50ms of latency - but the back (blurred) edge has 80ms!
On flight simulators we try to fix that by extrapolating the positions
of objects ahead in time to where they *will* be by the time the CRT
gets through lighting up the phosphor. This only works for us because
aircraft - whilst fast - don't change speed or direction all that quickly
and we can predict with reasonable accuracy where they'll be in 33ms's
from now by knowing the aerodynamics, etc, etc. That's not so easy
for twitch-action computer games where things are lighter and move
around with improbable accellerations!
----------------------------- Steve Baker -------------------------------
Mail : <sjbaker1@airmail.net> WorkMail: <sjbaker@link.com>
URLs : http://www.sjbaker.org
http://plib.sf.net http://tuxaqfh.sf.net http://tuxkart.sf.net
http://prettypoly.sf.net http://freeglut.sf.net
http://toobular.sf.net http://lodestone.sf.net