Mailinglist Archives:
Infrared
Panorama
Photo-3D
Tech-3D
Sell-3D
MF3D
|
|
Notice |
This mailinglist archive is frozen since May 2001, i.e. it will stay online but will not be updated.
|
|
P3D Depth perception cues
I found my old list of depth perception cues - it seems to me we might
have discussed a few other possible cues since this was last posted.
(I just added the entry on chromatic aberration cues, and touched up
a few others.)
JR
---------------------------------------------
Primary sources of human depth perception:
- binocular stereo: determining the distance (or relative distance) of an
object based on the convergence of the eyes, and the disparity (if any)
of the two views. Note that you don't have to converge on a particular
object to get an idea of its distance (change in disparity with change
in convergence is another source of information). (When done properly,
the Pulfrich effect produces true binocular stereo, though it is
produced by motion.)
- keystoning (nearby objects, not sure - see 970529 T3D discussion)
(Premise: the human visual system produces keystoning when viewing
nearby objects, and the amount of keystoning is partly a function of
the distance to the objects. The issue is whether the brain actually
uses this information.)
- visual cues from the image:
- perspective
- known shape of an object
- known size of an object
- hiding of an object by other objects in front of it
- multiple-reflection lighting (including reflection from non-specular
reflectors)
- degree of focus of objects in a scene (not sufficient when used alone
to determine depth, but still a useful cue)
- distance-related cues such as haze, brightness in a flash photo,
texture gradient (variations in perceptible detail with distance), etc.
- focal distance of the eyes (which viewers of traditional stereo photos
must learn to decouple from convergence) [accommodation]
- chromatic aberration of the eyes - normally suppressed from conscious
perception, but is used to notify parts of the brain on whether the
focal distance (accommodation) of the eyes is greater than or less than
the distance to the object being viewed
- depth determined from motion of the point of view (head or camera)
examples:
- lateral motion of head or camera
- rotation of the object being viewed
- oscillation of the camera (various patented techniques)
- motion of the point of view directly toward or away from the
scene being viewed (rate of change in size and angle of view of
objects in scene)
- audio effects, and combination of visual and audio effects
(A simple example is determining the distance of a flash of lightning,
but the human auditory system is capable of perceiving much shorter
distances - there's circuitry in there capable of handling time intervals
of hundreds of microseconds, or perhaps better. Humans also use
echolocation, usually not consciously - I don't ordinarily notice it,
but I find that I can usually walk down a hallway with my eyes closed
without bumping into anything, and I can't think of any other explanation.)
- kinesthetic senses: your brain knows where in space the articulated parts
of your body are.
- weird stuff like sensing of thermal infrared (The temperature sensitivity
of human skin is actually about the same as that of the sensors of pit
vipers, though handicapped by being attached to warm flesh. Humans can't
image in thermal infrared like pit vipers, but they can detect the
proximity of hot or cold objects.) Also detection of electric field of
a charged object. Both absolute value and gradient may be distance cues
for "touch"-based depth perception.
- "technology-assisted" depth perception - anything using external devices,
for example a blind person's cane, radar, or a GPS receiver.
Note that many of these methods of depth perception are very peripheral
to stereo photography, but applications such as advanced "virtual reality"
will require the clever combination of multiple depth perception techniques
for the most effective creation of realism.
John Roberts
------------------------------
|