Visual factors in space perception

print Print
Please select which sections you would like to print:
verifiedCite
While every effort has been made to follow citation style rules, there may be some discrepancies. Please refer to the appropriate style manual or other sources if you have any questions.
Select Citation Style
Feedback
Corrections? Updates? Omissions? Let us know if you have suggestions to improve this article (requires login).
Thank you for your feedback

Our editors will review what you’ve submitted and determine whether to revise the article.

On casual consideration, it might be concluded that the perception of space is based exclusively on vision. After closer study, however, this so-called visual space is found to be supplemented perceptually by cues based on auditory (sense of hearing), kinesthetic (sense of bodily movement), olfactory (sense of smell), and gustatory (sense of taste) experience. Spatial cues, such as vestibular stimuli (sense of balance) and other modes for sensing body orientation, also contribute to perception. No single cue is perceived independently of another; in fact, experimental evidence shows these sensations combine to produce unified perceptual experiences.

Despite all this sensory input, most individuals receive the bulk of the information about their environment through the sense of sight, while balance or equilibrium (vestibular sense) apparently ranks next in importance. (For example, in a state of total darkness, an individual’s orientation in space depends mainly on sensory data derived from vestibular stimuli.) Visual stimuli most likely dominate human perception of space because vision is a distance sense; it can supply information from extremely distant points in the environment, reaching out to the stars themselves. Hearing is also considered a distance sense, as is smell, though the space they encompass is considerably more restricted than that of vision. All the other senses, such as touch and taste, are usually considered to be proximal senses, because they typically convey information about elements that come in direct contact with the individual.

The eye works along similar principles. While this is a rough comparison, it is possible to think of the retina (the back surface of the inside of the eye) as the film in a camera; the lens (within the eye) is analogous to the single lens of the camera (see eye). Just as in a portrait photographer’s camera, the picture (image) that is projected from the environment onto the retina is upside-down. The perceiver, however, does not experience space as turned upside down. Instead, a person’s perceptual mechanisms cause the world to be viewed as right side up. The exact nature of these mechanisms remains poorly understood, but the process of perception seems to involve at least two inversions: one (optical) inversion of the image on the retina and another (perceptual) inversion that is associated with nerve impulses in the visual tissues of the brain. Research suggests that the individual can adapt to a new set of visual stimulus cues that deviate considerably from those previously learned. Experiments have been carried out with people who have been given spectacles that reverse the right-left or up-down dimensions of images. At first the subjects become disoriented, but, after wearing the distorting glasses for a considerable period of time, they learn to cope with space correctly by reorienting to the environment until objects are perceived as right side up again. The process changes direction when the spectacles are removed. At first the basic visual dimensions appear reversed to the subject, but within a short time another adaptation occurs, and the subject reorients to the earlier, well-learned, normal visual cues and perceives the environment as normal once more.

Perception of depth and distance

The perception of depth and distance depends on information transmitted through various sense organs. Sensory cues indicate the distance at which objects in the environment are located from the perceiving individual and from each other. Such sense modalities as seeing and hearing transmit depth and distance cues and are largely independent of one another. Each modality by itself can produce consistent perception of the distances of objects. Ordinarily, however, the individual relies on the collaboration of all senses (so-called intermodal perception).

Gross tactile-kinesthetic cues

When perceiving the distances of objects located in nearby space, one depends on tactile (touch) sense. Tactile experience is usually considered in tandem with kinesthetic experience (sensations of muscle movements and of movements of the sense-organ surfaces). These tactile-kinesthetic sensations enable the individual to differentiate his own body from the surrounding environment. This means that the body can function as a perceptual frame of reference—that is, as a standard against which the distances of objects are gauged. Because the perception of one’s own body may vary from time to time, however, its role as a perceptual standard is not always consistent. It has been found that the way in which the environment is perceived can also affect the perception of one’s body.

Cues from the eye muscles

When one looks at an object at a distance, the effort arouses activity in two eye-muscle systems called the ciliary muscles and the rectus muscles. The ciliary effect is called accommodation (focusing the lens for near or far vision), and the rectus effect is called convergence (moving the entire eyeball). Each of these muscle systems contracts as a perceived object approaches. The effect of accommodation in this case is to make the lens more convex, while the rectus muscles rotate the eyes to converge on the object as it comes nearer. One’s experience of these muscle contractions provides clues about the distance of objects.

Beyond the cues of accommodation and convergence, the size of the retinal image indicates how far one is from an object. The larger the image on the retina, the closer one judges the object to be. Some perceptual learning theorists believe that these sensory cues activate inherited tendencies that allow the perception of such sensory attributes as size without the need for learning (the so-called nativistic theory). Modern efforts to study these cues have been especially directed toward physiological changes in the body that may be related to depth perception; whether one’s perception of depth is totally inborn, and thus independent of learning, remains controversial.

Accommodations and convergence provide reliable cues when the perceived object is at a distance of less than about 30 feet (9 metres) and when it is perceived binocularly (with both eyes at once).