Page: Textbook - Chapter 5
Chapter 5. Sensing and Perceiving
5. Sensing and Perceiving
Misperception by Those Trained to Accurately Perceive a Threat
On September 6, 2007, the Asia-Pacific Economic Cooperation (APEC) leaders’ summit was being held in
downtown Sydney, Australia. World leaders were attending the summit. Many roads in the area were closed
for security reasons, and police presence was high.
As a prank, eight members of the Australian television satire The Chaser’s War on Everything assembled
a false motorcade made up of two black four-wheel-drive vehicles, a black sedan, two motorcycles,
bodyguards, and chauffeurs (see the video below). Group member Chas Licciardello was in one of the cars
disguised as Osama bin Laden. The motorcade drove through Sydney’s central business district and entered
the security zone of the meeting. The motorcade was waved on by police, through two checkpoints, until the
Chaser group decided it had taken the gag far enough and stopped outside the InterContinental Hotel where
former U.S. president George W. Bush was staying. Licciardello stepped out onto the street and complained,
in character as bin Laden, about not being invited to the APEC Summit. Only at this time did the police
belatedly check the identity of the group members, finally arresting them.
Watch the Chaser APEC Motorcade Stunt [YouTube]
Afterward, the group testified that it had made little effort to disguise its attempt as anything more than
a prank. The group’s only realistic attempt to fool police was its Canadian-flag-marked vehicles. Other
than that, the group used obviously fake credentials, and its security passes were printed with “JOKE,”
“Insecurity,” and “It’s pretty obvious this isn’t a real pass,” all clearly visible to any police officer who
might have been troubled to look closely as the motorcade passed. The required APEC 2007 official vehicle
stickers had the name of the group’s show printed on them, and this text: “This dude likes trees and poetry
and certain types of carnivorous plants excite him.” In addition, a few of the “bodyguards” were carrying
camcorders, and one of the motorcyclists was dressed in jeans, both details that should have alerted police
that something was amiss.
The Chaser pranksters later explained the primary reason for the stunt. They wanted to make a statement
about the fact that bin Laden, a world leader, had not been invited to an APEC Summit where issues of terror
were being discussed. The secondary motive was to test the event’s security. The show’s lawyers approved
the stunt, under the assumption that the motorcade would be stopped at the APEC meeting.
The ability to detect and interpret the events that are occurring around us allows us to respond to these stimuli
appropriately (Gibson & Pick, 2000). In most cases the system is successful, but as you can see from the above
example, it is not perfect. In this chapter we will discuss the strengths and limitations of these capacities, focusing on
both sensation—awareness resulting from the stimulation of a sense organ — and perception—the organization
and interpretation of sensations. Sensation and perception work seamlessly together to allow us to experience the
world through our eyes, ears, nose, tongue, and skin, but also to combine what we are currently learning from the
environment with what we already know about it to make judgments and to choose appropriate behaviours.
156
The study of sensation and perception is exceedingly important for our everyday lives because the knowledge
generated by psychologists is used in so many ways to help so many people. Psychologists work closely with
mechanical and electrical engineers, with experts in defence and military contractors, and with clinical, health, and
sports psychologists to help them apply this knowledge to their everyday practices. The research is used to help us
understand and better prepare people to cope with such diverse events as driving cars, flying planes, creating robots,
and managing pain (Fajen & Warren, 2003).
Figure 5.1 Sports psychologists, video game designers, and mechanical engineers use knowledge
about sensation and perception to create and improve everyday objects and behaviours.
We will begin the chapter with a focus on the six senses of seeing, hearing, smelling, touching, tasting, and
monitoring the body’s positions (proprioception). We will see that sensation is sometimes relatively direct, in
the sense that the wide variety of stimuli around us inform and guide our behaviours quickly and accurately,
but nevertheless is always the result of at least some interpretation. We do not directly experience stimuli, but
rather we experience those stimuli as they are created by our senses. Each sense accomplishes the basic process of
transduction—the conversion of stimuli detected by receptor cells to electrical impulses that are then transported
to the brain — in different, but related, ways.
After we have reviewed the basic processes of sensation, we will turn to the topic of perception, focusing on how the
brain’s processing of sensory experience can not only help us make quick and accurate judgments, but also mislead
us into making perceptual and judgmental errors, such as those that allowed the Chaser group to breach security at
the APEC meeting.
References
Fajen, B. R., & Warren, W. H. (2003). Behavioral dynamics of steering, obstacle avoidance, and route
selection. Journal of Experimental Psychology: Human Perception and Performance, 29(2), 343–362.
Gibson, E. J., & Pick, A. D. (2000). An ecological approach to perceptual learning and development. New York,
NY: Oxford University Press.
Image Attributions
Figure 5.1: Caroline ouellette by Genevieve2 (http://en.wikipedia.org/wiki/
File:Caroline_Ouellette_8_janvier_2011.jpg) used under CC BY SA 3.0 license (http://creativecommons.org/
licenses/by-sa/3.0/deed.en); Arcade by Belinda Hankins Miller (http://it.wikipedia.org/wiki/
File:Arcade-20071020-a.jpg) used under CC BY 2.0 license (http://creativecommons.org/licenses/by/2.0/deed.it);
Niagara Bridge, Canada by Tony Hisgett (http://commons.wikimedia.org/wiki/File:Niagara_Bridge,_Canada.jpg)
used under CC BY 2.0 license (http://creativecommons.org/licenses/by/2.0/deed.en).
157 • INTRODUCTION TO PSYCHOLOGY - 1ST CANADIAN EDITION
5.1 We Experience Our World through Sensation
Learning Objectives
1. Review and summarize the capacities and limitations of human sensation.
2. Explain the difference between sensation and perception and describe how psychologists
measure sensory and difference thresholds.
Sensory Thresholds: What Can We Experience?
Humans possess powerful sensory capacities that allow us to sense the kaleidoscope of sights, sounds, smells, and
tastes that surround us. Our eyes detect light energy and our ears pick up sound waves. Our skin senses touch,
pressure, hot, and cold. Our tongues react to the molecules of the foods we eat, and our noses detect scents in the
air. The human perceptual system is wired for accuracy, and people are exceedingly good at making use of the wide
variety of information available to them (Stoffregen & Bardy, 2001).
In many ways our senses are quite remarkable. The human eye can detect the equivalent of a single candle flame
burning 30 miles away and can distinguish among more than 300,000 different colours. The human ear can detect
sounds as low as 20 hertz (vibrations per second) and as high as 20,000 hertz, and it can hear the tick of a clock
about 20 feet away in a quiet room. We can taste a teaspoon of sugar dissolved in two gallons of water, and we are
able to smell one drop of perfume diffused in a three-room apartment. We can feel the wing of a bee on our cheek
dropped from one centimeter above (Galanter, 1962).
Test your hearing
To get an idea of the range of sounds that the human ear can sense, test your hearing here: http://test-myhearing.
com
Although there is much that we do sense, there is even more that we do not. Dogs (Figure 5.2), bats, whales, and
some rodents all have much better hearing than we do, and many animals have a far richer sense of smell. Birds
are able to see the ultraviolet light that we cannot (see Figure 5.3, “Ultraviolet Light and Bird Vision”) and can
also sense the pull of the earth’s magnetic field. Cats have an extremely sensitive and sophisticated sense of touch,
and they are able to navigate in complete darkness using their whiskers. The fact that different organisms have
different sensations is part of their evolutionary adaptation. Each species is adapted to sensing the things that are
most important to them, while being blissfully unaware of the things that don’t matter.
158
Figure 5.2 Smell. The dog’s highly sensitive sense of smell is
useful for searches of missing persons, explosives, foods, and
drugs.
Figure 5.3 Ultraviolet Light and Bird Vision. Birds can see ultraviolet
light; humans cannot. What looks like a black bird to us is in colour for a
bird.
Measuring Sensation
Psychophysics is the branch of psychology that studies the effects of physical stimuli on sensory perceptions and
mental states. The field of psychophysics was founded by the German psychologist Gustav Fechner (1801-1887),
who was the first to study the relationship between the strength of a stimulus and a person’s ability to detect the
stimulus.
159 • INTRODUCTION TO PSYCHOLOGY - 1ST CANADIAN EDITION
The measurement techniques developed by Fechner and his colleagues are designed in part to help determine
the limits of human sensation. One important criterion is the ability to detect very faint stimuli. The absolute
threshold of a sensation is defined as the intensity of a stimulus that allows an organism to just barely detect it.
In a typical psychophysics experiment, an individual is presented with a series of trials in which a signal is
sometimes presented and sometimes not, or in which two stimuli are presented that are either the same or different.
Imagine, for instance, that you were asked to take a hearing test. On each of the trials your task is to indicate either
“yes” if you heard a sound or “no” if you did not. The signals are purposefully made to be very faint, making
accurate judgments difficult.
The problem for you is that the very faint signals create uncertainty. Because our ears are constantly sending
background information to the brain, you will sometimes think that you heard a sound when none was there, and
you will sometimes fail to detect a sound that is there. Your task is to determine whether the neural activity that you
are experiencing is due to the background noise alone or is the result of a signal within the noise.
The responses that you give on the hearing test can be analyzed using signal detection analysis. Signal detection
analysis is a technique used to determine the ability of the perceiver to separate true signals from background noise
(Macmillan & Creelman, 2005; Wickens, 2002). As you can see in Figure 5.4, “Outcomes of a Signal Detection
Analysis,” each judgment trial creates four possible outcomes: A hit occurs when you, as the listener, correctly say
“yes” when there was a sound. A false alarm occurs when you respond “yes” to no signal. In the other two cases
you respond “no” — either a miss (saying “no” when there was a signal) or a correct rejection (saying “no” when
there was in fact no signal).
Figure 5.4 Outcomes of a Signal Detection Analysis. Our ability to accurately detect stimuli
is measured using a signal detection analysis. Two of the possible decisions (hits and correct
rejections) are accurate; the other two (misses and false alarms) are errors.
The analysis of the data from a psychophysics experiment creates two measures. One measure, known as sensitivity,
refers to the true ability of the individual to detect the presence or absence of signals. People who have better
hearing will have higher sensitivity than will those with poorer hearing. The other measure, response bias, refers to
a behavioural tendency to respond “yes” to the trials, which is independent of sensitivity.
5.1 WE EXPERIENCE OUR WORLD THROUGH SENSATION • 160
Imagine, for instance, that rather than taking a hearing test, you are a soldier on guard duty, and your job is to detect
the very faint sound of the breaking of a branch that indicates that an enemy is nearby. You can see that in this
case making a false alarm by alerting the other soldiers to the sound might not be as costly as a miss (a failure to
report the sound), which could be deadly. Therefore, you might well adopt a very lenient response bias in which
whenever you are at all unsure, you send a warning signal. In this case your responses may not be very accurate
(your sensitivity may be low because you are making a lot of false alarms) and yet the extreme response bias can
save lives.
Another application of signal detection occurs when medical technicians study body images for the presence of
cancerous tumours. Again, a miss (in which the technician incorrectly determines that there is no tumour) can be
very costly, but false alarms (referring patients who do not have tumours to further testing) also have costs. The
ultimate decisions that the technicians make are based on the quality of the signal (clarity of the image), their
experience and training (the ability to recognize certain shapes and textures of tumours), and their best guesses about
the relative costs of misses versus false alarms.
Although we have focused to this point on the absolute threshold, a second important criterion concerns the ability
to assess differences between stimuli. The difference threshold (or just noticeable difference [JND]), refers to
the change in a stimulus that can just barely be detected by the organism. The German physiologist Ernst Weber
(1795-1878) made an important discovery about the JND — namely, that the ability to detect differences depends
not so much on the size of the difference but on the size of the difference in relation to the absolute size of the
stimulus. Weber’s law maintains that the just noticeable difference of a stimulus is a constant proportion of the
original intensity of the stimulus. As an example, if you have a cup of coffee that has only a very little bit of sugar
in it (say one teaspoon), adding another teaspoon of sugar will make a big difference in taste. But if you added that
same teaspoon to a cup of coffee that already had five teaspoons of sugar in it, then you probably wouldn’t taste
the difference as much (in fact, according to Weber’s law, you would have to add five more teaspoons to make the
same difference in taste).
One interesting application of Weber’s law is in our everyday shopping behaviour. Our tendency to perceive cost
differences between products is dependent not only on the amount of money we will spend or save, but also on the
amount of money saved relative to the price of the purchase. For example, if you were about to buy a soda or candy
bar in a convenience store, and the price of the items ranged from $1 to $3, you would likely think that the $3 item
cost “a lot more” than the $1 item. But now imagine that you were comparing between two music systems, one that
cost $397 and one that cost $399. Probably you would think that the cost of the two systems was “about the same,”
even though buying the cheaper one would still save you $2.
Research Focus: Influence without Awareness
If you study Figure 5.5, “Absolute Threshold,” you will see that the absolute threshold is the point where
we become aware of a faint stimulus. After that point, we say that the stimulus is conscious because we
can accurately report on its existence (or its nonexistence) more than 50% of the time. But can subliminal
stimuli (events that occur below the absolute threshold and of which we are not conscious) have an influence
on our behaviour?
A variety of research programs have found that subliminal stimuli can influence our judgments and
behaviour, at least in the short term (Dijksterhuis, 2010). But whether the presentation of subliminal stimuli
can influence the products that we buy has been a more controversial topic in psychology. In one relevant
161 • INTRODUCTION TO PSYCHOLOGY - 1ST CANADIAN EDITION
Figure 5.5 Absolute Threshold. As the intensity of a stimulus increases, we are more likely to
perceive it. Stimuli below the absolute threshold can still have at least some influence on us, even
though we cannot consciously detect them.
experiment, Karremans, Stroebe, and Claus (2006) had Dutch college students view a series of computer
trials in which a string of letters such as BBBBBBBBB or BBBbBBBBB were presented on the screen. To
be sure they paid attention to the display, the students were asked to note whether the strings contained a
small b. However, immediately before each of the letter strings, the researchers presented either the name
of a drink that is popular in Holland (Lipton Ice) or a control string containing the same letters as Lipton
Ice (NpeicTol). These words were presented so quickly (for only about one-fiftieth of a second) that the
participants could not see them.
Then the students were asked to indicate their intention to drink Lipton Ice by answering questions such as
“If you would sit on a terrace now, how likely is it that you would order Lipton Ice,” and also to indicate
how thirsty they were at the time. The researchers found that the students who had been exposed to the
“Lipton Ice” words (and particularly those who indicated that they were already thirsty) were significantly
5.1 WE EXPERIENCE OUR WORLD THROUGH SENSATION • 162
more likely to say that they would drink Lipton Ice than were those who had been exposed to the control
words.
If they were effective, procedures such as this (we can call the technique “subliminal advertising” because it
advertises a product outside awareness) would have some major advantages for advertisers, because it would
allow them to promote their products without directly interrupting the consumers’ activity and without the
consumers’ knowing they are being persuaded. People cannot counterargue with, or attempt to avoid being
influenced by, messages received outside awareness. Due to fears that people may be influenced without
their knowing, subliminal advertising has been banned in many countries, including Australia, Canada, Great
Britain, the United States, and Russia.
Although it has been proven to work in some research, subliminal advertising’s effectiveness is still
uncertain. Charles Trappey (1996) conducted a meta-analysis in which he combined 23 leading research
studies that had tested the influence of subliminal advertising on consumer choice. The results showed that
subliminal advertising had a negligible effect on consumer choice. Saegert (1987, p. 107) concluded that
“marketing should quit giving subliminal advertising the benefit of the doubt,” arguing that the influences of
subliminal stimuli are usually so weak that they are normally overshadowed by the person’s own decision
making about the behaviour.
Taken together then, the evidence for the effectiveness of subliminal advertising is weak, and its effects may
be limited to only some people and in only some conditions. You probably don’t have to worry too much
about being subliminally persuaded in your everyday life, even if subliminal ads are allowed in your country.
But even if subliminal advertising is not all that effective itself, there are plenty of other indirect advertising
techniques that are used and that do work. For instance, many ads for automobiles and alcoholic beverages
are subtly sexualized, which encourages the consumer to indirectly (even if not subliminally) associate these
products with sexuality. And there is the ever more frequent “product placement” technique, where images
of brands (cars, sodas, electronics, and so forth) are placed on websites and in popular television shows
and movies. Harris, Bargh, & Brownell (2009) found that being exposed to food advertising on television
significantly increased child and adult snacking behaviours, again suggesting that the effects of perceived
images, even if presented above the absolute threshold, may nevertheless be very subtle.
Another example of processing that occurs outside our awareness is seen when certain areas of the visual cortex
are damaged, causing blindsight, a condition in which people are unable to consciously report on visual stimuli
but nevertheless are able to accurately answer questions about what they are seeing. When people with blindsight
are asked directly what stimuli look like, or to determine whether these stimuli are present at all, they cannot do
so at better than chance levels. They report that they cannot see anything. However, when they are asked more
indirect questions, they are able to give correct answers. For example, people with blindsight are able to correctly
determine an object’s location and direction of movement, as well as identify simple geometrical forms and patterns
(Weiskrantz, 1997). It seems that although conscious reports of the visual experiences are not possible, there is still
a parallel and implicit process at work, enabling people to perceive certain aspects of the stimuli.
Key Takeaways
• Sensation is the process of receiving information from the environment through our sensory
163 • INTRODUCTION TO PSYCHOLOGY - 1ST CANADIAN EDITION
organs. Perception is the process of interpreting and organizing the incoming information so that
we can understand it and react accordingly.
• Transduction is the conversion of stimuli detected by receptor cells to electrical impulses that are
transported to the brain.
• Although our experiences of the world are rich and complex, humans — like all species — have
their own adapted sensory strengths and sensory limitations.
• Sensation and perception work together in a fluid, continuous process.
• Our judgments in detection tasks are influenced by both the absolute threshold of the signal as
well as our current motivations and experiences. Signal detection analysis is used to differentiate
sensitivity from response biases.
• The difference threshold, or just noticeable difference, is the ability to detect the smallest change
in a stimulus about 50% of the time. According to Weber’s law, the just noticeable difference
increases in proportion to the total intensity of the stimulus.
• Research has found that stimuli can influence behaviour even when they are presented below the
absolute threshold (i.e., subliminally). The effectiveness of subliminal advertising, however, has
not been shown to be of large magnitude.
Exercises and Critical Thinking
1. Leaf through a magazine or watch several advertisements on television and pay attention to the
persuasive techniques being used. What impact are these ads having on your senses? Based on
what you know about psychophysics, sensation, and perception, what are some of the reasons why
subliminal advertising might be banned in some countries?
2. If we pick up two letters, one that weighs one ounce and one that weighs two ounces, we can
notice the difference. But if we pick up two packages, one that weighs three pounds one ounce,
and one that weighs three pounds two ounces, we can’t tell the difference. Why?
3. Take a moment and lie down quietly in your bedroom. Notice the variety and levels of what
you can see, hear, and feel. Does this experience help you understand the idea of the absolute
threshold?
References
Dijksterhuis, A. (2010). Automaticity and the unconscious. In S. T. Fiske, D. T. Gilbert, & G. Lindzey
(Eds.), Handbook of social psychology (5th ed., Vol. 1, pp. 228–267). Hoboken, NJ: John Wiley & Sons.
Galanter, E. (1962). Contemporary Psychophysics. In R. Brown, E. Galanter, E. H. Hess, & G. Mandler (Eds.), New
directions in psychology. New York, NY: Holt, Rinehart and Winston.
5.1 WE EXPERIENCE OUR WORLD THROUGH SENSATION • 164
Harris, J. L., Bargh, J. A., & Brownell, K. D. (2009). Priming effects of television food advertising on eating
behavior. Health Psychology, 28(4), 404–413.
Karremans, J. C., Stroebe, W., & Claus, J. (2006). Beyond Vicary’s fantasies: The impact of subliminal priming and
brand choice. Journal of Experimental Social Psychology, 42(6), 792–798.
Macmillan, N. A., & Creelman, C. D. (2005). Detection theory: A user’s guide (2nd ed). Mahwah, NJ: Lawrence
Erlbaum Associates.
Saegert, J. (1987). Why marketing should quit giving subliminal advertising the benefit of the doubt. Psychology
and Marketing, 4(2), 107–120.
Stoffregen, T. A., & Bardy, B. G. (2001). On specification and the senses. Behavioral and Brain Sciences, 24(2),
195–261.
Trappey, C. (1996). A meta-analysis of consumer choice and subliminal advertising. Psychology and Marketing,
13, 517–530.
Weiskrantz, L. (1997). Consciousness lost and found: A neuropsychological exploration. New York, NY: Oxford
University Press.
Wickens, T. D. (2002). Elementary signal detection theory. New York, NY: Oxford University Press.
Image Attributions
Figure 5.2: Police officer with sniffer dog by Harald Dettenborn, http://commons.wikimedia.org/wiki/
File:Msc2010_dett_0036.jpg used under CC BY 3.0 license(http://creativecommons.org/licenses/by/3.0/de/
deed.en).
Figure 5.3: Adapted from Fatal Light Awareness Program. (2008), http://www.flap.org/research.htm.
165 • INTRODUCTION TO PSYCHOLOGY - 1ST CANADIAN EDITION
5.2 Seeing
Learning Objectives
1. Identify the key structures of the eye and the role they play in vision.
2. Summarize how the eye and the visual cortex work together to sense and perceive the visual
stimuli in the environment, including processing colours, shape, depth, and motion.
Whereas other animals rely primarily on hearing, smell, or touch to understand the world around them, human
beings rely in large part on vision. A large part of our cerebral cortex is devoted to seeing, and we have substantial
visual skills. Seeing begins when light falls on the eyes, initiating the process of transduction. Once this visual
information reaches the visual cortex, it is processed by a variety of neurons that detect colours, shapes, and motion,
and that create meaningful perceptions out of the incoming stimuli.
The air around us is filled with a sea of electromagnetic energy: pulses of energy waves that can carry information
from place to place. As you can see in Figure 5.6, “The Electromagnetic Spectrum,” electromagnetic waves vary
in their wavelength — the distance between one wave peak and the next wave peak — with the shortest gamma
waves being only a fraction of a millimeter in length and the longest radio waves being hundreds of kilometers long.
Humans are blind to almost all of this energy — our eyes detect only the range from about 400 to 700 billionths of
a meter, the part of the electromagnetic spectrum known as the visible spectrum.
The Sensing Eye and the Perceiving Visual Cortex
As you can see in Figure 5.7, “Anatomy of the Human Eye,” light enters the eye through the cornea, a clear
covering that protects the eye and begins to focus the incoming light. The light then passes through the pupil, a
small opening in the centre of the eye. The pupil is surrounded by the iris, the coloured part of the eye that controls
the size of the pupil by constricting or dilating in response to light intensity. When we enter a dark movie theatre on
a sunny day, for instance, muscles in the iris open the pupil and allow more light to enter. Complete adaptation to
the dark may take up to 20 minutes.
Behind the pupil is the lens, a structure that focuses the incoming light on the retina, the layer of tissue at the back
of the eye that contains photoreceptor cells. As our eyes move from near objects to distant objects, a process known
as visual accommodation occurs. Visual accommodation is the process of changing the curvature of the lens to
keep the light entering the eye focused on the retina. Rays from the top of the image strike the bottom of the retina
and vice versa, and rays from the left side of the image strike the right part of the retina and vice versa, causing the
image on the retina to be upside down and backward. Furthermore, the image projected on the retina is flat, and yet
our final perception of the image will be three dimensional.
Accommodation is not always perfect (Figure 5.8) if the focus is in front of the retina, we say that the person is
nearsighted, and when the focus is behind the retina, we say that the person is farsighted. Eyeglasses and contact
166
Figure 5.7 Anatomy of the Human Eye. Light enters the eye through the transparent cornea,
passing through the pupil at the centre of the iris. The lens adjusts to focus the light on the retina,
where it appears upside down and backward. Receptor cells on the retina send information via the
optic nerve to the visual cortex.
lenses correct this problem by adding another lens in front of the eye, and laser eye surgery corrects the problem by
reshaping the eye’s own lens.
Figure 5.8 Normal, Nearsighted, and Farsighted Eyes. For people with normal vision (left), the lens properly focuses incoming light on the retina.
For people who are nearsighted (centre), images from far objects focus too far in front of the retina, whereas for people who are farsighted (right),
images from near objects focus too far behind the retina. Eyeglasses solve the problem by adding a secondary, corrective lens.
The retina contains layers of neurons specialized to respond to light (see Figure 5.9, “The Retina with Its Specialized
Cells”). As light falls on the retina, it first activates receptor cells known as rods and cones. The activation of these
cells then spreads to the bipolar cells and then to the ganglion cells, which gather together and converge, like the
167 • INTRODUCTION TO PSYCHOLOGY - 1ST CANADIAN EDITION
strands of a rope, forming the optic nerve. The optic nerve is a collection of millions of ganglion neurons that sends
vast amounts of visual information, via the thalamus, to the brain. Because the retina and the optic nerve are active
processors and analyzers of visual information, it is appropriate to think of these structures as an extension of the
brain itself.
Figure 5.9 The Retina with Its Specialized Cells. When light falls on the retina, it creates a
photochemical reaction in the rods and cones at the back of the retina. The reactions then continue
to the bipolar cells, the ganglion cells, and eventually to the optic nerve.
Rods are visual neurons that specialize in detecting black, white, and gray colours. There are about 120 million rods
in each eye. The rods do not provide a lot of detail about the images we see, but because they are highly sensitive to
shorter-waved (darker) and weak light, they help us see in dim light — for instance, at night. Because the rods are
located primarily around the edges of the retina, they are particularly active in peripheral vision (when you need to
see something at night, try looking away from what you want to see). Cones are visual neurons that are specialized
in detecting fine detail and colours. The five million or so cones in each eye enable us to see in colour, but they
operate best in bright light. The cones are located primarily in and around the fovea, which is the central point of
the retina.
To demonstrate the difference between rods and cones in attention to detail, choose a word in this text and focus
on it. Do you notice that the words a few inches to the side seem more blurred? This is because the word you
are focusing on strikes the detail-oriented cones, while the words surrounding it strike the less-detail-oriented rods,
which are located on the periphery.
Margaret Livingstone (2000) (Figure 5.10) found an interesting effect that demonstrates the different processing
capacities of the eye’s rods and cones — namely, that the Mona Lisa’s smile, which is widely referred to as
“elusive,” is perceived differently depending on how one looks at the painting. Because Leonardo da Vinci painted
the smile in low-detail brush strokes, these details are better perceived by our peripheral vision (the rods) than by
the cones. Livingstone found that people rated the Mona Lisa as more cheerful when they were instructed to focus
on her eyes than they did when they were asked to look directly at her mouth. As Livingstone put it, “She smiles
until you look at her mouth, and then it fades, like a dim star that disappears when you look directly at it.”
As you can see in Figure 5.11, “Pathway of Visual Images through the Thalamus and into the Visual Cortex,”
5.2 SEEING • 168
Figure 5.10 Mona Lisa’s Smile.
the sensory information received by the retina is relayed through the thalamus to corresponding areas in the visual
cortex, which is located in the occipital lobe at the back of the brain. Although the principle of contralateral control
might lead you to expect that the left eye would send information to the right brain hemisphere and vice versa, nature
is smarter than that. In fact, the left and right eyes each send information to both the left and the right hemisphere,
and the visual cortex processes each of the cues separately and in parallel. This is an adaptational advantage to an
organism that loses sight in one eye, because even if only one eye is functional, both hemispheres will still receive
input from it.
The visual cortex is made up of specialized neurons that turn the sensations they receive from the optic nerve into
meaningful images. Because there are no photoreceptor cells at the place where the optic nerve leaves the retina, a
hole or blind spot in our vision is created (see Figure 5.12, “Blind Spot Demonstration”). When both of our eyes
are open, we don’t experience a problem because our eyes are constantly moving, and one eye makes up for what
the other eye misses. But the visual system is also designed to deal with this problem if only one eye is open — the
visual cortex simply fills in the small hole in our vision with similar patterns from the surrounding areas, and we
never notice the difference. The ability of the visual system to cope with the blind spot is another example of how
sensation and perception work together to create meaningful experience.
Perception is created in part through the simultaneous action of thousands of feature detector neurons —
specialized neurons, located in the visual cortex, that respond to the strength, angles, shapes, edges, and movements
of a visual stimulus (Kelsey, 1997; Livingstone & Hubel, 1988). The feature detectors work in parallel, each
performing a specialized function. When faced with a red square, for instance, the parallel line feature detectors,
the horizontal line feature detectors, and the red colour feature detectors all become activated. This activation is
then passed on to other parts of the visual cortex, where other neurons compare the information supplied by the
169 • INTRODUCTION TO PSYCHOLOGY - 1ST CANADIAN EDITION
Figure 5.11 Pathway of Visual Images through the Thalamus and into the Visual Cortex. The left
and right eyes each send information to both the left and the right brain hemisphere.
Figure 5.12 Blind Spot Demonstration. You can get an idea of the extent of your blind spot (the
place where the optic nerve leaves the retina) by trying this: close your left eye and stare with your
right eye at the cross in the diagram. You should be able to see the elephant image to the right
(don’t look at it, just notice that it is there). If you can’t see the elephant, move closer or farther
away until you can. Now slowly move so that you are closer to the image while you keep looking
at the cross. At one distance (probably a foot or so), the elephant will completely disappear from
view because its image has fallen on the blind spot.
feature detectors with images stored in memory. Suddenly, in a flash of recognition, the many neurons fire together,
creating the single image of the red square that we experience (Rodriguez et al., 1999). See Figure 5.13 for an
explanation about the Necker cube.
Some feature detectors are tuned to selectively respond to particularly important objects, such as faces, smiles,
and other parts of the body (Downing, Jiang, Shuman, & Kanwisher, 2001; Haxby et al., 2001). When researchers
disrupted face recognition areas of the cortex using the magnetic pulses of transcranial magnetic stimulation (TMS),
people were temporarily unable to recognize faces, and yet they were still able to recognize houses (McKone,
Kanwisher, & Duchaine, 2007; Pitcher, Walsh, Yovel, & Duchaine, 2007).
5.2 SEEING • 170
Figure 5.13 The Necker Cube. The Necker cube is an example
of how the visual system creates perceptions out of sensations.
We do not see a series of lines but, rather, a cube. Which
cube we see varies depending on the momentary outcome of
perceptual processes in the visual cortex.
Perceiving Colour
It has been estimated that the human visual system can detect and discriminate among seven million colour
variations (Geldard, 1972), but these variations are all created by the combinations of the three primary colours: red,
green, and blue. The shade of a colour, known as hue, is conveyed by the wavelength of the light that enters the eye
(we see shorter wavelengths as more blue and longer wavelengths as more red), and we detect brightness from the
intensity or height of the wave (bigger or more intense waves are perceived as brighter), as shown in Figure 5.14.
In his important research on colour vision, Hermann von Helmholtz (1821-1894) theorized that colour is perceived
because the cones in the retina come in three types. One type of cone reacts primarily to blue light (short
wavelengths), another reacts primarily to green light (medium wavelengths), and a third reacts primarily to red light
(long wavelengths). The visual cortex then detects and compares the strength of the signals from each of the three
types of cones, creating the experience of colour. According to this Young-Helmholtz trichromatic colour theory
what colour we see depends on the mix of the signals from the three types of cones. If the brain is receiving primarily
red and blue signals, for instance, it will perceive purple; if it is receiving primarily red and green signals it will
perceive yellow; and if it is receiving messages from all three types of cones it will perceive white.
The different functions of the three types of cones are apparent in people who experience colour blindness — the
inability to detect green and/or red colours. About one in 50 people, mostly men, lack functioning in the red- or
green-sensitive cones, leaving them only able to experience either one or two colours (Figure 5.15).
The trichromatic colour theory cannot explain all of human vision, however. For one, although the colour purple
does appear to us as a mix of red and blue, yellow does not appear to be a mix of red and green. And people with
colour blindness, who cannot see either green or red, nevertheless can still see yellow. An alternative approach to
the Young-Helmholtz theory, known as the opponent-process colour theory, proposes that we analyze sensory
information not in terms of three colours but rather in three sets of “opponent colours”: red-green, yellow-blue,
and white-black. Evidence for the opponent-process theory comes from the fact that some neurons in the retina and
in the visual cortex are excited by one colour (e.g., red) but inhibited by another colour (e.g., green).
171 • INTRODUCTION TO PSYCHOLOGY - 1ST CANADIAN EDITION
Figure 5.14 Low- and High-Frequency Sine Waves and Low- and High-Intensity Sine
Waves and Their Corresponding Colours. Light waves with shorter frequencies are
perceived as more blue than red; light waves with higher intensity are seen as brighter.
Figure 5.15 Colour Blindness. People with normal colour vision can see the number 42 in the first
image and the number 12 in the second (they are vague but apparent). However, people who are
colour blind cannot see the numbers at all.
One example of opponent processing occurs in the experience of an afterimage. If you stare at the shape on the top
left side of Figure 5.16, “Afterimages,” for about 30 seconds (the longer you look, the better the effect), and then
move your eyes to the blank area to the right of it, you will see the afterimage. Now try this by staring at the image
of the Italian flag below and then shifting your eyes to the blank area beside it. When we stare at the green stripe,
our green receptors habituate and begin to process less strongly, whereas the red receptors remain at full strength.
5.2 SEEING • 172
When we switch our gaze, we see primarily the red part of the opponent process. Similar processes create blue after
yellow and white after black.
Figure 5.16 Afterimages.
The tricolour and the opponent-process mechanisms work together to produce colour vision. When light rays enter
the eye, the red, blue, and green cones on the retina respond in different degrees and send different strength signals
of red, blue, and green through the optic nerve. The colour signals are then processed both by the ganglion cells and
by the neurons in the visual cortex (Gegenfurtner & Kiper, 2003).
Perceiving Form
One of the important processes required in vision is the perception of form. German psychologists in the 1930s
and 1940s, including Max Wertheimer (1880-1943), Kurt Koffka (1886-1941), and Wolfgang K.hler (1887-1967),
argued that we create forms out of their component sensations based on the idea of the gestalt, a meaningfully
organized whole. The idea of the gestalt is that the “whole is more than the sum of its parts.” Some examples of how
gestalt principles lead us to see more than what is actually there are summarized in Table 5.1, “Summary of Gestalt
Principles of Form Perception.”
173 • INTRODUCTION TO PSYCHOLOGY - 1ST CANADIAN EDITION
Table 5.1 Summary of Gestalt Principles of Form Perception.
[Skip Table]
Principle Description Example Image
Figure
and
ground
We structure
input so that
we always
see a figure
(image)
against a
ground
(background).
At right, you may see a vase or
you may see two faces, but in
either case, you will organize the
image as a figure against a ground.
Similarity
Stimuli that
are similar to
each other
tend to be
grouped
together.
You are more likely to see three
similar columns among the XYX
characters at right than you are to
see four rows.
5.2 SEEING • 174
[Skip Table]
Principle Description Example Image
Proximity
We tend to
group nearby
figures
together.
Do you see four or eight images at
right? Principles of proximity
suggest that you might see only
four.
Continuity
We tend to
perceive
stimuli in
smooth,
continuous
ways rather
than in more
discontinuous
ways.
At right, most people see a line of
dots that moves from the lower
left to the upper right, rather than
a line that moves from the left and
then suddenly turns down. The
principle of continuity leads us to
see most lines as following the
smoothest possible path.
175 • INTRODUCTION TO PSYCHOLOGY - 1ST CANADIAN EDITION
[Skip Table]
Principle Description Example Image
Closure
We tend to
fill in gaps in
an incomplete
image to
create a
complete,
whole object.
Closure leads us to see a single
spherical object at right rather than
a set of unrelated cones.
Perceiving Depth
Depth perception is the ability to perceive three-dimensional space and to accurately judge distance. Without
depth perception, we would be unable to drive a car, thread a needle, or simply navigate our way around the
supermarket (Howard & Rogers, 2001). Research has found that depth perception is in part based on innate
capacities and in part learned through experience (Witherington, 2005).
Psychologists Eleanor Gibson and Richard Walk (1960) tested the ability to perceive depth in six- to 14-month-old
infants by placing them on a visual cliff, a mechanism that gives the perception of a dangerous drop-off, in which
infants can be safely tested for their perception of depth (Figure 5.17 “Visual Cliff”). The infants were placed on one
side of the “cliff,” while their mothers called to them from the other side. Gibson and Walk found that most infants
either crawled away from the cliff or remained on the board and cried because they wanted to go to their mothers,
but the infants perceived a chasm that they instinctively could not cross. Further research has found that even very
young children who cannot yet crawl are fearful of heights (Campos, Langer, & Krowitz, 1970). On the other hand,
studies have also found that infants improve their hand-eye coordination as they learn to better grasp objects and as
they gain more experience in crawling, indicating that depth perception is also learned (Adolph, 2000).
Depth perception is the result of our use of depth cues, messages from our bodies and the external environment
that supply us with information about space and distance. Binocular depth cues are depth cues that are created by
retinal image disparity — that is, the space between our eyes — and which thus require the coordination of both
eyes. One outcome of retinal disparity is that the images projected on each eye are slightly different from each other.
The visual cortex automatically merges the two images into one, enabling us to perceive depth. Three-dimensional
movies make use of retinal disparity by using 3-D glasses that the viewer wears to create a different image on each
eye. The perceptual system quickly, easily, and unconsciously turns the disparity into 3-D.
5.2 SEEING • 176
Figure 5.17 Visual Cliff. Babies appear to have the innate ability to perceive depth, as seen by this
baby’s reluctance to cross the “visual cliff.”
An important binocular depth cue is convergence, the inward turning of our eyes that is required to focus on objects
that are less than about 50 feet away from us. The visual cortex uses the size of the convergence angle between the
eyes to judge the object’s distance. You will be able to feel your eyes converging if you slowly bring a finger closer
to your nose while continuing to focus on it. When you close one eye, you no longer feel the tension—convergence
is a binocular depth cue that requires both eyes to work.
The visual system also uses accommodation to help determine depth. As the lens changes its curvature to focus on
distant or close objects, information relayed from the muscles attached to the lens helps us determine an object’s
distance. Accommodation is only effective at short viewing distances, however, so while it comes in handy when
threading a needle or tying shoelaces, it is far less effective when driving or playing sports.
Although the best cues to depth occur when both eyes work together, we are able to see depth even with one eye
closed. Monocular depth cues are depth cues that help us perceive depth using only one eye (Sekuler & Blake,
2006). Some of the most important are summarized in Table 5.2, “Monocular Depth Cues That Help Us Judge
Depth at a Distance.”
177 • INTRODUCTION TO PSYCHOLOGY - 1ST CANADIAN EDITION
Table 5.2 Monocular Depth Cues That Help Us Judge Depth at a Distance.
[Skip Table]
Name Description Example Image
Position
We tend to see objects
higher up in our field
of vision as farther
away.
The fence posts at
right appear farther
away not only because
they become smaller
but also because they
appear higher up in the
picture.
Relative
size
Assuming that the
objects in a scene are
the same size, smaller
objects are perceived
as farther away.
At right, the cars in the
distance appear
smaller than those
nearer to us.
Linear
perspective
Parallel lines appear to
converge at a distance.
We know that the
tracks at right are
parallel. When they
appear closer together,
we determine they are
farther away.
5.2 SEEING • 178
[Skip Table]
Name Description Example Image
Light and
shadow
The eye receives more
reflected light from
objects that are closer
to us. Normally, light
comes from above, so
darker images are in
shadow.
We see the images at
right as extending and
indented according to
their shadowing. If we
invert the picture, the
images will reverse.
Interposition
When one object
overlaps another
object, we view it as
closer.
At right, because the
blue star covers the
pink bar, it is seen as
closer than the yellow
moon.
179 • INTRODUCTION TO PSYCHOLOGY - 1ST CANADIAN EDITION
[Skip Table]
Name Description Example Image
Aerial
perspective
Objects that appear
hazy, or that are
covered with smog or
dust, appear farther
away.
The artist who painted
the picture on the right
used aerial perspective
to make the clouds
more hazy and thus
appear farther away.
Perceiving Motion
Many animals, including human beings, have very sophisticated perceptual skills that allow them to coordinate their
own motion with the motion of moving objects in order to create a collision with that object. Bats and birds use this
mechanism to catch up with prey, dogs use it to catch a Frisbee, and humans use it to catch a moving football. The
brain detects motion partly from the changing size of an image on the retina (objects that look bigger are usually
closer to us) and in part from the relative brightness of objects.
We also experience motion when objects near each other change their appearance. The beta effect refers to the
perception of motion that occurs when different images are presented next to each other in succession (see “Beta
Effect and Phi Phenomenon”). The visual cortex fills in the missing part of the motion and we see the object moving.
The beta effect is used in movies to create the experience of motion. A related effect is the phi phenomenon, in
which we perceive a sensation of motion caused by the appearance and disappearance of objects that are near
each other. The phi phenomenon looks like a moving zone or cloud of background colour surrounding the flashing
objects. The beta effect and the phi phenomenon are other examples of the importance of the gestalt—our tendency
to “see more than the sum of the parts.”
Beta Effect and Phi Phenomenon
In the beta effect, our eyes detect motion from a series of still images, each with the object in a different
place. This is the fundamental mechanism of motion pictures (movies). In the phi phenomenon, the
perception of motion is based on the momentary hiding of an image.
Phi phenomenon: http://upload.wikimedia.org/wikipedia/commons/6/6e/Lilac-Chaser.gif
Beta effect: http://upload.wikimedia.org/wikipedia/commons/0/09/Phi_phenomenom_no_watermark.gif
5.2 SEEING • 180
Key Takeaways
• Vision is the process of detecting the electromagnetic energy that surrounds us. Only a small
fraction of the electromagnetic spectrum is visible to humans.
• The visual receptor cells on the retina detect shape, colour, motion, and depth.
• Light enters the eye through the transparent cornea and passes through the pupil at the centre of
the iris. The lens adjusts to focus the light on the retina, where it appears upside down and
backward. Receptor cells on the retina are excited or inhibited by the light and send information to
the visual cortex through the optic nerve.
• The retina has two types of photoreceptor cells: rods, which detect brightness and respond to
black and white, and cones, which respond to red, green, and blue. Colour blindness occurs when
people lack function in the red- or green-sensitive cones.
• Feature detector neurons in the visual cortex help us recognize objects, and some neurons respond
selectively to faces and other body parts.
• The Young-Helmholtz trichromatic colour theory proposes that colour perception is the result of
the signals sent by the three types of cones, whereas the opponent-process colour theory proposes
that we perceive colour as three sets of opponent colours: red-green, yellow-blue, and whiteblack.
• The ability to perceive depth occurs as the result of binocular and monocular depth cues.
• Motion is perceived as a function of the size and brightness of objects. The beta effect and the phi
phenomenon are examples of perceived motion.
Exercises and Critical Thinking
1. Consider some ways that the processes of visual perception help you engage in an everyday
activity, such as driving a car or riding a bicycle.
2. Imagine for a moment what your life would be like if you couldn’t see. Do you think you
would be able to compensate for your loss of sight by using other senses?
References
Adolph, K. E. (2000). Specificity of learning: Why infants fall over a veritable cliff. Psychological Science, 11(4),
290–295.
Campos, J. J., Langer, A., & Krowitz, A. (1970). Cardiac responses on the visual cliff in prelocomotor human
infants. Science, 170(3954), 196–197.
181 • INTRODUCTION TO PSYCHOLOGY - 1ST CANADIAN EDITION
Downing, P. E., Jiang, Y., Shuman, M., & Kanwisher, N. (2001). A cortical area selective for visual processing of
the human body. Science, 293(5539), 2470–2473.
Gegenfurtner, K. R., & Kiper, D. C. (2003). Color vision. Annual Review of Neuroscience, 26, 181–206.
Geldard, F. A. (1972). The human senses (2nd ed.). New York, NY: John Wiley & Sons.
Gibson, E. J., & Walk, R. D. (1960). The “visual cliff.” Scientific American, 202(4), 64–71.
Haxby, J. V., Gobbini, M. I., Furey, M. L., Ishai, A., Schouten, J. L., & Pietrini, P. (2001). Distributed and
overlapping representations of faces and objects in ventral temporal cortex. Science, 293(5539), 2425–2430.
Howard, I. P., & Rogers, B. J. (2001). Seeing in depth: Basic mechanisms (Vol. 1). Toronto, ON: Porteous.
Kelsey, C.A. (1997). Detection of visual information. In W. R. Hendee & P. N. T. Wells (Eds.), The perception of
visual information (2nd ed.). New York, NY: Springer Verlag.
Livingstone, M., & Hubel, D. (1998). Segregation of form, color, movement, and depth: Anatomy, physiology, and
perception. Science, 240, 740–749.
Livingstone M. S. (2000). Is it warm? Is it real? Or just low spatial frequency? Science, 290, 1299.
McKone, E., Kanwisher, N., & Duchaine, B. C. (2007). Can generic expertise explain special processing for
faces? Trends in Cognitive Sciences, 11, 8–15.
Pitcher, D., Walsh, V., Yovel, G., & Duchaine, B. (2007). TMS evidence for the involvement of the right occipital
face area in early face processing. Current Biology, 17, 1568–1573.
Rodriguez, E., George, N., Lachaux, J.-P., Martinerie, J., Renault, B., & Varela, F. J. (1999). Perception’s shadow:
Long-distance synchronization of human brain activity. Nature, 397(6718), 430–433.
Sekuler, R., & Blake, R. (2006). Perception (5th ed.). New York, NY: McGraw-Hill.
Witherington, D. C. (2005). The development of prospective grasping control between 5 and 7 months: A
longitudinal study. Infancy, 7(2), 143–161.
Image Attributions
Figure 5.10: Mona Lisa detail face (http://commons.wikimedia.org/wiki/File:Mona_Lisa_detail_face.jpg) is in the
public domain.
Figure 5.15: Ishihara Plate No.11 (http://commons.wikimedia.org/wiki/File:Ishihara_11.PNG) and Ishihara Plate
No.23 (http://commons.wikimedia.org/wiki/File:Ishihara_23.PNG) is in the public domain.
Figure 5.16: Nachbild by Freddy2001 (http://commons.wikimedia.org/wiki/File:Nachbild-1.svg) and Italian Flag
Inverted by Pcessna (http://commons.wikimedia.org/wiki/File:ItalianFlagInverted.gif) is in the public domain.
Figure 5.17: Perception-Conception (http://perception-connection.wikispaces.com/3)+Key+Findings) used with
CC-BY-SA 3.0 (http://creativecommons.org/licenses/by-sa/3.0/).
5.2 SEEING • 182
5.3 Hearing
Learning Objectives
1. Draw a picture of the ear, label its key structures and functions, and describe the role they play
in hearing.
2. Describe the process of transduction in hearing.
Like vision and all the other senses, hearing begins with transduction. Sound waves that are collected by our ears
are converted into neural impulses, which are sent to the brain where they are integrated with past experience and
interpreted as the sounds we experience. The human ear is sensitive to a wide range of sounds, from the faint tick
of a clock in a nearby room to the roar of a rock band at a nightclub, and we have the ability to detect very small
variations in sound. But the ear is particularly sensitive to sounds in the same frequency as the human voice. A
mother can pick out her child’s voice from a host of others, and when we pick up the phone we quickly recognize
a familiar voice. In a fraction of a second, our auditory system receives the sound waves, transmits them to the
auditory cortex, compares them to stored knowledge of other voices, and identifies the caller.
The Ear
Just as the eye detects light waves, the ear detects sound waves. Vibrating objects (such as the human vocal cords
or guitar strings) cause air molecules to bump into each other and produce sound waves, which travel from their
source as peaks and valleys, much like the ripples that expand outward when a stone is tossed into a pond. Unlike
light waves, which can travel in a vacuum, sound waves are carried within media such as air, water, or metal, and it
is the changes in pressure associated with these media that the ear detects.
As with light waves, we detect both the wavelength and the amplitude of sound waves. The wavelength of the sound
wave (known as frequency) is measured in terms of the number of waves that arrive per second and determines our
perception of pitch, the perceived frequency of a sound. Longer sound waves have lower frequency and produce a
lower pitch, whereas shorter waves have higher frequency and a higher pitch.
The amplitude, or height of the sound wave, determines how much energy it contains and is perceived as
loudness (the degree of sound volume). Larger waves are perceived as louder. Loudness is measured using the unit
of relative loudness known as the decibel. Zero decibels represent the absolute threshold for human hearing, below
which we cannot hear a sound. Each increase in 10 decibels represents a tenfold increase in the loudness of the sound
(see Figure 5.18, “Sounds in Everyday Life”). The sound of a typical conversation (about 60 decibels) is 1,000
times louder than the sound of a faint whisper (30 decibels), whereas the sound of a jackhammer (130 decibels) is
10 billion times louder than the whisper.
Audition begins in the pinna, the external and visible part of the ear, which is shaped like a funnel to draw in
sound waves and guide them into the auditory canal. At the end of the canal, the sound waves strike the tightly
183
Figure 5.18 Sounds in Everyday Life. The human ear can comfortably hear sounds up to 80
decibels. Prolonged exposure to sounds above 80 decibels can cause hearing loss. [Long
Description]
stretched, highly sensitive membrane known as the tympanic membrane (or eardrum), which vibrates with the
waves. The resulting vibrations are relayed into the middle ear through three tiny bones, known as the ossicles —
the hammer (or malleus), anvil (or incus), and stirrup (or stapes) —to the cochlea, a snail-shaped liquid-filled tube
in the inner ear that contains the cilia. The vibrations cause the oval window, the membrane covering the opening
of the cochlea, to vibrate, disturbing the fluid inside the cochlea (Figure 5.19).
The movements of the fluid in the cochlea bend the hair cells of the inner ear, in much the same way that a gust
of wind bends over wheat stalks in a field. The movements of the hair cells trigger nerve impulses in the attached
neurons, which are sent to the auditory nerve and then to the auditory cortex in the brain. The cochlea contains about
16,000 hair cells, each of which holds a bundle of fibres known as cilia on its tip. The cilia are so sensitive that they
can detect a movement that pushes them the width of a single atom. To put things in perspective, cilia swaying the
width of an atom is equivalent to the tip of the Eiffel Tower swaying half an inch (Corey et al., 2004).
5.3 HEARING • 184
Figure 5.19 The Human Ear. Sound waves enter the outer ear and are transmitted through the
auditory canal to the eardrum. The resulting vibrations are moved by the three small ossicles into
the cochlea, where they are detected by hair cells and sent to the auditory nerve.
Although loudness is directly determined by the number of hair cells that are vibrating, two different mechanisms
are used to detect pitch. The frequency theory of hearing proposes that whatever the pitch of a sound wave, nerve
impulses of a corresponding frequency will be sent to the auditory nerve. For example, a tone measuring 600 hertz
will be transduced into 600 nerve impulses a second. This theory has a problem with high-pitched sounds, however,
because the neurons cannot fire fast enough. To reach the necessary speed, the neurons work together in a sort of
volley system in which different neurons fire in sequence, allowing us to detect sounds up to about 4,000 hertz.
Not only is frequency important, but location is critical as well. The cochlea relays information about the specific
area, or place, in the cochlea that is most activated by the incoming sound. The place theory of hearing proposes
that different areas of the cochlea respond to different frequencies. Higher tones excite areas closest to the opening
of the cochlea (near the oval window). Lower tones excite areas near the narrow tip of the cochlea, at the opposite
end. Pitch is therefore determined in part by the area of the cochlea firing the most frequently.
Just as having two eyes in slightly different positions allows us to perceive depth, so the fact that the ears are placed
on either side of the head enables us to benefit from stereophonic, or three-dimensional, hearing. If a sound occurs
on your left side, the left ear will receive the sound slightly sooner than the right ear, and the sound it receives will
be more intense, allowing you to quickly determine the location of the sound. Although the distance between our
two ears is only about six inches, and sound waves travel at 750 miles an hour, the time and intensity differences
are easily detected (Middlebrooks & Green, 1991). When a sound is equidistant from both ears, such as when it is
directly in front, behind, beneath, or overhead, we have more difficulty pinpointing its location. It is for this reason
that dogs (and people, too) tend to cock their heads when trying to pinpoint a sound, so that the ears receive slightly
different signals.
Hearing Loss
In 2006, 1,266,120 (5.0%) Canadians aged 15 and older reported having a hearing limitation. Over eight in 10
185 • INTRODUCTION TO PSYCHOLOGY - 1ST CANADIAN EDITION
(83.2%) hearing limitations were mild in nature, while the remaining 16.8% were classified as severe (Statistics
Canada, 2006). Conductive hearing loss is caused by physical damage to the ear (such as to the eardrums or ossicles)
that reduces the ability of the ear to transfer vibrations from the outer ear to the inner ear. Sensorineural hearing
loss, which is caused by damage to the cilia or to the auditory nerve, is less common overall but frequently occurs
with age (Tennesen, 2007). The cilia are extremely fragile, and by the time we are 65 years old, we will have lost
40% of them, particularly those that respond to high-pitched sounds (Chisolm, Willott, & Lister, 2003).
Prolonged exposure to loud sounds will eventually create sensorineural hearing loss as the cilia are damaged by the
noise. People who constantly operate noisy machinery without using appropriate ear protection are at high risk of
hearing loss, as are people who listen to loud music on their headphones or who engage in noisy hobbies, such as
hunting or motorcycling. Sounds that are 85 decibels or more can cause damage to your hearing, particularly if you
are exposed to them repeatedly. Sounds of more than 130 decibels are dangerous even if you are exposed to them
infrequently. People who experience tinnitus (a ringing or a buzzing sensation) after being exposed to loud sounds
have very likely experienced some damage to their cilia. Taking precautions when being exposed to loud sounds is
important, as cilia do not grow back.
While conductive hearing loss can often be improved through hearing aids that amplify the sound, they are of
little help to sensorineural hearing loss. But if the auditory nerve is still intact, a cochlear implant may be used. A
cochlear implant is a device made up of a series of electrodes that are placed inside the cochlea. The device serves
to bypass the hair cells by stimulating the auditory nerve cells directly. The latest implants utilize place theory,
enabling different spots on the implant to respond to different levels of pitch. The cochlear implant can help children
who would normally be deaf hear. If the device is implanted early enough, these children can frequently learn to
speak, often as well as children born without hearing loss do (Dettman, Pinder, Briggs, Dowell, & Leigh, 2007;
Dorman & Wilson, 2004).
Key Takeaways
• Sound waves vibrating through media such as air, water, or metal are the stimulus energy that is
sensed by the ear.
• The hearing system is designed to assess frequency (pitch) and amplitude (loudness).
• Sound waves enter the outer ear (the pinna) and are sent to the eardrum via the auditory canal.
The resulting vibrations are relayed by the three ossicles, causing the oval window covering the
cochlea to vibrate. The vibrations are detected by the cilia (hair cells) and sent via the auditory
nerve to the auditory cortex.
• There are two theories as to how we perceive pitch: The frequency theory of hearing suggests that
as a sound wave’s pitch changes, nerve impulses of a corresponding frequency enter the auditory
nerve. The place theory of hearing suggests that we hear different pitches because different areas
of the cochlea respond to higher and lower pitches.
• Conductive hearing loss is caused by physical damage to the ear or eardrum and may be improved
by hearing aids or cochlear implants. Sensorineural hearing loss, caused by damage to the hair
cells or auditory nerves in the inner ear, may be produced by prolonged exposure to sounds of
more than 85 decibels.
5.3 HEARING • 186
Exercise and Critical Thinking
1. Given what you have learned about hearing in this chapter, are you engaging in any activities
that might cause long-term hearing loss? If so, how might you change your behaviour to reduce
the likelihood of suffering damage?
References
Chisolm, T. H., Willott, J. F., & Lister, J. J. (2003). The aging auditory system: Anatomic and physiologic changes
and implications for rehabilitation. International Journal of Audiology, 42(Suppl. 2), 2S3–2S10.
Corey, D. P., Garc.a-A.overos, J., Holt, J. R., Kwan, K. Y., Lin, S.-Y., Vollrath, M. A., Amalfitano, A.,…Zhang,
D.-S. (2004). TRPA1 is a candidate for the mechano-sensitive transduction channel of vertebrate hair cells. Nature,
432, 723–730. Retrieved from http://www.nature.com/nature/journal/v432/n7018/full/nature03066.html
Dettman, S. J., Pinder, D., Briggs, R. J. S., Dowell, R. C., & Leigh, J. R. (2007). Communication development in
children who receive the cochlear implant younger than 12 months: Risk versus benefits. Ear and Hearing, 28(2,
Suppl.), 11S–18S.
Dorman, M. F., & Wilson, B. S. (2004). The design and function of cochlear implants. American Scientist, 92,
436–445.
Middlebrooks, J. C., & Green, D. M. (1991). Sound localization by human listeners. Annual Review of Psychology,
42, 135–159.
Statistics Canada. (2006). Participation and activity limitation survey, 2006. Retrieved June 2014 from
http://www.statcan.gc.ca/pub/89-628-x/2009012/fs-fi/fs-fi-eng.htm
Tennesen, M. (2007, March 10). Gone today, hear tomorrow. New Scientist, 2594, 42–45.
187 • INTRODUCTION TO PSYCHOLOGY - 1ST CANADIAN EDITION
Long Description
Figure 5.18 long description: Levels of Noise
Decibels
(dB) Description Examples
140 Painful and dangerous, use hearing protection or avoid. Fireworks, gunshots, custom car stereos (at full
volume)
130 Painful and dangerous, use hearing protection or avoid. Jackhammers, ambulances
120 Uncomfortable, dangerous over 30 seconds Jet planes (during takeoff)
110 Very loud, dangerous over 30 seconds Concerts, car horns, sporting events
100 Very loud, dangerous over 30 seconds Snowmobiles, MP3 players (at full volume)
90 Very loud, dangerous over 30 seconds Lawnmowers, power tools, blenders, hair dryers
85 Over 85 dB for extended periods can cause permanent
hearing loss.
80 Loud Alarm clocks
70 Loud Traffic, vacuum cleaners
60 Moderate Normal conversation, dishwashers
50 Moderate Moderate raindall
40 Soft Quiet library
20 Faint Leaves rustling
[Return to Figure 5.18]
5.3 HEARING • 188
5.4 Tasting, Smelling, and Touching
Learning Objectives
1. Summarize how the senses of taste and olfaction transduce stimuli into perceptions.
2. Describe the process of transduction in the senses of touch and proprioception.
3. Outline the gate control theory of pain. Explain why pain matters and how it may be controlled.
Although vision and hearing are by far the most important senses, human sensation is rounded out by four others,
each of which provides an essential avenue to a better understanding of and response to the world around us. These
other senses are touch, taste, and smell, and our sense of body position and movement (proprioception).
Tasting
Taste is important not only because it allows us to enjoy the food we eat, but, even more crucial, because it leads us
toward foods that provide energy (sugar, for instance) and away from foods that could be harmful. Many children
are picky eaters for a reason — they are biologically predisposed to be very careful about what they eat. Together
with the sense of smell, taste helps us maintain appetite, assess potential dangers (such as the odour of a gas leak or
a burning house), and avoid eating poisonous or spoiled food.
Our ability to taste begins at the taste receptors on the tongue. The tongue detects six different taste sensations,
known respectively as sweet, salty, sour, bitter, piquancy (spicy), and umami (savory). Umami is a meaty taste
associated with meats, cheeses, soy, seaweed, and mushrooms, and is particularly found in monosodium glutamate
(MSG), a popular flavour enhancer (Ikeda, 1909/2002; Sugimoto & Ninomiya, 2005).
Our tongues are covered with taste buds, which are designed to sense chemicals in the mouth. Most taste buds are
located in the top outer edges of the tongue, but there are also receptors at the back of the tongue as well as on the
walls of the mouth and at the back of the throat. As we chew food, it dissolves and enters the taste buds, triggering
nerve impulses that are transmitted to the brain (Northcutt, 2004). Human tongues are covered with 2,000 to 10,000
taste buds, and each bud contains between 50 and 100 taste receptor cells. Taste buds are activated very quickly; a
salty or sweet taste that touches a taste bud for even one-tenth of a second will trigger a neural impulse (Kelling &
Halpern, 1983). On average, taste buds live for about five days, after which new taste buds are created to replace
them. As we get older, however, the rate of creation decreases, making us less sensitive to taste. This change helps
explain why some foods that seem so unpleasant in childhood are more enjoyable in adulthood.
The area of the sensory cortex that responds to taste is in a very similar location to the area that responds to smell,
a fact that helps explain why the sense of smell also contributes to our experience of the things we eat. You may
remember having had difficulty tasting food when you had a bad cold, and if you block your nose and taste slices of
raw potato, apple, and parsnip, you will not be able to taste the differences between them. Our experience of texture
in a food (the way we feel it on our tongues) also influences how we taste it.
189
Smelling
As we breathe in air through our nostrils, we inhale airborne chemical molecules, which are detected by the 10
million to 20 million receptor cells embedded in the olfactory membrane of the upper nasal passage. The olfactory
receptor cells are topped with tentacle-like protrusions that contain receptor proteins. When an odour receptor
is stimulated, the membrane sends neural messages up the olfactory nerve to the brain (see Figure 5.20. “Smell
Receptors”).
Figure 5.20 Smell Receptors. There are more than 1,000 types of odour receptor cells in the
olfactory membrane.
We have approximately 1,000 types of odour receptor cells (Bensafi et al., 2004), and it is estimated that we can
detect 10,000 different odours (Malnic, Hirono, Sato, & Buck, 1999). The receptors come in many different shapes
and respond selectively to different smells. Like a lock and key, different chemical molecules fit into different
receptor cells, and odours are detected according to their influence on a combination of receptor cells. Just as the
10 digits from 0 to 9 can combine in many different ways to produce an endless array of phone numbers, odour
molecules bind to different combinations of receptors, and these combinations are decoded in the olfactory cortex.
As you can see in Figure 5.21, “Age Differences in Smell,” the sense of smell peaks in early adulthood and then
begins a slow decline. By ages 60 to 70, the sense of smell has become sharply diminished. In addition, women tend
to have a more acute sense of smell than men.
Figure 5.21 Age Differences in Smell. The ability to identify common odourants declines
markedly between 20 and 70 years of age.
5.4 TASTING, SMELLING, AND TOUCHING • 190
Touching
The sense of touch is essential to human development. Infants thrive when they are cuddled and attended to, but
not if they are deprived of human contact (Baysinger, Plubell, & Harlow, 1973; Feldman, 2007; Haradon, Bascom,
Dragomir, & Scripcaru, 1994). Touch communicates warmth, caring, and support, and is an essential part of the
enjoyment we gain from our social interactions with close others (Field et al., 1997; Keltner, 2009).
The skin, the largest organ in the body, is the sensory organ for touch. The skin contains a variety of nerve endings,
combinations of which respond to particular types of pressures and temperatures. When you touch different parts of
the body, you will find that some areas are more ticklish, whereas other areas respond more to pain, cold, or heat.
The thousands of nerve endings in the skin respond to four basic sensations — pressure, hot, cold, and pain — but
only the sensation of pressure has its own specialized receptors. Other sensations are created by a combination of
the other four. For instance:
• The experience of a tickle is caused by the stimulation of neighbouring pressure receptors.
• The experience of heat is caused by the stimulation of hot and cold receptors.
• The experience of itching is caused by repeated stimulation of pain receptors.
• The experience of wetness is caused by repeated stimulation of cold and pressure receptors.
The skin is important not only in providing information about touch and temperature, but also in proprioception
— the ability to sense the position and movement of our body parts. Proprioception is accomplished by specialized
neurons located in the skin, joints, bones, ears, and tendons, which send messages about the compression and the
contraction of muscles throughout the body. Without this feedback from our bones and muscles, we would be unable
to play sports, walk, or even stand upright.
The ability to keep track of where the body is moving is also provided by the vestibular system, a set of liquidfilled
areas in the inner ear that monitors the head’s position and movement, maintaining the body’s balance. As
you can see in Figure 5.22, “The Vestibular System,” the vestibular system includes the semicircular canals and
the vestibular sacs. These sacs connect the canals with the cochlea. The semicircular canals sense the rotational
movements of the body, and the vestibular sacs sense linear accelerations. The vestibular system sends signals to
the neural structures that control eye movement and to the muscles that keep the body upright.
Experiencing Pain
We do not enjoy it, but the experience of pain is how the body informs us that we are in danger. The burn when we
touch a hot radiator and the sharp stab when we step on a nail lead us to change our behaviour, preventing further
damage to our bodies. People who cannot experience pain are in serious danger of damage from wounds that others
with pain would quickly notice and attend to.
The gate control theory of pain proposes that pain is determined by the operation of two types of nerve fibres in
the spinal cord. One set of smaller nerve fibres carries pain from the body to the brain, whereas a second set of
larger fibres is designed to stop or start (as a gate would) the flow of pain (Melzack & Wall, 1996). It is for this
reason that massaging an area where you feel pain may help alleviate it — the massage activates the large nerve
fibres that block the pain signals of the small nerve fibres (Wall, 2000).
Experiencing pain is a lot more complicated than simply responding to neural messages, however. It is also a matter
of perception. We feel pain less when we are busy focusing on a challenging activity (Bantick et al., 2002), which
can help explain why sports players may feel their injuries only after the game. We also feel less pain when we are
191 • INTRODUCTION TO PSYCHOLOGY - 1ST CANADIAN EDITION
Figure 5.22 The Vestibular System. The vestibular system includes the semicircular canals
(brown) that transduce the rotational movements of the body, and the vestibular sacs (blue) that
sense linear accelerations.
distracted by humour (Zweyer, Velker, & Ruch, 2004). And pain is soothed by the brain’s release of endorphins,
natural hormonal pain killers. The release of endorphins can explain the euphoria experienced in the running of a
marathon (Sternberg, Bailin, Grant, & Gracely, 1998).
Key Takeaways
• The ability to taste, smell, and touch are important because they help us avoid harm from
environmental toxins.
• The many taste buds on our tongues and inside our mouths allow us to detect six basic taste
sensations: sweet, salty, sour, bitter, piquancy, and umami.
• In olfaction, transduction occurs as airborne chemicals that are inhaled through the nostrils are
detected by receptors in the olfactory membrane. Different chemical molecules fit into different
receptor cells, creating different smells.
• The ability to smell diminishes with age and, on average, women have a better sense of smell than
men.
• We have a range of different nerve endings embedded in the skin, combinations of which respond
to the four basic sensations of pressure, hot, cold, and pain. But only the sensation of pressure has
its own specialized receptors.
• Proprioception is our ability to sense the positions and movements of our body parts. Postural and
movement information is detected by special neurons located in the skin, joints, bones, ears, and
5.4 TASTING, SMELLING, AND TOUCHING • 192
tendons, which pick up messages from the compression and the contraction of muscles throughout
the body.
• The vestibular system, composed of structures in the inner ear, monitors the head’s position and
movement, maintaining the body’s balance.
• Gate control theory explains how large and small neurons work together to transmit and regulate
the flow of pain to the brain.
Exercises and Critical Thinking
1. Think of the foods that you like to eat the most. Which of the six taste sensations do these foods
have, and why do you think that you like these particular flavours?
2. Why do you think that women might have a better developed sense of smell than do men?
3. Why is experiencing pain a benefit for human beings?
References
Bantick, S. J., Wise, R. G., Ploghaus, A., Clare, S., Smith, S. M., & Tracey, I. (2002). Imaging how attention
modulates pain in humans using functional MRI. Brain: A Journal of Neurology, 125(2), 310–319.
Baysinger, C. M., Plubell, P. E., & Harlow, H. F. (1973). A variable-temperature surrogate mother for studying
attachment in infant monkeys. Behavior Research Methods & Instrumentation, 5(3), 269–272.
Bensafi, M., Zelano, C., Johnson, B., Mainland, J., Kahn, R., & Sobel, N. (2004). Olfaction: From sniff to percept.
In M. S. Gazzaniga (Ed.), The cognitive neurosciences (3rd ed.). Cambridge, MA: MIT Press.
Feldman, R. (2007). Maternal-infant contact and child development: Insights from the kangaroo intervention. In L.
L’Abate (Ed.), Low-cost approaches to promote physical and mental health: Theory, research, and practice (pp.
323–351). New York, NY: Springer Science + Business Media.
Field, T., Lasko, D., Mundy, P., Henteleff, T., Kabat, S., Talpins, S., & Dowling, M. (1997). Brief report: Autistic
children’s attentiveness and responsivity improve after touch therapy. Journal of Autism and Developmental
Disorders, 27(3), 333–338.
Haradon, G., Bascom, B., Dragomir, C., & Scripcaru, V. (1994). Sensory functions of institutionalized Romanian
infants: A pilot study. Occupational Therapy International, 1(4), 250–260.
Ikeda, K. (1909/2002). [New seasonings]. Chemical Senses, 27(9), 847–849. Translated and shortened to 75% by Y.
Ogiwara & Y. Ninomiya from the Journal of the Chemical Society of Tokyo, 30, 820–836. (Original work published
1909).
193 • INTRODUCTION TO PSYCHOLOGY - 1ST CANADIAN EDITION
Kelling, S. T., & Halpern, B. P. (1983). Taste flashes: Reaction times, intensity, and quality. Science, 219, 412–414.
Keltner, D. (2009). Born to be good: The science of a meaningful life. New York, NY: Norton.
Malnic, B., Hirono, J., Sato, T., & Buck, L. B. (1999). Combinatorial receptor codes for odors. Cell, 96, 713–723.
Melzack, R., & Wall, P. (1996). The challenge of pain. London, England: Penguin.
Murphy, C. (1986). Taste and smell in the elderly. In H. L. Meiselman & R. S. Rivlin (Eds.), Clinical measurement
of taste and smell (Vol. 1, pp. 343–371). New York, NY: Macmillan.
Northcutt, R. G. (2004). Taste buds: Development and evolution. Brain, Behavior and Evolution, 64(3), 198–206.
Sternberg, W. F., Bailin, D., Grant, M., & Gracely, R. H. (1998). Competition alters the perception of noxious
stimuli in male and female athletes. Pain, 76(1–2), 231–238.
Sugimoto, K., & Ninomiya, Y. (2005). Introductory remarks on umami research: Candidate receptors and signal
transduction mechanisms on umami. Chemical Senses, 30(Suppl. 1), Pi21–i22.
Wall, P. (2000). Pain: The science of suffering. New York, NY: Columbia University Press.
Zweyer, K., Velker, B., & Ruch, W. (2004). Do cheerfulness, exhilaration, and humor production moderate pain
tolerance? A FACS study. Humor: International Journal of Humor Research, 17(1-2), 85–119.
Image Attributions
Figure 5.21: Adapted from Murphy (1986).
5.4 TASTING, SMELLING, AND TOUCHING • 194
5.5 Accuracy and Inaccuracy in Perception
Learning Objectives
1. Describe how sensation and perception work together through sensory interaction, selective
attention, sensory adaptation, and perceptual constancy.
2. Give examples of how our expectations may influence our perception, resulting in illusions and
potentially inaccurate judgments.
The eyes, ears, nose, tongue, and skin sense the world around us, and in some cases perform preliminary information
processing on the incoming data. But by and large, we do not experience sensation — we experience the outcome
of perception, the total package that the brain puts together from the pieces it receives through our senses and that
the brain creates for us to experience. When we look out the window at a view of the countryside, or when we look
at the face of a good friend, we don’t just see a jumble of colours and shapes — we see, instead, an image of a
countryside or an image of a friend (Goodale & Milner, 2006).
How the Perceptual System Interprets the Environment
This meaning making involves the automatic operation of a variety of essential perceptual processes. One of these is
sensory interaction—the working together of different senses to create experience. Sensory interaction is involved
when taste, smell, and texture combine to create the flavour we experience in food. It is also involved when we
enjoy a movie because of the way the images and the music work together.
Although you might think that we understand speech only through our sense of hearing, it turns out that the visual
aspect of speech is also important. One example of sensory interaction is shown in the McGurk effect — an
error in perception that occurs when we misperceive sounds because the audio and visual parts of the speech are
mismatched. You can witness the effect yourself by viewing “The McGurk Effect.”
Watch The McGurk Effect [YouTube]: http://www.youtube.com/
watch?v=jtsfidRq2tw
The McGurk effect is an error in sound perception that occurs when there is a
mismatch between the senses of hearing and seeing. You can experience it here.
Other examples of sensory interaction include the experience of nausea that can occur
when the sensory information being received from the eyes and the body does not
match information from the vestibular system (Flanagan, May, & Dobie, 2004) and
synesthesia — an experience in which one sensation (e.g., hearing a sound) creates
experiences in another (e.g., vision). Most people do not experience synesthesia, but
those who do link their perceptions in unusual ways, for instance, by experiencing colour when they taste a
195
particular food or by hearing sounds when they see certain objects (Ramachandran, Hubbard, Robertson, & Sagiv,
2005).
Another important perceptual process is selective attention — the ability to focus on some sensory inputs while
tuning out others. View “Video Clip: Selective Attention,” and count the number of times the people in white
playing with the ball pass it to each other. You may find that, like many other people who view it for the first time,
you miss something important because you selectively attend to only one aspect of the video (Simons & Chabris,
1999). Perhaps knowledge of the process of selective attention can help you see why the security guards completely
missed the fact that the Chaser group’s motorcade was a fake—they focused on some aspects of the situation, such
as the colour of the cars and the fact that they were there at all, and completely ignored others (the details of the
security information).
Watch Selective Attention [YouTube]: http://www.youtube.com/
watch?v=vJG698U2Mvo
Watch this video and carefully count how many times the people in white pass the ball
to each other.
Selective attention also allows us to focus on a single talker at a party while ignoring
other conversations that are occurring around us (Broadbent, 1958; Cherry,
1953). Without this automatic selective attention, we’d be unable to focus on the single
conversation we want to hear. But selective attention is not complete; we also, at the
same time, monitor what’s happening in the channels we are not focusing on. Perhaps
you have had the experience of being at a party and talking to someone in one part of the room, when suddenly you
hear your name being mentioned by someone in another part of the room. This cocktail party phenomenon shows
us that although selective attention is limiting what we process, we are nevertheless simultaneously doing a lot of
unconscious monitoring of the world around us—you didn’t know you were attending to the background sounds of
the party, but evidently you were.
A second fundamental process of perception is sensory adaptation — a decreased sensitivity to a stimulus after
prolonged and constant exposure. When you step into a swimming pool, the water initially feels cold, but after a
while you stop noticing it. After prolonged exposure to the same stimulus, our sensitivity toward it diminishes and
we no longer perceive it. The ability to adapt to the things that don’t change around us is essential to our survival,
as it leaves our sensory receptors free to detect the important and informative changes in our environment and to
respond accordingly. We ignore the sounds that our car makes every day, which leaves us free to pay attention to
the sounds that are different from normal, and thus likely to need our attention. Our sensory receptors are alert to
novelty and are fatigued after constant exposure to the same stimulus.
If sensory adaptation occurs with all senses, why doesn’t an image fade away after we stare at it for a period of
time? The answer is that, although we are not aware of it, our eyes are constantly flitting from one angle to the next,
making thousands of tiny movements (called saccades) every minute. This constant eye movement guarantees that
the image we are viewing always falls on fresh receptor cells. What would happen if we could stop the movement of
our eyes? Psychologists have devised a way of testing the sensory adaptation of the eye by attaching an instrument
that ensures a constant image is maintained on the eye’s inner surface. Participants are fitted with a contact lens
that has a miniature slide projector attached to it. Because the projector follows the exact movements of the eye, the
same image is always projected, stimulating the same spot, on the retina. Within a few seconds, interesting things
begin to happen. The image will begin to vanish, then reappear, only to disappear again, either in pieces or as a
whole. Even the eye experiences sensory adaptation (Yarbus, 1967).
One of the major problems in perception is to ensure that we always perceive the same object in the same way, even
5.5 ACCURACY AND INACCURACY IN PERCEPTION • 196
when the sensations it creates on our receptors change dramatically. The ability to perceive a stimulus as constant
despite changes in sensation is known as perceptual constancy. Consider our image of a door as it swings. When
it is closed, we see it as rectangular, but when it is open, we see only its edge and it appears as a line. But we
never perceive the door as changing shape as it swings — perceptual mechanisms take care of the problem for us
by allowing us to see a constant shape.
The visual system also corrects for colour constancy. Imagine that you are wearing blue jeans and a bright white
T-shirt. When you are outdoors, both colours will be at their brightest, but you will still perceive the white T-shirt
as bright and the blue jeans as darker. When you go indoors, the light shining on the clothes will be significantly
dimmer, but you will still perceive the T-shirt as bright. This is because we put colours in context and see that,
compared with its surroundings, the white T-shirt reflects the most light (McCann, 1992). In the same way, a
green leaf on a cloudy day may reflect the same wavelength of light as a brown tree branch does on a sunny day.
Nevertheless, we still perceive the leaf as green and the branch as brown.
Illusions
Although our perception is very accurate, it is not perfect. Illusions occur when the perceptual processes that
normally help us correctly perceive the world around us are fooled by a particular situation so that we see
something that does not exist or that is incorrect. Figure 5.23, “Optical Illusions as a Result of Brightness Constancy
(Left) and Colour Constancy (Right),” presents two situations in which our normally accurate perceptions of visual
constancy have been fooled.
Figure 5.23 Optical Illusions as a Result of Brightness Constancy (Left) and Colour Constancy
(Right). Look carefully at the snakelike pattern on the left. Are the green strips really brighter than
the background? Cover the white curves and you’ll see they are not. Square A in the right-hand
image looks very different from square B, even though they are exactly the same.
Another well-known illusion is the Mueller-Lyer illusion (see Figure 5.24, “The Mueller-Lyer Illusion”). The line
segment in the bottom arrow looks longer to us than the one on the top, even though they are both actually the same
length. It is likely that the illusion is, in part, the result of the failure of monocular depth cues — the bottom line
looks like an edge that is normally farther away from us, whereas the top one looks like an edge that is normally
closer.
The moon illusion refers to the fact that the moon is perceived to be about 50% larger when it is near the horizon
than when it is seen overhead, despite the fact that in both cases the moon is the same size and casts the same size
retinal image. The monocular depth cues of position and aerial perspective (see Figure 5.25, “The Moon Illusion”)
create the illusion that things that are lower and more hazy are farther away. The skyline of the horizon (trees,
clouds, outlines of buildings) also gives a cue that the moon is far away, compared to when it is at its zenith. If we
197 • INTRODUCTION TO PSYCHOLOGY - 1ST CANADIAN EDITION
Figure 5.24 The Mueller-Lyer Illusion. The Mueller-Lyer illusion makes the line segment at the
top of the left picture appear shorter than the one at the bottom. The illusion is caused, in part, by
the monocular distance cue of depth — the bottom line looks like an edge that is normally farther
away from us, whereas the top one looks like an edge that is normally closer.
look at a horizon moon through a tube of rolled-up paper, taking away the surrounding horizon cues, the moon will
immediately appear smaller.
Figure 5.25 The Moon Illusion. The moon always looks larger on the horizon than when it is high
above. But if we take away the surrounding distance cues of the horizon, the illusion disappears.
The Ponzo illusion operates on the same principle. As you can see in Figure 5.26, “The Ponzo Illusion,” the top
yellow bar seems longer than the bottom one, but if you measure them you’ll see that they are exactly the same
length. The monocular depth cue of linear perspective leads us to believe that, given two similar objects, the distant
one can only cast the same size retinal image as the closer object if it is larger. The topmost bar therefore appears
longer.
Illusions demonstrate that our perception of the world around us may be influenced by our prior knowledge. But the
fact that some illusions exist in some cases does not mean that the perceptual system is generally inaccurate — in
fact, humans normally become so closely in touch with their environment that the physical body and the particular
environment that we sense and perceive becomes embodied—that is, built into and linked with our cognition, such
5.5 ACCURACY AND INACCURACY IN PERCEPTION • 198
Figure 5.26 The Ponzo Illusion. The Ponzo illusion is caused by a failure of the monocular depth
cue of linear perspective. Both bars are the same size, even though the top one looks larger.
that the world around us becomes part of our brain (Calvo & Gomila, 2008). The close relationship between people
and their environments means that, although illusions can be created in the lab and under some unique situations,
they may be less common with active observers in the real world (Runeson, 1988).
The Important Role of Expectations in Perception
Our emotions, mindset, expectations, and the contexts in which our sensations occur all have a profound influence
on perception. People who are warned that they are about to taste something bad rate what they do taste more
negatively than people who are told that the taste won’t be so bad (Nitschke et al., 2006), and people perceive a
child and adult pair as looking more alike when they are told that they are parent and child (Bressan & Dal Martello,
2002). Similarly, participants who see images of the same baby rate it as stronger and bigger when they are told
it is a boy as opposed to when they are told it is a girl (Stern & Karraker, 1989), and research participants who
learn that a child is from a lower-class background perceive the child’s scores on an intelligence test as lower than
people who see the same test taken by a child they are told is from an upper-class background (Darley & Gross,
1983). Plassmann, O’Doherty, Shiv, and Rangel (2008) found that wines were rated more positively and caused
greater brain activity in brain areas associated with pleasure when they were said to cost more than when they were
said to cost less. And even experts can be fooled: professional referees tended to assign more penalty cards to soccer
teams for videotaped fouls when they were told that the team had a history of aggressive behaviour than when they
had no such expectation (Jones, Paull, & Erskine, 2002).
Our perceptions are also influenced by our desires and motivations. When we are hungry, food-related words
tend to grab our attention more than non-food-related words (Mogg, Bradley, Hyare, & Lee, 1998), we perceive
objects that we can reach as bigger than those that we cannot reach (Witt & Proffitt, 2005), and people who favour
a political candidate’s policies view the candidate’s skin colour more positively than do those who oppose the
candidate’s policies (Caruso, Mead, & Balcetis, 2009). Even our culture influences perception. Chua, Boland, and
Nisbett (2005) showed American and Asian graduate students different images, such as an airplane, an animal,
or a train, against complex backgrounds. They found that (consistent with their overall individualistic orientation)
199 • INTRODUCTION TO PSYCHOLOGY - 1ST CANADIAN EDITION
the American students tended to focus more on the foreground image, while Asian students (consistent with their
interdependent orientation) paid more attention to the image’s context. Furthermore, Asian-American students
focused more or less on the context depending on whether their Asian or their American identity had been activated.
Psychology in Everyday Life: How Understanding Sensation and Perception Can Save Lives
Human factors is the field of psychology that uses psychological knowledge, including the principles of
sensation and perception, to improve the development of technology. Human factors has worked on a variety
of projects, ranging from nuclear reactor control centres and airplane cockpits to cell phones and websites
(Proctor & Van Zandt, 2008). For instance, modern televisions and computer monitors were developed on
the basis of the trichromatic colour theory, using three colour elements placed close enough together that the
colours are blended by the eye. Knowledge of the visual system also helped engineers create new kinds of
displays, such as those used on notebook computers and music players, and better understand how using cell
phones while driving may contribute to automobile accidents (Lee & Strayer, 2004).
Human factors also has made substantial contributions to airline safety. About two-thirds of accidents
on commercial airplane flights are caused by human error (Nickerson, 1998). During takeoff, travel, and
landing, the pilot simultaneously communicates with ground control, maneuvers the plane, scans the horizon
for other aircraft, and operates controls. The need for a usable interface that works easily and naturally with
the pilot’s visual perception is essential.
Psychologist Conrad Kraft (1978) hypothesized that as planes land, with no other distance cues visible, pilots
may be subjected to a type of moon illusion, in which the city lights beyond the runway appear much larger
on the retina than they really are, deceiving the pilot into landing too early. Kraft’s findings caused airlines
to institute new flight safety measures, where copilots must call out the altitude progressively during the
descent, which has probably decreased the number of landing accidents.
Figure 5.27 presents images of an airplane instrument panel before and after it was redesigned by human
factors psychologists. On the left is the initial design, in which the controls were crowded and cluttered,
in no logical sequence, each control performing one task. The controls were more or less the same in
colour, and the gauges were not easy to read. The redesigned digital cockpit (right on Figure 5.27) shows
a marked improvement in usability. More of the controls are colour-coded and multifunctional so that there
is less clutter on the dashboard. Screens make use of LCD and 3-D graphics. Text sizes are changeable
— increasing readability — and many of the functions have become automated, freeing up the pilots’
concentration for more important activities.
Figure 5.27 Airplane Cockpits. Initial design of the airplane cockpit (left); the digital design of the
airplane cockpit (right), which has taken human factors into account.
5.5 ACCURACY AND INACCURACY IN PERCEPTION • 200
One important aspect of the redesign was based on the principles of sensory adaptation. Displays that are
easy to see in darker conditions quickly become unreadable when the sun shines directly on them. It takes the
pilot a relatively long time to adapt to the suddenly much brighter display. Furthermore, perceptual contrast
is important. The display cannot be so bright at night that the pilot is unable to see targets in the sky or on
the land. Human factors psychologists used these principles to determine the appropriate stimulus intensity
needed on these displays so that pilots would be able to read them accurately and quickly under a wide
range of conditions. The psychologists accomplished this by developing an automatic control mechanism
that senses the ambient light visible through the front cockpit windows and detects the light falling on the
display surface, and then automatically adjusts the intensity of the display for the pilot (Silverstein, Krantz,
Gomer, Yeh, & Monty, 1990; Silverstein & Merrifield, 1985).
Key Takeaways
• Sensory interaction occurs when different senses work together, for instance, when taste, smell,
and touch together produce the flavour of food.
• Selective attention allows us to focus on some sensory experiences while tuning out others.
• Sensory adaptation occurs when we become less sensitive to some aspects of our environment,
freeing us to focus on more important changes.
• Perceptual constancy allows us to perceive an object as the same, despite changes in sensation.
• Cognitive illusions are examples of how our expectations can influence our perceptions.
• Our emotions, motivations, desires, and even our culture can influence our perceptions.
Exercises and Critical Thinking
1. Consider the role of the security personnel at the APEC meeting who let the Chaser group’s car
enter the security area. List some perceptual processes that might have been at play.
2. Consider some cases where your expectations about what you thought you might be going to
experience have influenced your perceptions of what you actually experienced.
References
Bressan, P., & Dal Martello, M. F. (2002). Talis pater, talis filius: Perceived resemblance and the belief in genetic
relatedness. Psychological Science, 13, 213–218.
Broadbent, D. E. (1958). Perception and communication. New York, NY: Pergamon.
201 • INTRODUCTION TO PSYCHOLOGY - 1ST CANADIAN EDITION
Calvo, P., & Gomila, T. (Eds.). (2008). Handbook of cognitive science: An embodied approach. San Diego, CA:
Elsevier.
Caruso, E. M., Mead, N. L., & Balcetis, E. (2009). Political partisanship influences perception of biracial
candidates’ skin tone. PNAS Proceedings of the National Academy of Sciences of the United States of America,
106(48), 20168–20173.
Cherry, E. C. (1953). Some experiments on the recognition of speech, with one and with two ears. Journal of the
Acoustical Society of America, 25, 975–979.
Chua, H. F., Boland, J. E., & Nisbett, R. E. (2005). Cultural variation in eye movements during scene
perception. Proceedings of the National Academy of Sciences, 102, 12629–12633.
Darley, J. M., & Gross, P. H. (1983). A hypothesis-confirming bias in labeling effects. Journal of Personality and
Social Psychology, 44, 20–33.
Flanagan, M. B., May, J. G., & Dobie, T. G. (2004). The role of vection, eye movements, and postural instability in
the etiology of motion sickness. Journal of Vestibular Research: Equilibrium and Orientation, 14(4), 335–346.
Goodale, M., & Milner, D. (2006). One brain — Two visual systems. Psychologist, 19(11), 660–663.
Jones, M. V., Paull, G. C., & Erskine, J. (2002). The impact of a team’s aggressive reputation on the decisions of
association football referees. Journal of Sports Sciences, 20, 991–1000.
Kraft, C. (1978). A psychophysical approach to air safety: Simulator studies of visual illusions in night approaches.
In H. L. Pick, H. W. Leibowitz, J. E. Singer, A. Steinschneider, & H. W. Steenson (Eds.), Psychology: From
research to practice. New York, NY: Plenum Press.
Lee, J., & Strayer, D. (2004). Preface to the special section on driver distraction. Human Factors, 46(4), 583.
McCann, J. J. (1992). Rules for color constancy. Ophthalmic and Physiologic Optics, 12(2), 175–177.
Mogg, K., Bradley, B. P., Hyare, H., & Lee, S. (1998). Selective attention to food related stimuli in
hunger. Behavior Research & Therapy, 36(2), 227–237.
Nickerson, R. S. (1998). Applied experimental psychology. Applied Psychology: An International Review, 47,
155–173.
Nitschke, J. B., Dixon, G. E., Sarinopoulos, I., Short, S. J., Cohen, J. D., Smith, E. E.,…Davidson, R. J. (2006).
Altering expectancy dampens neural response to aversive taste in primary taste cortex. Nature Neuroscience 9,
435–442.
Plassmann, H., O’Doherty, J., Shiv, B., & Rangel, A. (2008). Marketing actions can moderate neural representations
of experienced pleasantness. Proceedings of the National Academy of Sciences, 105(3), 1050–1054.
Proctor, R. W., & Van Zandt, T. (2008). Human factors in simple and complex systems (2nd ed.). Boca Raton, FL:
CRC Press.
Ramachandran, V. S., Hubbard, E. M., Robertson, L. C., & Sagiv, N. (2005). The emergence of the human mind:
Some clues from synesthesia. In Synesthesia: Perspectives From Cognitive Neuroscience (pp. 147–190). New
York, NY: Oxford University Press.
5.5 ACCURACY AND INACCURACY IN PERCEPTION • 202
Runeson, S. (1988). The distorted room illusion, equivalent configurations, and the specificity of static optic
arrays. Journal of Experimental Psychology: Human Perception and Performance, 14(2), 295–304.
Silverstein, L. D., Krantz, J. H., Gomer, F. E., Yeh, Y., & Monty, R. W. (1990). The effects of spatial sampling and
luminance quantization on the image quality of color matrix displays. Journal of the Optical Society of America,
Part A, 7, 1955–1968.
Silverstein, L. D., & Merrifield, R. M. (1985). The development and evaluation of color systems for airborne
applications: Phase I Fundamental visual, perceptual, and display systems considerations (Tech. Report DOT/
FAA/PM085019). Washington, DC: Federal Aviation Administration.
Simons, D. J., & Chabris, C. F. (1999). Gorillas in our midst: Sustained inattentional blindness for dynamic events.
Perception, 28(9), 1059–1074.
Stern, M., & Karraker, K. H. (1989). Sex stereotyping of infants: A review of gender labeling studies. Sex Roles,
20(9–10), 501–522.
Witt, J. K., & Proffitt, D. R. (2005). See the ball, hit the ball: Apparent ball size is correlated with batting average.
Psychological Science, 16(12), 937–938.
Yarbus, A. L. (1967). Eye movements and vision. New York, NY: Plenum Press.
Image Attributions
Figure 5.23: Grey Square Optical Illusion by Edward H. Adelson, http://commons.wikimedia.org/wiki/
File:Grey_square_optical_illusion.PNG is in the public domain.
Figure 5.25: “Full Moon Through The Clouds” by Jake Khuon (http://www.flickr.com/photos/wintrhawk/
443408898/) is licensed under CC BY-NC 2.0 license(http://creativecommons.org/licenses/by-nc/2.0/deed.en_CA).
“Last Sail Under a Full Moon” by Kipp Baker (http://www.flickr.com/photos/mrpixure/3356957620/in/
photostream) is licensed under CC BY-NC-ND 2.0 license (http://creativecommons.org/licenses/by-nc-nd/2.0/
deed.en_CA).
Figure 5.27: “DC-9 Cockpit” by Dmitry Denisenkov (http://en.wikipedia.org/wiki/File:DC-9_Cockpit.jpg) is
licensed under the Creative Commons Attribution-Share Alike 3.0 Unported.(http://creativecommons.org/licenses/
by-sa/3.0/deed.en) “Airbus A380 cockpit” by Naddsy (http://en.wikipedia.org/wiki/File:Airbus_A380_cockpit.jpg)
used under the Creative Commons Attribution 2.0 Generic (http://creativecommons.org/licenses/by/2.0/deed.en).
203 • INTRODUCTION TO PSYCHOLOGY - 1ST CANADIAN EDITION
5.6 Chapter Summary
Sensation and perception work seamlessly together to allow us to detect both the presence of, and changes in, the
stimuli around us.
The study of sensation and perception is exceedingly important for our everyday lives because the knowledge
generated by psychologists is used in so many ways to help so many people.
Each sense accomplishes the basic process of transduction — the conversion of stimuli detected by receptor cells
into electrical impulses that are then transported to the brain — in different, but related, ways.
Psychophysics is the branch of psychology that studies the effects of physical stimuli on sensory perceptions.
Psychophysicists study the absolute threshold of sensation as well as the difference threshold, or just noticeable
difference (JND). Weber’s law maintains that the JND of a stimulus is a constant proportion of the original intensity
of the stimulus.
Most of our cerebral cortex is devoted to seeing, and we have substantial visual skills. The eye is a specialized
system that includes the cornea, pupil, iris, lens, and retina. Neurons, including rods and cones, react to light landing
on the retina and send it to the visual cortex via the optic nerve.
Images are perceived, in part, through the action of feature detector neurons.
The shade of a colour, known as hue, is conveyed by the wavelength of the light that enters the eye. The
Young-Helmholtz trichromatic colour theory and the opponent-process colour theory are theories of how the brain
perceives colour.
Depth is perceived using both binocular and monocular depth cues. Monocular depth cues are based on gestalt
principles. The beta effect and the phi phenomenon are important in detecting motion.
The ear detects both the amplitude (loudness) and frequency (pitch) of sound waves.
Important structures of the ear include the pinna, eardrum, ossicles, cochlea, and oval window.
The frequency theory of hearing proposes that as the pitch of a sound wave increases, nerve impulses of a
corresponding frequency are sent to the auditory nerve. The place theory of hearing proposes that different areas of
the cochlea respond to different frequencies.
Sounds that are 85 decibels or more can cause damage to your hearing, particularly if you are exposed to them
repeatedly. Sounds that exceed 130 decibels are dangerous, even if you are exposed to them infrequently.
The tongue detects six different taste sensations, known respectively as sweet, salty, sour, bitter, piquancy (spicy),
and umami (savory).
We have approximately 1,000 types of odour receptor cells and it is estimated that we can detect 10,000 different
odours.
Thousands of nerve endings in the skin respond to four basic sensations—pressure, hot, cold, and pain—but only
204
the sensation of pressure has its own specialized receptors. The ability to keep track of where the body is moving is
provided by the vestibular system.
Perception involves the processes of sensory interaction, selective attention, sensory adaptation, and perceptual
constancy.
Although our perception is very accurate, it is not perfect. Our expectations and emotions colour our perceptions
and may result in illusions.
205 • INTRODUCTION TO PSYCHOLOGY - 1ST CANADIAN EDITION