Você está na página 1de 4

Describe the sensory pathways from receptors to primary sensory

cortex for both thevisual and auditory systems. Be specific. Discuss


the similarities and differences that exist between the two systems.
Visual and auditory information is processed through specialized
neurons that respond to one sensory modality or the other. However, these
separate pathways target primary sensory cortices in which some neurons
also respond to stimulation of other modalities. In order to reveal the
similarities and differences that exist between these two systems, we will
describe the sensory pathways from receptors to primary sensory cortex for
both the visual and auditory systems.
The visual path begins as light passes through the eyeball and it is
absorbed by photoreceptor cells in the retina. Within the retina, these
receptors synapse with bipolar and horizontal cells, which establish the basis
for brightness and color contrasts. In turn, the bipolar cells synapse with
retinal ganglion cells and amacrine cells, which enhance contrast effects that
support form vision and establish the basis for movement detection. The
ganglion cells send this information via optic nerve to the optic chiasm.
Finally, the lateral geniculate nucleus will release color, shape and
movement information about the visual field to different neurons within V1
for further processing in V1and then sent onto different areas of the
extrastriate visual cortex.
Similarly, the auditory pathway starts as sounds are first collected by
the pinna to be translated into neural signals in the hair cells of the cochlea,

specifically the center axis called the spiral ganglion through the auditory
nerve to the cochlear nucleus where sound detection happens. These then
lead to the cochlear nucleus, then to the superior olive, then to the inferior
colliculus. The information is then release to the medial geniculate nucleus,
and finally on to the primary auditory cortex.
There are several similarities between these two sensory pathways. For
example, some cochlear neurons use lateral inhibition to sharpen the tuning
to one frequency by suppressing nearby frequencies, a mechanism
reminiscent of that used by retinal ganglion cells to respond to spots of light
instead of broad fields of light. Another similarity comes when sound wave
amplitude in the cochlea is conveyed in much the same way as light wave
amplitude: the larger the amplitude, the higher the firing rate of the neurons
that communicate with the brain. Additionally, as in the case in the lateral
geniculate of the visual system, the medial geniculate nucleus in the
auditory pathway is a sample of the many neurons that project from the
cortex. These efferent connections, some of which convey information back
to lower stages provide further anatomical evidence that sensory systems
are two-way stress, in which feedback from the brain is tightly integrated
with sensory information flowing up to the brain. Lastly, the tonotopic
organization pattern and topography mapping in these sensory systems
show that processing proceeds from simpler to more complex stimuli as we
move farther along the auditory and visual pathway. We also find greater

evidence of a cross-modal processing. Particularly in parabelt areas where


there is a combination of acoustic and optic information.
One of the most important characteristics between the two systems
arrives when determination locations. The visual information that comes
from the fovia allows us to determinate the location of a visual object. If you
see an object in front of you, you could know that it was to the left or the
right of your fovia because its image would appear on the right or left side of
your retina. However, a sound coming from an object enter the ears in
exactly the same place regardless the position of the object. This dilemma is
similar to the one the eyes face when trying to determinate how far and
object is. Thus, depth perception requires processing and integrating a set of
cues in order to provide indirect evidence about how far away the object is.
The auditory system uses a similar approach to determine the location in
space from which a sound is coming. Just as having two eyes is a main factor
to determine visual depth, having two ears is necessary to determine
auditory locations. Most times, the sound coming from an object will reach
one ear before the other. The ears then analyze sounds arrivals and intensity
of sound as auditory localization cues.
Additionally, there are several differences between the visual and
auditory system. While the retina has almost 100 million photoreceptors, the
cochlea has only about 14,000 hair cells. And the stereocilia of hair cells blow
away the competition when it comes to speed and sensitivity. These hair

cells must be capable of responding so fast that we can detect time


differences as little as 10 millionths of a second in order to know the
direction from which sound arrives. In addition, the opening of ion pores that
results from the direct connection between stereocilia via tip links is the only
known example of mechanoelectrical transduction. Unlike the case in vision,
depolarization in hearing does not await a cascade of biochemical processes
such as those in photoactivation.
In conclusion, comparing the overall structure of the auditory and
visual systems shows that a relatively large proportion of the processing in
the auditory system is done before A1. By contrast, the majority of the most
important visual processing occurs in cortical areas V1 and beyond. This
major difference may be the result of evolution of the two senses. The optical
and auditory systems have many similar structure and mechanism but they
both are different in the way they process information. Each system is unique
and unbelievably complex. Thus, we can see that the differences and
similarities between both senses make these systems better at
accomplishing necessary tasks and more efficient.

Você também pode gostar