Our world is brought through to us via our senses. Our senses are biological in nature.
There have been many studies that show the nature of the limits of our visual field. There are different types of colour blindness. People who are colour blind or colour vision deficient have an inability to see some or all colours. For example in monochromacity (total colour blindness), a person has the inability to distinguish colours owing to a cone and/or rod defect or absence. In such a case, a person would be looking at a colour object yet will not be able to determine the colours. An interesting fact is that the colour of traffic lights may not be distinguished by a someone who is colour vision deficient. In such cases one can look at the placement of the light, if it is the top light then it would be the red stop light, while the bottom light signals green. The person who is colour blind can use the light placement to compensate for not being able to differentiate the two colours.
The eye has a visual field. The visual field contains a blind spot as there is a lack of light-detecting photoreceptor cells on the optic disc of the retina. The reason for this is that the optic nerve passes through at this point and thus the visual field is blank at this location. Blind spot exercises can be performed and we quickly realise that we cant see at this point in the visual field. The brain fills in the hole for us by use of the other eyes visual field. Thus, the missing piece in our vision is constructed by our self for our self. This has serious applications. When driving in a car, the side mirrors show the cars that are behind us. If the car is at the point in the visual field where the blind spot is, we may see that there is no car as the missing information is filled in with the help of the other eye. The other eye does not see the car and thus we wont see it altogether. The term blind spot with respect to driving does not only describe the lack of coverage of the side mirrors but also the blind spot that occurs in our eye structure.
The visual information that is sensed in the visual field is sent to the brain for processing. The neural pathway is shown in the following two figures:
Figure 1a and 1b: Pathway from the visual receptors in the retina to the brain (Banich, 2004:24).
The optic nerve travels through various structures until it gets to the visual cortex. About 20% of the neural stimulus that is interpreted by the brain as visual information has originated from the structures that the optic nerve travels through. The thalamus acts as a type of relay station for almost all sensory information including visual information. A relay station can be described as a region in the brain where neurons from one area of the brain synapse onto neurons that have their synapses in another area of the brain (Banich, 2004). The precise functioning of the thalamus is still not known. What is known is that some processing takes place in this area. The hypothalamus is situated next to the thalamus. The hypothalamus is the part of the brain that has been linked with homeostasis. It assists the body in controlling behavior to help the body obtain its equilibrium. For example, if it is cold, a behavioral action of finding a jersey could be initiated to regulate the body back to a comfortable position as the hypothalamus is an activator of the autonomic nervous system. Other metabolic processes are also controlled via the hypothalamus such as thirst, hunger and fatigue and much of the hypothalamus’s control effects are unconscious to the individual. The hypothalamus is also a neurohormone producer.
The visual information undergoes some processing before being interpreted by the visual cortex of the brain. The visual cortex thus uses information that was noticed on the retina as well as further information that was generated by various structures of the brain. The awareness of our world is constructed from parts to give us the picture to which we use for our decision making.
We do not have the ability to observe our environment perfectly as we are limited by our biological structures and functions. Maturana and Varela (1983) performed a ground breaking experiment whereby they surgically rotated the eye of a newt (amphibian of the Salamandridae family) by 180 degrees. The newt thus had one eye at its normal position while the other eye was 180 degree a phase. When covering the rotated eye, the newt was able to catch its prey by projecting its tongue correctly in the direction of the food (fly). When covering the normal eye and exposing the rotated eye, the newt was unable to obtain its food as it kept extending its tongue 180 degrees away from the direction of where the food was. The newt was never able to get its food and even if it was left for some time, it still could not get its food.
Maturana and Varela (1983) concluded with the following statement:
This experiment reveals in a very dramatic way that, for the animal, there is no such thing as up and down, front and back, in reference to an outside world, as it appears to the observer doing the study. There is only internal correlation between the place where the retina receives a given perturbation and the muscular contractions that move the tongue, the mouth, the neck… The operation of the nervous system is an expression of its connectivity or structure of connections and that behavior arises because of the nervous system’s internal relations of activity (p125-126).
Cognitive psychology experiments have shown time and time again the mistakes in human perception. There is an abundance of research that illustrates this. There are also numerous medical conditions such as quadranosia, qudranopsia, scotoma amongst others that are related to deficits in visual perception and/or processing.
The human ear can hear from 0dBs to over 130dBs, with the latter being the pain threshold. Hearing is a subjective experience.
The subjective perception of sound level does not show complete linearity with that of power radiated from a sound source. One reason is that the human ear has differing sensitivities across the frequency range. Figure 2 shows the frequency response of the human ear.
The above figure shows that the human ear does not have a uniform sensitivity to sound pressure. A further complicating factor is that for each frequency, our ears do not always perceive consistent SPL (sound pressure level) increments that are applicable to another frequency’s SPL increase. In particular, the low frequency range has an uneven distribution of perceived loudness when compared to the same SPL of higher frequencies. For example, generally, a young person’s ears should be able to hear a 1kHz sound at an SPL of 25dB, but would only be able to hear a low frequency sound of say 60Hz if the SPL is increased by a further 30dBs. From figure 2, the perceived loudness of each contour exhibits both a non-linear shape along the frequency axis as well as each contour tracing a slightly different shape.
Another complicating factor relates to the time duration to which the signal is applied. There is a difference on the perceived loudness for steady-state versus impulse sounds. Generally the shorter the sound impulse (less than 70ms), the lower its loudness is perceived to be (Brüel & Kjær, 1984:8).
For some time, it was commonly thought that sound is mainly associated with the ear and that sound can be explained in terms of physics, while the process of hearing can be narrowed down to mechanics. Research has repeatedly challenged any objectification of the listening experience. One fascinating research challenged the idea of there being a fixed upper frequency limit of 20kHz to human hearing. Tsutomu and colleagues (1991) set up a listening experiment where they played back a recording that had active frequencies up to 60kHz. They set up a speaker system and included an independently powered tweeter that was to excite frequencies above 26kHz. The tweeter was switchable to be on or off during the test. An EEG (electroencephalogram) was incorporated as part of the listener’s response data. The finding was that the subjective evaluation of the music played was altered by whether the high-frequency tweeter was turned on or off as well changes to the EEG were noticed (Tsutomu, Emi, Norie, Yoshitaka, & Hiroshi, 1991).
As the ear is only a part of the hearing chain, audio processing requires a study of how the hearing system, the integration of this system and the brain as well. For example, the inner ear does a large amount of signal processing in its ability to convert sound waves into neural stimulus. Audio compression methods such as MP3 (derived from MPEG-1 Audio Layer 3, moving picture experts group) take advantage of this signal processing and remove the wave data that has been found to be imperceptible to the perception of hearing in order to achieve a reduction in file storage space (Ahlzen, & Song, 2003:510). Thus, some parts of the sound are removed when converted to MP3 format with most listeners unable to tell. By studying the psycho-acoustical characteristic of listening, advancements in audio technology have been undertaken successfully.
Generally speaking, as we age our hearing range and sensitivity reduces. It is not uncommon for an elderly person to have lost 25% of their hearing SPL sensitivity as well as their frequency response. Hearing damage to one ear also effects the perception of where sound originates from.
One can go through all the sense modalities one by one finding the limits. This is not necessary. The point of departure is that the outside world is brought through to us via parts. The parts too have characteristics which add [or subtract] in the same way a filter works. Some information is lost or manipulated along the way. The meaning and understanding that is provided by our brains to the sensory information is atoo a subjective value which too colours our world.
 There are at least 17 equal loudness contour graphs to choose from. While the first popularised contour mapping was presented by Fletcher and Munson in 1933, it was later found that it was not completely accurate. In 1956, Robinson and Dadson presented their contour map which has been used extensively. The ISO226:1987 standard was based on this loudness contour map of Robinson and Dadson. Further research has shown that this too was not entirely correct. While both Fletcher and Munson and Robinson and Dadson’s work do have accurate parts, a new standard has been set out namely, ISO226:2003 [or TC43] which seeks to portray a more accurate loudness contour map based on 12 studies starting in 1983 (Suzuki & Takeshima, 2004; ISO, 2009).
Ahlzen, L. & Song, C. (2003). The Sound Blaster Live! Book. A Complete Guide to the World’s Most
Popular Sound Card. No Starch Press. San Francisco. USA
Banich, M.T. (2004) Cognitive neuroscience and neuropsychology. (2nd) Houghton Mifflin Company. USA p26
Brüel & Kjær. (1984). Measuring Sound. ( Rev. ed). BR0047-13. Brüel & Kjær. 2850 Nærum. Denmark
ISO. (2009). ISO 226:2003, Acoustics — Normal equal-loudness-level contours. Retrieved 15 June 2009 from http://www.iso.org/iso/iso_catalogue/catalogue_tc/catalogue
Robinson, D. W., and Dadson, R. S, (1956). A Re-Determination of the Equal-Loudness Relations for
Pure Tones. British Journal of Applied Physics. 7, No 5 166-181
Suzuki, Y., & Takeshima, H. (2004). Equal-loudness-level contours for pure tones. Journal of the
Acoustical Society of America. Vol 116 (2), pp918-933
Tsutomu, O., Emi, N., Norie, K., Yoshitaka, F. & Hiroshi, I. (1991). High-Frequency Sound Above the
Audible Range Affects Brain Electric Activity and Sound Perception. Audio Engineering Society. AES Convention:91 (October) Paper Number:3207
Maturana, H.R., & Varela, F.R. (1987) The Tree of Knowledge. The biological roots of human
understanding. (Revised Ed). Shambhala Publications. Boston. MA.
P. Baron (June 2009)