Perception (from the Latin perceptio) is the organization, identification, and interpretation of sensory information in order to represent and understand the presented information, or the environment.
All perception involves signals that go through the nervous system, which in turn result from physical or chemical stimulation of the sensory system. For example, vision involves light striking the retina of the eye, smell is mediated by odor molecules, and hearing involves pressure waves.
Perception is not only the passive receipt of these signals, but it’s also shaped by the recipient’s learning, memory, expectation, and attention.
Perception can be split into two processes, (1) processing the sensory input, which transforms these low-level information to higher-level information (e.g., extracts shapes for object recognition), (2) processing which is connected with a person’s concepts and expectations (or knowledge), restorative and selective mechanisms (such as attention) that influence perception.
Perception depends on complex functions of the nervous system, but subjectively seems mostly effortless because this processing happens outside conscious awareness.
Since the rise of experimental psychology in the 19th Century, psychology’s understanding of perception has progressed by combining a variety of techniques. Psychophysics quantitatively describes the relationships between the physical qualities of the sensory input and perception. Sensory neuroscience studies the neural mechanisms underlying perception. Perceptual systems can also be studied computationally, in terms of the information they process. Perceptual issues in philosophy include the extent to which sensory qualities such as sound, smell or color exist in objective reality rather than in the mind of the perceiver.
Although the senses were traditionally viewed as passive receptors, the study of illusions and ambiguous images has demonstrated that the brain’s perceptual systems actively and pre-consciously attempt to make sense of their input. There is still active debate about the extent to which perception is an active process of hypothesis testing, analogous to science, or whether realistic sensory information is rich enough to make this process unnecessary.
The perceptual systems of the brain enable individuals to see the world around them as stable, even though the sensory information is typically incomplete and rapidly varying. Human and animal brains are structured in a modular way, with different areas processing different kinds of sensory information. Some of these modules take the form of sensory maps, mapping some aspect of the world across part of the brain’s surface. These different modules are interconnected and influence each other. For instance, taste is strongly influenced by smell.
Process and terminology
The process of perception begins with an object in the real world, termed the distal stimulus or distal object. By means of light, sound or another physical process, the object stimulates the body’s sensory organs. These sensory organs transform the input energy into neural activity—a process called transduction. This raw pattern of neural activity is called the proximal stimulus. These neural signals are transmitted to the brain and processed. The resulting mental re-creation of the distal stimulus is the percept.
An example would be a shoe. The shoe itself is the distal stimulus. When light from the shoe enters a person’s eye and stimulates the retina, that stimulation is the proximal stimulus. The image of the shoe reconstructed by the brain of the person is the percept. Another example would be a telephone ringing. The ringing of the telephone is the distal stimulus. The sound stimulating a person’s auditory receptors is the proximal stimulus, and the brain’s interpretation of this as the ringing of a telephone is the percept. The different kinds of sensation such as warmth, sound, and taste are called sensory modalities.
Psychologist Jerome Bruner has developed a model of perception. According to him, people go through the following process to form opinions:
When we encounter an unfamiliar target, we are open to different informational cues and want to learn more about the target.
In the second step, we try to collect more information about the target. Gradually, we encounter some familiar cues which help us categorize the target.
At this stage, the cues become less open and selective. We try to search for more cues that confirm the categorization of the target. We also actively ignore and even distort cues that violate our initial perceptions. Our perception becomes more selective and we finally paint a consistent picture of the target.
According to Alan Saks and Gary Johns, there are three components to perception.
The Perceiver, the person who becomes aware about something and comes to a final understanding. There are 3 factors that can influence his or her perceptions: experience, motivational state and finally emotional state. In different motivational or emotional states, the perceiver will react to or perceive something in different ways. Also in different situations he or she might employ a “perceptual defence” where they tend to “see what they want to see”.
The Target. This is the person who is being perceived or judged. “Ambiguity or lack of information about a target leads to a greater need for interpretation and addition.”
The Situation also greatly influences perceptions because different situations may call for additional information about the target.
Stimuli are not necessarily translated into a percept and rarely does a single stimulus translate into a percept. An ambiguous stimulus may be translated into multiple percepts, experienced randomly, one at a time, in what is called multistable perception. And the same stimuli, or absence of them, may result in different percepts depending on subject’s culture and previous experiences. Ambiguous figures demonstrate that a single stimulus can result in more than one percept; for example the Rubin vase which can be interpreted either as a vase or as two faces. The percept can bind sensations from multiple senses into a whole. A picture of a talking person on a television screen, for example, is bound to the sound of speech from speakers to form a percept of a talking person. “Percept” is also a term used by Leibniz, Bergson, Deleuze, and Guattari to define perception independent from perceivers.
In the case of visual perception, some people can actually see the percept shift in their mind’s eye. Others, who are not picture thinkers, may not necessarily perceive the ‘shape-shifting’ as their world changes. The ‘esemplastic’ nature has been shown by experiment: an ambiguous image has multiple interpretations on the perceptual level.
This confusing ambiguity of perception is exploited in human technologies such as camouflage, and also in biological mimicry, for example by European peacock butterflies, whose wings bear eyespots that birds respond to as though they were the eyes of a dangerous predator.
There is also evidence that the brain in some ways operates on a slight “delay”, to allow nerve impulses from distant parts of the body to be integrated into simultaneous signals.
Perception is one of the oldest fields in psychology. The oldest quantitative laws in psychology are Weber’s law – which states that the smallest noticeable difference in stimulus intensity is proportional to the intensity of the reference – and Fechner’s law which quantifies the relationship between the intensity of the physical stimulus and its perceptual counterpart (for example, testing how much darker a computer screen can get before the viewer actually notices). The study of perception gave rise to the Gestalt school of psychology, with its emphasis on holistic approach.
Perceptual constancy is the ability of perceptual systems to recognize the same object from widely varying sensory inputs.:118–120 For example, individual people can be recognized from views, such as frontal and profile, which form very different shapes on the retina. A coin looked at face-on makes a circular image on the retina, but when held at angle it makes an elliptical image. In normal perception these are recognized as a single three-dimensional object. Without this correction process, an animal approaching from the distance would appear to gain in size. One kind of perceptual constancy is color constancy: for example, a white piece of paper can be recognized as such under different colors and intensities of light. Another example is roughness constancy: when a hand is drawn quickly across a surface, the touch nerves are stimulated more intensely. The brain compensates for this, so the speed of contact does not affect the perceived roughness. Other constancies include melody, odor, brightness and words. These constancies are not always total, but the variation in the percept is much less than the variation in the physical stimulus. The perceptual systems of the brain achieve perceptual constancy in a variety of ways, each specialized for the kind of information being processed, with phonemic restoration as a notable example from hearing.
The principles of grouping (or Gestalt laws of grouping) are a set of principles in psychology, first proposed by Gestalt psychologists to explain how humans naturally perceive objects as organized patterns and objects. Gestalt psychologists argued that these principles exist because the mind has an innate disposition to perceive patterns in the stimulus based on certain rules. These principles are organized into six categories: proximity, similarity, closure, good continuation, common fate and good form.
The principle of proximity states that, all else being equal, perception tends to group stimuli that are close together as part of the same object, and stimuli that are far apart as two separate objects. The principle of similarity states that, all else being equal, perception lends itself to seeing stimuli that physically resemble each other as part of the same object, and stimuli that are different as part of a different object. This allows for people to distinguish between adjacent and overlapping objects based on their visual texture and resemblance. The principle of closure refers to the mind’s tendency to see complete figures or forms even if a picture is incomplete, partially hidden by other objects, or if part of the information needed to make a complete picture in our minds is missing. For example, if part of a shape’s border is missing people still tend to see the shape as completely enclosed by the border and ignore the gaps. The principle of good continuation makes sense of stimuli that overlap: when there is an intersection between two or more objects, people tend to perceive each as a single uninterrupted object. The principle of common fate groups stimuli together on the basis of their movement. When visual elements are seen moving in the same direction at the same rate, perception associates the movement as part of the same stimulus. This allows people to make out moving objects even when other details, such as color or outline, are obscured. The principle of good form refers to the tendency to group together forms of similar shape, pattern, color, etc. Later research has identified additional grouping principles.
A common finding across many different kinds of perception is that the perceived qualities of an object can be affected by the qualities of context. If one object is extreme on some dimension, then neighboring objects are perceived as further away from that extreme. “Simultaneous contrast effect” is the term used when stimuli are presented at the same time, whereas “successive contrast” applies when stimuli are presented one after another.
The contrast effect was noted by the 17th Century philosopher John Locke, who observed that lukewarm water can feel hot or cold, depending on whether the hand touching it was previously in hot or cold water. In the early 20th Century, Wilhelm Wundt identified contrast as a fundamental principle of perception, and since then the effect has been confirmed in many different areas. These effects shape not only visual qualities like color and brightness, but other kinds of perception, including how heavy an object feels. One experiment found that thinking of the name “Hitler” led to subjects rating a person as more hostile. Whether a piece of music is perceived as good or bad can depend on whether the music heard before it was pleasant or unpleasant. For the effect to work, the objects being compared need to be similar to each other: a television reporter can seem smaller when interviewing a tall basketball player, but not when standing next to a tall building. In the brain, brightness contrast exerts effects on both neuronal firing rates and neuronal synchrony.
Effect of experience
With experience, organisms can learn to make finer perceptual distinctions, and learn new kinds of categorization. Wine-tasting, the reading of X-ray images and music appreciation are applications of this process in the human sphere. Research has focused on the relation of this to other kinds of learning, and whether it takes place in peripheral sensory systems or in the brain’s processing of sense information. Empirical research show that specific practices (such as yoga, mindfulness, Tai Chi, meditation, Daoshi and other mind-body disciplines) can modify human perceptual modality. Specifically, these practices enable perception skills to switch from the external (exteroceptive field) towards a higher ability to focus on internal signals (proprioception). Also, when asked to provide verticality judgments, highly self-transcendent yoga practitioners were significantly less influenced by a misleading visual context. Increasing self-transcendence may enable yoga practitioners to optimize verticality judgment tasks by relying more on internal (vestibular and proprioceptive) signals coming from their own body, rather than on exteroceptive, visual cues.
Effect of motivation and expectation
A perceptual set, also called perceptual expectancy or just set is a predisposition to perceive things in a certain way. It is an example of how perception can be shaped by “top-down” processes such as drives and expectations. Perceptual sets occur in all the different senses. They can be long term, such as a special sensitivity to hearing one’s own name in a crowded room, or short term, as in the ease with which hungry people notice the smell of food. A simple demonstration of the effect involved very brief presentations of non-words such as “sael”. Subjects who were told to expect words about animals read it as “seal”, but others who were expecting boat-related words read it as “sail”.
Sets can be created by motivation and so can result in people interpreting ambiguous figures so that they see what they want to see. For instance, how someone perceives what unfolds during a sports game can be biased if they strongly support one of the teams. In one experiment, students were allocated to pleasant or unpleasant tasks by a computer. They were told that either a number or a letter would flash on the screen to say whether they were going to taste an orange juice drink or an unpleasant-tasting health drink. In fact, an ambiguous figure was flashed on screen, which could either be read as the letter B or the number 13. When the letters were associated with the pleasant task, subjects were more likely to perceive a letter B, and when letters were associated with the unpleasant task they tended to perceive a number 13.
Perceptual set has been demonstrated in many social contexts. People who are primed to think of someone as “warm” are more likely to perceive a variety of positive characteristics in them, than if the word “warm” is replaced by “cold”. When someone has a reputation for being funny, an audience is more likely to find them amusing. Individual’s perceptual sets reflect their own personality traits. For example, people with an aggressive personality are quicker to correctly identify aggressive words or situations.
One classic psychological experiment showed slower reaction times and less accurate answers when a deck of playing cards reversed the color of the suit symbol for some cards (e.g. red spades and black hearts).
Philosopher Andy Clark explains that perception, although it occurs quickly, is not simply a bottom-up process (where minute details are put together to form larger wholes). Instead, our brains use what he calls ‘predictive coding’. It starts with very broad constraints and expectations for the state of the world, and as expectations are met, it makes more detailed predictions (errors lead to new predictions, or learning processes). Clark says this research has various implications; not only can there be no completely “unbiased, unfiltered” perception, but this means that there is a great deal of feedback between perception and expectation (perceptual experiences often shape our beliefs, but those perceptions were based on existing beliefs). Indeed, predictive coding provides an account where this type of feedback assists in stabilizing our inference-making process about the physical world, such as with perceptual constancy examples.
Perception as direct perception
Cognitive theories of perception assume there is a poverty of stimulus. This (with reference to perception) is the claim that sensations are, by themselves, unable to provide a unique description of the world. Sensations require ‘enriching’, which is the role of the mental model. A different type of theory is the perceptual ecology approach of James J. Gibson. Gibson rejected the assumption of a poverty of stimulus by rejecting the notion that perception is based upon sensations – instead, he investigated what information is actually presented to the perceptual systems. His theory “assumes the existence of stable, unbounded, and permanent stimulus-information in the ambient optic array. And it supposes that the visual system can explore and detect this information. The theory is information-based, not sensation-based.” He and the psychologists who work within this paradigm detailed how the world could be specified to a mobile, exploring organism via the lawful projection of information about the world into energy arrays. “Specification” would be a 1:1 mapping of some aspect of the world into a perceptual array; given such a mapping, no enrichment is required and perception is direct perception.
An ecological understanding of perception derived from Gibson’s early work is that of “perception-in-action”, the notion that perception is a requisite property of animate action; that without perception, action would be unguided, and without action, perception would serve no purpose. Animate actions require both perception and motion, and perception and movement can be described as “two sides of the same coin, the coin is action”. Gibson works from the assumption that singular entities, which he calls “invariants”, already exist in the real world and that all that the perception process does is to home in upon them. A view known as constructivism (held by such philosophers as Ernst von Glasersfeld) regards the continual adjustment of perception and action to the external input as precisely what constitutes the “entity”, which is therefore far from being invariant.
Glasersfeld considers an “invariant” as a target to be homed in upon, and a pragmatic necessity to allow an initial measure of understanding to be established prior to the updating that a statement aims to achieve. The invariant does not and need not represent an actuality, and Glasersfeld describes it as extremely unlikely that what is desired or feared by an organism will never suffer change as time goes on. This social constructionist theory thus allows for a needful evolutionary adjustment.
A mathematical theory of perception-in-action has been devised and investigated in many forms of controlled movement, and has been described in many different species of organism using the General Tau Theory. According to this theory, tau information, or time-to-goal information is the fundamental ‘percept’ in perception.
Evolutionary psychology (EP) and perception
Many philosophers, such as Jerry Fodor, write that the purpose of perception is knowledge, but evolutionary psychologists hold that its primary purpose is to guide action. For example, they say, depth perception seems to have evolved not to help us know the distances to other objects but rather to help us move around in space. Evolutionary psychologists say that animals from fiddler crabs to humans use eyesight for collision avoidance, suggesting that vision is basically for directing action, not providing knowledge.
Building and maintaining sense organs is metabolically expensive, so these organs evolve only when they improve an organism’s fitness. More than half the brain is devoted to processing sensory information, and the brain itself consumes roughly one-fourth of one’s metabolic resources, so the senses must provide exceptional benefits to fitness. Perception accurately mirrors the world; animals get useful, accurate information through their senses.
Scientists who study perception and sensation have long understood the human senses as adaptations. Depth perception consists of processing over half a dozen visual cues, each of which is based on a regularity of the physical world. Vision evolved to respond to the narrow range of electromagnetic energy that is plentiful and that does not pass through objects. Sound waves provide useful information about the sources of and distances to objects, with larger animals making and hearing lower-frequency sounds and smaller animals making and hearing higher-frequency sounds. Taste and smell respond to chemicals in the environment that were significant for fitness in the environment of evolutionary adaptedness. The sense of touch is actually many senses, including pressure, heat, cold, tickle, and pain. Pain, while unpleasant, is adaptive. An important adaptation for senses is range shifting, by which the organism becomes temporarily more or less sensitive to sensation. For example, one’s eyes automatically adjust to dim or bright ambient light. Sensory abilities of different organisms often coevolve, as is the case with the hearing of echolocating bats and that of the moths that have evolved to respond to the sounds that the bats make.
Evolutionary psychologists claim that perception demonstrates the principle of modularity, with specialized mechanisms handling particular perception tasks. For example, people with damage to a particular part of the brain suffer from the specific defect of not being able to recognize faces (prospagnosia). EP suggests that this indicates a so-called face-reading module.
Theories of perception
Empirical theories of perception
Anne Treisman’s feature integration theory
Interactive activation and competition
Irving Biederman’s recognition by components theory
A sensory system is a part of the nervous system responsible for processing sensory information. A sensory system consists of sensory receptors, neural pathways, and parts of the brain involved in sensory perception. Commonly recognized sensory systems are those for vision, hearing, somatic sensation (touch), taste and olfaction (smell). It has been suggested that the immune system is an overlooked sensory modality. In short, senses are transducers from the physical world to the realm of the mind.
The receptive field is the specific part of the world to which a receptor organ and receptor cells respond. For instance, the part of the world an eye can see, is its receptive field; the light that each rod or cone can see, is its receptive field. Receptive fields have been identified for the visual system, auditory system and somatosensory system, so far. Research attention is currently focused not only on external perception processes, but also to “Interoception”, considered as the process of receiving, accessing and appraising internal bodily signals. Maintaining desired physiological states is critical for an organism’s well being and survival. Interoception is an iterative process, requiring the interplay between perception of body states and awareness of these states to generate proper self-regulation. Afferent sensory signals continuously interact with higher order cognitive representations of goals, history, and environment, shaping emotional experience and motivating regulatory behavior.
In many ways, vision is the primary human sense. Light is taken in through each eye and focused in a way which sorts it on the retina according to direction of origin. A dense surface of photosensitive cells, including rods, cones, and intrinsically photosensitive retinal ganglion cells captures information about the intensity, color, and position of incoming light. Some processing of texture and movement occurs within the neurons on the retina before the information is sent to the brain. In total, about 15 differing types of information are then forwarded to the brain proper via the optic nerve.
Hearing (or audition) is the ability to perceive sound by detecting vibrations. Frequencies capable of being heard by humans are called audio or sonic. The range is typically considered to be between 20 Hz and 20,000 Hz. Frequencies higher than audio are referred to as ultrasonic, while frequencies below audio are referred to as infrasonic. The auditory system includes the outer ears which collect and filter sound waves, the middle ear for transforming the sound pressure (impedance matching), and the inner ear which produces neural signals in response to the sound. By the ascending auditory pathway these are led to the primary auditory cortex within the temporal lobe of the human brain, which is where the auditory information arrives in the cerebral cortex and is further processed there.
Sound does not usually come from a single source: in real situations, sounds from multiple sources and directions are superimposed as they arrive at the ears. Hearing involves the computationally complex task of separating out the sources of interest, often estimating their distance and direction as well as identifying them.
Haptic perception is the process of recognizing objects through touch. It involves a combination of somatosensory perception of patterns on the skin surface (e.g., edges, curvature, and texture) and proprioception of hand position and conformation. People can rapidly and accurately identify three-dimensional objects by touch. This involves exploratory procedures, such as moving the fingers over the outer surface of the object or holding the entire object in the hand. Haptic perception relies on the forces experienced during touch.
Gibson defined the haptic system as “The sensibility of the individual to the world adjacent to his body by use of his body”. Gibson and others emphasized the close link between haptic perception and body movement: haptic perception is active exploration. The concept of haptic perception is related to the concept of extended physiological proprioception according to which, when using a tool such as a stick, perceptual experience is transparently transferred to the end of the tool.
Taste (or, the more formal term, gustation) is the ability to perceive the flavor of substances including, but not limited to, food. Humans receive tastes through sensory organs called taste buds, or gustatory calyculi, concentrated on the upper surface of the tongue. The human tongue has 100 to 150 taste receptor cells on each of its roughly ten thousand taste buds. There are five primary tastes: sweetness, bitterness, sourness, saltiness, and umami. Other tastes can be mimicked by combining these basic tastes. The recognition and awareness of umami is a relatively recent development in Western cuisine. The basic tastes contribute only partially to the sensation and flavor of food in the mouth — other factors include smell, detected by the olfactory epithelium of the nose; texture, detected through a variety of mechanoreceptors, muscle nerves, etc.; and temperature, detected by thermoreceptors. All basic tastes are classified as either appetitive or aversive, depending upon whether the things they sense are harmful or beneficial.
Social perception is the part of perception that allows people to understand the individuals and groups of their social world, and thus an element of social cognition.
Speech perception is the process by which spoken languages are heard, interpreted and understood. Research in speech perception seeks to understand how human listeners recognize speech sounds and use this information to understand spoken language. The sound of a word can vary widely according to words around it and the tempo of the speech, as well as the physical characteristics, accent and mood of the speaker. Listeners manage to perceive words across this wide range of different conditions. Another variation is that reverberation can make a large difference in sound between a word spoken from the far side of a room and the same word spoken up close. Experiments have shown that people automatically compensate for this effect when hearing speech.
The process of perceiving speech begins at the level of the sound within the auditory signal and the process of audition. The initial auditory signal is compared with visual information — primarily lip movement — to extract acoustic cues and phonetic information. It is possible other sensory modalities are integrated at this stage as well. This speech information can then be used for higher-level language processes, such as word recognition.
Speech perception is not necessarily uni-directional. That is, higher-level language processes connected with morphology, syntax, or semantics may interact with basic speech perception processes to aid in recognition of speech sounds. It may be the case that it is not necessary and maybe even not possible for a listener to recognize phonemes before recognizing higher units, like words for example. In one experiment, Richard M. Warren replaced one phoneme of a word with a cough-like sound. His subjects restored the missing speech sound perceptually without any difficulty and what is more, they were not able to identify accurately which phoneme had been disturbed.
Facial perception refers to cognitive processes specialized for handling human faces, including perceiving the identity of an individual, and facial expressions such as emotional cues.
The somatosensory cortex encodes incoming sensory information from receptors all over the body. Affective touch is a type of sensory information that elicits an emotional reaction and is usually social in nature, such as a physical human touch. This type of information is actually coded differently than other sensory information. Intensity of affective touch is still encoded in the primary somatosensory cortex, but the feeling of pleasantness associated with affective touch activates the anterior cingulate cortex more than the primary somatosensory cortex. Functional magnetic resonance imaging (fMRI) data shows that increased blood oxygen level contrast (BOLD) signal in the anterior cingulate cortex as well as the prefrontal cortex is highly correlated with pleasantness scores of an affective touch. Inhibitory transcranial magnetic stimulation (TMS) of the primary somatosensory cortex inhibits the perception of affective touch intensity, but not affective touch pleasantness. Therefore, the S1 is not directly involved in processing socially affective touch pleasantness, but still plays a role in discriminating touch location and intensity.
Other senses enable perception of body balance, acceleration, gravity, position of body parts, temperature, pain, time, and perception of internal senses such as suffocation, gag reflex, intestinal distension, fullness of rectum and urinary bladder, and sensations felt in the throat and lungs.
Source from Wikipedia