I. Classical Conditioning Show Classical Conditioning can be defined as a type of learning in which a stimulus acquires the capacity to evoke a reflexive response that was originally evoked by a different stimulus. A. Ivan Pavlov - Russian physiologist interested in behavior (digestion). 1) Pavlov was studying salivation in dogs - he was measuring the amount of salivation produced by the salivary glands of dogs by presenting them meat powder through a food dispenser.
2) Terminology (if you are still confused by these definitions, please look in the non-Psychology jargon glossary on the AlleyDog.com homepage):
Don't worry, we will get to some examples that make this all much more clear. 3) Basic Principles:
B. Classical Conditioning in Everyday Life One of the great things about conditioning is that we can see it all around us. Here are some examples of classical conditioning that you may see: 1. Conditioned Fear & Anxiety - many phobias that people experience are the results of conditioning.
2. Advertising - modern advertising strategies evolved from John Watson's use of conditioning. The approach is to link an attractive US with a CS (the product being sold) so the consumer will feel positively toward the product just like they do with the US.
3. A Clockwork Orange - No additional information necessary! If you haven't seen this movie or read the book, do it. You will find it very interesting, and a wonderful example of conditioning in action. II. Operant Conditioning Operant conditioning can be defined as a type of learning in which voluntary (controllable; non-reflexive) behavior is strengthened if it is reinforced and weakened if it is punished (or not reinforced). Note: Skinner referred to this as Instrumental Conditioning/Learning A. The most prominent figure in the development and study of Operant Conditioning was B. F. Skinner 1. History: a) As an Undergraduate he was an English major, then decided to study Psychology in graduate school. b) Early in his career he believed much of behavior could be studied in a single, controlled environment (created Skinner box - address later). Instead of observing behavior in the natural world, he attempted to study behavior in a closed, controlled unit. This prevents any factors not under study from interfering with the study - as a result, Skinner could truly study
behavior and specific factors that influence behavior. c) during the "cognitive revolution" that swept Psychology (discussed later), Skinner stuck to the position that behavior was not guided by inner force or cognition. This made him a "radical behaviorist". d) as his theories of Operant Conditioning developed, Skinner became passionate about social issues, such as free will, how they developed, why they developed, how they were propagated, etc. 2. Skinner's views of Operant Conditioning a) Operant Conditioning is different from Classical Conditioning in that the behaviors studied in Classical Conditioning are reflexive (for example, salivating). However, the behaviors studied and governed by the principles of Operant Conditioning are non-reflexive (for example, gambling). So, compared to Classical Conditioning, Operant Conditioning attempts to predict non-reflexive, more complex behaviors, and the conditions in which they will occur. In
addition, Operant Conditioning deals with behaviors that are performed so that the organism can obtain reinforcement. b) there are many factors involved in determining if an organism will engage in a behavior - just because there is food doesn't mean an organism will eat (time of day, last meal, etc.). SO, unlike classical conditioning...(go to "c", below) c) in Op. Cond., the organism has a lot of control. Just because a stimulus is presented, does not necessarily mean that
an organism is going to react in any specific way. Instead, reinforcement is dependent on the organism's behavior. In other words, in order for an organism to receive some type of reinforcement, the organism must behave in a specific manner. For example, you can't win at a slot machine unless several things happen, most importantly, you pull the lever. Pulling the lever is a voluntary, non-reflexive behavior that must be exhibited before reinforcement (hopefully a jackpot) can be delivered. d) in classical conditioning, the controlling stimulus comes before the behavior. But in Operant Conditioning, the controlling stimulus comes after the behavior. If we look at Pavlov's meat powder example, you remember that the sound occurred (controlling stimulus), the dog salivated, and then the meat powder was delivered. With Operant conditioning, the sound would occur, then the dog would have to perform some behavior in order to get the meat powder as a reinforcement.
(like making a dog sit to receive a bone). e) Skinner Box - This is a chamber in which Skinner placed animals such as rats and pigeons to study. The chamber contains either a lever or key that can be pressed in order to receive reinforcements such as food and water. * the Skinner Box created Free Operant Procedure - responses can be made and recorded continuously without the need to stop the experiment for the experimenter to record the responses made by the animal. f) Shaping - operant conditioning method for creating an entirely new behavior by using rewards to guide an organism toward a desired behavior (called Successive Approximations). In doing so, the organism is rewarded with each small advancement in the right direction. Once one appropriate behavior is made and rewarded, the organism is not reinforced again until they make a further advancement, then another and another until the organism is only rewarded once the entire behavior is performed.
3. Principles of Reinforcement a) Skinner identified two types of reinforcing events - those in which a reward is given; and those in which something bad is removed. In either case, the point of reinforcement is to increase the frequency or
probability of a response occurring again.
b) Skinner also identified two types of reinforcers
4. Schedules of Reinforcement There are two types of reinforcement schedules - continuous, and partial/intermittent (four subtypes of partial schedules) a) Fixed Ratio (FR) - reinforcement given after every N th responses, where N is the size of the ratio (i.e., a certain number of responses have to occur before getting reinforcement).
b) Variable Ratio (VR) - the variable ration schedule is the same as the FR except that the ratio varies, and is not stable like the FR schedule. Reinforcement is given after every N th response, but N is an average.
c) Fixed Interval (FI) - a
designated amount of time must pass, and then a certain response must be made in order to get reinforcement.
d) Variable Interval (VI) - same as FI but now the time interval varies.
5. Punishment - Whereas reinforcement increases the probability of a response occurring again, the premise of punishment is to decrease the frequency or probability of a response occurring again. a) Skinner did not believe that punishment was as powerful a form of control as reinforcement, even though it is the so commonly used. Thus, it is not truly the opposite of reinforcement like he originally thought, and the effects are normally short-lived. b) there are two types of punishment:
6. Applications of Operant Conditioning a) In the Classroom Skinner thought that our education system was ineffective. He suggested that one teacher in a classroom could not teach many students adequately when each child learns at a different rate. He proposed using teaching machines (what we now call computers) that would allow each student to move at their own pace. The teaching machine would provide self-paced learning that gave immediate
feedback, immediate reinforcement, identification of problem areas, etc., that a teacher could not possibly provide. b) In the Workplace I already gave the example of piece work in factories. Another example - study by Pedalino & Gamboa (1974) - To help reduce the frequency of employee tardiness, the researchers implemented a game-like system for all employees that arrived on time. When an employee arrived on time, they were allowed to draw a card. Over the course
of a 5-day workweek, the employee would have a full hand for poker. At the end of the week, the best hand won $20. This simple method reduced employee tardiness significantly and demonstrated the effectiveness of operant conditioning on humans. There are also many clinical uses, including Ivar Lovaas' method of teaching autistic children how to speak (see your book). What is it called when we learn not to respond to a stimulus?Habituation occurs when we learn not to respond to a stimulus that is presented repeatedly without change. As the stimulus occurs over and over, we learn not to focus our attention on it.
What is the process by which stimuli lose their ability to evoke?Extinction is the process by which conditioned stimuli lose the ability to elicit conditioned responses because the conditioned stimuli are no longer associated with the unconditioned stimuli. Extinction helps organisms adapt the environmental changes.
What is the stimulus that naturally evokes an unlearned response?The unconditioned response is the unlearned response that occurs naturally in response to the unconditioned stimulus. In our example, the feeling of hunger in response to the smell of food is the unconditioned response.
What is the process of learning to respond to certain stimuli?Conditioning is the process of learning these associations. There are two types of conditioning: classical conditioning and operant conditioning. is a learning process in which a neutral stimulus becomes associated with a meaningful stimulus and acquires the capacity to elicit a similar response.
|