Use Sophia to knock out your gen-ed requirements quickly and affordably. Learn more
×

The Basics of Operant Conditioning

Author: Sophia
what's covered
This tutorial will cover another behaviorist approach to learning, operant conditioning. You will learn about

  1. Operant Conditioning
  2. Operant Conditioning Process
  3. Complex Behavior


1. Operant Conditioning

Behaviorism studies psychology as a result of the interactions with the environment, and a person's behaviors in response to those environments and situations. Operant conditioning is learning that occurs through the association of consequences to the behaviors of the person.

Operant conditioning is different from classical conditioning, because classical conditioning examines what happens before an action occurs. It looks at the stimulus that leads to the response and the learner is essentially a passive agent in the process. The learner takes stimuli from the environment and spits out different behaviors.

In contrast, operant conditioning examines what happens after an action, as a result of that action and the situation that caused the action to occur. Operant conditioning focuses on how situations and actions are more or less likely to occur later on:

  • If something pleasant happens after a behavior occurs, then the connection becomes stronger between that behavior and the situation around it. An individual is more likely to do that behavior again in the same kind of situation, because it led to something that is pleasant.
  • Conversely, if something unpleasant happens as a result of the action and the situation, then the opposite occurs. The connection is weaker and a person is less likely to perform that action again.

This is what Edward Thorndike referred to as the Law of Effect.

Thorndike was one of the first behaviorist thinkers under operant conditioning, but most agree that the most famous and well-known of the operant conditioning proponents is B.F. Skinner.

Skinner was an American psychologist in the 1950s to 1970s. He is what is known as a "radical behaviorist." This means that he thought that all of psychology and mental states are completely the result of behaviors and they are determined by the environment around us. He believed that free will is an illusion, and we're actually just constantly responding to the things around us.

Now, you might agree or disagree with what he thought, but the results of his studies and his information about conditioning has been very informative in psychology.

terms to know
Operant Conditioning
Learning that occurs through the association of consequences to behaviors
Law of Effect
Edward Thorndike’s concept that if consequences to our actions are pleasant we are likely to repeat our action (strengthen the association), and if the consequences are unpleasant we are not likely to repeat the action (weakening the association)
people to know
B.F. Skinner
American psychologist, worked primarily with animals in the 1950’s-1970’s


2. Operant Conditioning Process

Skinner elaborated on Thorndike's Law of Effect, developing several important concepts, including the idea of operant reinforcer and punisher.

An operant reinforcer is anything that follows a response and makes it more likely for that response to be repeated. Recall that Thorndike said that if something pleasant happens after a behavior, a person is more likely to repeat the behavior; those pleasant things are what we would call reinforcers.

On the other hand, a punisher is anything that makes it less likely for an action to be repeated, which Thorndike referred to as being an unpleasant experience that followed a behavior.

Skinner proved these ideas about the operant conditioning process through animal experiments using his invention of the Skinner box, which he named after himself and which others refer to as the operant conditioning chamber.

The box was a simple, bare box within a laboratory with a lever or a button inside of it, as well as a small place to dispense food. Sometimes, Skinner boxes would be modified to include lights, speakers, or other similar things, but the simplest form is just the box, lever or button, and a place to dispense food.

The animal was put inside and discovered that whenever it pushed the lever or the button, it would get food. The rate of the lever being pushed was recorded, showing how often the lever push occurred over a period of time.

Skinner found that, over time, the rate at which the lever was being pushed would go up. The animal would push the lever more often as a result of getting the reinforcer of the food.

big idea
The Skinner box experiments showed that the consequence of the behavior makes it more likely for that behavior to occur again.

terms to know
Operant Reinforcer
Anything that follows a response and makes it more likely for that response to be repeated
Skinner Box (chamber)
A simple box with a lever/button and a place to dispense food where an animal is placed and the rate of the lever being pushed is recorded; a.k.a. a Skinner box


3. Complex Behavior

More complex behaviors can also be learned through operant conditioning, but not necessarily all at once. This is what Skinner referred to as shaping.

Shaping is the process of gradually learning responses into a desired behavior. It involves transforming over time to the behavior that is being targeted as an end result. This can be achieved, for example, by using the Skinner box and changing the way that the food is being dispensed:

EXAMPLE

Suppose that a pigeon only gets food at first when it turns its head to the left; every time it would turn its head to the left and then peck, it would get food. Now, after that behavior is learned, you would start to change it so that whenever the bird turns its whole body to the left, it gets food. Eventually, through these gradual changes, you can change the behavior so that the pigeon turns all the way around in a circle before it gets food.

You'll find that the behavior is repeated because of the reinforcer that is causing the behavior to continue to occur and become more likely to occur.

Behaviors can also be linked together in response to a single reinforcer. This is called response chaining. Response chaining is where one reinforcer can lead to multiple actions in succession. In this sense, this concept helps to explain why long-term goals occur.

EXAMPLE

Someone might think, "If I come to class each day, then finish this project, and then present it, I'll get a good grade in my class." Multiple behaviors lead to one reinforcer at the very end.

Now, if the reinforcer is taken away, then the behavior will disappear altogether. This is what's called operant extinction. However, the behaviors don't disappear as soon as that reinforcer is taken away. Sometimes, the behavior will actually increase right before it reaches extinction.

EXAMPLE

For instance, this is why certain negative behaviors in children will increase for a short period of time when they're being ignored. If a child starts yelling for their parents' attention but not getting a response, instead of immediately stopping their yelling--because they're not receiving that reinforcer of attention--they might start yelling even louder. Eventually, though, the behavior will disappear in the absence of that reinforcer.

terms to know
Shaping
Gradual learning of responses into a desired behavior
Response Chaining
Behaviors are linked together in response to a single reinforcement
Operant Extinction
The relationship between the consequences of a behavior and the behavior itself disappears and a person doesn’t perform a behavior in a situation


summary
Operant conditioning comes from the field of behaviorism. It is related to, but different from, classical conditioning because it looks at the way behaviors are reinforced after they are performed. The operant conditioning process was shown in Skinner's studies with the Skinner box. Through the process of operant conditioning, many complex behaviors can be better understood.

Good luck!

Source: This work is adapted from Sophia author Erick Taggart.

Terms to Know
Law of Effect

Edward Thorndike’s concept that if consequences to our actions are pleasant we are likely to repeat our action (strengthen the association), and if the consequences are unpleasant we are not likely to repeat the action (weakening the association).

Operant Conditioning

Learning that occurs through the association of consequences to behaviors.

Operant Extinction

The relationship between the consequences of a behavior and the behavior itself disappears and a person doesn’t perform doesn’t perform a behavior in a situation.

Operant Reinforcer

Anything that follows a response and makes it more likely for that response to be repeated.

Response Chaining

Behaviors are linked together in response to a single reinforcement.

Shaping

Gradual learning of responses into a desired behavior.

Skinner Box (chamber)

A simple box with a lever/button and a place to dispense food where an animal is placed and the rate of the lever being pushed is recorded; a.k.a. a Skinner box.

People to Know
B. F. Skinner

American psychologist, worked primarily with animals in the 1950’s - 1970’s.