Online College Courses for Credit

2 Tutorials that teach Operant Conditioning Basics
Take your pick:
Operant Conditioning Basics

Operant Conditioning Basics

Author: Erick Taggart

Identify the basic concepts involved in operant conditioning.

See More
Fast, Free College Credit

Developing Effective Teams

Let's Ride
*No strings attached. This college course is 100% free and is worth 1 semester credit.

37 Sophia partners guarantee credit transfer.

299 Institutions have accepted or given pre-approval for credit transfer.

* The American Council on Education's College Credit Recommendation Service (ACE Credit®) has evaluated and recommended college credit for 33 of Sophia’s online courses. Many different colleges and universities consider ACE CREDIT recommendations in determining the applicability to their course and degree programs.


Video Transcription

Download PDF

Hello, class. In today's lesson, we're going to be looking at another behaviorist approach to learning. And that's operant conditioning. Now, remember behaviorism studies psychology as a result of the interactions with the environment, and a person's behaviors in response to those environments and situations. And operant conditioning, simply put, is learning that occurs through the association of consequences to the behaviors of the person.

Now, this is different from classical conditioning, because classical conditioning examines what happens before an action occurs. It looks at the stimulus that leads to the response. Also, under classical conditioning the learner is sort of a passive agent in this. It just takes stuff from the environment and spits out these different behaviors.

On the other hand, operant conditioning examines what happens after an action and as a result of that action and the situation that's around them that causes that action to occur. So it looks at how the situations and the actions are more or less likely to occur later on. And simply put, under operant conditioning the idea is, if something pleasant happens after a behavior occurs, then the connection becomes stronger between that behavior and the situation around it. And you're more likely to do that behavior again in the same kind of situation, because of what happened that's pleasant.

If something unpleasant happens as a result of the action and the situation, then the opposite occurs. The connection is weaker. And you're less likely to perform that action again. This is what Edward Thorndike referred to as the Law of Effect. Thorndike was one of the first behaviorist thinkers under operant conditioning.

But probably the most famous and well-known of the operant conditioning people is BF Skinner. He was an American psychologist in the 1950s to 1970s. And he's what we would refer to as a radical behaviorist. In other words, he thought that all of psychology and mental states are the result of behaviors completely. And they're determined by the environment around us. He had this kind of idea of free will as an illusion. We're actually just responding constantly to the things around us.

Now you might agree or disagree with what he thought. But the results of his studies and his information about conditioning has been very informative in psychology. So let's take a look at that.

Now, Skinner elaborated on Thorndike's Law of Effect, giving specific terms to the ideas that he was coming up with. And two of the most important terms are reinforcer and punisher. Now, a reinforcer is anything that follows a response and makes it more likely for that response to be repeated. So you remember Thorndike said if something pleasant happens, you're more likely to repeat. Well, those pleasant things are what we would call reinforcers.

On the other hand, a punisher is anything which makes it less likely for an action to be repeated, which again Thorndike referred to as being an unpleasant experience that followed a behavior. So he proved these ideas about operant conditioning, reinforcers and punishers, through animal experiments in his invention of what he called the Skinner box, named after himself, or what others refer to as the operant conditioning chamber.

So this is how it worked. The box was essentially a simple, bare box within a laboratory with a lever or a button inside of it, as well as a small place to dispense of food. Now sometimes they would be modified to include lights or speakers or things like that. But this is the simplest form, box, lever or button, and a place to dispense food.

The animal was put inside. And they found whenever they pushed the lever or the button, they would get food. And so the rate of the lever being pushed was recorded. So there's a machine saying how often it occurred over a period of time.

And they found that over time, the animal would find that the rate of the lever being pushed would go up. They would do it more often as a result of getting that reinforcer of the food. So this is an idea that the consequence of the behavior makes it more likely for that to occur again.

Now, more and more complex behaviors can also be learned through operant conditioning, but not all at once necessarily. And this is what Skinner referred to as shaping, which is the gradual learning of responses into a desired behavior. So you're transforming over time to the behavior that you're trying to look for at the very end.

So for example, if you take that idea of the Skinner box again and change the way that the food is being dispensed, so that let's say a pigeon for example only gets food at first when it turns its head to the left, so every time it would turn it head to the left and then peck, it would get food. Now after that, you would start to change it so whenever it would turn its body to the left, then it would get food. And eventually what you find is through these gradual changes, you can change the behavior so that the pigeon turns all the way around a circle before it will get food. And you'll find that this behavior is repeated because of that reinforcer that's causing the behavior to continue to occur and become more likely to occur.

Now behaviors can also be linked together in response to a single reinforcer. This is called a response chain or response chaining, which is where one reinforcer can lead to multiple actions in succession. So this explains sort of why long-term goals occur in people, so the thought that if I come in each day and then finish this project and then present it, then I'll get a good grade in my class. So you see those multiple behaviors lead to one reinforcer at the very end.

Now if the reinforcer is taken away, then the behavior will disappear altogether. And this is what's called operant extinction. However, the behaviors don't disappear as soon as that reinforcer is taken away. And sometimes, the behavior will actually increase right before it reaches extinction.

So this explains why certain negative behaviors in children will increase for a short period of time when they're being ignored. So if a child starts yelling for their parents' attention, instead of immediately not yelling anymore because they're not receiving that reinforcer of attention, they might start yelling even louder. But eventually the behavior will disappear in the absence of that reinforcer.

Terms to Know
Law of Effect

Edward Thorndike’s concept that if consequences to our actions are pleasant we are likely to repeat our action (strengthen the association), and if the consequences are unpleasant we are not likely to repeat the action (weakening the association).

Operant Conditioning

Learning that occurs through the association of consequences to behaviors.

Operant Extinction

The relationship between the consequences of a behavior and the behavior itself disappears and a person doesn’t perform doesn’t perform a behavior in a situation.

Operant Reinforcer

Anything that follows a response and makes it more likely for that response to be repeated.

Response Chaining

Behaviors are linked together in response to a single reinforcement.


Gradual learning of responses into a desired behavior.

Skinner Box (chamber)

A simple box with a lever/button and a place to dispense food where an animal is placed and the rate of the lever being pushed is recorded; a.k.a. a Skinner box.

People to Know
B. F. Skinner

American psychologist, worked primarily with animals in the 1950’s - 1970’s.