Difference between Operant and Instrumental conditioning

  • Thread starter RabbitWho
  • Start date
  • Tags
    Difference
In summary, my fellow first years are confused about the difference between instrumental and operant conditioning.
  • #1
RabbitWho
153
18
Before my fellow first years jump on me, no they're not the same thing :(

I know most textbooks teach them as being the same thing, but most textbooks also confuse the Yerkes-Dodson law and the Hebb's version of it and don't even mention Hebb and for some reason my course thinks these little details are incredibly important and I have to learn them.

So I'm hoping some of you have studied this way as well since I can't find anything about the differences online to double check my understanding. Can you?
Here is how I understand itInstrumental conditioning (Thorndike):

The important thing in instrumental conditioning is the situation (stimulus) and the response,
Centers on the law of effect

Operant conditioning (Skinner)

The important thing in operant conditioning is the response and the reinforcement.
Centers on the law of reinforcement.This seems absolutely mad to me. Of course there is a reinforcement in Thorndike's experiments, otherwise the cat would just curl up and go to sleep inside of the box.
Of course there is a situation in Skinner's experiment, otherwise what would we study? All three things seem equally important in both.
But I guess with instrumental you're thinking "how fast can the cat get out of the box"/" can the cat learn / how long does it take the cat to learn"
While with operant you're thinking "what cool things can I get the pigeon/my pets to do?"

It's also possible that the book is just really badly written, and when they say

"Blah blah blah Thorndike blah blah blah - this is known as instrumental conditioning"
and then they go on to say
"Blah blah blah Skinner blah blah blah - this is known as operant conditioning"

That they actually mean "all of this is known as instrumental or operant conditioning" .Thanks!

Edit: Oh dear, sorry for the angry tone of this message. It's not the poor book's fault.
 
Last edited:
Biology news on Phys.org
  • #2
RabbitWho said:
That they actually mean "all of this is known as instrumental or operant conditioning" .

This.

Operant conditioning is instrumental learning, reformulated. IL came first, Skinner came a long and refined it.
 
  • #3
Thanks, it seems a bit clearer now after a good night's sleep . :)
 
  • #4
Hey, is there a way to find out why this was moved to Biology when it's Psychology?

Should I post Psychology questions on the Biology board in future ?
 
  • #5


I can provide some clarification on the differences between instrumental and operant conditioning.

Firstly, it is important to note that both instrumental and operant conditioning are forms of learning in which behavior is strengthened or weakened based on its consequences. However, there are some key differences between the two.

Instrumental conditioning, also known as Thorndike's Law of Effect, focuses on the relationship between a specific behavior and its consequences. In this type of conditioning, the organism learns to associate a particular response with a specific outcome. For example, a cat learns to press a lever to receive food.

On the other hand, operant conditioning, also known as Skinner's Law of Reinforcement, focuses on the relationship between a behavior and its consequences in the context of a reinforcement schedule. In this type of conditioning, the organism learns to associate a behavior with a consequence, either positive or negative, and this affects the likelihood of the behavior being repeated in the future. For example, a pigeon learns to peck a button to receive a food pellet.

One key difference between the two is the role of the situation or stimulus. In instrumental conditioning, the focus is on the situation or stimulus that elicits the behavior, whereas in operant conditioning, the focus is on the behavior itself and its consequences.

Another difference is the emphasis on the timing of the reinforcement. In instrumental conditioning, the reinforcement is given immediately after the behavior, whereas in operant conditioning, the reinforcement can be delayed or delivered after a certain number of behaviors have been performed.

In summary, both instrumental and operant conditioning involve learning through consequences, but the emphasis is different in each. Instrumental conditioning focuses on the relationship between a specific behavior and its consequences, while operant conditioning focuses on the relationship between a behavior and its consequences within a reinforcement schedule. I hope this helps clarify the differences between the two.
 

FAQ: Difference between Operant and Instrumental conditioning

What is the difference between operant and instrumental conditioning?

Operant conditioning and instrumental conditioning are both forms of learning in which behavior is modified through consequences. However, there are a few key differences between the two.

What is the role of reinforcement in operant conditioning?

In operant conditioning, reinforcement is used to increase the likelihood of a behavior being repeated. This can be achieved through positive reinforcement, where a desired behavior is followed by a reward, or negative reinforcement, where a behavior is strengthened by the removal of an undesirable stimulus.

How does instrumental conditioning differ from operant conditioning?

Instrumental conditioning is similar to operant conditioning in that it also involves modifying behavior through consequences. However, instrumental conditioning focuses more on the relationship between the behavior and its consequences, while operant conditioning looks at the behavior itself.

What is an example of operant conditioning?

An example of operant conditioning is a child receiving a sticker for completing their homework. The positive reinforcement of receiving a sticker increases the likelihood of the child completing their homework in the future.

What is an example of instrumental conditioning?

An example of instrumental conditioning is a rat learning to press a lever to receive a food pellet. The rat's behavior of pressing the lever is strengthened by the consequence of receiving food, and thus, the behavior is more likely to be repeated in the future.

Similar threads

Replies
3
Views
4K
Replies
4
Views
5K
Replies
1
Views
1K
Replies
21
Views
2K
Replies
4
Views
2K
Back
Top