The History of Operant Conditioning


By Sarah Munoz             next


 Edward Thorndike                                                                              B.F. Skinner


Around the turn of the century, Edward Thorndike attempted to develop an experimental method for problem solving ability of cats and dogs.  Thorndike created ‘puzzle-boxes’ and observe how the animals would react when put in the boxes.   Thorndike’s goal was to show that achievements of cats and dogs could be replicated in controlled, standardized circumstances.  Thorndike soon learned that he could measure the intelligence of the animal using this method.  Thorndike found that the reward somehow strengthens the association between a stimulus and an appropriate reaction.  Thorndike demonstrated the nature of learning when consequences are presented.  Through the research, Thorndike formulated the Law of Effect that states: In any given situation, the probability of a behavior occurring is a function of the consequences of the behavior in that situation.

In 1938 Burrhus Friederich (B.F.) Skinner took Thorndike's ideas and invented the Skinner-Box, which closed in the maze and made the rat learn stimulus response in association with different parts of the maze, perhaps in sequence rather than having an internalized map of the maze.  Through trials, Skinner developed the basic concept of operant conditioning claiming that this type of learning was not the result of stimulus-response learning, but the basic association in operant conditioning.  Skinner believed the operant conditioning was between the operant response and the treat, the punishment served as a signal when this association would be acted upon.