The law of effect principle developed by Edward Thorndike suggested that:
“Responses that produce a satisfying effect in a particular situation become more likely to occur again in that situation, and responses that produce a discomforting effect become less likely to occur again in that situation (Gray, 2011, p. 108–109).”
Edward Thorndike (1898) is famous in psychology for his work on learning theory that leads to the development of operant conditioning within behaviorism.
Whereas classical conditioning depends on developing associations between events, operant conditioning involves learning from the consequences of our behavior.
Skinner wasn’t the first psychologist to study learning by consequences. Indeed, Skinner’s theory of operant conditioning is built on the ideas of Edward Thorndike.
Thorndike studied learning in animals (usually cats). He devised a classic experiment in which he used a puzzle box (see fig. 1) to empirically test the laws of learning.
Fig 1 : Simplified graph of the result of the puzzle box experiment.
He placed a cat in the puzzle box, which was encourage to escape to reach a scrap of fish placed outside. Thorndike would put a cat into the box and time how long it took to escape. The cats experimented with different ways to escape the puzzle box and reach the fish.
Eventually they would stumble upon the lever which opened the cage. When it had escaped it was put in again, and once more the time it took to escape was noted. In successive trials the cats would learn that pressing the lever would have favorable consequences and they would adopt this behavior, becoming increasingly quick at pressing the lever.
Edward Thorndike put forward a “ Law of effect ” which stated that any behavior that is followed by pleasant consequences is likely to be repeated, and any behavior followed by unpleasant consequences is likely to be stopped.
Critical Evaluation
Thorndike (1905) introduced the concept of reinforcement and was the first to apply psychological principles to the area of learning.
His research led to many theories and laws of learning, such as operant conditioning. Skinner (1938), like Thorndike, put animals in boxes and observed them to see what they were able to learn.
B.F. Skinner built upon Thorndike’s principles to develop his theory of operant conditioning. Skinner’s work involved the systematic study of how the consequences of a behavior influence its frequency in the future. He introduced the concepts of reinforcement (both positive and negative) and punishment to describe how consequences can modify behavior.
The learning theories of Thorndike and Pavlov were later synthesized by Hull (1935). Thorndike’s research drove comparative psychology for fifty years, and influenced countless psychologists over that period of time, and even still today.
References
Gray, P. (2011). Psychology (6th ed.) New York: Worth Publishers.
Hull, C. L. (1935). The conflicting psychologies of learning—a way out. Psychological Review, 42(6), 491.
Skinner, B. F. (1938). The behavior of organisms: An experimental analysis. New York: Appleton-Century.
Thorndike, E. L. (1898). Animal intelligence: An experimental study of the associative processes in animals. Psychological Monographs: General and Applied, 2(4), i-109.
Thorndike, E. L. (1905). The elements of psychology . New York: A. G. Seiler.