Smith Garg Law Default What Is the Law of Effect in Psychology?

What Is the Law of Effect in Psychology?

A one year old tomcat is playing in a cardboard box.

The Law of Effect was a precursor to B.F. Skinner’s operant conditioning, and was developed by psychologist Edward Thorndike. The Law of Effect states that responses that receive positive outcomes in a given situation will be repeated in that situation, while responses that lead to negative outcomes in a given situation will not be repeated in that situation.

Key Takeaways: The Law of Effect

  • The Law of Effect was proposed by psychologist Edward Thorndike in the early twentieth century.
  • The Law of Effect says that behaviors that lead to satisfaction in a specific situation are likely to be repeated when the situation recurs, and behaviors that lead to discomfort in a specific situation are less likely to be repeated when the situation recurs.
  • Thorndike had a major influence on behaviorism, the psychological approach B. F. Skinner championed, as the latter built his ideas about operant conditioning on the Law of Effect.

Origins of the Law of Effect

While today B.F. Skinner and operant conditioning are known for demonstrating that we learn based on the consequences of our actions, this idea was built on Edward Thorndike’s early contributions to the psychology of learning. The Law of Effect—also referred to as Thorndike’s law of effect—came out of Thorndike’s experiments with animals, typically cats.

Thorndike would place a cat in a puzzle box that had a small lever on one side. The cat could only get out by pressing the lever. Thorndike would then place a piece of meat outside the box to encourage the cat to escape, and time how long it would take the cat to get out of the box. On its first try, the cat would press the lever by accident. However, because the cat was rewarded with both its freedom and food following each lever press, every time the experiment was repeated, the cat would press the lever more quickly.

Thorndike’s observations in these experiments led him to posit the Law of Effect, which was published in his book Animal Intelligence in 1911. The law had two parts.

Regarding actions that received positive consequences, the Law of Effect stated: “Of several responses made to the same situation, those which are accompanied or closely followed by satisfaction to the animal will, other things being equal, be more firmly connected with the situation, so that, when it recurs, they will be more likely to recur.”

Of actions that received negative consequences, the Law of Effect stated: “Those [responses] which are accompanied or closely followed by discomfort to the animal will, other things being equal, have their connections with that situation weakened, so that, when it recurs, they will be less likely to occur.

Thorndike concluded his theory by observing, “The greater the satisfaction or discomfort, the greater the strengthening or weakening of the bond [between the response and the situation].”

Thorndike modified the law of effect in 1932, after determining both parts were not equally valid. He found that responses that are accompanied by positive outcomes or rewards always made the association between the situation and the response stronger, however, responses that are accompanied by negative outcomes or punishments only weaken the association between the situation and response a little bit.

Examples of the Law of Effect in Action

Thorndike’s theory outlined one way people learn, and we can see it in action in many situations. For example, say you’re a student and you rarely speak up in class even when you know the answer to the teacher’s questions. But one day, the teacher asks a question that no one else answers, so you tentatively raise your hand and give the correct answer. The teacher commends you for your response and it makes you feel good. So, next time you’re in class and you know the answer to a question the teacher asks, you raise your hand again with the expectation that, after answering correctly, you will once again experience your teacher’s praise. In other words, because your response in the situation led to a positive outcome, the likelihood that you’ll repeat your response increases.

Some other examples include:

  • You train hard for a swim meet and win first place, making it more likely you’ll train just as hard for the next meet.
  • You practice your act for a talent show, and following your performance, the audience gives you a standing ovation, making it more likely you will practice for your next performance.
  • You work long hours to ensure you meet a deadline for an important client, and your boss praises your actions, making it more likely you will work long hours when your next deadline is approaching.
  • You get a ticket for speeding on the highway, making it less likely that you will speed in the future, however, the association between driving and speeding will probably only be weakened a little bit based on Thorndike’s modification to the law of effect.

Influence on Operant Conditioning

Thorndike’s Law of Effect is an early theory of conditioning. It is an unmediated stimulus-response model because there was nothing else that happens between the stimulus and the response. In Thorndike’s experiments, the cats were allowed to operate freely, and made the association between the box and pressing the lever to gain their freedom on their own. Skinner studied Thorndike’s ideas and conducted similar experiments that involved placing animals in his own version of a puzzle box with a lever (which is typically referred to as a Skinner box).

Skinner introduced the concept of reinforcement into Thorndike’s theory. In operant conditioning, behaviors that are positively reinforced are likely to be repeated and behaviors that are negatively reinforced are less likely to be repeated. A clear line can be drawn between operant conditioning and the Law of Effect, demonstrating the influence Thorndike had on both operant conditioning and behaviorism as a whole.

Related Post