Thorndike’s law of effect, in animal behaviour and conditioning, the postulate developed by American psychologist Edward L. Thorndike in 1905 that argued that the probability that a particular stimulus will repeatedly elicit a particular learned response depends on the perceived consequences of the response. In addition, new stimulus-response connections are strengthened only if the response is followed by behavioral responses that were most closely followed by a satisfying result.
This concept can be illustrated in an experiment in which a rat presses a lever to receive either a reward or a punishment. A rat whose operation of a lever is followed by the delivery of a food pellet will press the lever again. In contrast, if the only consequence of pressing the lever is the delivery of a painful electrical shock, the rat will not press the lever in the future (which illustrates what has been called the negative law of effect). In another example, Thorndike placed a cat within a box with a lever that, when pressed, would open a door that would allow the cat to escape and obtain food placed outside the box. After the cat discovered the connection between pressing the lever and its escape (and its access to a food reward), for each trial thereafter, the cat became much more adept at pressing the lever and opening the door.
Thorndike’s law of effect finds less general acceptance today, largely because it became unclear whether an animal’s responses were sometimes (or even always) modified by their consequences, since other factors could be present. The law did, however, influence the development of behaviourism during the first half of the 20th century—with American psychologist B.F. Skinner building on Thorndike’s law of effect and formalizing the process of operant conditioning, which he understood to be the explanatory basis of human behaviour.