B.F. Skinner's Life, Theories, and Influence on Psychology

Burrhus Frederic Skinner (March 20, 1904 - August 18, 1990) was an influential American psychologist, behaviorist, inventor, and social philosopher. He held the esteemed position of Edgar Pierce Professor of Psychology at Harvard University from 1958 until his retirement in 1974.
Skinner, along with John B. Watson and Ivan Pavlov, is considered one of the three founders of behaviorism. His work in the field of behaviorism has had a profound impact on the study of learning and behavior, particularly in the context of operant conditioning.

1 Biographyed as one of the pioneers of modern behaviorism. In a June 2002 survey, Skinner was recognized as the most influential psychologist of the 20th century.

1 Biography

Skinner was born in Susquehanna, Pennsylvania, to Grace and William Skinner, the latter a lawyer. He attended Hamilton College in Clinton, New York, where he earned his Bachelor of Arts in English literature in 1926. His academic journey then led him to Harvard University, where he immersed himself in the study of psychology, inspired particularly by John B. Watson's behaviorism. This influence shaped his path towards developing his own variant of behaviorism.
In 1931, Skinner obtained his PhD from Harvard and remained there as a researcher for several years. He then moved to the University of Minnesota in Minneapolis in 1936 to begin his career as a professor. By 1945, he had relocated to Indiana University, where he served as chair of the psychology department from 1946 to 1947. Eventually, Skinner returned to Harvard in 1948 as a tenured professor.
Throughout his career, Skinner made significant contributions to psychology, particularly in the study of operant conditioning and behavior modification. His work had a profound impact on both theoretical understanding and practical applications within the field of psychology.

2 Main theories

Operant Conditioning

Skinner's ideas about behaviorism were largely set forth in his first book, The Behavior of Organisms (1938). Here, he gives a systematic description of the manner in which environmental variables control behavior. He distinguished two sorts of behavior which are controlled in different ways:
Respondent behaviors are elicited by stimuli and can be modified through respondent conditioning, commonly known as classical (or Pavlovian) conditioning, where a neutral stimulus is paired with an eliciting stimulus. These behaviors are often assessed based on their latency or strength.
Operant behaviors, on the other hand, are 'emitted', meaning they are initially not elicited by any specific stimulus. They are strengthened through operant conditioning (also known as instrumental conditioning), where the occurrence of a response results in a reinforcer. These behaviors are typically measured by their rate of occurrence.
Both types of behavior had already been extensively studied experimentally: respondent behaviors by Ivan Pavlov and operant behaviors by Edward Thorndike. Skinner's approach differed in some respects from earlier theories and was among the first to integrate them comprehensively.
Skinner conducted experiments on operant conditioning using an apparatus he designed, famously known as the Skinner box. The differences between Skinner's experiments and Pavlov's classical conditioning experiments are noteworthy:
In Skinner's box, the experimental animals had the freedom to move around, unlike being restrained.
The responses of the experimental animals were not triggered by a known stimulus; operant behaviors (such as pressing a lever or pecking a key) served as means to obtain reinforcement (such as food).
The responses observed were not related to salivary gland activity but rather to skeletal muscle activity.The objective of these experiments was not to uncover patterns of cerebral cortex activity but rather to illustrate the relationship between stimuli and responses, aiming to effectively control organism behavior.

Reinforcement

Reinforcement, a key concept of behaviorism, is the primary process that shapes and controls behavior, and occurs in two ways: positive and negative.
Positive reinforcement involves obtaining a reinforcer to strengthen a response, such as a pigeon pecking a key to receive food. Negative reinforcement, on the other hand, entails the removal of aversive stimuli, where the withdrawal of the stimulus strengthens that behavior. For example, a pigeon pecks a key to terminate an electric shock.
Punishment can be the application of an aversive stimulus/event (positive punishment or punishment by contingent stimulation) or the removal of a desirable stimulus (negative punishment or punishment by contingent withdrawal). Though punishment is often used to suppress behavior, Skinner argued that this suppression is temporary and has a number of other, often unwanted, consequences.
Extinction is the absence of a rewarding stimulus, which weakens behavior.

Schedules of reinforcement

The most notable schedules of reinforcement studied by Skinner were continuous, interval (fixed or variable), and ratio (fixed or variable). All are methods used in operant conditioning.

3 Scientific inventions

Operant conditioning chamber

An operant conditioning chamber (also known as a "Skinner box") is a laboratory apparatus used in the experimental analysis of animal behavior. It was invented by Skinner while he was a graduate student at Harvard University. As used by Skinner, the box had a lever (for rats), or a disk in one wall (for pigeons). A press on this "manipulandum" could deliver food to the animal through an opening in the wall, and responses reinforced in this way increased in frequency. By controlling this reinforcement together with discriminative stimuli such as lights and tones, or punishments such as electric shocks, experimenters have used the operant box to study a wide variety of topics, including schedules of reinforcement, discriminative control, delayed response ("memory"), punishment, and so on.

Cumulative recorder

The cumulative recorder makes a pen-and-ink record of simple repeated responses. Skinner designed it for use with the operant chamber as a convenient way to record and view the rate of responses such as a lever press or a key peck.

Air crib

The air crib is an easily cleaned, temperature- and humidity-controlled box-bed intended to replace the standard infant crib. After raising one baby, Skinner felt that he could simplify the process for parents and improve the experience for children.

Teaching machine

The teaching machine, a mechanical invention to automate the task of programmed learning
The instructional potential of the teaching machine stemmed from several factors: it provided automatic, immediate and regular reinforcement without the use of aversive control; the material presented was coherent, yet varied and novel; the pace of learning could be adjusted to suit the individual. As a result, students were interested, attentive, and learned efficiently by producing the desired behavior, "learning by doing."

4 Theoretical limitations

Skinner committed the same error as traditional behaviorists by focusing solely on describing behavior rather than explaining it. In his later years, Skinner maintained his behaviorist viewpoint, opposing cognitive psychology's research and cognitive explanations of learning and behavior shaping processes.
Skinner advocated for "programmed instruction," but its practical outcomes did not align with his expectations. Educational practices showed that programmed instruction reduced opportunities for direct teacher-student dialogue and hindered timely communication between them, which proved highly detrimental to student learning.

Learn More

Did you like it? Help us spread the word!