Skinner’s Box Experiment (Behaviorism Study)

practical psychology logo

We receive rewards and punishments for many behaviors. More importantly, once we experience that reward or punishment, we are likely to perform (or not perform) that behavior again in anticipation of the result. 

Psychologists in the late 1800s and early 1900s believed that rewards and punishments were crucial to shaping and encouraging voluntary behavior. But they needed a way to test it. And they needed a name for how rewards and punishments shaped voluntary behaviors. Along came Burrhus Frederic Skinner , the creator of Skinner's Box, and the rest is history.

BF Skinner

What Is Skinner's Box?

The "Skinner box" is a setup used in animal experiments. An animal is isolated in a box equipped with levers or other devices in this environment. The animal learns that pressing a lever or displaying specific behaviors can lead to rewards or punishments.

This setup was crucial for behavioral psychologist B.F. Skinner developed his theories on operant conditioning. It also aided in understanding the concept of reinforcement schedules.

Here, "schedules" refer to the timing and frequency of rewards or punishments, which play a key role in shaping behavior. Skinner's research showed how different schedules impact how animals learn and respond to stimuli.

Who is B.F. Skinner?

Burrhus Frederic Skinner, also known as B.F. Skinner is considered the “father of Operant Conditioning.” His experiments, conducted in what is known as “Skinner’s box,” are some of the most well-known experiments in psychology. They helped shape the ideas of operant conditioning in behaviorism.

Law of Effect (Thorndike vs. Skinner) 

At the time, classical conditioning was the top theory in behaviorism. However, Skinner knew that research showed that voluntary behaviors could be part of the conditioning process. In the late 1800s, a psychologist named Edward Thorndike wrote about “The Law of Effect.” He said, “Responses that produce a satisfying effect in a particular situation become more likely to occur again in that situation, and responses that produce a discomforting effect become less likely to occur again in that situation.”

Thorndike tested out The Law of Effect with a box of his own. The box contained a maze and a lever. He placed a cat inside the box and a fish outside the box. He then recorded how the cats got out of the box and ate the fish. 

Thorndike noticed that the cats would explore the maze and eventually found the lever. The level would let them out of the box, leading them to the fish faster. Once discovering this, the cats were more likely to use the lever when they wanted to get fish. 

Skinner took this idea and ran with it. We call the box where animal experiments are performed "Skinner's box."

Why Do We Call This Box the "Skinner Box?"

Edward Thorndike used a box to train animals to perform behaviors for rewards. Later, psychologists like Martin Seligman used this apparatus to observe "learned helplessness." So why is this setup called a "Skinner Box?" Skinner not only used Skinner box experiments to show the existence of operant conditioning, but he also showed schedules in which operant conditioning was more or less effective, depending on your goals. And that is why he is called The Father of Operant Conditioning.

Skinner's Box Example

How Skinner's Box Worked

Inspired by Thorndike, Skinner created a box to test his theory of Operant Conditioning. (This box is also known as an “operant conditioning chamber.”)

The box was typically very simple. Skinner would place the rats in a Skinner box with neutral stimulants (that produced neither reinforcement nor punishment) and a lever that would dispense food. As the rats started to explore the box, they would stumble upon the level, activate it, and get food. Skinner observed that they were likely to engage in this behavior again, anticipating food. In some boxes, punishments would also be administered. Martin Seligman's learned helplessness experiments are a great example of using punishments to observe or shape an animal's behavior. Skinner usually worked with animals like rats or pigeons. And he took his research beyond what Thorndike did. He looked at how reinforcements and schedules of reinforcement would influence behavior. 

About Reinforcements

Reinforcements are the rewards that satisfy your needs. The fish that cats received outside of Thorndike’s box was positive reinforcement. In Skinner box experiments, pigeons or rats also received food. But positive reinforcements can be anything added after a behavior is performed: money, praise, candy, you name it. Operant conditioning certainly becomes more complicated when it comes to human reinforcements.

Positive vs. Negative Reinforcements 

Skinner also looked at negative reinforcements. Whereas positive reinforcements are given to subjects, negative reinforcements are rewards in the form of things taken away from subjects. In some experiments in the Skinner box, he would send an electric current through the box that would shock the rats. If the rats pushed the lever, the shocks would stop. The removal of that terrible pain was a negative reinforcement. The rats still sought the reinforcement but were not gaining anything when the shocks ended. Skinner saw that the rats quickly learned to turn off the shocks by pushing the lever. 

About Punishments

Skinner's Box also experimented with positive or negative punishments, in which harmful or unsatisfying things were taken away or given due to "bad behavior." For now, let's focus on the schedules of reinforcement.

Schedules of Reinforcement 

Operant Conditioning Example

We know that not every behavior has the same reinforcement every single time. Think about tipping as a rideshare driver or a barista at a coffee shop. You may have a string of customers who tip you generously after conversing with them. At this point, you’re likely to converse with your next customer. But what happens if they don’t tip you after you have a conversation with them? What happens if you stay silent for one ride and get a big tip? 

Psychologists like Skinner wanted to know how quickly someone makes a behavior a habit after receiving reinforcement. Aka, how many trips will it take for you to converse with passengers every time? They also wanted to know how fast a subject would stop conversing with passengers if you stopped getting tips. If the rat pulls the lever and doesn't get food, will they stop pulling the lever altogether?

Skinner attempted to answer these questions by looking at different schedules of reinforcement. He would offer positive reinforcements on different schedules, like offering it every time the behavior was performed (continuous reinforcement) or at random (variable ratio reinforcement.) Based on his experiments, he would measure the following:

  • Response rate (how quickly the behavior was performed)
  • Extinction rate (how quickly the behavior would stop) 

He found that there are multiple schedules of reinforcement, and they all yield different results. These schedules explain why your dog may not be responding to the treats you sometimes give him or why gambling can be so addictive. Not all of these schedules are possible, and that's okay, too.

Continuous Reinforcement

If you reinforce a behavior repeatedly, the response rate is medium, and the extinction rate is fast. The behavior will be performed only when reinforcement is needed. As soon as you stop reinforcing a behavior on this schedule, the behavior will not be performed.

Fixed-Ratio Reinforcement

Let’s say you reinforce the behavior every fourth or fifth time. The response rate is fast, and the extinction rate is medium. The behavior will be performed quickly to reach the reinforcement. 

Fixed-Interval Reinforcement

In the above cases, the reinforcement was given immediately after the behavior was performed. But what if the reinforcement was given at a fixed interval, provided that the behavior was performed at some point? Skinner found that the response rate is medium, and the extinction rate is medium. 

Variable-Ratio Reinforcement

Here's how gambling becomes so unpredictable and addictive. In gambling, you experience occasional wins, but you often face losses. This uncertainty keeps you hooked, not knowing when the next big win, or dopamine hit, will come. The behavior gets reinforced randomly. When gambling, your response is quick, but it takes a long time to stop wanting to gamble. This randomness is a key reason why gambling is highly addictive.

Variable-Interval Reinforcement

Last, the reinforcement is given out at random intervals, provided that the behavior is performed. Health inspectors or secret shoppers are commonly used examples of variable-interval reinforcement. The reinforcement could be administered five minutes after the behavior is performed or seven hours after the behavior is performed. Skinner found that the response rate for this schedule is fast, and the extinction rate is slow. 

Skinner's Box and Pigeon Pilots in World War II

Yes, you read that right. Skinner's work with pigeons and other animals in Skinner's box had real-life effects. After some time training pigeons in his boxes, B.F. Skinner got an idea. Pigeons were easy to train. They can see very well as they fly through the sky. They're also quite calm creatures and don't panic in intense situations. Their skills could be applied to the war that was raging on around him.

B.F. Skinner decided to create a missile that pigeons would operate. That's right. The U.S. military was having trouble accurately targeting missiles, and B.F. Skinner believed pigeons could help. He believed he could train the pigeons to recognize a target and peck when they saw it. As the pigeons pecked, Skinner's specially designed cockpit would navigate appropriately. Pigeons could be pilots in World War II missions, fighting Nazi Germany.

When Skinner proposed this idea to the military, he was met with skepticism. Yet, he received $25,000 to start his work on "Project Pigeon." The device worked! Operant conditioning trained pigeons to navigate missiles appropriately and hit their targets. Unfortunately, there was one problem. The mission killed the pigeons once the missiles were dropped. It would require a lot of pigeons! The military eventually passed on the project, but cockpit prototypes are on display at the American History Museum. Pretty cool, huh?

Examples of Operant Conditioning in Everyday Life

Not every example of operant conditioning has to end in dropping missiles. Nor does it have to happen in a box in a laboratory! You might find that you have used operant conditioning on yourself, a pet, or a child whose behavior changes with rewards and punishments. These operant conditioning examples will look into what this process can do for behavior and personality.

Hot Stove: If you put your hand on a hot stove, you will get burned. More importantly, you are very unlikely to put your hand on that hot stove again. Even though no one has made that stove hot as a punishment, the process still works.

Tips: If you converse with a passenger while driving for Uber, you might get an extra tip at the end of your ride. That's certainly a great reward! You will likely keep conversing with passengers as you drive for Uber. The same type of behavior applies to any service worker who gets tips!

Training a Dog: If your dog sits when you say “sit,” you might treat him. More importantly, they are likely to sit when you say, “sit.” (This is a form of variable-ratio reinforcement. Likely, you only treat your dog 50-90% of the time they sit. If you gave a dog a treat every time they sat, they probably wouldn't have room for breakfast or dinner!)

Operant Conditioning Is Everywhere!

We see operant conditioning training us everywhere, intentionally or unintentionally! Game makers and app developers design their products based on the "rewards" our brains feel when seeing notifications or checking into the app. Schoolteachers use rewards to control their unruly classes. Dog training doesn't always look different from training your child to do chores. We know why this happens, thanks to experiments like the ones performed in Skinner's box. 

Related posts:

  • Operant Conditioning (Examples + Research)
  • Edward Thorndike (Psychologist Biography)
  • Schedules of Reinforcement (Examples)
  • B.F. Skinner (Psychologist Biography)
  • Fixed Ratio Reinforcement Schedule (Examples)

Reference this article:

About The Author

Photo of author

Free Personality Test

Free Personality Quiz

Free Memory Test

Free Memory Test

Free IQ Test

Free IQ Test

PracticalPie.com is a participant in the Amazon Associates Program. As an Amazon Associate we earn from qualifying purchases.

Follow Us On:

Youtube Facebook Instagram X/Twitter

Psychology Resources

Developmental

Personality

Relationships

Psychologists

Serial Killers

Psychology Tests

Personality Quiz

Memory Test

Depression test

Type A/B Personality Test

© PracticalPsychology. All rights reserved

Privacy Policy | Terms of Use

  • Technophiles
  • Communities
  • Editors of The Future
  • B.F. Skinner, the man who taught pigeons to play ping-pong 

course details

Most Popular

Loading loading loading loading loading loading.

B. F. Skinner Foundation

Project Pigeon

[title size=”2″] Project Pigeon (Orcon) [/title]

video

B. F. Skinner wrote about his work during World War Two and about the Orcon Project. Download the article here: Pigeons in a Pelican.pdf.

Leave a Reply Cancel reply

EVERY PURCHASE FROM OUR ONLINE STORE BENEFITS THE FOUNDATION! Dismiss

skinner bird experiment

  • Editor's Pick

skinner bird experiment

B.F. Skinner at Harvard

Long before there were grab and go lunches and weekly pub trivia nights, slot machines and pianos filled the basement of Memorial Hall. The lucky gamblers and musicians were not students or faculty, but pigeons.

Established in 1948, the Harvard Pigeon Lab was one of the many Psychological Laboratories occupying the space below Sanders Theater. It was led by a newly tenured professor who had spent the last few years trying to create a pigeon-guided missile program for the US military to use during World War II. Burrhus Frederic Skinner, known to the academic world as B.F. Skinner, would continue to experiment on pigeons, but his years at Harvard proved to be as dynamic and eclectic as his contributions to the fields of psychology and education.

Skinner’s connection to Harvard began in 1928 when he enrolled in the graduate program in psychology within what was then the Department of Philosophy and Psychology. He split most of his time between Emerson Hall, where the department was located, and his house three blocks from the Yard on Harvard Street.

The path that brought Skinner to psychology was an unusual one. Jerome Kagan, Professor Emeritus of Psychology, recalled a lunch date with Skinner, in which the eminent psychologist noted, “when he was an undergraduate, he decided to be a writer because his main goal was to change the world. He had very high ambitions. He wanted to have an effect on the world and decided that writing was the best way to do it.”

Everything changed for Skinner in his second half of college when, according to Kagan, “he read [John] Watson, who is the original behaviorist, and that persuaded him that if you want to change the world, becoming a psychologist was probably more effective.”

Graduate school at Harvard for Skinner was an experience almost completely confined to academics. He would later write in his autobiography, “The Shaping of a Behaviorist,” that “Harvard University takes little or no interest in the private lives of its graduate students,” explaining that all matters of social and residential life were not of concern to the University. He would also reflect on the fact that graduate school pushed him harder than anything before, consuming nearly all of his daytime hours.

“At Harvard I entered upon the first strict regimen of my life,” he wrote. “I would rise at six, study until breakfast, go to classes, laboratories, and libraries with no more than fifteen minutes unscheduled during the day, study until exactly nine o’clock at night, and go to bed. I saw no movies or plays, seldom went to concerts. I had scarcely any dates, and read only psychology and physiology.”

Skinner soon became interested in behaviorism, a school of psychology more concerned with behaviors themselves than the unseen mental processes behind them. Kagan explained that behaviorism is rooted in the belief that in order to teach some behavior to an organism, “all we have to do is control the rewards, the desirable things that the animal or human wants, and punish the behaviors that we don’t want. There’s no mind, there are no thoughts, everything is behavior.” Steven Pinker, Professor of Psychology, described behaviorism as “not a theory of psychology,” but instead, “really a meta-theory or a philosophy of psychology.”

While still a graduate student, Skinner invented the operant conditioning chamber, in which animals are taught certain behaviors by rewarding or punishing the animal’s actions. Later known as the Skinner Box, the apparatus was instrumental in pursuing the study of operant conditioning, an alternative to the more widely studied classical conditioning à la Pavlov’s dog.

Operant conditioning opened the door to a world of new theories and possibilities regarding control and learning. “What Skinner did was say, ‘Well, if we really want to control behavior, we’ve got to control habits that are not innately biological,’” notes Kagan. “If you want to control what people do–control their aggression, control their work habits, control their study habits–that’s all operant conditioning.”

Skinner’s years as a graduate student were spent surrounded by the giants of a field emerging as its own distinct science. He studied under professors like Henry Murray—the developer of a personality psychology called personology—and took classes alongside students like Fred Keller, who would become a renowned champion of scientific education reform. But despite contact with pioneering members of the field, Skinner largely took his own approach.

After passing his preliminary exams, Skinner all but ignored by-the-book psychology. He noted in his autobiography that upon reading through a copy of the American Journal of Psychology, he concluded “there must be a better way to find out what was going on in the field.”

“I never learned how to read the ‘literature’ in psychology, and the literature remained largely unread by me,” he went on to write. Even in his research ventures, he recalled working “entirely without supervision” and that “some kind of flimsy report” would usually suffice.

Nevertheless, Skinner would later reflect that in his late graduate years, “that [he] was doing exactly as [he] pleased.” He received his Ph.D. in 1931 and remained at Harvard to do research until 1936. After over a decade of teaching at colleges in the Midwest, he came back to Cambridge in 1948 when Harvard offered him a tenured professorship. He became the Edgar Pierce Professor of Psychology in 1958, a position he maintained until his retirement in 1974.

In his faculty position at Harvard, Skinner was able to continue his research into animal behavior as well as expand to other fields, both in and out of psychology. The Pigeon Lab, set up in Skinner’s first year as a professor, used birds to study behavioral principles that could be applied to humans. A Crimson article from 1949 observed Skinner in his element. “Skinner places his pigeons in a small closed box with a button in one wall,” the article reports. “The birds must peck at this button at least once every five minutes to be paid off with food. The eager but ignorant pigeon, however, not knowing he will get the same reward with less exertion, will hammer away rapidly for great lengths of time to get his dinner.” Skinner saw the experiment as providing valuable insight to the work habits and monetary incentives of human beings.

Working out of his office at the south end of Memorial Hall’s basement, Skinner put forth theories on topics ranging from the superstitious tendencies of rodents to the synthesis of internal emotions and external behaviors—a field of his own that became known as radical behaviorism.

Memorial Hall was a beloved site for experimentation. Skinner and his colleagues had significant freedom in modifying the space because administrators had little concern for the building’s underground level. When William James Hall was built as a new home for the Department of Psychology, Pinker said that Skinner and others “had to be dragged kicking and screaming out of the basement of Memorial Hall.” Scientists, Pinker notes, “love space that you can modify yourself indefinitely.”

Nonetheless, the Department managed to adapt to the move. “The 7th Floor of William James Hall was [Skinner’s] empire,” said Pinker. “When the elevator doors opened there were two bumper stickers that you could see. One of them said ‘Think Behavior,’ and the other one said ‘God is a VI,’ which is a very nerdy in-joke, VI being a variable interval schedule of reinforcement.”

By the time Skinner retired, behaviorism began to see a decline in popularity. A shift towards cognitive elements of psychology was already underway. Kagan describes the move towards study of the brain as occurring because of both technical advances that made it easier to do cognitive imaging and the desire of psychologists to examine people’s inner emotions.

“Humans think. Humans feel. Humans feel guilty if they have a bigoted thought. Operant conditioning can’t explain that,” he says.

Skinner also explored non-psychological fields, contributing to linguistics and philosophy through books like “Verbal Behavior” and “Beyond Freedom and Dignity”. One of the fields outside of psychology that was influenced by Skinner’s work is educational theory. He predicted that technology would play an increasingly greater role in the classroom, theorizing that “audio-visual aids” would come to supplement, and maybe even replace, lectures and textbooks. He developed pedagogical methods based on his conditioning theories.

Skinner’s legacy at Harvard and more broadly in his discipline remains ambiguous. Many of his theories have suffered extensive criticism and even been eclipsed by modern methods. His thoughts on education and philosophy were unique but often controversial. In any case, he was arguably one of the most famous psychologists of his time, with an unmatched drive to leave a mark in some way.

“There are two kinds of scientists,” says Kagan. “I call one kind hunters. Hunters win prizes. Hunters want a victory, they want to establish a fact that’s reliable, unambiguous, replicable…They don’t particularly care what the problem is.” He goes on, “then there are the birdwatchers. I’m a birdwatcher. Birdwatchers fall in love with a particular domain.”

This dichotomy represents, according to Kagan, “the difference between a passion about a domain and a passion to make an important discovery about anything.”

“Skinner,” he concludes, “was a hunter.”

Skinner's Superstition Experiment

What is it.

Skinner's Superstition Experiment was a study conducted by B.F. Skinner in 1947, where he placed pigeons in a Skinner box and found that even though the delivery of food was entirely random, the pigeons exhibited superstitious behaviors and developed their own "superstitions" that they believed were responsible for the delivery of food. This experiment showed that animals (and potentially humans) may develop superstitious behaviors through association and reinforcement, even in situations where there is no actual causal relationship between their behavior and a desired outcome.

  • Simple Examples

B.F. Skinner was a famous psychologist who studied behaviorism, a theory that explains how we learn behaviors through rewards and punishments. In the 1940s, Skinner conducted an experiment to investigate how superstitions form in animals, particularly pigeons. His experiment is now known as "Skinner's Superstition Experiment."

In layman's terms, imagine you had a group of pigeons in a box. Each pigeon had access to a button. When the button was pressed, food would be released. Skinner wanted to see if pigeons would develop "superstitious" behaviors when food was given randomly, without a clear connection to the button.

Skinner set up the experiment so that food would be released at specific time intervals, regardless of whether the pigeon pressed the button or not. Over time, the pigeons began to associate unrelated actions with the food delivery, believing that their behavior was causing the food to appear.

For example, one pigeon might have been spinning around when the food appeared. The pigeon would then start to believe that spinning around caused the food to come out. Another pigeon might have been pecking at the button when the food appeared, and would continue to peck at ...

Pigeons, Operant Conditioning, and Social Control

Audrey watters.

This is the transcript of the talk I gave at the Tech4Good event I'm at this weekend in Albuquerque, New Mexico. The complete slide deck is here .

skinner bird experiment

I want to talk a little bit about a problem I see – or rather, a problem I see in the “solutions” that some scientists and technologists and engineers seem to gravitate towards. So I want to talk to you about pigeons, operant conditioning, and social control, which I recognize is a bit of a strange and academic title. I toyed with some others:

skinner bird experiment

I spent last week at the Harvard University archives, going through the papers of Professor B. F. Skinner, arguably one of the most important psychologists of the twentieth century. (The other, of course, being Sigmund Freud.)

skinner bird experiment

I don’t know how familiar this group is with Skinner – he’s certainly a name that those working in educational psychology have heard of. I’d make a joke here about software engineers having no background in the humanities or social sciences but I hear Mark Zuckerberg was actually a psych major at Harvard. (So that’s the joke.)

I actually want to make the case this morning that Skinner’s work – behavioral psychology in particular – has had profound influence on the development of computer science, particularly when it comes to the ways in which “programming” has become a kind of social engineering. I’m not sure this lineage is always explicitly considered – like I said, there’s that limited background in or appreciation for history thing your field seems to have got going on.

B. F. Skinner was a behaviorist. Indeed, almost all the American psychologists in the early twentieth century were. Unlike Freud, who was concerned with the subconscious mind, behaviorists like Skinner were interested in – well, as the name suggests – behaviors. Observable behaviors. Behaviors that could be conditioned or controlled.

skinner bird experiment

Skinner’s early work was with animals. As a graduate student at Harvard, he devised the operant conditioning chamber – better known as the Skinner box – that was used to study animal behavior. The chamber provided some sort of response mechanism that the animal would be trained to use, typically by rewarding the animal with food.

skinner bird experiment

During World War II, Skinner worked on a program called Project Pigeon – also known as Project Orcon, short for Organic Control – an experimental project to create pigeon-guided missiles.

The pigeons were trained by Skinner to peck at a target, and they rewarded with food when they completed the task correctly. Skinner designed a missile that carried pigeons which could see the target through the windows. The pigeons would peck at the target; the pecking in turn would control the missile’s tail fins, keeping it on course, via a metal conductor connected to the birds’ beak, transmitting the force of the pecking to the missile’s guidance system. The pigeons’ accuracy, according to Skinner’s preliminary tests: nearly perfect.

As part of their training, Skinner also tested the tenacity of the pigeons – testing their psychological fitness, if you will, for battle. He fired a pistol next to their heads to see if loud noise would disrupt their pecking. He put the pigeons in a pressure chamber, setting the altitude at 10,000 feet. The pigeons were whirled around in a centrifuge meant to simulate massive G forces; they were exposed to bright flashes meant to simulate shell bursts. The pigeons kept pecking. They had been trained, conditioned to do so.

The military canceled and revived Project Pigeon a couple of times, but Skinner’s ideas were never used in combat. “Our problem,” Skinner admitted, “was no one would take us seriously.” And by 1953, the military had devised an electronic system for missile guidance, so animal-guided systems were no longer necessary (if they ever were).

This research was all classified, and when the American public were introduced to Skinner’s well-trained pigeons in the 1950s, there was no reference to their proposed war-time duties. Rather, the media talked about his pigeons that could play ping-pong and piano.

Admittedly, part of my interest in Skinner’s papers at Harvard involved finding more about his research on pigeons. I use the pigeons as a visual metaphor throughout my work. And I could talk to you for an hour, easily, about the birds – indeed, I have given a keynote like that before. But I’m writing a book on the history of education technology, and B. F. Skinner is probably the name best known with “teaching machines” – that is, programmed instruction (pre-computer).

Skinner’s work on educational technology – on teaching and learning with machines – is connected directly, explicitly to his work with animals. Hence my usage of the pigeon imagery. Skinner believed that there was not enough (if any) of the right kind of behavior modification undertaken in schools. He pointed that that students are punished when they do something wrong – that’s the behavioral reinforcement that they receive: aversion. But students are rarely rewarded when they do something right. And again, this isn’t simply about “classroom behavior” – the kind of thing you get a grade for “good citizenship” on (not talking in class or cutting in the lunch line). Learning, to Skinner, was a behavior – and a behavior that needed what he called “contingencies of reinforcement.” These should be positive. They should minimize the chances of doing something wrong – getting the wrong answer, for example. (That’s why Skinner didn’t like multiple choice tests.) The reinforcement should be immediate.

skinner bird experiment

Skinner designed a teaching machine that he said would do all these things – allow the student to move at her own pace through the material. The student would know instantaneously if she had the answer right. (The reward was getting to move on to the next exciting question or concept.) And you can hear all this echoed in today’s education technology designers and developers and school reformers – from Sal Khan and Khan Academy to US Secretary of Education Betsy DeVos. It’s called “personalized learning.” But it’s essentially pigeon training with a snazzier interface.

“Once we have arranged the particular type of consequence called a reinforcement,” Skinner wrote in 1954 in “The Science of Learning and the Art of Teaching,” "our techniques permit us to shape the behavior of an organism almost at will. It has become a routine exercise to demonstrate this in classes in elementary psychology by conditioning such an organism as a pigeon.”

“ …Such an organism as a pigeon .” We often speak of “lab rats” as shorthand for the animals used in scientific experiments. We use the phrase too to describe people who work in labs, who are completely absorbed in performing their tasks again and again and again. In education and in education technology, students are also the subjects of experimentation and conditioning. In Skinner’s framework, they are not “lab rats”; they are pigeons . As he wrote,

…Comparable results have been obtained with pigeons, rats, dogs, monkeys, human children… and psychotic subjects. In spite of great phylogenetic differences, all these organisms show amazingly similar properties of the learning process. It should be emphasized that this has been achieved by analyzing the effects of reinforcement and by designing techniques that manipulate reinforcement with considerable precision. Only in this way can the behavior of the individual be brought under such precise control.

If we do not bring students’ behavior under control, Skinner cautioned, we will find ourselves “losing our pigeon.” The animal will be beyond our control.

Like I said, I’m writing a book. So I can talk at great length about Skinner and teaching machines. But I want folks to consider how behaviorism hasn’t just found its way into education reform or education technology. Indeed, Skinner and many others envisioned that application of operant conditioning outside of the laboratory, outside of the classroom – the usage (past and present) of behavior modification for social engineering is at the heart of a lot of “fixes” that people think they’re doing “for the sake of the children,” or “for the good of the country,” or “to make the world a better place.”

skinner bird experiment

Among the discoveries I made – new to me, not new to the world, to be clear: in the mid–1960s, B. F. Skinner was contacted by the Joseph P. Kennedy Jr. Foundation, a non-profit that funded various institutions and research projects that dealt with mental disabilities. Eunice Kennedy Shriver was apparently interested in his work on operant behavior and child-rearing, and her husband Sargent Shriver who’d been appointed by President Johnson to head the newly formed Office of Economic Opportunity was also keen to find ways to use operant conditioning as part of the War on Poverty.

There was a meeting. Skinner filed a report. But as he wrote in his autobiography, nothing came of it. “A year later,” he added, “one of Shriver’s aides came to see me about motivating the peasants in Venezuela.”

Motivating pigeons or poor people or peasants (or motivating peasants and poor people as pigeons) – it’s all offered, quite earnestly no doubt – as the ways in which science and scientific management will make the world better.

But if nothing else, the application of behavior modification to poverty implies that this is a psychological problem and not a structural one. Focus on the individual and their “mindset” – to use the language that education technology and educational psychology folks invoke these days – not on the larger, societal problems.

I recognize, of course, that you can say “it’s for their own good” – but it involves a great deal of hubris (and often historical and cultural ignorance, quite frankly) to assume that you know what “their own good” actually entails.

skinner bird experiment

You’ll sometimes hear that B. F. Skinner’s theories are no longer in fashion – the behaviorist elements of psychology have given way to the cognitive turn. And with or without developments in cognitive and neuroscience, Skinner’s star had certainly lost some of its luster towards the end of his career, particularly, as many like to tell the story, after Noam Chomsky penned a brutal review of his book Beyond Freedom and Dignity in the December 1971 issue of The New York Review of Books . In the book, Skinner argues that our ideas of freedom and free will and human dignity stand in the way of a behavioral science that can better organize and optimize society.

“Skinner’s science of human behavior, being quite vacuous, is as congenial to the libertarian as to the fascist,” writes Chomsky, adding that “there is nothing in Skinner’s approach that is incompatible with a police state in which rigid laws are enforced by people who are themselves subject to them and the threat of dire punishment hangs over all.”

Skinner argues in Beyond Freedom and Dignity that the goal of behavioral technologies should be to “design a world in which behavior likely to be punished seldom or never occurs” – a world of “automatic goodness.“ We should not be concerned with freedom, Skinner argues – that’s simply mysticism. We should pursue ”effectiveness of techniques of control“ which will ”make the world safer." Or make the world totalitarian, as Chomsky points out.

skinner bird experiment

Building behavioral technologies is, of course, what many computer scientists now do (perhaps what some of you do cough FitBit) – most, I’d say, firmly believing that they’re also building a world of “automatic goodness.” “Persuasive technologies,” as Stanford professor B. J. Fogg calls it. And in true Silicon Valley fashion, Fogg erases the long history of behavioral psychology in doing so: “the earliest signs of persuasive technology appeared in the 1970s and 1980s when a few computing systems were designed to promote health and increase workplace productivity,” he writes in his textbook. His students at his Behavioral Design Lab at Stanford have included Mike Krieger, the co-founder of Instagram, and Tristan Harris, a former Googler, founder of the Center for Humane Technology, and best known figure in what I call the “tech regrets industry” – he’s into “ethical” persuasive technologies now, you see.

Behavior modification. Behavioral conditioning. Behavioral design. Gamification. Operant conditioning. All practices and products and machines that are perhaps so ubiquitous in technology that we don’t see them – we just feel the hook and the urge for the features that reward us for behaving like those Project Pigeon birds pecking away at their target – not really aware of why there’s a war or what’s at stake or that we’re going to suffer and die if this missile runs its course. But nobody asked the pigeons. And even with the best of intentions for pigeons – promising pigeons an end to poverty and illiteracy, nobody asked the pigeons. Folks just assumed that because the smart men at Harvard (or Stanford or Silicon Valley or the US government) were on it, that it was surely right “fix.”

Published 15 Jun 2018

Hack Education

The history of the future of education technology.

Psychologist World

Learn More Psychology

  • Cognitive Psychology

Superstition

How skinner's pigeon experiment revealed signs of superstition in pigeons..

Permalink Print   |  

Superstition

  • Behavioral Psychology
  • Biological Psychology
  • Body Language Interpretation
  • Developmental Psychology
  • Dream Interpretation
  • Freudian Psychology
  • Memory & Memory Techniques
  • Role Playing: Stanford Prison Experiment
  • Authoritarian Personality
  • Memory: Levels of Processing
  • Cold Reading: Psychology of Fortune Telling
  • Stages of Sleep
  • Personality Psychology
  • Why Do We Forget?
  • Psychology of Influence
  • Stress in Psychology
  • Body Language: How to Spot a Liar
  • Be a Better Communicator
  • Eye Reading: Body Language
  • Motivation: Maslow's Hierarchy of Needs
  • How to Interpret your Dreams Guide
  • How to Remember Your Dreams
  • Interpreting Your Dreams
  • Superstition in Pigeons
  • Altruism in Animals and Humans
  • Stimulus-Response Theory
  • Conditioned Behavior
  • Synesthesia: Mixing the Senses
  • Freudian Personality Type Test
  • ... and much more
  • Unlimited access to analysis of groundbreaking research and studies
  • 17+ psychology guides : develop your understanding of the mind
  • Self Help Audio : MP3 sessions to stream or download

Best Digital Psychology Magazine - UK

Best online psychology theory resource.

Which Archetype Are You?

Which Archetype Are You?

Are You Angry?

Are You Angry?

Windows to the Soul

Windows to the Soul

Are You Stressed?

Are You Stressed?

Attachment & Relationships

Attachment & Relationships

Memory Like A Goldfish?

Memory Like A Goldfish?

31 Defense Mechanisms

31 Defense Mechanisms

Slave To Your Role?

Slave To Your Role?

Which Archetype Are You?

Are You Fixated?

Are You Fixated?

Interpret Your Dreams

Interpret Your Dreams

How to Read Body Language

How to Read Body Language

How to Beat Stress and Succeed in Exams

skinner bird experiment

Psychology Topics

Learn psychology.

Sign Up

  • Access 2,200+ insightful pages of psychology explanations & theories
  • Insights into the way we think and behave
  • Body Language & Dream Interpretation guides
  • Self hypnosis MP3 downloads and more
  • Behavioral Approach
  • Eye Reading
  • Stress Test
  • Cognitive Approach
  • Fight-or-Flight Response
  • Neuroticism Test

© 2024 Psychologist World. Home About Contact Us Terms of Use Privacy & Cookies Hypnosis Scripts Sign Up

Processing Therapy

What Exactly Was The Skinner Pigeon Experiment

Table of Contents

What exactly was the Skinner pigeon experiment?

Pigeons were trained to peck at a tiny, moving point beneath a glass screen using a device that made a clicking noise. According to Skinner, when birds are placed in front of a screen inside a missile, they quickly start pecking at the screen as soon as they spot enemy torpedoes. Thus, we draw the conclusion that for Skinner, the rate of response is the most crucial indicator of learning.Skinner used operant conditioning to train birds to behave like people. Starting with pigeons, Skinner used them to train them to carry out specific tasks or peck at specific levers in order to access food as he worked to demonstrate these theories.B. F. To illustrate his theory of operant conditioning, Dr. Skinner trained pigeons. Richard Herrnstein, a Harvard psychologist, developed a well-known series of experiments in the 1960s and 1970s that demonstrated that pigeons could categorize objects into different groups, such as trees or people.According to Skinner, the aim of psychology is to predict and manage an organism’s behavior based on its past reinforcement history and current stimulus situation.Skinner frequently used a technique known as shaping in his operant conditioning experiments. In shaping, we reward successive approximations of a target behavior rather than just the actual behavior. For example, parents can divide a task into more manageable, smaller steps.

When did the Skinner pigeon experiment occur?

Skinner. FT) IS-sec noncontin- gent food schedule by Skinner. Six of the birds, according to his report, exhibited consistent superstitious behaviors, such as circling, head swinging, and pecking. Such behavior was decried as superstitious by Skinner, who also proposed that it persisted as a result of what is now known as adventitious reinforcement—a temporal proximity that occurred by accident between the pigeon’s activity at the time and the food’s delivery.An enclosed device called a Skinner box has a bar or key that an animal subject can manipulate to get reinforcement. Produced by B. F.Superstition among people. Even pigeons can be trained to exhibit superstitious behaviors in the hopes of being fed, as demonstrated by Skinner’s pigeon experiment. However, superstition is more readily apparent in commonplace human behavior, such as avoiding three grates in a row or going under ladders.Skinner put a rat inside a box with a lever that let food into the box as part of an experiment known as the Skinner box. The rat eventually understood that its behavior (pulling the lever) resulted in a specific outcome (getting food) after accidentally hitting the lever on the lever enough times.Skinner’s Operant Conditioning Theory. His hypothesis was supported by two presumptions. First, a person’s environment plays a role in how they behave. The likelihood that a behavior will be repeated is also determined by its effects.

What is the significance of Skinner’s theory?

Psychology’s understanding of how behavior is learned has been greatly aided by Skinner’s theory of operant conditioning. It explains how reinforcement schedules can impact conditioning results and why reinforcements can be used in the learning process so successfully. Developmental psychologists were able to research the effects of positive and negative reinforcement thanks to his work. According to Skinner, the environment affects behavior, and when the environment is changed, behavior will change.B. F. In order to develop or expand theories that were applicable to people, Skinner used animals in his experiments. Examples of his work in this area include behavior modification, shaping, and reinforcement schedules.The Skinner boxes, which bear his name, were used to train animals’ behavior using rewards and punishments. For instance, Skinner employed this method to teach pigeons how to read. The bird had learned through operant conditioning that saying Peck would result in food being given to him.B. F. Operant conditioning is a term that Skinner (1938) coined to describe the roughly changing of behavior through the use of reinforcement that is given after the desired response. Skinner distinguished three categories of operant responses that can occur in response to behavior.

What is the conclusion of the Skinner theory?

Conclusion of Operant Conditioning By either using positive or negative reinforcement, we can be able to encourage or discourage a particular trait that we desire. We would be able to influence behavior if we used Skinner’s theory. This can be accomplished by either rewarding or penalizing behavior. Since all actions were dictated by the chances of reinforcement, Skinner was heavily criticized for underestimating the existence of free will. Any human action can be justified using such a perspective because the person can place the blame for their actions on outside forces.B. F. Among American psychologists, Skinner had the most sway. He was a behaviorist who created the operant conditioning theory, which postulates that actions are determined by the consequences they receive, such as rewards or penalties, which affect how likely they are to be repeated.Furthermore, the maintenance of the learned behavior depends on reinforcement (Skinner, 1963). Positive reinforcement and negative reinforcement are the two different kinds of reinforcement.An especially strong point of Skinner’s research is how easily it can be applied to a variety of social contexts, including the family, the workplace, and education, with little formal training.The idea of positive punishment is applied in B. F. Operant conditioning according to Skinner. The objective of any form of punishment is to lessen the behavior that it follows, so describe the precise steps involved in the positive punishment process.

What was the title of Skinner’s study?

Experiment in a Box by Skinner B. Skinner’s experiment in a box was developed to test the. F. Edward Thorndike’s experimental puzzle box had a big impact on Skinner. Thorndike used his puzzle box as a learning tool in a lab where he trained hungry cats through trial and error. The Skinner box is used to study animal behavior by identifying when an animal exhibits a desired behavior, rewarding it, and measuring how long it takes the animal to learn the behavior.According to this theory, behavior that is followed by positive consequences is more likely to be repeated than behavior that is followed by negative consequences. BF Skinner called this concept operant conditioning. Reinforcement was a new concept added to Skinner’s Law of Effect.Teachers want their students to act in a certain way and comprehend the rules and procedures of the class. To increase the desired behaviors and decrease the undesirable ones, they use positive reinforcement or negative consequences. B’s underlying principles are those relating to human motivation. F. Skinner.A voluntary behavior is associated with a reinforcing stimulus in shaping, which is a component of operant conditioning learning theory. The work of B has been a major influence on how this association was learned and how well it trains behaviors. F. Skinner and has a strong animal research foundation.

The learning theory put forth by Skinner is what?

According to Skinner’s theory of learning, a person is first exposed to a stimulus, which elicits a response, and the response is then reinforced (stimulus, response, reinforcement). In the end, this is what shapes our behaviors. Psychology’s understanding of how behavior is learned has been greatly aided by Skinner’s theory of operant conditioning. It explains why reinforcements are so useful in the learning process and how reinforcement schedules can influence the results of conditioning.The distinction that Skinner draws between verbal and nonverbal operant behavior is based on the different criteria by which the environment chooses each topography as well as other response-related dimensions, and the definition of verbal behavior must specify the relevant criterion.Theoretically, according to Skinner, operant behavior should involve a repeatable response, such as pressing a lever for rats or pecking an illuminated disk (key) for pigeons.The Skinner theory of operant conditioning proposes that reinforcement and punishment are the causes of learning and behavior modification. By strengthening a response, reinforcement increases the likelihood that the behavior will recur in the future.

Which technique did Skinner employ to train the birds?

By projecting a . Skinner started training the pigeon. Using a method known as shaping, he would feed the bird when it pounced on the dot. Thus, relational, mediational, communal, and stipulational are among Skinner’s four characteristics of verbal behavior.The Skinner box is used to study animal behavior by identifying when an animal exhibits a desired behavior, rewarding it, and measuring how long it takes the animal to learn the behavior.Limitations of Reinforcement Theory One of the main tenets of Skinner’s reinforcement theory is that behavior is influenced solely by consequences. The theory does not take into account deeper motivations or people’s internal feelings, which could produce contradictory outcomes.Skinner discovered that the type of reinforcement that results in the slowest rate of extinction (i. Continuous reinforcement is the kind of reinforcement that extinguishes itself the fastest.Skinner. Operant conditioning is a type of learning in which a behavior is motivated only after it has been demonstrated. After engaging in a particular behavior, either an animal or a human is penalized. The result is either a reinforcer or a punisher.

Related Posts

Why is art journaling therapeutic, what is art journal therapy, why is art journaling important, does journaling help with mental health, what is the goal of expressive arts therapy, what is dbt art therapy, what is gestalt art therapy, what are 3 writing prompts, what are four benefits of art therapy, leave a comment cancel reply.

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

Please enter an answer in digits: 14 − thirteen =

MIT Technology Review

  • Newsletters

Pigeon pilots

  • Christina Couch, SM ’15 archive page

Hand inserting a pigeon into missle

As World War II raged in the 1940s, a showdown between missile guidance systems unfolded in Building 20, home of the recently established MIT Radiation Laboratory. The “Rad Lab” team was secretly building microwave radar technologies to support the Allied troops, focusing on detection: aircraft-locating technologies and target-finding explosives. But in the Midwest, the psychologist B. F. Skinner had been concocting an alternative plan: instead of building guidance technologies, why not let animals with innate homing abilities do the guiding? He began training pigeons to pilot bombs.

Skinner viewed the idea as a logical extension of years spent studying animal behavior. He had already conducted his now-famous rat experiments, which showed that behavior can be controlled by manipulating environmental conditions, and he knew that if complicated jobs were broken into simple components, animals given enough time (and treats) could master them piece by piece. The US Army already used trained pigeons for message delivery. Why not for bomb delivery, too?

In the months following Hitler’s 1939 bombing of Warsaw, Skinner, who was then a professor at the University of Minnesota, wondered whether defensive missiles dropped from high altitudes could take out attack planes in midair. While on a train, he spotted birds flying in formation and thought the solution might be “waiting for me in my own backyard.”

But first, he had to teach birds to recognize targets. Skinner cut the toe out of a sock and slipped a pigeon inside, head poking out the end. Wings restrained and legs tied with shoestring, the bird was strapped to a wooden block and placed in an apparatus attached to an electrical hoist that could move up and down, and to a track that ran across the ceiling and down Skinner’s wall. By nudging strategically placed lightweight rods, the pigeon could steer the contraption up or down, right or left. Skinner set up a hanging bull’s-eye that held a cup of grain and gradually moved the starting point of the pigeon’s steering contraption away from it, eventually teaching the bird to align itself directly in front of the target as it was wheeled across the room.

Meanwhile, MIT researchers were making their own breakthroughs. Research on the microwave radar defense systems had begun at the Rad Lab shortly after the National Defense Research Committee had established it in early 1941. In 1942, NDRC’s successor, the Office of Scientific Research and Development (OSRD)—which was directed by Vannevar Bush, EGD ’16, a former dean of the MIT School of Engineering—launched its guided missile division at the Rad Lab.

The guided missile team focused on developing homing missiles that could be carried externally on high-flying planes. Throughout 1942, they built several experimental systems, including one for the guided glide bomb known as the Pelican. “Project Pelican” was designed to guide a receiver in the bomb’s nose cone to a ground-based transmitter acting as a target. The Rad Lab was making radar breakthroughs in areas like navigation and air-to-surface vessel detection. But hundreds of (bomb-free) test flights later, guided missile systems were still unreliable.

By late 1942, Skinner had requested OSRD funds for pigeon research twice, and twice had been rejected. A fellow researcher mentioned the work to executives at General Mills, and the company gave Skinner $5,000 and space in an old flour mill to develop the project so that its feasibility could be proved to the government. Skinner, several students, and a General Mills engineer began testing how restrained pigeons performed under extreme flight conditions. Upon finding that underfed birds stayed food-focused amid changes in temperature, pressure, acceleration, noise, and vibrations, they also began devising ways to turn precision pecking into precision bombing. 

The team built an apparatus to train four birds simultaneously. Each pigeon was harnessed and placed in a tight enclosure with a four-inch opening near the bird’s beak. The opening contained a translucent plate with an image of a target projected from a lens outside the enclosure. Light beams formed an X where the pigeon was to strike, and as it pecked, grain dropped onto the plate. As the target moved across the field of view, pigeons moved their heads to keep pecking directly on the center of the target—and keep the food coming. (Skinner would later realize that cannabis made pigeons “almost fearless,” and replaced the grain with hemp seed.) The OSRD awarded General Mills $25,000 to develop “Project Pigeon” for the Pelican, which had been designed to fly without radically changing direction. If a pigeon guidance system could move the bomb just a few degrees toward the target as it dropped, that would increase accuracy.

Skinner’s team built a “multiple bird unit” with three lenses, three screens, and pressurized chambers for three restrained birds who’d been trained to recognize aerial footage of a target. Conditioned to expect that grain would eventually drop, the birds diligently pecked at targets projected onto the screens. Each screen had air valves positioned on the north, south, east, and west edges. When the target image moved away from center (and pecking did, too), air on the opposite side of the plate was released, triggering pneumatic sensors and sending directional signals to move the glide bomb’s fins. To prevent errors, two of the three birds had to agree before the system changed direction. The team built a simulator that mimicked Pelican’s steering.

In early 1944, after learning that OSRD had not renewed his contract, Skinner headed to MIT to show the researchers in Building 20 what they were missing. In a live demonstration, the birds successfully followed the target, pecking fast and hard enough to keep the image within 3 degrees of the center of the screen. Reactions were mixed. While servomechanisms engineer (and future MIT Graduate School dean) Harold Hazen ’24, SM ’29, SCD ’31, said the pigeons were “better than radar,” others doubted that Skinner’s test conditions accurately simulated a glide bomb in flight and said it would be too difficult to fine-tune the servomechanism linking the pecking to the steering. Instead of convincing OSRD to renew Project Pigeon’s funding, the demo proved what Skinner himself had once observed: “A pigeon was more easily controlled than a physical scientist serving on a committee.”

Project Pigeon was shelved in favor of another promising animal-inspired bomb. In May 1944 a team led by the National Bureau of Standards, including a Rad Lab research group headed by Ralph Lamm and Perry Stout, began testing the “Bat” bomb—a missile that emitted electrical radiation and used echolocation to estimate distance from a target. It went on to become the first fully automatic guided missile deployed in combat.

Experimental psychologist Franklin V. Taylor, who was then running the Naval Research Lab in Washington, DC, briefly rebooted Project Pigeon in 1948 after seeing a film of Skinner’s bird pilots. His team attached gold electrodes to the pigeons’ beaks and projected the target images onto conductive screens that transferred their pecks into electronic signals, replacing the pneumatic controls. But the project was canceled in 1953 as other electronic guidance systems became more reliable.

Although pigeon-guided bombs never took flight, Skinner stood by the research. “Call it a crackpot idea if you will,” he wrote in a summary of the project published in 1960. “It is one in which I have never lost faith.”

Keep Reading

Most popular.

six runners running in a line on a track

Supershoes are reshaping distance running

Kenyan runners, like many others, are grappling with the impact of expensive, high-performance shoes.

  • Jonathan W. Rosen archive page

ASCII image of a head with the text, "How can I help you today?"

Why does AI hallucinate?

The tendency to make things up is holding chatbots back. But that’s just what they do.

  • Will Douglas Heaven archive page

a woman stationed in front a mass of pneumatic tubes

The return of pneumatic tubes

Pneumatic tubes were supposed to revolutionize the world but have fallen by the wayside. Except in hospitals.

  • Vanessa Armstrong archive page

a knight standing in a virtual space

How generative AI could reinvent what it means to play

AI-powered NPCs that don’t need a script could make games—and other worlds—deeply immersive.

  • Niall Firth archive page

Stay connected

Get the latest updates from mit technology review.

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at [email protected] with a list of newsletters you’d like to receive.

AT THE SMITHSONIAN

B.f. skinner’s pigeon-guided rocket.

On this date 21 years ago, noted psychologist and inventor B.F. Skinner died; the American History Museum is home to one of his more unusual inventions

Joseph Stromberg

Joseph Stromberg

Nose Cone from B.F. Skinner's Pigeon-Guided Missile, on display in "Science in American Life."

It’s 1943, and America desperately needs a way to reliably bomb targets in Nazi Germany. What do we do? For B.F. Skinner, noted psychologist and inventor, the answer was obvious: pigeons.

“During World War II, there was a grave concern about aiming missiles,” says Peggy Kidwell , a curator of Medicine and Science at the American History Museum. “Military officials really wanted to figure out how to aim them accurately,”  Skinner approached the National Research Defense Committee with his plan, code-named “Project Pigeon.” Members of the committee were doubtful, but granted Skinner $25,000 to get started.

Skinner had already used pigeons in his psychological research, training them to press levers for food. An obsessive inventor, he had been pondering weapons targeting systems one day when he saw a flock of birds maneuvering in formation in the sky. “Suddenly I saw them as ‘devices’ with excellent vision and extraordinary maneuverability,” he said . “Could they not guide a missile? Was the answer to the problem waiting for me in my own back yard?”

Getting to work, Skinner decided on pigeons because of both their vision and unflappable behavior in chaotic conditions. He built a nose cone for a missile fitted with three small electronic screens and three tiny pigeon cockpits. Onto the screens was projected an image of the ground in front of the rocket.

“He would train street pigeons to recognize the pattern of the target, and to peck when they saw this target,” says Kidwell. “And then when all three of them pecked, it was thought you could actually aim the missile in that direction.” As the pigeons pecked, cables harnessed to each one’s head would mechanically steer the missile until it finally reached its mark. Alas, without an escape hatch, the birds would perish along with their target, making it a kamikaze mission.

Despite a successful demonstration of the trained pigeons, officials remained skeptical and eventually decided to terminate the project. Skinner, of course, would go on to become one of the country’s most influential psychologists, popularizing behaviorism, a conception of psychology that views behavior as a reaction to one’s environment.

He also kept inventing. As part of his research, Skinner designed a number of devices that used feedback processes to encourage learning. “After the war, he became very interested in machines for teaching people to do things,” says Kidwell. “In 1954, he had this machine for teaching arithmetic to young people, and in 1957 he designed a machine for teaching Harvard students basic natural sciences.”

Although Skinner’s machines were purely mechanical, the ideas he developed have been incorporated into many educational software programs in recent years, including some used in distance learning settings. “Many of his ideas are now most frequently seen by people as they have been incorporated in electronic testing. That programmed learning, where you have a series of questions, and responses, and based on the response you gave you are directed to the next question, is very much in a Skinnerian framework,” Kidwell says.

Skinner’s missile prototype, along with other teaching machines, came to the Smithsonian at the end of his career. “Skinner was a teacher of Uta C. Merzbach, who was a curator in this museum,” says Kidwell. “They had a very good relationship, so when he was writing his autobiography, when he had finished writing about a particular machine, he would give it to the museum.” The American History Museum is home to several Skinner teaching machines , as well as the missile, which is on display in the “ Science in American Life ” exhibition.

As for the pigeons? Skinner held on to them, and just out of curiosity, occasionally tested them  to see if their skills were still sharp enough for battle. One, two, four, and even six years later, the pigeons were still pecking strong.

Get the latest on what's happening At the Smithsonian in your inbox.

Joseph Stromberg

Joseph Stromberg | | READ MORE

Joseph Stromberg was previously a digital reporter for Smithsonian .

IMAGES

  1. skinner pigeon experiment

    skinner bird experiment

  2. Way of Intervening: B.F. Skinner's Pigeon Experiment

    skinner bird experiment

  3. PPT

    skinner bird experiment

  4. Skinner's Psychology Was Best Described as

    skinner bird experiment

  5. Skinner Box: What Is an Operant Conditioning Chamber?

    skinner bird experiment

  6. Skinner box or operant conditioning chamber experiment outline diagram

    skinner bird experiment

COMMENTS

  1. B.F. Skinner: The Man Who Taught Pigeons to Play Ping-Pong and Rats to

    March 20, 2013. Psychologist B.F. Skinner taught these pigeons to play ping-pong in 1950. Photo via Psychology Pictures. B.F Skinner, a leading 20th century psychologist who hypothesized that ...

  2. Skinner's Box Experiment (Behaviorism Study)

    Skinner's Box is one of the most influential experiments in the world of psychology. Learn how this device made an impact on behaviorism.

  3. PDF Skinner, 1948

    The response of hopping from side to side never reappeared and could not, of course, be obtained deliberately without making the reinforcement contingent upon the behavior. The experiment might be said to demonstrate a sort of superstition. The bird behaves as if there was a causal relation between its behavior and the presentation of food ...

  4. B. F. Skinner

    Skinner also used the term "experiment" when describing the crib, and this association with laboratory animal experimentation discouraged the crib's commercial success, although several companies attempted to produce and sell it.

  5. Operant conditioning chamber

    An operant conditioning chamber (also known as a Skinner box) is a laboratory apparatus used to study animal behavior. The operant conditioning chamber was created by B. F. Skinner while he was a graduate student at Harvard University. The chamber can be used to study both operant conditioning and classical conditioning.

  6. B.F. Skinner, the man who taught pigeons to play ping-pong

    Skinner carried out experiments with pigeons and rats to establish the concept of positive and negative reinforcements. B.F Skinner holding a pigeon used for his experiments during World War II. Over the years, many psychologists have attempted to decode the human mind and its labyrinths. You may have heard of Sigmund Freud and Carl Jung, two ...

  7. Project Pigeon

    This silent video shows the project Skinner worked on during World War Two. The problem was that before radar, pilots trying to hit enemy ships flew so close that they were often shot down. Skinner realized he could teach pigeons to guide missiles. Pigeons were trained to peck an image that would look like a ship as a missile approached.

  8. PDF The Discovery of Shaping: B.F. Skinner's Big Surprise

    The quote about "great illumination" sounded familiar, but Bjork's take on the incident in question was totally different from what mine had been. While Bjork and I had evidently both been struck by Skinner's expression of surprise, Bjork obviously thought that the bird learning while being hand-held had caused Skinner's surprise.

  9. Project Pigeon

    Project Pigeon. During World War II, Project Pigeon (later Project Orcon, for "organic control") was American behaviorist B. F. Skinner 's attempt to develop a pigeon -controlled guided bomb. [1]

  10. Skinner's Project Pigeon: One Of The First Teaching Machines

    Skinner began the training process by the showing the pigeon a dot projected on a translucent screen and, using the technique called shaping, would reward the bird with food when it pecked on the dot. (Shaping, or successive approximation, is a process Skinner developed to encourage specific behaviors he wanted by rewarding small steps toward that action.

  11. B.F. Skinner at Harvard

    Burrhus Frederic Skinner, known to the academic world as B.F. Skinner, would continue to experiment on pigeons, but his years at Harvard proved to be as dynamic and eclectic as his contributions ...

  12. 'Superstition' in the pigeon.

    The bird tends to learn whatever response it is making when the hopper appears. The response may be extinguished and reconditioned. "The experiment might be said to demonstrate a sort of superstition. The bird behaves as if there were a causal relation between its behavior and the presentation of food, although such a relation is lacking."

  13. Skinner's Superstition Experiment

    What is it? Skinner's Superstition Experiment was a study conducted by B.F. Skinner in 1947, where he placed pigeons in a Skinner box and found that even though the delivery of food was entirely random, the pigeons exhibited superstitious behaviors and developed their own "superstitions" that they believed were responsible for the delivery of food. This experiment showed that animals (and ...

  14. Full text of "Skinner, B. F. ( 1938). The Behavior Of Organisms An

    the card. After half a minute replace the card or turn on the I light and continue the experiment. Under these conditions the response is positively reinforced with food remains part of the repertoire of the bird, while the response which leads to a blackout quickly disappears.

  15. Are we like Skinner's pigeon?

    The experiment itself is quite simple: suppose we put a pigeon in a box with a system that delivers food to the bird "at regular intervals with no reference whatsoever to the bird's behavior ...

  16. Pigeons, Operant Conditioning, and Social Control

    During World War II, Skinner worked on a program called Project Pigeon - also known as Project Orcon, short for Organic Control - an experimental project to create pigeon-guided missiles. The pigeons were trained by Skinner to peck at a target, and they rewarded with food when they completed the task correctly.

  17. Superstition

    The Superstition Experiment In the Summer of 1947, renowned behavioral psychologist B.F. Skinner published his study on a group of pigeons that showed even animals are susceptible to the human condition that is superstition.

  18. Classics in the History of Psychology -- Skinner (1948)

    The experiment might be said to demonstrate a sort of superstition. The bird behaves as if there were a causal relation between its behavior and the presentation of food, although such a relation is lacking. There are many analogies in human behavior. Rituals for changing one's luck at cards are good examples.

  19. Classics in the History of Psychology -- Skinner (1950)

    The bird will receive practically as many reinforcements if it responds to (1) only one key or (2) only one color, since the programming of the experiment makes any persistent response eventually the correct one.

  20. What Exactly Was The Skinner Pigeon Experiment

    What exactly was the Skinner pigeon experiment? Pigeons were trained to peck at a tiny, moving point beneath a glass screen using a device that made a clicking noise. According to Skinner, when birds are placed in front of a screen inside a missile, they quickly start pecking at the screen as soon as they spot enemy torpedoes.

  21. Pigeon pilots

    Pigeon pilots. B. F. Skinner trained birds to steer bombs. But his prototype was scrapped in favor of a bat-inspired missile guidance system developed at the Rad Lab. As World War II raged in the ...

  22. B.F. Skinner's Pigeon-Guided Rocket

    On this date 21 years ago, noted psychologist and inventor B.F. Skinner died; the American History Museum is home to one of his more unusual inventions

  23. PDF 'SUPERSTITION' IN THE PIGEON

    The experiment might be said to demonstrate a sort of supersti-tion. The bird behaves as if there were a causal relation between its behavior and the presentation of food, although such a relation is lacking. There are many analogies in human behavior. Rituals for changing one's luck at cards are good examples.