Why did I do that? A Primer on B.F. Skinner
- May 18, 2016
- Posted by: Administrator
- Category: Psychology Explained,
“I did not direct my life. I didn’t design it. I never made decisions. Things always came up and made them for me. That’s what life is.”– B. F. Skinner
Burrhus Frederic (B.F.) Skinner is known as one of the most influential American psychologists. His theory? We are what we do. Not only that, we can change what we do for the better.
What he proposed may not seem remarkable now, but it was at the time. Initially, his work focused on lab animals, and he then made more and more broad applications to people. Later on, his work focused on how environments shape human behavior.
Have you ever trained a dog or other animal to perform a trick? Whether it’s rolling over or performing a more unusual trick that lands you on Dave Letterman’s show, you’re following in Skinner’s footsteps. His work was based on the idea that learning is a function of change in overt behavior.
Changes in behavior are the result of an individual’s response to events, or stimuli that occur in the environment. A response produces a consequence. In other words, you train a dog to change his behavior (i.e. roll over) by giving a reward.
A simple idea in dog terms, but more complicated when it comes to people. In fact, Skinner once boasted, “Give me a child, and I’ll shape him into anything.”
Pavlov vs. Skinner
Before Skinner came Ivan Pavlov, the father of “classical conditioning.” In classical conditioning, an existing behavior is shaped by associating it with a new stimulus. His most well-known example is how Pavlov got his dog to salivate with a stimulus, the ringing of a metronome. Feed the dog every time the bell rings and eventually, when the bell rings, the dog salivates.
Skinner’s approach, known as operant conditioning, was different. For him, the reward came after the behavior. Operant conditioning rewards partial behavior or a random act that approaches the desired behavior. Operant conditioning can be used to shape behavior.
If the goal is to have a pigeon turn in a circle to the left, a reward is given for any small movement to the left. Think you’re smarter than a pigeon? Have you played the lottery or a slot machine in Las Vegas? The casino owners train you to pour money in slot machines by allowing you to win just often enough to keep you hoping.
Skinner Box
One of his most well-known inventions was a wooden box commonly referred to as the “Skinner Box.” To help study operant conditioning, lab animals, such as free-roaming rats, were used in the boxes. When they accidentally pressed a bar they were rewarded with a small pellet of food.
As time went on, the rats would press the bar more frequently to receive the reward. Skinner was interested in the rate of responding; he found that when responses are reinforced, their rates of occurrence increase. In terms of human behavior–if I am wearing a certain pair of socks on the day I hit a home run, those are my lucky socks, and I have to wear them every time I play baseball.
Skinner’s Legacy
Because Skinner believed everything we do and are is shaped by our experience of punishment and reward, his theories have been instrumental in educational theory. B.F. Skinner urged school systems to move away from negative reinforcement because “Positive reinforcement can be as effective as negative reinforcement and has many fewer unwanted byproducts.”
For example, students who are punished when they do not study may study, but they may also stay away from school (truancy), vandalize school property, attack teachers, or stubbornly do nothing. Redesigning school systems so that what students do is more often positively reinforced can make a great difference.”
References
B.F. Skinner Foundation. Retrieved 16 March 2010.
The Journal for the Experimental Analysis of Behavior (JEAB). Retrieved 16 March 2010.
The Journal of Applied Behavior Analysis (JABA). Retrieved 16 March 2010.
Skinner, B.F. A Brief Survey of Operant Behavior. Retrieved 16 March 2010.
Skinner, B.F. (1950). Are theories of learning necessary? Psychological Review, 57(4), 193-216.
Skinner, B.F. (1953). Science and Human Behavior. New York: Macmillan.
Skinner, B.F. (1954). The science of learning and the art of teaching. Harvard Educational Review, 24(2), 86-97.
Skinner, B.F. (1957). Verbal Learning. New York: Appleton-Century-Crofts.
Skinner, B.F. (1968). The Technology of Teaching. New York: Appleton-Century-Crofts.
Skinner, B.F. (1971). Beyond Freedom and Dignity. New York: Knopf.