Shaping is a concept that many pet owners find hard to grasp. We're used to making animals do things by leading them or pushing them into the behavior we want—and it is hard to believe that there is another way. Common sense tells us that there is no possible way to get an animal to do something it has never done before, doing nothing yourself but reinforcing spontaneous movements.
The word "shaping" is scientific slang for building a particular behavior by using a series of small steps to achieve it. Shaping allows you to create behavior from scratch without physical control or corrections, but rather by drawing on your animal's natural ability to learn.
Even B.F. Skinner did not start out training animals by capturing and shaping spontaneously offered behavior. Initially, he taught his laboratory animals to press levers and accomplish other tasks by making small changes in the environment: raising the height of a bar in small increments until an animal had to reach higher, or increasing the "stiffness" of a button so a pigeon learned to peck harder. This method was called successive approximation.
In 1943, while waiting for a government grant to come through, Skinner and two of his graduate students decided to see if they could teach one of their experimental pigeons to bowl in a laboratory on the top floor of a building in Minnesota.
They started by putting the pigeon and a wooden ball in a box rigged with an automatic feeder, planning to trip the feeder when the pigeon swiped at the ball with its beak. But the pigeon did not swipe at the ball as they had hoped, and they grew tired of waiting. Skinner decided to reinforce any movement toward the ball, even just one look toward it. When the pigeon looked in that direction, he clicked the switch, opening the feeder briefly so the pigeon could get a bit of corn.
Skinner later wrote, "The result amazed us. In a few moments, the ball was caroming off the walls of the box as if the pigeon had been a champion squash player." Skinner had made a discovery that astonished even him: It was much easier to shape behavior by hand than by changing the environment.
Skinner's daughter, behavior analyst Julie Vargas, Ph.D., has told me, "His realization at that moment was that if you could do this, you could shape behavior anywhere, in any environment." You did not need to manipulate the task or build elaborate apparatus. You could just reinforce moves in the right direction.
Skinner named this newly discovered method shaping, to differentiate it from the mechanical process of successive approximation.
Shaping depends on reinforcing the desired action instantaneously, as it is happening. A key factor in Skinner's early research setting was that the feeders made noise as soon as they were tripped. This click became the conditioned reinforcer that meant food was coming. It was the marker signal that identified the move being reinforced.
Skinner recognized the value of the conditioned reinforcer. For the cameras of Look magazine, he trained a dog to jump higher and higher up a wall using a sound and some food; in a popular magazine article in 1951, he recommended the toy cricket or clicker as a good conditioned reinforcer for dog training.
Some people in the behavioral and animal communities have taken to using the word "shaping" to describe any training that increases a response in small increments, even though the response may be generated or elicited by luring, force, verbal instruction, environmental manipulation, or other external pressure, rather than being offered spontaneously. The correct term for these non-spontaneous methods would be successive approximation, however. Many animal trainers and sports coaches have used successive approximation for years, gradually raising the height of jumps, the distance of a race, and the heaviness of weights, all to improve performance. The terms "free shaping" and "cold shaping" have arisen as additional ways to identify true shaping, when the animal's volunteered or spontaneous behavior is the key factor in the development of the behavior.
Gifted trainers have also used timely praise and play to reinforce spontaneous behavior, and thus develop new kinds of performance without baiting or forcing the movements. The scientific importance of Skinner's discovery was that these principles became generally applicable by any user and in any learning situation, not just by the rare, intuitive, or masterful individual.
Sometimes faster is better
An important characteristic of shaping is the speed with which new responses can develop. This is not a method that requires a lot of practice and repetition.
Often, as Skinner reported with his ball-playing pigeon, complex new behavior can develop in a few minutes. Francis Mechner, Ph.D. suggests that one explanation for this rapid increase in behavioral topographies is that the marker identifies not only a position—the paw is three inches in the air—but a vector, a movement in a direction. By clicking during the upward movement of the paw, the shaper reinforces not only the current outcome—a three-inch lift—but also the action that is taking place: lifting upward. Reinforcement quickly leads to stronger paw movements and higher lifts, giving the shaper even more and larger behaviors to select.
Birth of clicker training
Keller Breland, one of the graduate students present at the moment of Skinner's discovery, left psychology to develop a business based on animal training. In the 1960s, he was one of several behaviorists who carried shaping by use of a marker signal (usually a whistle) into the relatively new world of marine mammal training. Through the late 1980s and early 1990s, after nearly thirty years of development in oceanariums around the world, marker-based shaping spread further, from the marine mammal world into the zoo world, carried to the management of other species by keepers, curators, and consultants, some of whom began their careers as marine mammal trainers.
Over the next decades, however, the behavioral research community largely dismissed the importance of the marker signal, focusing instead on the value to the learner, whether animal or human, of the primary reinforcer, usually food. In shaping behavior in the modern research setting, cooperation is often still guaranteed in animals by increasing hunger, keeping research animals at 85% of normal body weight. In humans, as when teaching necessary skills to children with developmental deficits, cooperation is sought by identifying and using highly preferred food items.
Clicker training, a popular method of training dogs, horses, and other pets using shaping and a marker signal, the clicker, to replace traditional prompting and correction-based training, dates to two presentations in May of 1992. One, organized and led by myself, occurred at the annual meeting of the Association for Behavior Analysis in San Francisco and included dog trainer Gary Wilkes (first to locate and use a commercially available plastic box clicker with dogs), San Diego Zoo curator Gary Priest, and Sea Life Park head trainer Ingrid Kang Shallenberger. That same weekend, Wilkes, Shallenberger, and I presented a seminar for 250 dog trainers outside of San Francisco. The subsequent rapid expansion of the clicker training community was fueled by the widening availability of the internet.
The shape of things to come
The uses and practices of shaping and its application continue to evolve. In 2001, horse trainer and gymnastics coach Theresa McKeon, together with biochemist and dog trainer Joan Orr and dance teacher Beth Wheeler, began developing the use of the marker signal in teaching physical skills to humans, a practical applications system dubbed TAGteach. As with any emerging technology, new practices ask questions of the basic underlying sciences.
Clicker training and shaping-related studies of both the underlying principles and their applications are underway in behavioral ecology, behavior analysis, sports psychology, and neuroscience.
We've covered a lot of ground in a relatively short amount of time. Stay tuned for the shape of things to come.