Operant Conditioning, the Law of Effect, Successive Approximations, and Shaping
Thorndike developed the second theory which was born out of the learning theory. He developed the theory of operant conditioning. He was interested in what the effects of consequences would have on a person’s or animal’s behavior.
He took alley cats and caged them. The cats discovered they could lift the latch and escape. It started as an accidental act by one of the cats and then it became a learned strategy for all the cats.
The law of effect
Basically the law of effect states that behavior which is supported by a happy result will be repeated. On the other hand, behavior which is followed by unpleasant consequences will eventually stop. Consequences were the key to controlling behavior and the field called instrumental learning came about following this concept.
Skinner modified the law of effect ever so subtly. He did not put as much influence on stimulus and responses in his theory of operant conditioning. His theory was more about the environment. A response from the environment and a response from the organism is required to change or form new behavior. To complete the process of learning, there had to be a consequence attached to the response from the organism. Thus he renamed the law of effect the principles of reinforcement.
Skinner conducted his experiments with pigeons. A positive reinforcer was held when a pigeon would continually peck at the same place where water is found. The water was the reinforcement of the pigeon’s behavior which was found in the environment.
Negative reinforcement often misunderstood; it doesn’t mean applying a punishment it means removing a negative response in the environment which would encourage the pigeon or human to do something they were afraid of doing. For example, if a person has a fear of flying, removing that fear will encourage the person to start flying.
Skinner believed that all behavior could be predicted, observed, encouraged, or extinguished through reinforcements found in society.
Shaping is getting behavior to conform more and more to certain standard or activity. Skinner would take a rat in his famous skinner box and shape it to pull a lever for food. Of course the rat at first did not do that. But he would explore the cage. Skinner would purposely drop a pellet of food and the rat would remain in that area. Eventually skinner would expect the rat to do more to get the pellet and would then be shaping the animals behavior.
To shape the behavior in question Skinner would use a series of behaviors that would gradually be expected to get the response desired. This worked with his pigeons and rats and works with people as well. Take the case of phobias, a person who has a fear of dogs cannot just go up to a dog and pet it. So the therapist will start slowly, perhaps getting the person to look at pictures of dogs, when that works go to a shelter where dogs are in cages; the client would just look at the dogs. When that worked the next approximation might be to place the client across the room from a dog that is not caged and then finally to slowly go up to the dog and pet it; all these shaping techniques are called successive approximations.
Modeling is simply learning by watching others. Children will learn both good and bad traits from other people. It is believed that phobias may be learned behaviors from childhood. Children of parents who have phobias often have the same phobias.