Your search

In authors or contributors
  • In Experiment 1, pigeons' pecks on a green key led to a 5-s delay with green houselights, and then food was delivered on 20% (or, in other conditions, 50%) of the trials. Pecks on a red key led to an adjusting delay with red houselights, and then food was delivered on every trial. The adjusting delay was used to estimate indifference points: delays at which the two alternatives were chosen about equally often. Varying the presence or absence of green houselights during the delays that preceded possible food deliveries had large effects on choice. In contrast, varying the presence of the gr een or red houselights in the intertrial intervals had no effects on choice. In Experiment 2, pecks on the green key led to delays of either 5 s or 30 s with green houselights, and then food was delivered on 20% of the trials. Varying the duration of the green houselights on nonreinforced trials had no effect on choice. The results suggest that the green houselights served as a conditioned reinforcer at some rimes but not at others, depending on whether or nor there was a possibility that a primary reinforcer might be delivered. Given this interpretation of what constitutes a conditioned reinforcer, most of the results were consistent with the view that the strength of a conditioned reinforcer is inversely related to its duration.

  • Two experiments studied the phenomenon of procrastination, in which pigeons chose a larger, more delayed response requirement over a smaller, more immediate response requirement. The response requirements were fixed-interval schedules that did not lead to an immediate food reinforcer, but that interrupted a 55-s period in which food was delivered at random times. The experiments used an adjusting-delay procedure in which the delay to the start of one fixed-interval requirement was varied over trials to estimate an indifference point-a delay at which the two alternatives were chosen about equally often. Experiment 1 found that as the delay to a shorter fixed-interval requirement was increased, the adjusting delay to a longer fixed-interval requirement also increased, and the rate of increase depended on the duration of the longer fixed-interval requirement. Experiment 2 found a strong preference for a fixed delay of 10 s to the start of a fixed-interval requirement compared to a mixed delay of either 0 or 20 s. The results help to distinguish among different equations that might describe the decreasing effectiveness of a response requirement with increasing delay, and they suggest that delayed reinforcers and delayed response requirements have symmetrical but opposite effects on choice.

  • For several decades, choice has been the focus of considerable research by those who study operant behavior. This is not surprising, because the topics of choice and operant behavior are intimately intertwined. In everyday life, people can choose among a large, almost infinite set of operant behaviors, and they can choose not only which behaviors to perform, but under what conditions, at what rate, and for how long. Because choice is an essential part of human (and animal) life, it has been studied with great interest not only by behavioral psychologists, but also by decision theorists, economists, political scientists, biologists, and others. The research methods used in these different disciplines vary widely, and a review of all of the different methods for studying choice is well beyond the scope and purpose of this chapter. Instead, the chapter will focus on the techniques most frequently used in operant research—techniques that involve single-subject designs, that allow precise control of the reinforcement contingencies, and that produce (in most cases) large and clear effects on each subject’s behavior.

Last update from database: 3/13/26, 4:15 PM (UTC)

Explore

Resource type

Resource language