A quantitative analyses of behavior responding by interval schedules

57
JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR A QUANTITATIVE ANALYSIS OF THE RESPONDING MAINTAINED BY INTERVAL SCHEDULES OF REINFORCEMENT' A. CHARLES CATANIA AND G. S. REYNOLDS NEW YORK UNIVERSITY AND UNIVERSITY OF CALIFORNIA, SAN DIEGO Interval schedules of reinforcement maintained pigeons' key-pecking in six experiments. Each schedule was specified in terms of mean interval, which determined the maximum rate of reinforcement possible, and distribution of intervals, which ranged from many-valued (varia- ble-interval) to single-valued (fixed-interval). In Exp. 1, the relative durations of a sequence of intervals from an arithmetic progression were held constant while the mean interval was varied. Rate of responding was a monotonically increasing, negatively accelerated function of rate of reinforcement over a range from 8.4 to 300 reinforcements per hour. The rate of re- sponding also increased as time passed within the individual intervals of a given schedule. In Exp. 2 and 3, several variable-interval schedules made up of different sequences of inter- vals were examined. In each schedule, the rate of responding at a particular time within an interval was shown to depend at least in part on the local rate of reinforcement at that time, derived from a measure of the probability of reinforcement at that time and the proximity of potential reinforcements at other times. The functional relationship between rate of re- sponding and rate of reinforcement at different times within the intervals of a single schedule was similar to that obtained across different schedules in Exp. 1. Experiments 4, 5, and 6 examined fixed-interval and two-valued (mixed fixed-interval fixed-interval) schedules, and demonstrated that reinforcement at one time in an interval had substantial effects on respond- ing maintained at other times. It was concluded that the rate of responding maintained by a given interval schedule depends not on the overall rate of reinforcement provided but rather on the summation of different local effects of reinforcement at different times within intervals. CONTENTS Experiment 1: Rate of responding as a func- tion of rate of reinforcement in arithmetic variable-interval schedules. Experiment 2: Effects of a zero-sec interval in an arithmetic variable-interval schedule. Experiment 3: Effects of the distribution of intervals in variable-interval schedules on changes in the local rate of responding within intervals. Experiment 4: Overall and local rates of re- sponding within three fixed-interval sched- ules. Experiment 5: Effects of the separation in time of opportunities for reinforcement in two-valued interval schedules. Experiment 6: Effects of the omission of re- inforcement at the end of the long interval in two-valued interval schedules. General Discussion. Appendix I: Analysis in terms of interre- sponse times. Appendix II: Constant-probability variable- interval schedules. The statement that responses take place in time expresses a fundamental characteristic of behavior (Skinner, 1938, pp. 263-264). Re- sponses occur at different rates, in different se- 'This research was supported by NSF Grants G8621 and G18167 (B. F. Skinner, Principal Investigator) to Harvard University, and was conducted at the Har- vard Psychological Laboratories. Some of the material has been presented at the 1961 and 1963 meetings of the Psychonomic Society. The authors' thanks go to Mrs. Antoinette C. Papp and Mr. Wallace R. Brown, Jr., for care of pigeons and assistance in the daily con- duct of the experiments, and to Mrs. Geraldine Han- sen for typing several revisions of the manuscript. We are indebted to many colleagues, and in particular to N. H. Azrin, who maintained responsibility for the manuscript well beyond the expiration of his editorial term, and to .D. G. Anger, L. R. Gollub, and S. S. Plis- koff. Some expenses of preparation of the manuscript were defrayed by NSF Grant GB 3614 (to New York University), by NSF Grants GB 316 and GB 2541 (to the University of Chicago), and by the Smith Kline and French Laboratories. Expenses of publication were defrayed by NIH Grant MH 13613 (to New York University) and NSF Grants GB 5064 and GB 6821 (to the University of California, San Diego). Reprints may be obtained from A. C. Catania, Department of Psy- chology, New York University, University College of Arts and Sciences, New York, N.Y. 10453. 327 1968, 112 327-383 NUMBER 3 (MAY) PART 2

description

A beautiful article by the experimental analyses of behavior at 60´s

Transcript of A quantitative analyses of behavior responding by interval schedules

Page 1: A quantitative analyses of behavior responding by interval schedules

JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR

A QUANTITATIVE ANALYSIS OF THE RESPONDINGMAINTAINED BY INTERVAL SCHEDULES

OF REINFORCEMENT'

A. CHARLES CATANIA AND G. S. REYNOLDS

NEW YORK UNIVERSITY AND UNIVERSITY OF CALIFORNIA, SAN DIEGO

Interval schedules of reinforcement maintained pigeons' key-pecking in six experiments. Eachschedule was specified in terms of mean interval, which determined the maximum rate ofreinforcement possible, and distribution of intervals, which ranged from many-valued (varia-ble-interval) to single-valued (fixed-interval). In Exp. 1, the relative durations of a sequenceof intervals from an arithmetic progression were held constant while the mean interval wasvaried. Rate of responding was a monotonically increasing, negatively accelerated function ofrate of reinforcement over a range from 8.4 to 300 reinforcements per hour. The rate of re-sponding also increased as time passed within the individual intervals of a given schedule.In Exp. 2 and 3, several variable-interval schedules made up of different sequences of inter-vals were examined. In each schedule, the rate of responding at a particular time within aninterval was shown to depend at least in part on the local rate of reinforcement at that time,derived from a measure of the probability of reinforcement at that time and the proximityof potential reinforcements at other times. The functional relationship between rate of re-sponding and rate of reinforcement at different times within the intervals of a single schedulewas similar to that obtained across different schedules in Exp. 1. Experiments 4, 5, and 6examined fixed-interval and two-valued (mixed fixed-interval fixed-interval) schedules, anddemonstrated that reinforcement at one time in an interval had substantial effects on respond-ing maintained at other times. It was concluded that the rate of responding maintained bya given interval schedule depends not on the overall rate of reinforcement provided butrather on the summation of different local effects of reinforcement at different times withinintervals.

CONTENTSExperiment 1: Rate of responding as a func-

tion of rate of reinforcement in arithmeticvariable-interval schedules.

Experiment 2: Effects of a zero-sec intervalin an arithmetic variable-interval schedule.

Experiment 3: Effects of the distribution ofintervals in variable-interval schedules onchanges in the local rate of respondingwithin intervals.

Experiment 4: Overall and local rates of re-sponding within three fixed-interval sched-ules.

Experiment 5: Effects of the separation intime of opportunities for reinforcement intwo-valued interval schedules.

Experiment 6: Effects of the omission of re-inforcement at the end of the long intervalin two-valued interval schedules.

General Discussion.

Appendix I: Analysis in terms of interre-sponse times.

Appendix II: Constant-probability variable-interval schedules.

The statement that responses take place intime expresses a fundamental characteristic ofbehavior (Skinner, 1938, pp. 263-264). Re-sponses occur at different rates, in different se-

'This research was supported by NSF Grants G8621and G18167 (B. F. Skinner, Principal Investigator) toHarvard University, and was conducted at the Har-vard Psychological Laboratories. Some of the materialhas been presented at the 1961 and 1963 meetings ofthe Psychonomic Society. The authors' thanks go toMrs. Antoinette C. Papp and Mr. Wallace R. Brown,Jr., for care of pigeons and assistance in the daily con-duct of the experiments, and to Mrs. Geraldine Han-sen for typing several revisions of the manuscript. Weare indebted to many colleagues, and in particular toN. H. Azrin, who maintained responsibility for themanuscript well beyond the expiration of his editorialterm, and to .D. G. Anger, L. R. Gollub, and S. S. Plis-koff. Some expenses of preparation of the manuscriptwere defrayed by NSF Grant GB 3614 (to New YorkUniversity), by NSF Grants GB 316 and GB 2541 (tothe University of Chicago), and by the Smith Klineand French Laboratories. Expenses of publicationwere defrayed by NIH Grant MH 13613 (to New YorkUniversity) and NSF Grants GB 5064 and GB 6821 (tothe University of California, San Diego). Reprints maybe obtained from A. C. Catania, Department of Psy-chology, New York University, University College ofArts and Sciences, New York, N.Y. 10453.

327

1968, 112 327-383 NUMBER 3 (MAY) PART 2

Page 2: A quantitative analyses of behavior responding by interval schedules

A. CHARLES CATANIA and G. S. REYNOLDS

quences, and with different temporal patterns,depending on the temporal relations betweenthe responses and other events. One event offundamental interest is reinforcement, andthe rate at which responses occur and thechanges in this rate over time are strongly de-termined by the schedule according to whichparticular responses are reinforced (e.g., Morse,1966).An interval schedule arranges reinforce-

ment for the first response that occurs after aspecified time has elapsed since the occurrenceof a preceding reinforcement or some otherenvironmental event (Ferster and Skinner,1957). In such a schedule, the spacing of re-inforcements in time remains roughly con-stant over a wide range of rates of responding.The schedule specifies certain minimum inter-vals between two reinforcements; the actualdurations of these intervals are determinedby the time elapsed between the availabilityof reinforcement, at the end of the interval,and the occurrence of the next response,which is reinforced. The patterns and rates ofresponding maintained by interval schedulesusually are such that this time is short rela-tive to the durations of the -intervals.

Ferster and Skinner (1957, Ch. 5 and 6)have described in considerable detail somieimportant features of the performances main-tained by interval schedules. In a fixed-inter-val (FI) schedule, the first response after afixed elapsed time is reinforced, and an or-ganism typically responds little or not at alljust after reinforcement, although respondingincreases later in the interval. In a variable-interval (VI) schedule, the first response aftera variable elapsed time is reinforced, and arelatively constant rate of responding is main-tained throughout each interval. Detailed ex-amination shows, however, that this respond-ing may be modulated by the particular dura-tions of the different intervals that constitutethe schedule. In other words, the distributionof responses in time depends on the distribu-tion of reinforcements in time. For example,responding shortly after reinforcement in-creases with increases in the relative fre-quency of short intervals in the schedule (Fer-ster and Skinner, 1957, p. 331-332). Thus, it isimportant to study not only the rate of re-sponding averaged over the total time in aninterval schedule, but also the changes in therate of responding as time passes within indi-

vidual intervals. (The former, a rate calcu-lated over the total time in all the intervalsof a schedule, will be referred to as an overallrate; the latter, a rate calculated over a periodof time that is short relative to the averageinterval between reinforcements, will be re-ferred to as a local rate. The terms will be ap-plied to reinforcement as well as to respond-ing. The terminology has the advantage ofpointing out that both reinforcement and re-sponding are measured in terms of events perunit of time.)

In a VI schedule, a response at a given timeafter reinforcement is reinforced in some in-tervals but not in others. The probability ofreinforcement at this time is determined bythe relative frequency of reinforcement at thistime, which may be derived from the distri-bution of intervals in the schedule. The dis-tribution of intervals in a VI schedule mayact upon behavior because the time elapsedsince a preceding reinforcement (or since anyother event that starts an interval) may func-tion as a discriminable continuum. Skinner(1938, p. 263 ff.), in his discussion of temporaldiscrimination, included the discriminationof the time elapsed since reinforcement as afactor in his account of the performancesmaintained by FI schedules. The major dif-ference between FI and VI schedules is thatan Fl schedule provides reinforcement at afixed point along the temporal continuum,whereas a VI schedule provides reinforcementat several points. The present account ana-lyzes performances maintained by differentinterval schedules in terms of the local effectsof different probabilities of reinforcement onthe local rates of responding at different times.Within interval schedules, reinforcement

may be studied as an input that determinesa subsequent output of responses (cf. Skinner,1938, p. 130). In this sense, the study of theperformances maintained by interval sched-ules is a study of response strength. The con-cept of response strength, once a reference toan inferred response tendency or state, hasevolved to a simpler usage: it is "used to des-ignate probability or rate of responding"(Ferster and Skinner, 1957, p. 733). This evo-lution is a result of several related findings:that the schedule of reinforcement is a pri-mary determinant of performance; that differ-ent measures of responding such as rate andresistance to extinction are not necessarily

328

Page 3: A quantitative analyses of behavior responding by interval schedules

INTERVAL SCHEDULES OF REINFORCEMENT

highly correlated; that rate of responding isrelatively insensitive to such variables asamount of reinforcement and deprivation;that rate of responding is itself a property ofresponding that can be differentially rein-forced; and that rate of responding can bereduced to component interresponse times(e.g., Anger, 1956; Ferster and Skinner,1957; Herrnstein, 1961; Skinner, 1938). Nev-ertheless, the relationship between reinforce-ment and responding remains of fundamen-tal importance to the analysis of behavior.Many studies of response strength have beenconcerned with the acquisition of behavior(learning: e.g., Hull, 1943) or with the rela-tive strengths of two or more responses(choice: e.g., Herrnstein, 1961). The presentexperiments emphasize reinforcement as itdetermines performance during maintainedor steady-state responding, rather than dur-ing acquisition, extinction, and other transi-tion states, and are concerned with absolutestrength, rather than with strength relativeto other behavior.

EXPERIMENT 1:RATE OF RESPONDING AS AFUNCTION OF RATE OFREINFORCEMENT INVARIABLE-INTERVAL

SCHEDULESThe relation between the overall rate of

reinforcement and the overall rate of a pi-geon's key-pecking maintained by intervalschedules may be thought of as an input-out-put function for the pigeon. In Exp. 1, thisfunction was determined for VI schedulesover a range of overall rates of reinforcementfrom 8.4 to 300 rft/hr (reinforcements perhour). Each schedule consisted of an arithme-tic series of 15 intervals ranging from zero totwice the average value of the schedule andarranged in an irregular order. Thus, therelative durations of the particular intervalsthat made up each schedule were held con-stant.

METHOD

Subjects and ApparatusThe key-pecking of each of six adult, male,

White Carneaux pigeons, maintained at 80%of free-feeding body weights, had been rein-

forced on VI schedules for at least 50 hr be-fore the present experiments.The experimental chamber was similar to

that described by Ferster and Skinner (1957).Mounted on one wall was a translucentPlexiglas response key, 2 cm in diameter andoperated by a minimum force of about 15 g.The key was transilluminated by two yellow6-w lamps. Two white 6-w lamps mounted onthe chamber ceiling provided general illumi-nation. The operation of -the key occasion-ally produced the reinforcer, 4-sec access tomixed grain in a standard feeder located be-hind a 6.5-cm square opening beneath thekey. During reinforcement, the feeder was il-luminated and the other lights were turnedoff.

Electromechanical controlling and record-ing apparatus was located in a separate room.A device that advanced a loop of punchedtape a constant distance with each operation(ratio programmer, R. Gerbrands Co.) wasstepped by an electronic timer, and intervalsbetween reinforcements were determined bythe spacing of the holes punched in the tape.Thus, the absolute durations of the intervalsdepended on the rate at which the timer op-erated the programmer, but the relative du-rations were independent of the timer.The punched holes in the tape provided a

series of 15 intervals from an arithmetic pro-gression, in the following order: 14, 8, 11, 6,5, 9, 2, 13, 7, 1, 12, 4, 10, 0, 3. The numbersindicate the durations of the intervals be-tween successive reinforcements in multiplesof t sec, the setting of the electronic timer.(To permit the arrangement of a 0-sec inter-val, in which reinforcement was available forthe first peck after a preceding reinforcement,the ratio programmer was stepped at each re-inforcement as well as at the rate determinedby the electronic timer.) In this series, theaverage interval of the VI schedule was 7tsec; with t equal to 6.5 sec, for example, theaverage interval was 45.5 sec.At the end of each interval, when a peck

was to be reinforced, the controlling appa-ratus stopped until the peck occurred; thenext interval began only at the end of the4-sec reinforcement. Thus, the apparatus ar-ranged a distribution of minimum interrein-forcement intervals; the actual intervals weregiven by the time from one reinforcement tothe next reinforced response. In practice, the

329

Page 4: A quantitative analyses of behavior responding by interval schedules

A. CHARLES CATANIA and G. S. REYNOLDS

rates of responding at most VI values were

such that differences between the minimumand the actual interreinforcement intervalswere negligible.

Stepping switches that stepped with eachstep of the ratio programmer and that resetafter each reinforcement distributed key-pecks to the 14 counters, which representedsuccessive periods of time after reinforce-ment. The time represented by each counterwas t sec, and each counter recorded re-

sponses only within interreinforcement inter-vals equal to or longer than the time afterreinforcement that the counter represented.For example, the first counter cumulated re-

sponses that occurred during the first t sec ofall intervals except the 0-sec interval (the 0-sec interval was terminated by a single rein-forced response). Correspondingly, the sev-

enth counter cumulated responses during theseventh t sec of only those intervals 7t sec

long or longer. The fourteenth counter cu-

mulated responses only during the four-teenth t sec of the 14t-sec interval, the long-est interval in the series. Thus, response ratesat early times after reinforcement were basedon larger samples of pecking than responserates at later times.

ProcedureSeven VI schedules with average intervals

ranging from 12.0 to 427 sec (300 to 8.4 rft/hr)were examined. Each pigeon was exposed toVI 12.0-sec, VI 23.5-sec, and VI 45.5-sec, and toa sample of the longer average intervals, as

indicated in Table 1. (Occasional sessions inwhich equipment failed have been omitted;none were within the last five sessions of a

given schedule.) Each schedule was in effect

for at least 15 daily sessions and until thepigeon's performance was stable, as judgedby visual inspection of numerical data andcumulative records, for five successive ses-

sions. With few exceptions, the rate of re-

sponding in each of the last five sessions of a

given schedule was within 10% of the aver-

age rate over those sessions.The first peck in each session was rein-

forced and the VI schedule then operated,beginning at a different place in the seriesof intervals in successive sessions. Thus, eachscheduled interval, including the first in thesession, began after a reinforcement. Sessionsended after each interval in the series had oc-

curred four times (61 reinforcements). Thus,the duration of a session ranged from about16 min (12 min of VI 12-sec plus 61 rein-forcements) to about 431 min (427 min of VI427-sec plus 61 reinforcements).

RESULTSThe overall rate of key-pecking as a func-

tion of the overall rate of reinforcement isshown for each pigeon in Fig. 1. The func-tions were, to a first approximation, mono-

tonically increasing and negatively acceler-ated, perhaps approaching an asymptoticlevel for some pigeons. With increasing ratesof reinforcement, the rate of responding in-creased more rapidly at low rates of rein-forcement (for most pigeons, to roughly 50rft/hr) than at higher rates of reinforcement.The shapes of the functions differed in detailfrom pigeon to pigeon: Pigeon 118, for exam-

ple, produced a fairly smooth increasingfunction; Pigeon 121, an almost linear func-tion; and Pigeons 278 and 279, a rapid in-crease to a near invariance in the rate of re-

tble 1

Mean intervals (sec) of the arithmetic variable-interval schedules arranged for each pigeon,with number of sessions for each schedule shown in parentheses.

Pigeon

118 121 129 278 279 281

108 (52) 45.5 (52) 108 (29) 23.5 (35) 427 (52) 23.5 (35)45.5 (29) 23.5 (29) 216 (35) 12.0 (17) 216 (29) 45.5 (17)23.5 (22) 12.0 (58) 427 (29) 45.5 (29) 108 (22) 12.0 (29)12.0 (36) 108 (22) 23.5 (22) 216 (22) 23.5 (36) 427 (58)323 (37) 23.5 (15) 45.5 (36) 108 (36) 12.0 (22) 45.5 (22)108 (28) 12.0 (22) 45.5 (22) 45.5 (15) 12.0 (43)

23.5 (15) 427 (15) 108 (29)108 (28) 427* (26)

Reinstated after interruption.

330

Page 5: A quantitative analyses of behavior responding by interval schedules

INTERVAL SCHEDULES OF REINFORCEMENT

120 _W41.c-

RFT INTERVAL (SEC)120 60 40 30 2*0 15 1?

50 100 150 200 250 300

80 pf t RFT/HR

40

118 121

zE80

LI)

z0 129 278

100 200 0 100 200 300REINFORCEMENTS PER HOUR

Fig. 1. Rate of key-pecking as a function of rate of reinforcement for six pigeons. Key-pecking was main-tained by VI schedules consisting of 15 intervals in an arithmetic progression of size, but arranged in an ir-regular order. Each point is the arithmetic mean of the rates of responding over the last five sessions of a givenschedule. Numerals 1 and 2 indicate first and second determinations. Some representative average interrein-forcement intervals, proportional to reciprocals of the rates of reinforcement (rft/hr), are shown on the scale atthe upper right.

331

Page 6: A quantitative analyses of behavior responding by interval schedules

A. CHARLES CATANIA and G. S. REYNOLDS

sponding. This near invariance might becalled a "locked rate" (Herrnstein, 1955; Sid-man, 1960), a term that has been applied tothe occasionally observed insensitivity of agiven pigeon's rate of responding to changesin the parameters of an interval schedule ofreinforcement.

Despite the near invariance, the functionsappear in general to increase over their en-tire range. (Reversals, as for Pigeon 129 at33.3 rft/hr, were within the limits of varia-bility implied by the redeterminations, whichgenerally produced higher rates of respond-ing than the original determinations.) Theaverage rate of responding maintained by300 rft/hr was higher than that maintainedby 153 rft/hr for all pigeons. In addition,rates of responding at higher rates of rein-forcement may be spuriously low, because thecontribution of the latency of the first re-sponse after reinforcement to the overall rateof responding was greatest at the higher ratesof reinforcement. A correction for this la-tency would slightly increase rates of re-sponding at the higher rates of reinforce-ment (300, 153, and, perhaps, 79 rft/hr), butwould have virtually no effect at the lowerrates of reinforcement. Despite the smallchanges at high rates of reinforcement, itseems reasonable to conclude that overallrates of responding increase monotonically(perhaps approaching an asymptote) as over-all rate of reinforcement increases.Within individual intervals between two

reinforcements, the rate of key-pecking in-creased with increasing time since reinforce-ment, as shown for each pigeon in Fig. 2,which plots local rates of responding againstthe absolute time elapsed since reinforce-ment. The functions reflect in their verticalseparation the different overall rates main-tained by each schedule (Fig. 1).Data obtained with each arithmetic VI

schedule for each pigeon are plotted againstrelative time since reinforcement in Fig. 3.The functions have been adjusted by multi-plying local rates of responding by constantschosen to make the average rate of respond-ing for each function equal to 1.0. When the(lifferences in overall levels of the functionswere remove(d by this adjustment, the localrate of responding within intervals grew asapproximately the same function of relativetime after reinforcement in most VI sched-

ules studied with most pigeons. The majorexceptions were some pigeons' data from theshorter VI schedules: Pigeon 121 at 12.0 and23.5 sec; Pigeon 278 at 23.5 and 45.5 sec; andPigeon 281 at 12.0 sec. It may be relevant thatonly in these schedules were rates of respond-ing sometimes low enough to produce largedifferences between the minimum and actualinterreinforcement intervals. For the remain-ing functions, there appeared to be no sys-tematic ordering from one pigeon to anotherof the slopes or degrees of curvature of theseveral functions (see, however, Exp. 3, Dis-cussion).As with overall rates of responding (Fig. 1),

the functions differed in detail from pigeonto pigeon, even if the atypical data from theshorter VI schedules are ignored. For a givenpigeon, however, the functions in Fig. 1 andin Fig. 3 were generally similar: fairly smoothincreasing functions for Pigeon 118, almostlinear functions for Pigeon 121 except fordata from the shorter VI schedules, and rapidincreases to a near invariance for Pigeons 278and 279. The similarity is debatable for Pi-geon 281 even when the 12.0-sec function isdisregarded, and no simple relationship isevident between the two sets of data for Pi-geon 129. The possible significance of the

---similarities is that the same variables mayhave operated to produce changes in both thelocal rate of responding, as time passedwithin interreinforcement intervals, and inthe overall rate of responding, when the over-all rate of reinforcement was changed.A cumulative record of the responding of

Pigeon 118 is shown in Fig. 4. UJpward con-cavity, which indicates an increasing rate ofresponding, is evident in almost every inter-val between reinforcements. The averagingof rates of responding across intervals as-sumed that there was no systematic changein the responding within intervals from oneinterval to another. No consistent sequentialeffects were evident in the cumulative rec-ords; if present, they constituted a relativelyminor effect that, for the present purposes,will be ignored.

DISCUSSION

Overall rates of responding. Individual dif-ferences among pigeons were considerable,but the functions relating overall rate of re-sponding to overall rate of reinforcement

332

Page 7: A quantitative analyses of behavior responding by interval schedules

INTERVAL SCHEDULES OF REINFORCEMENT 333

12.0 sec.23.5 sec.18

45.5,se. 118108 sec.

100 323 s.c.

50

C0 III I12.0 am..

f 12150 23.5 sec.

4_ 1 0 18 sc.

'12 0~~ ~ ~ ~ ~ ~~~2100 .5 sc. 1 129

Ei 5, 5 se Sec

D|0X t1s12.0suc.1 1 1 216 sec.

50

LUJ

o I I 2Osc I I

12.0 sec. 1

1001 23.5 s.c. SCALE

4 5.5 sc. 40279-

216 sec427 sec.

5102

a L0 200 400 600 800

TIME SINCE REINFORCEMENT (SECONDS)

Fig. 2. Rate of key-pecking as a function of time since reinforcement in several arithmetic VI schedules. Thefunction for each schedule, composed of averages of the local rates of responding over the last five sessions ofthe schedule, is identified by the mean interreinforcement interval. Two of the 12.0-sec functions have been dis-placed on the ordinate, as indicated by the inserted scales (Pigeons 278 and 279). For those schedules arrangedtwice for a given pigeon, only one function, chosen on the basis of convenience of presentation, has been plotted.

Page 8: A quantitative analyses of behavior responding by interval schedules

A. CHARLES CATANIA and G. S. REYNOLDS

118a a a a

I'S1 *at4-I-

p431

It44

cJ

129

J

I"ttt*'si 6 s-p

279a a a 3

V I (sec)* 12 o108*23.5 a216* 45.5 0323

ao4270~

- 1

U* *-0

It a -

I0

0

00 £A -

AGI

0.

121I I I J

o * gi *t

Ae Aa£ A&

278I

.

oooa 0

-U040

I

2811.0 1.5 2.0 0 0.5 1.0 1.5 2.0

TIME SINCE REINFORCEMENT(RELATIVE TO AVERAGE INTERREINFORCEMENT INTERVAL)

Fig. 3. Rate of key-pecking, adjusted so that the average rate of pecking equals 1.0, as a function of relativetime since reinforcement in several arithmetic VI schedules. For those schedules arranged twice for a given pi-geon, only the first determination has been plotted.

1.5

1nl

zz

0.50~() I.LUJCL .

0 1.0LLJ

U.SI 1.5

:D1

£2< 1.0

0.5

0 .

a*ae

0 0.5

i I I Im

7

-0

'o

334

I

I

e-.o1010

l!S

P. I(a tw 10

a 0

10-4

0A A i0

Page 9: A quantitative analyses of behavior responding by interval schedules

INTERVAL SCHEDULES OF REINFORCEMENT

Fig. 4. Cumulative record of a full session of key-pecking maintained by an arithmetic VI schedule with amean interreinforcement interval of 108-sec (Pigeon 118). The recording pen reset to baseline after each rein-forcement, indicated by diagonal pips as at a, a reinforcement after a zero-sec interval. Curvature can be seenmost easily by foreshortening the figure.

were generally monotonically increasing andnegatively accelerated. The general nature ofthis relationship is well supported by the lit-erature on both VI and Fl schedules. Bothpigeons and rats have been studied in a va-

riety of experimental contexts, usually overa narrower range of rates of reinforcementthan was studied here. Monotonically increas-ing and negatively accelerated functions havebeen obtained from rats by Skinner (1936;data obtained early in the acquisition of Flperformance), Wilson (1954; Fl schedules),Clark (1958; VI schedules at several levels ofdeprivation), and Sherman (1959; Fl sched-ules). The same relationship may hold forschedules of negative reinforcement (Kaplan,1952; Fl schedules of escape). Similar func-tions have been obtained from pigeons bySchoenfeld and Cumming (1960) and by

Farmer (1963), whose data are discussed later(General Discussion). Other data have beenobtained from pigeons by Cumming (1955)and by Ferster and Skinner (1957). In Cum-ming's experiment, rates of responding didnot increase monotonically with rates of re-

inforcement, but rates of responding may nothave reached asymptotic levels and the VIschedules alternated with a stimulus-corre-lated period of extinction. Ferster and Skin-ner presented data in the form of cumulativerecords selected to show detailed characteris-tics of responding; the data therefore were notnecessarily representative of the overall ratesof responding maintained by each schedule.

Monotonically increasing and negativelyaccelerated functions relating total respond-ing to total reinforcement in concurrentschedules (Findley, 1958; Herrnstein, 1961;

ARITHMETIC VI 108-sec

[/.vl/w Vt/I/iIl

I

RVs/min

1MP

10 MINUTES

335

118

Page 10: A quantitative analyses of behavior responding by interval schedules

A. CHARLES CATANIA and G. S. REYNOLDS

Catania, 1963a), in which VI schedules wereindependently arranged for pigeons' pecks ontwo different keys, have been discussed byCatania (1963a). Additional data are pro-vided by experiments with chained schedules(Autor, 1960; Findley, 1962; Nevin, 1964;Herrnstein, 1964), in which reinforcement ofresponding in the presence of one stimulusconsists of the onset of another stimulus inthe presence of which another schedule ofreinforcement is arranged (cf. the review byKelleher and Gollub, 1962).

Evidence for substantial individual differ-ences among pigeons has been noted in theliterature. Herrnstein (1955), for example,varied the overall rate of reinforcement pro-vided by VI schedules in an experiment con-cerned with the effect of stimuli preceding aperiod of timeout from VI reinforcement.Monotonically increasing, negatively acceler-ated functions were obtained from two pi-geons (S1, 6 to 120 rft/hr, and S3, 6 to 60rft/hr), but the third pigeon's rate of respond-ing was roughly constant over the range of re-inforcement rates studied (S2, 6 to 40 rft/hr:this pigeon provided the basis for a discus-sion of "locked rate"). Individual differencesamong pigeons were also observed by Rey-nolds (1961, 1963), who obtained monotoni-cally increasing, negatively accelerated func-tions when different VI schedules in thepresence of one stimulus were alternated witha constant VI schedule in the presence of asecond stimulus (multiple schedules).The derivation of a mathematical function

describing the relationship between reinforce-ment and responding for all pigeons is com-plicated by the idiosyncratic character ofeach pigeon's data, particularly if the func-tions are restricted to those involving simpletransformations of the ordinate and/or ab-scissa and are limited in the number of ar-bitrary constants. In an earlier version of thispaper (Reynolds and Catania, 1961; Cataniaand Reynolds, 1963), a power function wasproposed, on the basis of a fit to average datafor the group of pigeons (see also Catania,1963a). This function, of the form: R = kr02,where R is rate of responding, r is rate ofreinforcement, and k is a constant dependingon the units of measurement, was chosen inpreference to a logarithmic function, of theform: R = k log r + n, where n is a constantand the other symbols are as above. The

choice between these two functions was basedmore on logical considerations, i.e., that rateof responding should approach zero as rateof reinforcement approaches zero, than onthe superiority of the fit of the power func-tion to the data. This mathematical repre-sentation, however, does not provide an ade-quate fit to the data from individual pigeons.Fits to data from individual pigeons are pos-sible (cf. Norman, 1966), but they are not es-sential for the present purposes and will notbe considered further here.

Local rates of responding. It has beennoted (Results) that the idiosyncratic charac-teristics of the present data from each pigeonwere reflected, to some extent, in the changesin the local rate of responding with the pas-sage of time since reinforcement. This rela-tionship is not mathematically determined;a given overall rate of responding could havebeen produced by a variety of different tem-poral distributions of responses within theintervals of a given schedule. Aside from afew atypical functions at high rates of rein-forcement, local rates of responding generallyincreased monotonically as time passed sincereinforcement (Fig. 3). For a given pigeon,the adjusted local rates of responding at dif-ferent relative times after reinforcement re-mained roughly invariant over a wide rangeof overall rates of reinforcement.The changes in local rates of responding

cannot be accounted for solely in terms oftime since reinforcement. The distribution ofresponses throughout a given period of timesince reinforcement can be manipulatedwithin VI schedules by changing the distri-bution of intervals (e.g., from an arithmeticto a geometric progression of intervals; Fer-ster and Skinner, 1957). A variable that mayoperate together with time since reinforce-ment, however, is probability of reinforce-ment or some derivative of this probability.If a responding organism reaches a time afterreinforcement equal to the longest interval ina VI schedule, the probability that the nextresponse will be reinforced is 1.0. If, how-ever, the organism has not yet reached thattime, the probability is less than 1.0, and de-pends on the number of intervals that end ator after the time that the organism hasreached. In the present arithmetic VI sched-ules, therefore, probability of reinforcementincreased as time passed since reinforcement.

336

Page 11: A quantitative analyses of behavior responding by interval schedules

INTERVAL SCHEDULES OF REINFORCEMENT

The calculation of probability of rein-forcement is considered in greater detail inExp. 3, in which the probability of reinforce-ment was explicitly manipulated. It is suffi-cient to note here that both probability ofreinforcement and local rates of respondingincreased as time elapsed since reinforcement.The overall-rate functions (Fig. 1) and thelocal-rate functions (Fig. 3) may therefore besimilar because the changes in the overallrate of reinforcement provided by an inter-val schedule also changed the probability ofreinforcement for responses within any fixedperiod of time. Thus, the overall- and thelocal-rate functions may depend on the samerelationship between probability of reinforce-ment and subsequent responding.

Relationship between overall and local ratesof responding. This relationship between lo-cal and overall rates of responding suggeststhat a given overall rate of responding maynot be determined directly by an overall rateof reinforcement. Rather, a schedule may pro-duce a given overall rate of respondingthrough its effects on local rates of respond-ing at different times after reinforcement. Theway in which local rates of responding con-tribute to overall rates of responding musttherefore be considered.An overall rate of responding is a weighted

average of the local rates of responding at suc-cessive times after reinforcement. The earlytimes after reinforcement are weighted moreheavily than the later times because the earlytimes represent a larger proportion of the to-tal time in the schedule. For example, withinthe first t sec after reinforcement in the arith-metic VI schedules, responding was possible14 times as often as within the last t sec (firstand last points on each function in Fig. 2 and3; cf. Method). Thus, a consistent change inthe local rate of responding early after rein-forcement would produce a greater change inthe overall rate of responding than the sameconsistent change late after reinforcement.An alternative measure, therefore, is the aver-age of the successive local rates of respondingmaintained by a particular schedule (e.g., theaverage of all the points on a given functionin Fig. 2), because this measure does notweight early local rates more heavily.When local-rate functions are similar at dif-

ferent rates of reinforcement (as to a first ap-proximation for Pigeons 118, 129, and 279 in

Fig. 3), the substitution of average local ratefor overall rate of responding does not alterthe functional relation between rate of re-sponding and overall rate of reinforcement(Fig. 1); the average local rates and the over-all rates of responding will differ slightly, bya multiplicative constant. This is not neces-sarily the case, however, when the local-ratefunctions are dissimilar. For example, in the12.0-sec and 23.5-sec functions for Pigeon 121,the 23.5-sec function for Pigeon 278, and the12.0-sec function for Pigeon 281 in Fig. 3, thelocal rates of responding shortly after rein-forcement were relatively low compared tothe local rates within other schedules for thesame pigeons. The values of t in the 12.0-secand 23.5-sec VI schedules were roughly 1.7and 3.4 sec, respectively, and although ratesof responding were high, occasional shortpauses that occurred immediately after rein-forcement reduced the number of responses inthe early t-sec periods after reinforcement.Because these pauses were weighted moreheavily in the overall rate of responding thanin the average local rate, the overall rate waslower, relative to the average local rate, inthese than in the remaining schedules. In-versely, the local rate of responding was rela-tively high after reinforcement for Pigeon278 at VI 45.5-sec (Fig. 3), and the overallrate was higher, relative to the average localrate, in this than in the remaining schetlules.

Figure 3 shows data from the initial deter-mination of performance on each schedule.In three of the above cases (Pigeon 121 at VI23.5-sec, Pigeon 281 at VI 12.0-sec, and Pigeon278 at VI 45.5-sec), data from a redetermina-tion were available. The redetermined local-rate functions (not shown in Fig. 3) deviatedconsiderably less from other local-rate func-tions for the same pigeon than did the initiallocal-rate functions. These three cases repre-sent three of the four largest discrepancies be-tween initial and redetermined overall rates ofresponding (see Fig. 1), and it is of interestthat the three discrepancies are each reducedby about 5 resp/min if initial and redeter-mined average local rates of responding aresubstituted for initial and redetermined over-all rates of responding.This observation is consistent with the as-

sumption that the overall rate of respondingis not directly determined by an overall rateof reinforcement. Reinforcement does not

337

Page 12: A quantitative analyses of behavior responding by interval schedules

A. CHARLES CATANIA and G. S. REYNOLDS

produce a reserve of responses that are emit-ted irrespective of their distribution in time.Rather, a given rate of reinforcement pro-duces a given overall rate of respondingthrough its effects on local rates of respond-ing at different times after reinforcement.The experiments that follow consider the ef-fects of reinforcement on local rates of re-sponding in detail, by varying the distribu-tion of intervals in VI schedules.

EXPERIMENT 2:EFFECTS OF A ZERO-SEC INTERVALIN AN ARITHMETIC VARIABLE-

INTERVAL SCHEDULEIn the schedules of Exp. 1, the first response

after a reinforcement was reinforced in oneof every 15 intervals. This 0-sec interval mayhave had an effect on responding both imme-diately after reinforcement and later. Thepresent experiment directly compared localrates of responding maintained by arithmeticVI schedules with and without a 0-sec inter-val. One consequence of the 0-sec intervalwas that the reinforced response was precededby a latency (timed from the end of reinforce-ment) whereas the reinforced response inother intervals typically was preceded by aninterresponse time (timed from the preced-ing response).

METHODSubjects and Apparatus

In sessions preceding Exp. 1, Pigeons 278and 281 were exposed to two arithmetic VIschedules in the apparatus described previ-ously. To permit a detailed examination ofresponding shortly after reinforcement, therecording circuitry cumulated responses sepa-rately during successive thirds of the first andsecond t sec of each interval.

ProcedureIn the first schedule, t sec were added to

each interreinforcement interval of the arith-metic VI schedule of Exp. 1. This increasedthe mean interval from 7t to 8t sec and elimi-nated the Ot-sec interval; the shortest inter-val in the schedule was 1 t sec. The secondschedule was the same as the schedule of Exp.1; it included the 0-sec interval. With t equalto about 15.4 sec, the first schedule was ar-ranged for 29 sessions (arithmetic VI 123-

sec with no 0-sec interval) and the secondschedule for 21 sessions (arithmetic VI 108-secwith 0-sec interval).

RESULTSLocal rate of responding as a function of

absolute time since reinforcement is shownin Fig. 5. The first six points on each func-tion represent local rates during successivethirds of the first and second t sec after rein-forcement. For Pigeon 278, rates of respond-ing remained roughly constant after about 50sec since reinforcement in both schedules,and for Pigeon 281, rates of responding gradu-ally increased up to the longest time since rein-forcement in both schedules (cf. Fig. 2 and 3).

100278

80

60

w 40I-.

z20

accCL 0v. IetLUcn 281

0..U)LU

0 50 100 150 200TIME SINCE PREVIOUS RFT (sec)

Fig. 5. Rate of key-pecking as a function of timesince reinforcement in arithmetic VI schedules withand without a 0-sec interreinforcement interval.

The effect of the 0-sec interval was re-stricted primarily to the time shortly after re-inforcement (first two or three points on eachfunction). Relative to the schedule with no0-sec interval, the 0-sec interval added a largerincrement to the local rate of responding thanwould have been produced if only a single re-

338

Page 13: A quantitative analyses of behavior responding by interval schedules

INTERVAL SCHEDULES OF REINFORCEMENT

sponse immediately after reinforcement wasadded to each interval. One additional re-sponse at the beginning of each interval wouldhave raised the local rate of responding im-mediately after reinforcement by about 12resp/min and would have had no effect onsubsequent local rates of responding. The ac-tual increment was of the order of 30 resp/min and persisted to some extent in subse-quent local rates of responding (second andthird points on each function).The overall rates of responding maintained

by the VI 123-sec and VI 108-sec scheduleswere 63.7 and 63.2 resp/min for Pigeon 278and 61.1 and 73.7 resp/min for Pigeon 281.Thus, the overall rate of responding washigher for the schedule with no 0-sec intervalfor Pigeon 278 and lower for Pigeon 281. InFig. 5, the difference is exaggerated for Pi-geon 278 because the figure does not reflectthe relatively large contribution of the highrate of responding shortly after reinforcementto the overall rate of responding maintainedby the schedule with the 0-sec interval (Exp.1, Discussion). The schedule with the 0-secinterval (VI 108-sec) provided about 5 rft/hrmore than the schedule without the 0-sec in-terval (VI 123-sec), but the magnitude of thereversal for Pigeon 278 was well within thelimits of variability suggested by the redeter-minations in Fig. 1.

DISCUSSIONThe increment in the local rate of respond-

ing immediately after reinforcement suggeststhat the 0-sec interval affected both the la-tency of the first response after reinforcementand the local rate of responding shortly afterreinforcement. Continued exposure to theschedules with the 0-sec interval might havereduced the size of the increment, becausefirst responses after reinforcement were occa-sionally reinforced whereas second and thirdwere never reinforced. One factor that couldhave counteracted this effect was the occa-sional reinforcement of responses that fol-lowed the first response (about 15 sec later, atthe end of the t-sec interval). Another possi-bility was that the reinforced first response inthe 0-sec interval occasionally may have beenpreceded by a peck of insufficient force to op-erate the response key, with an effect on sub-sequent behavior equivalent to the reinforce-ment of a second peck after reinforcement.

Only pecks of sufficient force, however, pro-duced a feedback click and the feedback pre-sumably contributed to differentiation of theforce of pecks over the course of the presentexperiment.The fairly localized effect in time of the 0-

sec interval suggests that it is reasonable tocompare local rates of responding maintainedat later times after reinforcement in VI sched-ules with different distributions of intervalseven if some, but not all, of the schedules in-clude a 0-sec interval. Such comparisons aremade in Exp. 3, although the data presentedhere are limited. The data also suggest thatreinforcement of the first response in eachsession, common to both schedules in Fig. 5,had at best a small effect on responding earlyin intervals compared to the effect of rein-forcement of the first peck after a reinforce-ment in 0-sec intervals within the session.

EXPERIMENT 3:EFFECTS OF THE DISTRIBUTION OFINTERVALS IN VARIABLE-INTERVALSCHEDULES ON CHANGES IN THELOCAL RATE OF RESPONDING

WITHIN INTERVALSExperiment 1 demonstrated that local rate

of responding increased as time passed sincereinforcement in an arithmetic VI schedule.Evidence in the literature, however, demon-strates that VI schedules with other distribu-tions of intervals have different effects. Forexample, lerster and Skinner (1957) showedthat local rates of responding decreased astime passed since reinforcement in a VI sched-ule in which the durations of intervals werederived from a geometric progression. Theirdemonstration that different distributions ofintervals differently affect local rates of re-sponding indicates that local rates are notcontrolled solely by time since reinforcement.The present experiment examined anothervariable: the probability of reinforcement atdifferent times since reinforcement, which isdetermined by the distribution of intervalsin a VI schedule.The present treatment defines the proba-

bility of reinforcement as a relative fre-quency: the number of times the first peck isreinforced after- a particular time since re-inforcement divided by the number of op-portunities for a peck after that time. This

339

Page 14: A quantitative analyses of behavior responding by interval schedules

A. CHARLES CATANIA and G. S. REYNOLDS

statistic will be called reinforcements per op-portunity (rft/op) by analogy to Anger's mea-sure of response probability, interresponsetimes per opportunity (IRT/Op; Anger,1956). The method of calculation is illustratedin Fig. 6, which diagrammatically shows anarbitrary schedule with intervals of 0, 20, 20,60, 120, and 200 sec. The intervals are ar-ranged in order of size, although they wouldbe arranged in an irregular order in practice.The first peck after a reinforcement is rein-

forced in the shortest interval but not in anyof the remaining five intervals. The probabil-ity that this peck will be reinforced is there-fore one-sixth (0.17). When the peck is rein-forced, in the 0-sec interval, the reinforcementterminates the interval and serves as the start-ing point for another interval. When the peckis not reinforced, in the remaining five inter-vals, the probability of reinforcement for sub-sequent pecks becomes zero until the end ofthe next longer interval. In the example, thenext opportunity for reinforcement occurs at20 sec, when two of the five remaining inter-vals end. Thus, the first peck after 20 sec isreinforced on two of five opportunities, orwith a probability of 0.40. Similarly, the firstpeck after 60 sec is reinforced with a proba-bility of 0.33, the first peck after 120 sec witha probability of 0.50, and the first peck after200 sec with a probability of 1.0. As Fig. 6illustrates, the statistic can be calculated bydividing the number of intervals that end ata given time after reinforcement by the num-ber that end at that time or later. (Reinforce-ments per opportunity is defined as the prob-ability of reinforcement for the first responsethat occurs after a particular time since rein-forcement. For convenience, the present dis-cussion sometimes refers to the probability ofreinforcement at a particular time.)

Reinforcements per opportunity rests onthe assumption that, except at reinforcement,the organism cannot discriminate between agiven time since reinforcement in one inter-val and the same time since reinforcement inan interval of different duration (e.g., suchdiscrimination could be based on the se-quence of intervals). Another assumption isthat the organism responds rapidly enough,when reinforcement becomes available at theend of one interval, to emit the reinforcedresponse before the time at which the nextlonger interval ends. For example, the proba-

bilities of reinforcement for the first peckafter reinforcement (in the 0-sec interval) andat 20 sec would not be separable if responsesnever occurred before 25 sec; the relevantprobability of reinforcement would be 0.50for both intervals. In most VI schedules, therate,of responding is high enough, relative tothe time separating successive opportunitiesfor reinforcement, to avoid violating this as-sumption (see, however, VI 12.0-sec and VI23.5-sec in Exp. 1).

Probability of reinforcement does not neces-sarily increase monotonically as time passessince reinforcement. In Fig. 6, for example,the probability decreases from 0.40 at 20 secto 0.33 at 60 sec, and then increases to 0.50and 1.0 at later times after reinforcement.Each probability, however, occurs at a dis-crete point in time. The statistic is greaterthan zero only at times after reinforcementwhen intervals in the schedule end. An ac-count of performance in terms of probabilityof reinforcement also must deal with othertimes, when the probability is zero. In addi-tion, reinforcements per opportunity is inde-pendent of the absolute values on the timescale for an interval schedule. Probabilitieswould be unaffected, for example, if thevalues of the time scale of Fig. 6 were mul-tiplied by 100. Because performance presum-ably would be different after this change,probability of reinforcement alone is prob-ably not a sufficient determinant of perform-ance; absolute durations must be taken intoaccount by converting probabilities to localrates of reinforcement. Figure 6 illustrates atechnique for computing such local rates. Anopportunity for reinforcement is defined as apoint on the continuum of time since rein-forcement at which the probability of rein-forcement is greater than zero, or at which atleast one interval ends. The time over whicha particular probability of reinforcement isassumed to be effective is arbitrarily taken asthe time ranging from halfway back to thepreceding reinforcement or opportunity forreinforcement and halfway forward to thenext reinforcement or opportunity for rein-forcement. This procedure takes into accountthe observation that a probability of rein-forcement at a particular time since reinforce-ment can affect responding at both earlierand later times.

Consider, for example, the opportunity at

340

Page 15: A quantitative analyses of behavior responding by interval schedules

INTERVAL SCHEDULES OF REINFORCEMENT 3

-a

zo 3LE1FJI uiwG 2 1

I.o SU I I

z - 36U I I

l~~~ l l

IAIBI

RF/O 1%62k7

l lRFT/SEC 0.0200 0.0182

RFT AVALABILITYINTER-RFT INTERVAL REINFORCED RESPONSE

RFT RFT

I

I II I

I II I

: ID I

1/3 X

E !

0.0083

F I

0.0100

II

HIIG I

0.0250

0 20 40 60 80 100 120 140TIME SINCE REINFORCEMENT

160(SEC.)

180 200

Fig. 6. Schematic presentation of a VI schedule illustrating the computation of two statistics discussed in thetext. The upper part of the figure shows the six interreinforcement intervals of the schedule in order of size:0-sec, 20-sec, 20-sec, 60-sec, 120-sec, 200-sec. Each interval is shown starting from a preceding reinforcement (rft).The first statistic, reinforcements per opportynity (rft/op), is a measure of probability of reinforcement: thenumber of occasions that reinforcement becomes available at a particular time since reinforcement divided bythe number of occasions that the time since reinforcement is reached (e.g., reinforcement is available at 20 secon two of five occasions). The second statistic, reinforcements per second (rft/sec), is a measure of local rate ofreinforcement: the number of reinforcements within a particular period of time since reinforcement divided bythe number of seconds spent in that period of time. The periods of time since reinforcement are arbitrarilytaken as centered at a given opportunity for reinforcement and extending halfway back to the preceding rein-forcement or opportunity for reinforcement and halfway forward to the next reinforcement or opportunity forreinforcement (e.g., for the two reinforcements at 20 sec, the periods of time marked B and C: five 10-sec pe-riods and three 20-sec periods for a total of 110 sec).

60 sec in Fig. 6. The time over which thisprobability of reinforcement is consideredeffective is marked D and E (halfway back tothe opportunity at 20 sec and halfway for-ward to the opportunity at 120 sec). The or-ganism spends 120 sec within the period oftime represented by D and E in each fullsampling of the six intervals, and one rein-forcement is arranged, at the end of the60-sec interval. In other words, the rate of re-inforcement within this period is one rein-forcement per 120 sec (0.0083 rft/sec). Corre-spondingly, the local rate of reinforcement at20 sec is given by the number of reinforce-ments arranged at 20 sec divided by the time

spent in periods B and C: this local rate istwo reinforcements per 110 sec (0.0182 rft/sec). For the opportunity at 0 sec, which im-mediately follows a reinforcement, the localrate of reinforcement is based only on timeperiod A. For the opportunity at 200 sec,after which a peck always terminates the in-terval with reinforcement, the local rate of re-inforcement is based only on time period H.With this calculation, the local rates of re-

inforcement at 0 and 200 sec after reinforce-ment are almost equal, whereas the probabil-ities of reinforcement at these times differ by afactor of six (0.166 and 1.0). Other plausibletechniques for assigning time to successive op-

341

A

Page 16: A quantitative analyses of behavior responding by interval schedules

A. CHARLES CATANIA and G. S. REYNOLDS

portunities for the purpose of calculating lo-cal rates of reinforcement are possible, suchas bisection of the time interval separatingtwo successive opportunities using the geo-metric rather than the arithmetic mean, orthe assignment to a given opportunity of thetime since the last opportunity. The presenttechnique, though arbitrary, seems to involvethe simplest ad hoc assumptions.To recapitulate, reinforcements per oppor-

tunity expresses a conditional probability:the probability that the pigeon's response willbe reinforced, given that the pigeon hasreached a certain time since the last reinforce-ment. Defined in this way, the statistic doesnot take into account the separation in timeof-different opportunities (ends of intervals).By taking into account the temporal separa-tion of successive opportunities, probabilitiesof reinforcement can be converted into localrates of reinforcement.The present experiments compared five VI

schedules, each providing roughly the sameoverall rate of reinforcement (rft/hr) butwith different distributions of intervals. Oneschedule was the arithmetic VI schedule ofExp. 1. Two of the other four schedules dif-fered from the arithmetic VI schedule pri-marily by including extra short intervals.The extra short intervals produced ahigher probability of reinforcement shortlyafter reinforcement than was produced at thesame time after reinforcement in the arith-metic VI schedule. In another schedule, thedistribution of intervals was such that theprobability of reinforcement, given an oppor-tunity for reinforcement, was roughly a lin-early increasing function of the time sincereinforcement. In the last schedule, the proba-bility of reinforcement was held roughly con-stant over most of the range of time since re-inforcement. The relationship between localrates of responding and the probabilities andlocal rates of reinforcement were examinedwithin each schedule.

METHOD

ApparatusThe apparatus was as described in Exp. 1

and 2. The recording circuitry subdivided thefirst and second t sec of each interval so thatresponses were cumulated separately duringsuccessive thirds of these time periods. In ad-

dition, the constant-probability schedule, de-scribed in detail below, included interrein-forcement intervals longer than those in otherschedules. For this schedule, therefore, thelast three counters grouped responses in thetwelfth and thirteenth, the fourteenth andfifteenth, and the sixteenth and seventeentht sec after reinforcement.

Subjects and ProcedureFour of the six pigeons of Exp. 1 were each

assigned an average interreinforcement inter-val: for Pigeons 118 and 129, VI 108-sec (33.3rft/hr); for Pigeon 278, VI 427-sec (8.4 rft/hr);and for Pigeon 279, VI 45.5-sec (79 rft/hr).The arithmetic VI schedule of Exp. 1 wasthen compared with the four other VI sched-ules, the component t-sec intervals of whichare indicated in Table 2. The table lists theschedules in the order in which they were pre-sented. Each session consisted of 61 reinforce-ments. The sessions of the arithmetic VIschedule were the last sessions of Exp. 1 ex-cept for Pigeon 279, for which 29 sessions ofarithmetic VI 108-sec intervened between 15sessions of arithmetic VI 45.5-sec and this pi-geon's other schedules in the present experi-ment (cf. Table 1).

Schedules were changed only after the per-formance of each pigeon had been stable over*a period of at least two weeks. Some scheduleswere continued for a large number of ses-sions so that long-term stability of the per-formances could be examined. Data from thisexperiment are averages over the last five ses-sions of each schedule.In making up the distribution of interrein-

forcement intervals for the constant-probabil-ity VI schedule, it was not convenient tomatch the mean interval to that of the otherVI schedules. Thus, the constant-probabilityschedule was VI 79-sec (45.5 rft/hr) for Pi-geons 118 and 129, VI 379-sec (9.5 rft/hr) forPigeon 278, and VI 40.5-sec (89 rft/hr) forPigeon 279.

RESULTSTwo kinds of graphs summarize the VI

schedules. Probability of reinforcement (rft/op) plotted as a function of time since rein-forcement describes the schedule, and localrate of pecking plotted as a function of timesince reinforcement describes the perform-ance maintained by the schedule.

342

Page 17: A quantitative analyses of behavior responding by interval schedules

INTERVAL SCHEDULES OF REINFORCEMENT

Table 2Sequence of minimum interreinforcement intervals, mean interval, and number of sessionsfor five variable-interval schedules. Interreinforcement intervals are expressed in terms of thenumber of t-sec steps from one reinforcement to the next opportunity for reinforcement.

Schedule Sequence of Intervals Mean Sessions

Arithmetic 14, 8, 11, 6, 5, 9, 2, 13, 7, 1, 7 28*12, 4, 10, 0, 3.

Extra short 14, 8, 11, 6, 5, 9, 2, 12, 7, 1, 7 109interval, I 12, 4, 10, 1, 3."Linear" 13, 10, 10, 7, 4, 7, 7, 4, 10, 13, 7 37

1, 4, 10, 7, 10, 4, 7, 7, 10, 7,7, 10, 4, 7, 10, 4, 1, 7, 4, 4.

Extra short 12, 1, 4, 13, 10, 1, 8, 11, 1, 14, 7 95interval, II 2, 1, 7, 14, 6.Constant 2, 10, 6, 17, 3, 5, 14, 3, 8, 15, 8.25 127probability 1, 13, 10, 9, 2, 3, 8, 2, 1, 2,(rft/op=O.l) 11, 5, 16, 9, 17, 6, 17, 7, 3, 4,

16, 1, 4, 17, 1, 7, 16, 12, 17, 8,1, 4, 2, 16, 12, 13, 17, 3, 5, 7,6, 11, 4, 1, 6, 14, 9, 16, 5, 15.

*For Pigeon 278, 26 sessions; for Pigeon 279, 15 sessions (see text).

ccuia-

Z II.

u

343

0 0.5 1.0 1.5 0 0.5 1.0 1.5 U U.5 to LZA

TIME SINCE REINFORCEMENT (RELATIVE TO AVERAGE ITERREINFORCEMENT INTERVAL)Fig. 7. Probability of reinforcement (upper frames) and the rate of Pigeon 278's key-pecking (lower frames)

as a function of relative time since reinforcement in each of three VI schedules. The schedules differed in thenumber of short interreinforcement intervals and therefore in the probability of reinforcement (reinforce-ments per opportunity) shortly after reinforcement. Two different times after reinforcement at which prob-abilities of reinforcement were equal are indicated by arrows (extra short I and II). Dashed horizontal linesshow the overall rates of key-pecking maintained by each schedule.

Page 18: A quantitative analyses of behavior responding by interval schedules

A. CHARLES CA TANIA and G. S. REYNOLDS

Extra-short-interval schedules. The arith-metic VI schedule and the two scheduleswith extra short intervals are describedin the upper frames of Fig. 7. The arith-metic VI schedule arranged monotonicallyincreasing probabilities of reinforcement(rft/op) at the ends of successive t-sec periodsof time after reinforcement. The other twoschedules (labeled extra short I and II) in-cluded extra t-sec intervals, and therefore pro-vided a higher probability of reinforcement att sec. In these schedules, the 0-sec interval wasomitted (see Exp. 2) and the probabilities atlater times also were changed from those inthe arithmetic VI schedule, so that the threeschedules had equal average values (see Ta-ble 2). The data supported the assumptionthat the changes later after reinforcementwould have minor effects compared to thoseproduced by the addition of more short in-tervals.The lower frames in Fig. 7, for Pigeon 278,

show that the arithmetic VI schedule main-tained local rates of responding that increasedas time passed since reinforcement (cf. Fig. 2),and that the two schedules with the extrashort intervals maintained higher local ratesof key-pecking at t sec after reinforcementthan did the arithmetic VI schedule. A smallerincrement was generated by the schedule withtwo intervals that ended at t sec after rein-forcement (extra short I) than by the schedulewith four intervals that ended at t sec (extrashort II). Thus, the local rate of respondingat t sec depended on the probability of rein-forcement at that time. Some independenceof the effect of probability of reinforcementfrom time since reinforcement is suggested bythe rates of responding later after reinforce-ment when the probability of reinforcementwas roughly the same as that at t sec (arrows;the later probabilities did not actually occurin the schedules but were interpolated fromthe adjacent non-zero probabilities).

Figure 8 shows the performances of theother three pigeons (118, 129, and 279) in thearithmetic and the extra-short-interval sched-ules. The local rates of pecking plotted againsttime since reinforcement are similar to thoseof Pigeon 278 in Fig. 7. The rate of peckingat t sec after reinforcement increased with theprobability of reinforcement at t sec. The oneexception was that only the second extra-short-interval schedule produced an increment

in the rate at t sec relative to that in thearithmetic VI schedule for Pigeon 129.For all pigeons, the differences between the

first points on each function can be attributedto the inclusion of a 0-sec interval in the arith-metic but not in the VI schedules with extrashort intervals.The overall rates of reinforcement were the

same in each of the three schedules. Thedashed horizontal lines in Fig. 7 and 8 showthe overall rates of responding maintained byeach schedule. The addition of extra shortintervals produced increments in the overallrate of responding for Pigeon 278, but did notsystematically affect overall rates of respond-ing for the other three pigeons.

"Linear" schedule. The schedule in whichprobability of reinforcement was roughly lin-early related to time since reinforcement ("lin-ear" schedule) is compared with the arithmeticVI schedule in the upper frame of Fig. 9. Inthe "linear" schedule, non-zero probabilitiesof reinforcement occurred at only five discretetimes after reinforcement, but the probabilitiesof reinforcement at successive opportunities in-creased more rapidly than in the arithmeticVI schedule.The lower frame of Fig. 9 shows the per-

formance of Pigeon 278. The local rate ofresponding increased as time passed since rein-forcement in both the "linear" and the arith-metic VI schedules. Overall rate was higherin the "linear" than in the arithmetic VIschedule .The data for the other three pigeonsare shown in Fig. 10. For Pigeons 129 and 279,the rate of key-pecking increased over timesince reinforcement within both schedules, andfor Pigeon 118, the rate of key-pecking de-creased at later times after reinforcement inthe "linear" schedule. For these three pigeons,overall rate was lower in the "linear" than inthe arithmetic VI schedule.The general similarity of the performances

maintained by the arithmetic and "linear"schedules, given the considerable differencesin the probabilities of reinforcement at par-ticular times, appear inconsistent with thefindings obtained with the schedules contain-ing the extra short intervals. But the differ-ences between these and arithmetic VI sched-ules were primarily in the probabilities ofreinforcement shortly after reinforcement.These comparisons suggest, therefore, that theeffect of a change in the probability of rein-

344

Page 19: A quantitative analyses of behavior responding by interval schedules

INTERVAL SCHEDULES OF REINFORCEMENT

TIME SINCE REINFORCEMENT (RELATIVE TO AVERAGEINTERREINFORCEMENT INTERVAL)

Fig. 8. Data from three VI schedules for three additional pigeons. Details as in Fig. 6.

forcement may depend on the time since re-

inforcement or on the separation of differentprobabilities of reinforcement along the con-

tinuum of time since the previous reinforce-ment.

Constant-probability schedule. The effects ofa roughly constant probability of reinforce-ment over most of the range of time sincereinforcement were examined in a schedulerelated to the random-interval schedules ofFarmer (1963) and Millenson (1963), and tothe constant-probability interval schedule ar-

ranged by Chorney (1960) according to the

specifications of Fleshler and Hoffman (1962),both of which are discussed in detail in Ap-pendix II. In the present constant-probabilityVI schedule, shown in the upper frame ofFig. 11, the probability of reinforcement re-

mained equal to 0.10 + 0.02 over a range oftime since reinforcement within which, in thearithmetic VI schedule, this probability in-creased almost five-fold, from 0.07 to 0.33. Atthe very late times since reinforcement, theprobability of reinforcement necessarily in-creased, because the series had to contain a

longest interval.

LUJ

u- I

CLL/)un 4

UJCKJ

345

Page 20: A quantitative analyses of behavior responding by interval schedules

346 A. CHARLES CA TANIA and G. S. REYNOLDS

"- 1.0'5-

o "LINEAR"O *8** ARITHMETIC0Cc .6'U

0.~~~~~z .4EU 0ui o8 .2UA.

v _

z

0 . 0

~40

(VI 427-sec.)

0 0.5 1.0 1.5 2.0TIME SINCE RENFORCEMENT(RELATIVE TO AVERAGE

INTERREINFORCEMENT NTERVAL)Fig. 9. Probability of reinforcement (reinforcements

per opportunity) and rate of key-pecking as a functionof time since reinforcement in a "linear" VI schedule.In this schedule, the probability of reinforcement,when not zero, was roughly proportional to time sincereinforcement. The arithmetic VI schedule is pre-sented for comparison.

The performance maintained by the con-

stant probability and the arithmetic VI sched-ules are compared, for Pigeon 278, in thelower frame of Fig. 11. When the probabilityof reinforcement was held constant, the localrate of responding remained roughly constantthroughout the interval between reinforce-ments. The increase in rate was only about2 resp/min over the time from 2t to 17t sec,or roughly one-tenth the increase over thesame range of time in the arithmetic VIschedule. A slight increase in response ratemight have been expected, even in the con-

stant-probability VI schedule, because theprobability of reinforcement did increaseeventually to 1.0 at the latest times after

(VI 45.5 -sec.)

0 0.5 1.0 1.5 2.0TIME SNCE REINFORCEMENT(RELATIVE TO AVERAGE

INTERREINFORCEMENT INTERVAL)Fig. 10. Data from the "linear" VI schedule for three

additional pigeons. Details as in Fig. 9.

reinforcement. Responding began at a lowrate within intervals of the constant-proba-bility schedule, probably because no 0-sec in-terval had been included in the schedule, butthe rate increased rapidly during the first tand part of the second t sec.

LU

D

z

LUC1-

LI)LUC)

VI)LUI

Page 21: A quantitative analyses of behavior responding by interval schedules

INTERVAL SCHEDULES OF REINFORCEMENT

0C-C-.0

'hiC.

I.-z

'ae

0

'hi

'hi

'UC.

'hi

ILl

TIME SINCE REINFORCEMENT(RELATIVE TO AVERAGE

INTERREINFORCEMENT INTERVAL)Fig. 11. Probability of reinforcement (reinforcements

per opportunity) and rate of key-pecking as a func-tion of time since reinforcement in a constant-proba-bility VI schedule. In this schedule, the probabilityof reinforcement remained roughly constant until thelatest times after reinforcement, when it increasedabruptly to 1.0. The arithmetic VI schedule is pre-sented for comparison.

The performances of the three other pigeonsare shown in Fig. 12, again compared withthe performances maintained by the arithmeticVI schedule. For all three pigeons, the localrate of responding changed considerably lessover most of the range of time since reinforce-ment in the constant-probability schedule thanit did in the arithmetic VI schedule. A transi-tory high local rate of responding shortly afterreinforcement, for Pigeon 279 and to a lesserextent for Pigeon 118, may have persisted fromthe previous schedule with additional shortintervals (Table 2). If so, it is not clear why

LU

Dz

(I)LU

z011)LUck:

0.5 1.0 1.5TIME SINCE REINFORCEMENT(RELATIVE TO AVERAGE

INTERREINFORCEMENT INTERVAL)Fig. 12. Data from the constant-probability VI sched-

tile for three additional pigeons. Details as in Fig. 11.

the peak did not similarly persist in the per-formances of the other birds. If the very earlytimes after reinforcement have characteristicsthat affect the local rate of responding main-

347

Page 22: A quantitative analyses of behavior responding by interval schedules

A. CHARLES CATANIA and G. S. REYNOLDS

tained by a given probability of reinforcement,it may be relevant that the absolute value ofthe constant-probability VI schedule, andtherefore the duration of the short interval,was shortest for Pigeon 279.

Figure 13 shows a cumulative record ofPigeon 118's performance on constant-proba-bility VI 79-sec, and may be compared withFig. 4, a record of arithmetic VI 108-sec forthe same pigeon. The record in Fig. 13 showsthat the constant-probability VI schedulemaintained a roughly constant rate of respond-ing within each individual interval. Thus, theconstancies in local rate shown in Fig. 11 and12 were not artifacts of averaging perform-ances over many intervals. Any consistent ef-fects on responding within successive intervalsthat might have been caused by the particularsequence of intervals were not evident in thecumulative records. If such effects were pres-ent, they were small and will be disregardedhere.

118 CONSTANT PROBABILITY VI 79-sc.

10 MINUTES

Fig. 13. Cumulative record of a full session of key-pecking maintained by a constant-probability VIschedule with a mean interreinforcement interval of79 sec (Pigeon 118). The recording pen reset to base-line after each reinforcement, indicated by diagonalpips. Compare Fig. 4.

DISCUSSIONDistributions of intervals in variable-inter-

val schedules. In the arithmetic and "linear"VI schedules, two schedules in which theprobability of reinforcement increased as timepassed since reinforcement, local rates of re-sponding also increased as time passed. Theincreases in local rate were somewhat compara-ble in the two schedules despite considerabledifferences in the way the probability of rein-forcement changed over time (Fig. 9 and 10).When the probability of reinforcement earlyafter reinforcement was made high relative to

the probability at the same time in the arith-metic VI schedule, by the addition of extrashort intervals, the local rate of responding atthat time became relatively high (Fig. 7 and 8).Finally, when the probability of reinforcementwas held roughly constant over most of therange of time since reinforcement, in the con-stant-probability VI schedule, local rates ofresponding remained relatively constant astime passed since reinforcement (Fig. 11 and12).Cumulative records presented by Ferster and

Skinner (1957, Ch. 6) from schedules roughlyequivalent to the arithmetic and the extra-short-interval VI schedules support the presentfindings: local rates of responding increasedas time passed since reinforcement in theformer schedules, and were relatively highshortly after reinforcement in the latter sched-ules. Ferster and Skinner also studied twoother schedules, the geometric and the Fibo-nacci, which supplement the present schedules.A geometric VI schedule consists of a se-

quence of intervals in which the duration ofa given interval is equal to the duration of thenext shorter interval multiplied by a constant(by this specification, Ferster and Skinner'sschedules are only approximately geometric).With a constant of 2, for example, one suchschedule consists of the following intervals inan irregular order: 1, 2, 4, 8, 16, etc. sec. AFibonacci VI schedule consists of a sequenceof intervals in which the duration of a giveninterval is equal to the sum of the durationsof the next two shorter intervals, as, for ex-ample, in an irregular ordering of the follow-ing intervals: 1, 1, 2, 3, 5, 8, 13, etc. sec.In both of these schedules, the probability

of reinforcement increases monotonically to1.0 over successive opportunities for reinforce-ment (except for the first opportunity afterreinforcement in the Fibonacci schedule, be-cause the shortest interval is represented twicein that sequence of intervals). For both ofthese schedules, Ferster and Skinner's cumu-lative records show that local rates of respond-ing decreased as time passed since reinforce-ment. This demonstration, that local rates ofresponding may decrease even while proba-bilities of reinforcement increase, indicatesagain that something more than probabilityof reinforcement alone must be taken intoaccount in the analysis of performance withinintervals of VI schedules.

348

Page 23: A quantitative analyses of behavior responding by interval schedules

INTERVAL SCHEDULES OF REINFORCEMENT

Further evidence is provided in Fig. 14,which shows data obtained by Chorney (1960)with arithmetic, geometric, and constant-prob-ability VI 3-min schedules. The upper framesdescribe the schedules in terms of reinforce-ments per opportunity; the lower frames pre-

sent local rates of responding averaged across

data from three pigeons for each schedule.Each pigeon was exposed to only one schedulefor about 26 sessions of 60 to 80 reinforcementseach. Chorney's arithmetic and geometric VIschedules correspond to the examples of thesetwo schedules already discussed: successivelylonger intervals differed in the arithmeticschedule by an additive constant, and in thegeometric schedule by a multiplicative con-

stant.The constant-probability VI schedule, based

on a formula proposed by Fleshler and Hoff-man (1962), differed in its derivation from theconstant-probability schedule of the presentexperiments. If a random generator arrangeda constant probability of reinforcement withinsuccessive equal periods of time since rein-forcement, the frequencies of different inter-reinforcement intervals would decline expo-

nentially as a function of interval duration. Ineffect, Fleshler and Hoffman took this theo-retical frequency distribution of intervals anddivided it into equal areas, or, in other words,into successive class intervals in each of whichan equal number of intervals ended. Theseclass intervals became larger the longer thetime since reinforcement because of the ex-

ponentially decreasing form of the frequencydistribution. The average intervals of each ofthese class intervals were then taken as theconstituent intervals of Fleshler and Hoff-man's constant-probability schedule. One ef-fect of this procedure was that the proba-bility of reinforcement taken over extendedperiods of time since reinforcement was heldroughly constant. For example, in the con-

stant-probability VI schedule arranged byChorney, 14 of the 25 intervals ended withinthe first 150 sec after reinforcement, or witha probability of 0.56; six of the remaining11 intervals ended within the next 150 sec

after reinforcement, or with a probability of0.55; and in the next two 150-sec periods, theprobabilities were 0.60 and 0.50, respectively(after 600 sec, when only the longest in-terval remained, the probability was neces-

sarily 1.0).

The difference between two kinds of con-stant-probability VI schedules may be sum-marized as follows: the Fleshler and Hoffmanschedule varied the separation of successiveopportunities for reinforcement in time whileholding equal the relative frequencies of theintervals ending at each opportunity; the pres-ent schedule spaced successive opportunitiesfor reinforcement uniformly in time whilevarying the relative frequencies of the inter-vals ending at each opportunity. Some of theimplications of those two methods of arrang-ing constant-probability VI schedules are dis-cussed in Appendix II.Each of the schedules arranged by Chorney

consisted of 25 intervals. In the arithmeticschedule, the intervals ranged from 1.0 to358.6 sec with an additive constant of 14.9 sec.In the geometric schedule, the intervals rangedfrom 1.0 to 1150.0 sec with a multiplicativeconstant of 1.341. In the constant-probabilityschedule, the intervals ranged from 3.6 to714.0 sec. The different ranges, produced whenthe mean value of each schedule was set at180 sec, are reflected by the different scalesfor the abscissas of Fig. 14. In each schedule,the same probabilities of reinforcement wererepresented at successive opportunities: from0.04 (1/25) at the end of the shortest interval,to 1.0 at the end of the longest interval. Theschedules differed only in the spacing of suc-cessive opportunities for reinforcement intime. In the arithmetic schedule, the timefrom one opportunity to the next was con-stant; in the geometric and constant-proba-bility schedules, the time from one oppor-tunity to the next increased as time passedsince reinforcement, but later after reinforce-ment the increase was more rapid in the geo-metric than in the constant-probability sched-ule.The data, which show the effects of the

different temporal spacings of successive op-portunities, consisted of average rates of re-sponding in successive thirds of intervals ofcomparable duration, about 350 sec, in eachschedule (unfilled circles), and average ratesof responding in all those intervals greaterthan 100 sec in each schedule (filled circles).The former showed that, during the first 300sec after reinforcement, local rates of respond-ing increased slightly in the arithmetic sched-ule and decreased in both the geometric andconstant-probability schedules. The latter

349

Page 24: A quantitative analyses of behavior responding by interval schedules

A. CHARLES CATANIA and G. S. REYNOLDS

0 100 200 300 0 400 800 0TIME SINCE REINFORCEMENT (SECONDS)

200 400 600 800

Fig. 14. Probability of reinforcement (reinforcements per opportunity) and the rate of key-pecking as a func-tion of absolute time since reinforcement in three schedules, from Chomey (1960). Note the different abscissascales. Details in text.

showed that the local rate of responding de-creased markedly as time passed since rein-forcement in the geometric schedule, but didnot change systematically in the arithmeticand constant-probability schedules. In sum-

mary, in the arithmetic schedule, the evidencefor an increase in local rate of responding as

time passed since reinforcement was ambigu-ous; in the geometric schedule, local rate de-creased as time passed since reinforcement;and in the constant-probability schedule, localrate was fairly constant over most of the rangeof time since reinforcement, but was relativelyhigh shortly after reinforcement. Taking intoaccount the relatively short experimental his--tories on which these data are based, they arein reasonable agreement with the present find-ings and with other findings in the literature.Local rates of reinforcement. The differences

in performance produced by a fixed sequenceof probabilities of reinforcement, when thetemporal separations of successive opportuni-ties for reinforcement were varied, indicatethat the present analysis must be extendedfrom probabilities of reinforcement to local

rates of reinforcement. The basic premise inconverting probabilities of reinforcement tolocal rates of reinforcement is that the effectof a given probability of reinforcement may

spread over time and may depend on thecloseness in time of other opportunities forreinforcement. A probability of reinforcementof 1.0 at one time, for example, may maintainresponding at earlier times and may not main-tain as high a rate when it is separated fromthe preceding opportunity by a long time (e.g.,300 sec in the geometric VI schedule of Fig.14) as when it is separated from the precedingopportunity by a short time (e.g., 15 sec in thearithmetic VI schedule of Fig. 14). Local ratesof reinforcement, illusttated in Fig. 6, takethe separation of successive opportunities forreinforcement into account; they are calcu-lated by dividing the number of reinforce-ments at a given opportunity by the periodof time within which that opportunity is iso-lated. Local rates of reinforcement also can

be considered equivalent to probabilities ofreinforcement averaged over extended periodsof time.

> 1.1

z

'S.

t

w

LUJ0.

4-z

as

L7

z

LU

z

aeQL 5

In

.0 -

ARITHMETIC GEOMETRIC CONSTANT - PROBABILITYVI 180-sec VI 180-sec VI 180-sec

.6

.4-

0~~ ~ ~ ~~~.2., * 0 .0

Chorney (1960)

75

o intervals of . 350 sec.

25-_ . . * all intervals >1100 sec.

O''

350

Page 25: A quantitative analyses of behavior responding by interval schedules

INTERVAL SCHEDULES OF REINFORCEMEN75

Local rates of reinforcement within the fiveVI schedules of the present experiment and ina geometric VI schedule (intervals of 10, 20,40, 80, and 160 sec) are shown in Table 3.The mean values of the schedules were chosenso that each schedule could be represented interms of intervals that were integral multiplesof 10 sec. Local rates of reinforcement are

shown only at opportunities for reinforcement(times at which at least one interval in theschedule ended); the local rates take into ac-

count the absence of opportunities for rein-forcement at times for which there are no

entries in Table 3. The opportunity at 0 sec

in the arithmetic VI schedule is omitted fromthe table for reasons to be discussed below.Table 3 shows that the local rate of rein-

forcement increased with time since reinforce-ment in both the arithmetic and "linear" VIschedules. Except for the opportunity at 10 sec

after reinforcement, it also increased in theVI schedules with the extra short intervals.In the constant-probability VI schedule, thelocal rate of reinforcement remained relativelyconstant, except at the latest times after re-

inforcement when the probability of rein-forcement necessarily increased to 1.0. In thegeometric VI schedule, the local rate of rein-forcement decreased as time passed since re-

inforcement, except at the terminal opportu-nity (160 sec).

The conversion of these local rates of rein-forcement to those for equivalent VI scheduleswith different mean intervals involves only themultiplication of each of the local rates by a

constant. For an arithmetic VI schedule witha mean of 35 sec, for example, in which eachof the intervals in the first column of Table 3is halved, the local rates of reinforcement inthe table are multiplied by two.Corresponding changes in local rates of re-

inforcement over time since reinforcement oc-

cur in the schedules arranged by Chorney.Except for some deviations at the earliest andlatest opportunities for reinforcement, localrate of reinforcement increased monotonicallyas time passed since reinforcement in the arith-metic VI schedule, decreased monotonically inthe geometric VI schedule, and remainedroughly constant in the constant-probabilityVI schedule. In other words, these directions-of change in local rate of reinforcement are

characteristic, respectively, of these threeclasses of distributions of intervals in VIschedules, and changes in local rate of rein-forcement correspond in direction to thechanges in local rate of responding observedin a given schedule (an increasing local rateof responding in the arithmetic VI schedule,a relatively high local rate of respondingshortly after reinforcement in the extra-short-interval VI schedules, and so on). Never-

Table 3Local rates of reinforcement (rft/hr) at successive opportunitiesvariable-interval schedules. Details in text.

for reinforcement in six

Extra Short Extra Short Constant-Time Since Arithmetic I Il "Linear" Probability Geometric

rft(sec) VI 70-sec VI 70-sec VI 70-sec VI 70-sec VI 82.5-sec VI 62-sec

10 27 51 111 13 38 8020 29 29 23 35 7230 31 31 3740 34 34 19 40 34 5150 38 38 3860 42 42 28 4270 48 48 48 80 3680 55 55 41 39 4590 66 66 44100 79 79 42 160 33110 103 103 80 36120 144 288 103 40130 240 144 240 45140 720 360 720 51150 60160 209 90170 720

Local rate of 55 rft/hr at 0 sec is omitted (see text) .

351

Page 26: A quantitative analyses of behavior responding by interval schedules

A. CHARLES CATANIA and G. S. REYNOLDS

theless, changes in the local rate of reinforce-ment are large compared to the changes in thelocal rate of responding. The local rate ofreinforcement in the arithmetic VI schedule,for example, increases by a factor of almost 40over the time from 10 to 140 sec after rein-forcement, whereas the local rate of respond-ing increases by a much smaller factor. InExp. 1, however, the functions relating overallrate of responding to overall rate of reinforce-ment were generally monotonically increasingbut negatively accelerated; beyond about 50rft/hr, large changes in the overall rate of re-inforcement produced relatively small changesin the overall rate of responding. It is there-fore appropriate to compare the local rates ofresponding maintained by different local ratesof reinforcement (Fig. 7-10) with the overallrates of responding maintained by differentoverall rates of reinforcement (Fig. 1; see alsothe top graph of Fig. 28, Appendix I).

Figure 15 compares for each pigeon thelocal rates of responding maintained by thearithmetic, extra-short-interval, and "linear"VI schedules (obtained: connected filledcircles) and, from Fig. 1, the overall rates ofresponding maintained by overall rates of re-inforcement that corresponded to the localrates of reinforcement at successive opportuni-ties within each schedule (calculated: uncon-nected unfilled circles). The correspondencebetween the two sets of data is by no meansperfect, but several features are encouraging.The two sets of data tend to increase and de-crease together, even when they differ consid-erably in absolute value. In several cases, bothsets of data agree fairly well in absolute rateas well as in the direction of change over timesince reinforcement. Finally, some of the idio-syncratic characteristics of the performancesof different pigeons, as in the larger ratechanges for Pigeon 118 than for Pigeon 279,are reflected in both sets of data.The disagreements between the two sets of

data stem from several sources. One of themost important is the adequacy of the overall-rate data from Fig. 1. The data in Fig. 1show average rates maintained by differentoverall rates of reinforcement, but the rangeof variation at a given overall rate of rein-forcement is indicated only by the extent towhich redeterminations differed from originaldeterminations. Redetermined overall rates ofresponding were generally higher than the

original rates. It therefore may be importantthat the local rates of responding in theschedules in Fig. 15 were generally higher,for three of the four pigeons, than the overallrates derived from Fig. 1, because the sched-ules in Fig. 15 were presented later than thosein Fig. 1.

In addition, most of the overall rates ofresponding plotted in Fig. 15 were obtainedindirectly, by interpolation between actualdata points in Fig. 1, or, in the case of Pigeon278 at VI 427-sec, by the linear extrapolationto zero from the lowest rate of reinforcement(8.4 rft/hr). For example, most of the localrates of reinforcement within the VI 427-secschedules were lower than any of the overallrates of reinforcement arranged for this pi-geon in Exp. 1, and this extrapolation yieldedmost of the overall rates of responding thatwere considerably lower than the local rateswithin each schedule for this pigeon in Fig. 15.Furthermore, the overall rates of respondingin Fig. 1 were obtained with arithmetic VIschedules, in which local rates varied withtime since reinforcement, rather than withconstant-probability VI schedules. As indi-cated in Exp. 1 (Discussion), this characteristicof arithmetic VI schedules may have affectedthe overall-rate functions.

Variability in the performances maintainedby the schedules in Fig. 15 also contributed tothe disagreement between the two sets of data.The two most obvious cases are the idiosyn-cratic performance of Pigeon 118 in the "lin-ear" schedule, within which the local rate ofresponding decreased at later times since re-inforcement, and of Pigeon 129 in the firstextra-short-interval schedule, within whichthe local rate of responding remained lowshortly after reinforcement.

Finally, some systematic disagreement isevident at the earliest and the latest timessince reinforcement. In the arithmetic sched-ule, the two sets of data disagree most atshort times after reinforcement for Pigeons118 and 279, and would do so also for Pi-geons 129 and 278 if the two sets of data wereadjusted to eliminate the differences in abso-lute value. In the extra-short-interval sched-ules, the overall rate of responding plottedat the latest time since reinforcement isrelatively high in both schedules for Pigeon118, in the first schedule for Pigeon 278, andin the second schedule for Pigeon 129. These

352

Page 27: A quantitative analyses of behavior responding by interval schedules

INTERVAL SCHEDULES OF REINFORCEMENT

279 (Vi 4 (5-sc.) VQ5 1.0 15 0 0.5 1.0 1. 0 0.5 1.0 t5 0 0.5 1.0

TIME SNCE REORCEMENT (RELATIVE TO AVERAGE NTERREN FORCEMENT NTERVAL)Fig. 15. Comparison of local rates of responding obtained in four VI schedules (Fig. 7-10) and local rates of

responding calculated from overall rates of responding (Fig. 1) and local rates of reinforcement (Fig. 6 andTable 3). Details in text.

differences probably depend on the calcula-tion of local rates of reinforcement (Fig. 6)rather than on the properties of the perform-ances and their controlling variables. Thereis no reason to believe that the arbitrarymethod of calculation of local rates of rein-forcement takes into account either the prop-erties of the 0-sec interval (Exp. 2), which wasarranged in the arithmetic but not the otherschedules of Fig. 15, or of the latest timeafter reinforcement, at which the probabilityof reinforcement necessarily increases to 1.0and at which the calculation of the local rateof reinforcement is unique in that it is basedon periods of time preceding but not follow-ing the opportunity for reinforcement.

Adjustments could be made in the methodof calculating local rates of reinforcementthat would reduce the disagreements in thetwo sets of data at the earliest and latesttimes since reinforcement (cf. discussion ofFig. 6). In view of the sources of disagree-ment, however, and in the absence of addi-tional data, such adjustments seem prema-ture. It is sufficient to note that, given thequalifications outlined, the agreement be-tween the two sets of data is good enough tosuggest that both the overall rate of respond-ing maintained by different interval schedulesand the local rates of responding as timepasses within a particular interval scheduleare controlled, at least in part, by the same

353

Page 28: A quantitative analyses of behavior responding by interval schedules

A. CHARLES CATANIA and G. S. REYNOLDS

variable: rate of reinforcement. Because over-

all rates of reinforcement and overall rates ofresponding are simply weighted averages ofthe local rates, it also follows that the overallrate of responding is indirectly determined bythe effects of different local rates of reinforce-ment as time passes, rather than directly de-termined by the overall rate of reinforcement.Some additional evidence, supplementing

that in Fig. 15, is relevant to the relationshipbetween overall and local rates of respond-ing. In Exp. 1 (Discussion), it was suggestedthat, except for some of the arithmetic VIschedules with shorter mean values, the localrate of responding for a particular pigeonchanged in roughly the same way as timepassed since reinforcement in most of theschedules studied (Fig. 3). This was approxi-mately so for most schedules and mostpigeons, but it is possible, furthermore, to ac-

count for some of the deviations by compar-

ing the overall-rate functions (Fig. 1) and thelocal rates of reinforcement within differentarithmetic VI schedules.Compare, for example, Pigeon 278's per-

formance in the arithmetic VI 108-sec andVI 427-sec schedules (Fig. 3, but more easilyseen by comparing Fig. 5 and 7). The localrate of responding became fairly constantafter about 50 sec in the VI 108-sec schedulewhereas it increased over almost the entirerange of time since reinforcement in the VI427-sec schedule. The local rates of reinforce-ment ranged from 17.4 to 465 rft/hr in theformer schedule and from 4.4 to 118 rft/hr inthe latter schedule (lt to 14t sec, with t equalto 15.4 and 61.0 sec in the two schedules, re-

spectively). Looking at the difference betweenthe two schedules in another way, note thatthe local rate of reinforcement reached 25rft/hr at about 5t sec in the VI 108-sec sched-ule and at about 12t sec in the VI 427-secschedule. The two schedules therefore cov-

ered different parts of the overall-rate func-tion shown for Pigeon 278 in Fig. 1. Thisfunction was fairly flat beyond about 25 rft/hr or over most of the range of rates of rein-forcement locally represented in the VI 108-sec schedule, but it increased steeply up toabout 25 rft/hr or over most of the range ofrates of reinforcement locally represented inthe VI 427-sec schedule.In some other cases in which local rates of

responding changed differently in different

arithmetic VI schedules for a particular pi-geon (Fig. 3), the differences can be similarlyrelated to the form of the overall-rate func-tion for that pigeon. The major deviationsfor the arithmetic VI schedules with shortmean values (e.g., Pigeon 121 at VI 12.0-sec,Fig. 3) cannot be assessed in this way, for thereasons outlined in Exp. 1 and because thelocal rates of reinforcement at later timesafter reinforcement in those schedules ex-ceeded the overall rates of reinforcement inFig. 1 (in the arithmetic VI 12.0-sec schedule,for example, the local rate of reinforcementexceeded 300 rft/hr by 8t sec after reinforce-ment).

EXPERIMENT 4:OVERALL AND LOCAL RATES OFRESPONDING WITHIN THREEFIXED-INTERVAL SCHEDULES

The fixed-interval (FI) schedule is the lim-iting case of the VI schedule: the distributionof intervals is narrowed down to a singlevalue. The performance maintained by anFl schedule is usually characterized by apause before the first response in an interval,and then by a gradually increasing rate of re-sponding as the end of the interval ap-proaches. Occasionally, the rate passesthrough a maximal value some time beforethe end of the interval (Ferster and Skinner,1957) or, after extended exposure to a fixed-interval schedule, the responding after theinitial pause may be maintained at a rela-tively constant rate throughout the remainderof the interval (Cumming and Schoenfeld,1958).The present analysis cannot easily be ap-

plied to the Fl schedule, which includes asingle opportunity at which the probabilityof reinforcement is 1.0. This single oppor-tunity combines two difficulties in the analy-sis: that of the earliest opportunity (end ofthe shortest interval), which is preceded bya reinforcement rather than by another op-portunity, and that of the latest opportunity(end of the longest interval), at which theprobability of reinforcement necessarily in-creases to 1.0. The present experiment there-fore examined some properties of respondingwithin fixed intervals. Three Fl scheduleswere studied: FI 30-sec, FI 50-sec, and Fl200-sec.

354

Page 29: A quantitative analyses of behavior responding by interval schedules

INTERVAL SCHEDULES OF REINFORCEMENT

METHOD

ApparatusThe standard experimental chamber was

similar to that described in the preceding ex-

periments. The response key was illuminatedby an orange 6-w bulb. Reinforcement dura-tion was 3 sec. The controlling and recordingapparatus was located in a separate room,

and included a stepping switch that arrangedthe Fl schedules and distributed responses to10 counters representing successive tenths ofthe interval.

Subjects and ProcedureFour adult, male, White Carneaux pigeons,

maintained at about 80% of free-feeding bodyweight, were exposed to FI 50-sec, FI 200-sec,and Fl 30-sec schedules in that order. Sessionsof a different procedure (temporal discrimi-nation: Reynolds and Catania, 1962) pre-ceded FI 50-sec and intervened between Fl200-sec and Fl 30-sec. Each FI schedule wasmaintained for approximately two monthspfdaily sessions, at which time the performanceof each pigeon had appeared stable (visualinspection of the data) for at least two weeks.Sessions of FI 50-sec and Fl 30-sec consistedof 61 reinforcements: reinforcement of thefirst response of the session followed by rein-forcement at the end of 60 intervals. Sessionsof Fl 200-sec consisted of only 31 reinforce-ments (30 intervals). Each interval was timedfrom the end of the preceding reinforcement.Data presented are averages over the last fivesessions of each FI schedule.

RESULTSFigure 16 shows local rates of responding

in successive tenths of the fixed interval as a

function of both absolute (left column) andrelative (right column) time since reinforce-ment. In absolute time, the rate of respond-ing increased most rapidly in the shortestfixed interval (Fl 30-sec). In both the Fl 30-sec and the Fl 50-sec schedules, local rates ofresponding increased- monotonically as timepassed since reinforcement. In the FI 200-secschedule, the local rate of responding beganto decrease slightly about halfway throughthe interval for Pigeon 68, and decreasedslightly shortly before reinforcement for Pi-geon 236. About halfway through the 200-secinterval, the local rate of responding become

roughly constant for Pigeon 237. Local rateof responding as a function of absolute timesince reinforcement was typically lower in Flthan in arithmetic VI schedules, but the twosets of data have some similar characteristics(cf. Fig. 2, VI 12.0-sec, VI 23.5-sec, and VI108-sec).Replotting the data against relative time~

since reinforcement (right column) showsthat, for all four pigeons, the Fl 200-sec sched-ule maintained a relatively higher rate ofresponding early after reinforcement and arelatively lower rate of responding later afterreinforcement than the other two schedules.This change in the pattern of responding inrelative time contrasts with that observed inVI schedules (Fig. 3). Despite the durationfor which the schedules were continued, therelatively high rate of responding maintainedearly after reinforcement in the FI 200-secschedule may have persisted from reinforce-ment at an earlier time in the preceding FI50-sec schedule. It is also possible that rein-forcement of the first response in each sessionhad effects similar to those of a 0-sec interval,but this interpretation is contradicted by theresults of Exp. 2 and it is unlikely that sucheffects would be most marked in FT 200-sec.

Overall Fl rates of responding are plottedagainst reinforcements per hour in Fig. 17(left). No systematic relationship is evident.In general, however, the terminal rate of re-sponding (the local rate of responding justbefore reinforcement) increased as the overallrate of reinforcement increased (right). Thesedata differ considerably from those obtainedwith VI schedules (Fig. 1), in which both over-all and local rates of responding increasedwith increases in the overall rate of reinforce-ment over a roughly comparable range.

DISCUSSIONThe FI schedules demonstrate the exten-

sive effects of reinforcement at one time afterreinforcement on local rates of responding atother times. For example, reinforcement at200-sec after reinforcement for Pigeon 68maintained a considerable rate of respondingover most of the range of time since reinforce-ment. Thus, the calculation of local rates ofreinforcement, in the introduction of Exp. 3(Fig. 6), probably underestimates the effectof reinforcement on local rates of respondingat remote points in time. The calculation of

355

Page 30: A quantitative analyses of behavior responding by interval schedules

A. CHARLES CATANIA and G. S. REYNOLDS

FIXED-INTERVAL SCHEDULE

0 50 100 150

ABSOLUTE TIME SINCE RFT(Seconds)

Fig. 16. Local rates of key-pecking maintained by three FItimes since reinforcement (four pigeons).

RELATIVE TIME SINCE RFT(Fl DURATIONZ10)

schedules as a function of absolute and relative

UJ

Zz

cc

ULt)z0a-LI)LU

356

Page 31: A quantitative analyses of behavior responding by interval schedules

INTERVAL SCHEDULES OF REINFORCEMENT

TERMINAL Fl RATE

0 50 100 0 SO 100

RENFORCEMENTS PER HOUR

Fig. 17. Overall and terminal rates of key-peckingas a function of the rates of reinforcement providedby three FI schedules (four pigeons).

the local rate of reinforcement was based on

a time extending only halfway back to a pre-

vious reinforcement or opportunity for rein-forcement and halfway forward to the nextreinforcement or opportunity for reinforce-ment. In many of the VI schedules discussed,this calculation involved a time of the orderof only a few seconds, whereas reinforcementat one point in time affected rates of respond-ing over a period of the order of 2 or 3 minin the Fl 200-sec schedule.

In an Fl performance, the time since rein-forcement may function as one discriminableproperty of the many aspects of the experi-mental situation (cf. Skinner, 1938, Ch. 7).Times since reinforcement that are consis-tently correlated with nonreinforcement maycome to control low rates of responding inmuch the same way as do other stimulus prop-erties (e.g., intensity, wavelength). Discrimi-nable periods of nonreinforcement in Flschedules probably contribute to the fact thatan Fl schedule generally maintains a loweroverall rate of responding than a VI sched-ule providing the same overall rate of rein-forcement. For example, except for Pigeon121 in Fig. 1, the overall rates of respondingmaintained by arithmetic VI schedules were

consistently higher than the overall ratesmaintained by corresponding Fl schedules inFig. 17.

Despite the possibility of temporal control,the performances maintained by Fl schedulesdo not appear to be as well under the controlof time since reinforcement as the capacity of

the pigeon to discriminate durations wouldsuggest (Reynolds and Catania, 1962; Rey-nolds, 1966; Stubbs, 1968). In an Fl perform-ance, responding occurs at appreciable rateseven at times since reinforcement well beforethe opportunity for reinforcement at the endof the interval. Other factors presumably op-erate to attenuate temporal control in an Flschedule.

Fixed-interval reinforcement sets the occa-

sion for the incidental but. consistent correla-tion of responding at one time in an intervaland subsequent reinforcement at the end ofthe interval. The early responding may bemaintained by the later reinforcement (cf.Dews, 1962, who suggests that responding over

time in an FI schedule reflects a delay-of-reinforcement gradient). Such delayed rein-forcement must operate in conjunction withtemporal discrimination: the time in the in-terval at which responding occurs must beto some extent discriminated if the respond-ing is to be consistently controlled by thetime that separates it from reinforcement atthe end of the interval.The incidental properties of the FI sched-

ule may even produce effects in relatively triv-ial ways. For example, a decrease in response

rate toward the end of an interval (Pigeon 68at Fl 200-sec, Fig. 16) may have its origin inan increase in the frequency with which thepigeon looked toward the feeder as the timeapproached when a response would operatethe feeder.

Procedurally, the FI schedule is the sim-plest of the interval schedules but, paradox-ically, the variables that appear to operatein Fl schedules suggest that, in at least one

respect, Fl performance is more complex thanVI performance. The Fl schedule is at one

extreme of a continuum of schedules thatdiffer in the degree to which they allowdiscriminative control by time since rein-forcement; at the other extreme is theconstant-probability VI schedule, which sim-plifies performance by eliminating the tem-poral patterning of reinforcement as a con-

trolling variable. The implication is that,although Fl schedules show that effects of re-

inforcement extend over a considerable pe-riod of time since reinforcement, the contri-bution of Fl performance to the quantitativeanalysis of VI schedules may not be simpleand direct.

AVERAGE Fl RATE

68 a

69DO .- Mn A

237 . a

.5

50

357

12

tu1-- 10

z 7V.

imI

we2

Page 32: A quantitative analyses of behavior responding by interval schedules

A. CHARLES CATANIA and G. S. REYNOLDS

The next experiment explores the effectsof combining two FI schedules. In terms ofprocedure, such a combination produces thesimplest VI schedule, which consists of onlytwo intervals, and permits further examina-tion of the spread of the effects of reinforce-ment over the continuum of time since rein-forcement.

EXPERIMENT 5:EFFECTS OF THE SEPARATIONIN TIME OF OPPORTUNITIESFOR REINFORCEMENT INTWO-VALUED INTERVAL

SCHEDULESThis experiment addressed two separate

but related questions. First, what is the effectof one probability of reinforcement on thelocal rate of responding maintained by a sec-ond probability as the period of time thatseparates them is varied? Second, what is theeffect of the magnitude of their separationon the local rates of responding during theperiod of time between them?The experiment compared the perform-

ance maintained by a single-valued intervalschedule, Fl 240-sec, with the performancesmaintained by several two-valued intervalschedules. -One interval in the two-valuedschedules was 240-sec; the other was shorter(30, 90, 150, or 210 sec) and was presentedwith a relative frequency of 0.05 or 0.50. Forexample, in a two-valued schedule of 90-secand 240-sec intervals with a relative frequencyof 0.05 for the 90-sec interval, reinforcementwas available 90 sec after the previous rein-forcement in one of every 20 intervals and at240 sec after reinforcement in the remainingintervals. According to the terminology ofFerster and Skinner (1957, Ch. 11), this sched-ules is a mixed Fl 90-sec Fl 240-sec schedule.The present purposes, however, require aspecification not only of the durations of thescheduled intervals but also of their relativefrequencies (cf. Millenson, 1959).The proximity in time of the two oppor-

tunities in the two-valued schedules changedwhen the duration of the short interval wasvaried over the range from 30 to 210 sec.Thus, the temporal separation of the twoopportunities was necessarily confoundedwith the time since reinforcement at whichthe earlier opportunity occurred.

METHOD

Subjects and ApparatusFour adult, male, White Carneaux pigeons

were maintained at 80% of free-feeding bodyweights. The key-pecking of each pigeon hadpreviously been maintained by Fl schedulesof reinforcement for at least 40 hr.The experimental chamber was similar to

that described in Exp. 1. The schedules werearranged by stepping switches operated every10 sec by an electronic timer and reset aftereach reinforcement. The stepping switchesarranged reinforcement either after 240 sec(24 steps of the stepping switches) or, accord-ing to the scheduled occurrences of theshorter interval, at the time specified for thisinterval. Each interval began only after the4-sec reinforcement at the end of the preced-ing interval. The stepping switches alsoserved to distribute responses to 24 countersthat represented the twenty-four 10-sec pe-riods of time since reinforcement.

ProcedureThe schedules and sessions for each pigeon

are summarized in Table 4. In each two-valued interval schedule, the long intervalwas 240 sec. The table shows the duration ofthe short interval and its probability or rela-tive frequency of occurrence, in rft/op. Anentry of 240 sec indicates that no short inter-val was arranged.Each daily session consisted of 21 reinforce-

ments: reinforcement of the first response ofthe session, followed by 20 intervals. Whenthe relative frequency of the short intervalwas 0.05, a single short interval occurred ata different place in the sequence of 20 inter-vals in each session. When the relative fre-quency of the short interval was 0.50, an ir-regular sequence of short and long intervalsvaried from one session to the next. The se-quence never included more than four suc-cessive occurrences of either interval, and wasbalanced so that the relative frequencies ofthe short and long intervals were indepen-dent of the duration of the preceding inter-val.The session durations varied from about 45

min (short interval of 30 sec with a relativefrequency of 0.50, or mixed Fl 30-sec Fl 240-sec) to about 80 min (no short interval, or Fl

358

Page 33: A quantitative analyses of behavior responding by interval schedules

INTERVAL SCHEDULES OF REINFORCEMENT

Table 4Sequence of two-valued interval schedules for each pigeon. Entries show the duration (sec)and relative frequency (rft/op) of the short interval. The long interval was held constant at240 sec.

Pigeon

85 86 34 35Short Short Short Short

Interval Rft/Op Interval Rft/Op Interval Rft/Op Interval Rft/Op Sessions

240 1.00* 240 1.00 240 1.00* 240 1.00* 4030 0.50 210 0.50 30 0.05 210 0.05 5230 0.05 210 0.05 90 0.05 150 0.05 102150 0.05 30 0.05 150 0.05 150 0.50 61240 1.00* 30 0.50 210 0.05 90 0.50 37240 1.00 90 0.50 240 1.00* 90 0.05 39240 1.00* 240 1.00* 240 1.00* 240 1.00* 47

Fixed-interval schedule (FT 240-sec).

240-sec). The data presented are averages overthe last five sessions of each schedule.

RESULTSLocal rates of responding. Figure 18 shows

local rates of key-pecking as a function oftime since reinforcement for Pigeon 34 in fiveinterval schedules. The performance main-tained by the single-valued schedule (Fl 240-sec) is represented in the bottom panel (filledcircles). The performances maintained by thetwo-valued schedules, which consisted of 240-sec intervals and a shorter interval, are repre-sented in the top four panels. Tfie relativefrequency of the shorter interval, here equalto rft/op, was always 0.05 and its durationwas 30, 90, 150, or 210 sec. Reading the graphsfrom top to bottom shows the effect of mov-ing the end of the short interval from an earlytime after reinforcement up to coincidencewith the end of the 240-sec interval. Differencesin the performances were therefore due at leastin part to changes in the time separating theprobability of reinforcement of, 0.05 from theterminal probability of 1.0. These differenceswere of two sorts. First, when the probabilityof 0.05 was at 30 sec, the rate of respondingdeclined for some time before it increased asthe terminal probability of 1.0 at 240 sec wasapproached, whereas when the probability of0.05 was at 90, 150, or 210 sec, the rate ofresponding did not decline.

Second, the rate of responding maintainedby the probability of 0.05 was lower the ear-lier it occurred or, in other words, the longerthe period of time that separated it from theterminal probability at 240 sec. These two

variables, the time at which the probabilityof 0.05 occurred and its separation from theterminal probability, were confounded: thelater the probability of 0.05, the less its tem-poral separation from the terminal probabil-ity. Data from Exp. 1 and 3, however, suggestthat the temporal proximity of the terminalprobability is the more relevant variable. Inthose experiments, increases in time since re-inforcement generally produced decreases inthe rate of responding maintained by a givenprobability. In the arithmetic VI schedule,for example, the response rate maintained bythe probability of reinforcement at lt secdecreased as t increased (the left-hand pointon each function in Fig. 2, Exp. 1). Becausethis change in the local rate of respondingmaintained by a given probability withchanges in its location in time was oppositeto that obtained in the present experiment,the present findings cannot be attributedsolely to the changes in the location in timeof the probability of 0.05 in the two-valuedinterval schedules. A similar point can bemade based on the FI data of Exp. 4 (Fig. 17,terminal rate).

Additional data are shown in Fig. 19. Thedata from Pigeon 86 (left column, upper threepanels) show the effect of a probability of re-inforcement of 0.50 at various times sincereinforcement; the data from Pigeons 85 (leftcolumn, bottom panel) and 35 (right column)show the effect of a probability of 0.05 atvarious times. The responding maintained bythe two-valued interval schedules (unfilled cir-cles) is compared with that maintained by theone-valued schedule, Fl 240-sec (filled circles).

35;9

Page 34: A quantitative analyses of behavior responding by interval schedules

A. CHARLES CATANIA and G. S. REYNOLDS

RFT

a 40

30

V) 20 _w

U,

0

40

30

I0

40-

240"

0 30 60 90 120 150 ISO 210 240TIME SINCE PREVIOUS REINFORCEMENT (SEC.)

Fig. 18. Rate of key-pecking as a function of timesince reinforcement in five interval schedules (Pigeon34). With a probability (rft/op) of 0.05, a shorter in-terval was introduced into an FT 240-sec schedule atthe times since reinforcement indicated by the arrowsin the top four panels. The bottom panel shows theperformance maintained by FI 240-sec without a

shorter interval.

The local rate of responding maintained bythe probability of 0.50 (Pigeon 86) at 30 secwas approximately equal to the rate main-tained 210 sec later by the terminal probabil-ity at 240 sec. This is consistent with findingsof Exp. 1 and 3 (Fig. 3, 7, and 8) which showedthat a probability of reinforcement of 0.50maintained about the same rate of respond-ing as a probability of 1.0. The equality ofthe rates cannot be attributed to the closenessof the two probabilities in time, because theywere separated by 210 sec and because withinthese 210 sec the local rate of responding de-creased and then increased.

The local rates of responding maintainedby the probabilities of 0.50 and 1.0 were alsoabout equal when the probability of 0.50 wasat 90 sec (left column, panel b), but there waslittle if any decrease in response rate be-tween 90 and 240 sec. Finally, with the proba-bility of 0.50 at 210 sec (panel c), the perform-ance was scarcely distinguishable from thatmaintained by the Fl 240-sec schedule.The performance of Pigeon 85 partially

confirms the conclusions drawn from the per-formance of Pigeon 34 (Fig. 18). The ratemaintained by the probability of 0.05 waslower, relative to the terminal rate at 240 sec,when this probability occurred at 30 sec thanwhen it occurred at 150 sec. Even when thisprobability occurred at 30 sec, however, therate of responding did not decline during thetime between this and the terminal probabil-ity of 1.0. The performance of Pigeon 35(right column) was atypical in that the localrate of responding consistently passed througha maximum earlier than 240 sec after rein-forcement, even in the Fl 240-sec schedule.Nevertheless, the maximum in the local rateof responding maintained by the probabilityof reinforcement of 0.05 occurred later whenthis probability was moved from 90 to 150sec (panels a and b), and the performancebecame more like that maintained by the Fl240-sec schedule when the probability wasmoved to 210 sec.

Figure 20 directly compares the respondingmaintained by a probability of reinforcementof 0.05 at 30 sec after reinforcement and theresponding maintained by a probability of0.50 at the same time after reinforcement(data from Pigeon 85). Relative to the per-formance maintained by Fl 240-sec (filled cir-cles), the probability of 0.05 produced an in-crease in the local rate of responding at 30sec. This rate was considerably lower thanthat at 240 sec, when the probability became1.0. The probability of 0.50 produced a rela-tively higher response rate at 30 sec and therate increased slightly throughout the remain-der of the interval. This is consistent with thefindings of Exp. 1 and 3 (Fig. 3, 7, and 8), inthat the rate of responding maintained by aprobability of 0.05 was low relative to thatmaintained by a probability of 0.50.When the two intervals, 30 sec and 240 sec,

occurred with equal relative frequencies forPigeon 85, they maintained a performance

360

Page 35: A quantitative analyses of behavior responding by interval schedules

INTERVAL SCHEDULES OF REINFORCEMENT 361

w b~~~~~~~~~~(ObI..

20 0

10 c

0c ~~~~~~~RFTZ) 30Z 0'0~~~~~~~~~~~~~~~2'I)wI 20

1010

60

50- 85 RFT ~~~~~~~~~~5040 - P.540-

30 - -W. 0.05 ~~~~~~~~~30-

20 20

10 10

nOL2~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~.30 so 90 120 ISO ISO 210 240 0 30' 60 90 120 950 ISO 210 240TIME SINCE PREVIOUS REINFORCEMENT (SEC.

Fig. 19. Rate of key-pecking as a function of time since reinforcement (Pigeons 86, 85, and 35). Filled circlesshow the performance maintained by an FI 240-sec schedule. Unfilled circles show the performances main-tained when a shorter interval was introduced with the probabilities (rft/op) and at the times since reinforce-ment indicated by the arrows.

w 60 - 85 .

Z 20 .o r- OPw

00 30 60 90 120 50 I 210 240TIME SINGE PREVIOUS REINFORCEMENT (SEC.)

Fig. 20. Rate of key-pecking as a function of timesince reinforcement in three interval schedules (Pi-geon 85). Filled circles show the performance main-tained by an FI 240-sec schedule. Unfilled circles showthe performances maintained when an interval of 30sec was introduced with a probability (rft/op) of 0.05or 0.50 into the FI schedule.

that was, over most of the range of time afterreinforcement, similar to those maintained bysome of the VI schedules examined in Exp. 1and 3. Thus, two-valued interval schedulescan sometimes sustain responding over a con-siderable period after reinforcement as effec-tively as can many-valued (VI) schedules. Sim-ilar findings have been discussed by Millenson(1959), who examined a mixed FI 30-sec Fl120-sec schedule in which the relative fre-quency of the shorter interval was 0.40. Thesedata may indicate that different probabilitiesof reinforcement (e.g., 0.05 and 0.50) vary inthe extent to which their effects spread in time.A comparison of the probabilities of 0.05

and of 0.50 at four different times after rein-

Page 36: A quantitative analyses of behavior responding by interval schedules

A. CHARLES CATANIA and G. S. REYNOLDS

forcement is shown in Fig. 21 (data from Pi-geons 86 and 35). At 30 sec, the difference be-tween the rates of pecking maintained bythese two probabilities was considerable (Pi-geon 86, lower left panel). This pigeon's rateof responding, unlike that of Pigeon 85 inFig. 19, decreased before again increasing dur-ing the time from 30 to 240 sec. A differencein the effects of the two probabilities was alsoevident at 90 sec (Pigeon 35, upper left panel)and, to a lesser extent, at 150 sec (Pigeon 35,upper right panel). At 210 sec (Pigeon 86,lower right panel), both probabilities main-tained rates of responding about equal tothose maintained 30 sec later, at 240 sec.

Overall rates of reinforcement. The intro-duction of short intervals increased the over-all rate of reinforcement relative to the 15rft/hr provided by the single-valued Fl 240-sec schedule. The probability of 0.50 at theend of the short interval made available ratesof reinforcement ranging from 16 (short in-

so 35 RFT

80~~~~~a5

90 X OoP

70

ve0o - k0

0.

"Qz

20 /

CL 60 -(I)z A08)6

w6 RFT~

terval of 210 sec) to 26.7 (short interval of 30sec) rft/hr. The probability of 0.05 at the endof the short interval, however, made avail-able only 15.1 (short interval of 210 sec) to16 (short interval of 30 sec) rft/hr. To someextent, the changes in the overall rates of re-sponding maintained by these schedules mayhave been determined by these changes in theoverall rate of reinforcement. The changes inoverall rate of responding, however, were notconsistent with those predicted from the gen-eral form of the functions of Exp. 1 (Fig. 1)and therefore must have depended at leastin part on the changes in the distribution ofintervals in time. This is particularly the casefor the probability of 0.05, for which a changein rft/hr of only about 6% (from 15 to 16rft/hr) produced 50% to 90% increases inoverall rates of responding.

Cumulative records. The relative consist-ency of the performances maintained withinindividual intervals by VI schedules (Fig. 4

120 150 180 210 0 30 60 90

TIME SINCE PREVIOUS REINFORCEMENT (SEC.)

Fig. 21. Rate of key-pecking as a function of time since reinforcement (Pigeons 35 and 86). Filled circlesshow the performance maintained by an FT 240-sec schedule. Unfilled circles show the performances main-tained when a shorter interval (30, 90, 150, or 210 sec) was introduced with a probability (rft/op) of 0.05 or0.50 into the FI schedule.

362

Page 37: A quantitative analyses of behavior responding by interval schedules

INTERVAL SCHEDULES OF REINFORCEMENT

and 13) was not a characteristic of the Fl 240-sec and the mixed Fl Fl schedules of the pres-ent experiment. Figure 22 shows cumulativerecords of the performance of Pigeon 86 infive of the present schedules, and indicatesthat the average rates of responding illustratedin Fig. 18 through 21 are not necessarily rep-resentative of responding within individualintervals. The performance maintained by Fl240-sec, for example, in the top record of Fig.22, is fairly typical of the variability in out-put from interval to interval in Fl perform-ances as reported by Ferster and Skinner(1957), among others. The temporal pattern-ing of responding also varied considerablyfrom interval to interval. A fairly constantrate of responding was maintained through-out most of some intervals in the record, forexample, whereas a gradually increasing rateof responding was observed in other intervals.

The record illustrating the performancemaintained with a probability of 0.05 at 30sec shows that a transition from a high to alow rate followed by a return to a higher rateoccurred in most intervals of the schedule.Intervals in which this pattern was absenttended to occur early in the session. Again,the record indicates that the smoothness ofthe average curve in Fig. 21 was not repre-sentative of the performances in individualintervals. Slow changes in local rates of re-sponding were observed, as in the second fullinterval after the short interval in the illus-trative record, but rather abrupt transitionsfrom a fairly high rate to an almost zero ratefollowed by a return to a higher rate werealso fairly common, as in the next-to-last in-terval of the record.This pattern of responding, in which the

rate decreased and then increased within in-

Fig. 22. Cumulative records of full sessions of Pigeon 86's key-pecking maintained by an FI 240-sec scheduleand by four schedules in which a shorter interval (30, 90, or 210 sec) was added with a probability (rft/op) of0.05 or 0.50. The recording pen reset to baseline after each reinforcement, indicated by diagonal pips.

Fl 240-sec.

RFT/OP at 30 sec.-0.05

I i~v1~//

RFT/OP at 30 sec. = 0.50

86

RFT/OP at 90 sec.-0.50

;R-/AAA_z W0UTS

RFT/OP at 210 s.c.: 0.50~~~~~~~A- _ ~ e* <.~

363

Page 38: A quantitative analyses of behavior responding by interval schedules

A. CHARLES CATANIA and G. S. REYNOLDS

dividual intervals, also occurred when theprobability at 30 sec was raised to 0.50, asillustrated by the third cumulative record.Within this schedule, however, the perform-ance from interval to interval seemed some-what more variable and, again, the temporalpatterning was typically absent in the earlyintervals of the session.When the probability of 0.50 was moved to

90 sec (fourth record), a fairly uniform rateof responding was maintained within each in-terval, consistent with the average data pre-sented in Fig. 19 (86b). Although this patternof responding was fairly regular, the particu-lar rate of responding maintained through-out each interval tended to vary from inter-val to interval.When the probability of 0.50 was moved to

210 sec (fifth record), the performance becamemore like that maintained by Fl 240-sec (seeFig. 19, 86c, and Record 1 of Fig. 22), withperhaps somewhat more variability in thetotal output from interval to interval thanwas maintained by the FI schedule.

DISCUSSIONWhen the two probabilities of reinforce-

ment in the two-valued schedules were sepa-rated by a considerable period of time, theprobability of 0.05 at the end of the short in-terval maintained lower local rates of re-sponding than did the probability of 1.0 atthe end of the long interval. The differencebetween the local rates maintained by the twoprobabilities became smaller as the two prob-abilities moved closer in time. The probabil-ity of 0.50 at the end of the short interval, onthe other hand, maintained about the samelocal rate of responding as the later probabil-ity of 1.0 even when the temporal separationof the two probabilities was large. The find-ings are consistent with the effects of differentprobabilities of reinforcement in the VIschedules of Exp. 1 and 3. The exceptions tothese generalizations again demonstrate theconsistency of individual differences amongpigeons. For each of the schedules studiedwith Pigeon 35, for example, the local rate ofresponding passed through a maximal valueat some time before the end of the 240-secinterval (Fig. 19 and 20), a characteristic ofperformance not noted for the other pigeons.To some extent, the performances main-

tained by the present schedules can be

considered simple combinations of the per-formances separately maintained by the com-ponent Fl schedules. Compare, for example,Pigeon 69 (Fig. 16) and Pigeon 86 (Fig. 19).whose performances on FI 200-sec and FI 240-sec, respectively, were similar in both abso-lute level and the temporal patterning of re-sponding. Data for Pigeon 69 show local ratesof responding maintained separately by inter-vals of 30 and 200 sec (Fig. 16, left); data forPigeon 86 show local rates of respondingmaintained by almost the same intervals, 30and 240 sec, in combination (Fig. 19, upperleft). The agreement between the two sets ofdata is fairly good, when it is considered thatreinforcement at 30 sec was arranged with aprobability of 1.0 for Pigeon 69 and 0.50 forPigeon 86, and that the FI 30-sec schedule forPigeon 69 necessarily prohibited respondingafter 30 sec since reinforcement. Some differ-ences include the lower local rates of respond-ing shortly after reinforcement in the FI30-sec schedule for Pigeon 69 than in the two-valued schedule for Pigeon 86, and the some-what larger difference between the terminalrates in the two FI schedules for Pigeon 69than between the rates at the end of the twointervals in the schedule for Pigeon 86 (cf.,however, the later performance of Pigeon 86in the same schedule: Fig. 23, Exp. 6). Greaterdisagreement can be found by making thesame kind of comparison between the FI datafor Pigeon 69 and the corresponding data forPigeon 85 (Fig. 20, rft/op = 0.50). Pigeon 85differed from Pigeon 86 primarily in thehigher local rates of responding maintainedbetween the two opportunities for reinforce-ment in the two-valued schedule.The relevance of the present findings to the

analysis in terms of local rates of reinforce-ment in Exp. 3 lies mainly in their indicationof the extensive period of time since reinforce-ment over which a particular probability ofreinforcement can have its effect (cf. Exp. 4),and of the degree to which a later high prob-ability of reinforcement can influence the lo-cal rate of responding maintained by an earlylow probability of reinforcement (e.g., rft/op = 0.05). The calculation of local rates ofreinforcement described in Exp. 3 (Fig. 6)contributes little to the analysis of the two-valued schedules. By this calculation, the lo-cal rate of reinforcement remains constant atthe end of the short interval and increases

364

Page 39: A quantitative analyses of behavior responding by interval schedules

INTERVAL SCHEDULES OF REINFORCEMENT

about seven-fold at the end of the long inter-val as the short interval is moved from 30 to210 sec since reinforcement. This inconsist-ency with the observed local rates of respond-ing in the schedules again demonstrates thelimited applicability of the method of calcu-lating local rates of reinforcement.

EXPERIMENT 6:EFFECTS OF THE OMISSION OF

REINFORCEMENT AT THE END OFTHE LONG INTERVAL IN TWO-VALUED INTERVAL SCHEDULESExperiment 5 suggested that the respond-

ing at and near the end of the short intervalin two-valued interval schedules is main-tained not only by reinforcement at the endof the short interval but also by reinforce-ment at the end of the long interval. Thepresent experiment examined the role of thelong interval in two-valued interval schedulesby substituting timeout, an event that gen-

erally does not serve as a reinforcer, for rein-forcement at the end of the long interval.

METHODThe two-valued Fl schedules of Exp. 5

were modified by substituting a 4-sec periodof timeout (no key light or house light) forthe 4-sec reinforcement at the end of the 240-sec interval. Reinforcement remained avail-able for the first response of each session andat the end of the short interval. Table 5 sum-

marizes the procedure and indicates the dura-tion and relative frequency of the short inter-vals for each pigeon. When reinforcement was

available at 240 sec (conditions 1 and 4 inTable 5), details were the same as in Exp. 5;

each session consisted of 21 reinforcements.With timeout substituted for reinforcementat 240 sec (conditions 2 and 3 in Table 5),sessions consisted of 20 intervals: two rein-forcements per session when the short inter-val occurred with a relative frequency of 0.05(reinforcement of the first response of the ses-

sion and at the end of the single short inter-val), and about 11 reinforcements per sessionwhen the short interval occurred with a rela-tive frequency of 0.50 (reinforcement of thefirst response of the session and at the end ofabout 10 short intervals).During the first 25 sessions of the second

condition, timeout was produced by the firstresponse after the end of the 240-sec interval.Thereafter, timeout occurred independentlyof responses at the end of the 240-sec interval.

RESULTSFor each pigeon, Fig. 23 shows the local

rates of responding maintained by a prob-ability of reinforcement of 0.50 when a re-sponse was reinforced with a probability of1.0 at 240 sec (filled circles) and when timeoutoccurred at 240 sec (unfilled circles). Theschedules with reinforcement at 240 sec main-tained performances roughly comparable tothose maintained by the equivalent schedulesin Fig. 5. One exception, in the performanceof Pigeon 86, was that the local rate of re-

sponding maintained by the probability of0.50 at 30 sec was considerably higher thanthe rate maintained by the higher probabilityof 1.0 at 240 sec (Fig. 19, 86a).The substitution of a 4-sec timeout for the

4-sec reinforcement had only small effects onlocal rates of responding. For all pigeons, thelocal rate of responding immediately after

Table 5Sequence of interval schedules for each pigeon. Entries show the relative frequency (rft/op)of the short interval (in sec). The terminal event at 240 sec was either reinforcement of aresponse (Rft) or a 4-sec timeout (TO).

Pigeon 85 86 34 35Short Interval 30 30 90 90Terminal Event

(240-sec) Rft/Op Rft/Op Rft/Op Rft/Op Sessions

1. Rft 0.05 0.50 0.05 0.50 442. TO 0.05 0.50 0.05 0.50 57*3. TO 0.50 0.05 0.50 0.05 544. Rft 0.50 0.05 0.50 0.05 41

The 4-sec timeout was response-dependent during the first 25 sessions and response-independent thereafter.

365

Page 40: A quantitative analyses of behavior responding by interval schedules

A. CHARLES CA TANIA and G. S. REYNOLDS

LU ~~~85 34/z 0 ~~~ItZ. 'O 0.

, .

LU *RFj/0p z 1.0 at 240-s.c.u 0 0 TO at 240-s.c.

0-60.

40-

20

86 350~0 50 100 150 200 0 50 100 150 200

TIME N NTERVAL. (SEC.)Fig. 23. Rate of key-pecking as a function of time since the start of an interval, for four pigeons. In one

schedule (filled circles), reinforcement was available at 240 sec. In the other schedule (unfilled circles), a re-sponse-independent 4-sec timeout (TO) was presented at 240 sec. In both schedules, reinforcement was avail-able with a probability (rft/op) of 0.50 at either 30 (Pigeons 85 and 86) or 90 (Pigeons 34 and 35) sec after thestart of an interval.

timeout (early times in each interval) washigher than the rate in the equivalent sched-ules with reinforcement at 240 sec. Localrates of responding after the end of the shortinterval (rft/op = 0.50) were fairly similar inthe two types of schedules. One factor thatmay have contributed to the small effect ofsubstituting timeout for reinforcement wasthat even with timeout at 240 sec reinforce-ment occasionally followed 30 or 90 sec later(in the short interval). This does not seem toaccount for the higher local rates early in in-tervals, however, because rates of respondingimmediately after timeout were, for Pigeons86, 34, and 35, lower than local rates of re-sponding immediately preceding timeout(initial and terminal local rates). An alternatepossibility is that timeout did not fully ac-quire control as a temporal reference point

for subsequent responding within intervals,so that the substitution of timeout for rein-forcement produced higher local rates of re-sponding early in intervals. In the precedingexperiments, the durations of intervals hadbeen timed consistently from a preceding re-inforcement. The effects, however, were evi-dent in the performances of Pigeons 85 and34, for which the data presented are based onover 100 sessions with timeout at 240 sec, aswell as in the performances of Pigeons 86 and35. Finally, since timeout is sometimes fol-lowed by relatively high rates of respondingin interval schedules, the rise in local rateearly in intervals may reflect a direct effect oftimeout on subsequent responding (e.g., Fer-ster, 1958; Neuringer and Chung, 1967).

Figure 24 compares, for each pigeon, the per-formances maintained with reinforcement at

366

Page 41: A quantitative analyses of behavior responding by interval schedules

INTERVAL SCHEDULES OF REINFORCEMENT

240 sec (unfilled circles) and with timeout at240 sec (unfilled circles) when the probabilityof reinforcement at the end of the short inter-val was 0.05. Again, with the exception of theelevated local rate of responding at 30 secfor Pigeon 86, the performances maintainedby the schedules with reinforcement at 240sec were roughly comparable to the equiva-lent performances -in Exp. 5. When timeoutwas substituted for reinforcement at 240 sec,however, the schedules lost control over thedistribution of responses in time. In otherwords, a relatively low and constant local rateof responding was maintained, independentof the time elapsed in the interval. For Pi-geons 85 and 34, the local rate of respondingwas about equal to the local rate maintainedby the probability of 0.05 with reinforcementat 240 sec. For Pigeons 86 and 35, the localrate of responding was considerably lower

80 .

RFT p '1.0 at 240-sec. 85

60 o TO at 240-sec.

40-

I- '%0.05

Z20 I

ce.

6~~~~~~8

ECI

than the local rate maintained by the prob-ability of 0.05 with reinforcement at 240 sec.The low local rates of responding, however,do not represent a stable and relatively con-tinuous low rate of responding, but rather anaverage over higher rates of responding alter-nating irregularly with long periods of no re-sponding. Thus, the probability of 0.05, un-like the probability of 0.50, was less effectivein maintaining responding when the higherprobability of reinforcement at a later timewas removed. In other words, the rate of re-sponding maintained by the probability of0.05 probably was supported in part by theprobability of 1.0 at 240 sec.

DISCUSSION

When reinforcement was eliminated at 240sec, the temporal pattern of responding wasmaintained when the earlier probability was

0 50 100 150 200 0 50 100 150 200

TIME N NTERVAL (SEC.)Fig. 24. Rate of key-pecking as a function of time since the start of an interval, for four pigeons. Same as

Fig. 23, except that reinforcement at either 30 (Pigeons 85 and 86) or 90 (Pigeons 34 and 35) sec after the startof the interval was available with a probability (rft/op) of 0.05.

367

Page 42: A quantitative analyses of behavior responding by interval schedules

A. CHARLES CATANIA and G. S. REYNOLDS

0.50, but not when the earlier probability was0.05. It is possible that the temporal patterncould have been maintained by the probabil-ity of 0.05 under different circumstances. Theelimination of reinforcement at 240 sec notonly changed the stimulus from which inter-vals were timed, but also, when the earlierprobability was 0.05, drastically decreased theoverall rate of reinforcement. Respondingmight have been maintained more consist-ently at and near the end of the short inter-val if the probability at 240 sec was graduallyrather than abruptly reduced from 1.0 to zeroor if, in the absence of reinforcement at 240sec, a high probability at the end of the shortinterval (e.g., 0.90) was gradually reduced to0.50 and then to 0.05. It may also be rele-vant that the number of intervals and num-ber of reinforcements per session were rela-tively small, although the present results wereobtained over a reasonably large number ofsessions (cf. Method).The finding that reinforcement at the end

of a long interval may support the respond-ing maintained by earlier opportunities forreinforcement has implications for the analy-sis of interval schedules in terms of local ratesof reinforcement. The calculation of localrates of reinforcement should not be limitedonly to the opportunities for reinforcementwithin a particular schedule. Instead, the lo-cal rate of reinforcement at any time sincereinforcement should be based on the oppor-tunities that occur over an extended periodof time, with the probabilities of reinforce-ment at the different opportunities weightedas a function of their proximity to the timein question. The period of time over whichopportunities for reinforcement contribute tothe local rate of reinforcement at a particulartime probably should grow as a function ofthe absolute time since reinforcement. Localrates of reinforcement calculated for succes-sive points of time since reinforcement, there-fore, would be a kind of moving weighted av-erage of the probabilities of reinforcementover successive overlapping ranges of time.Such a calculation would take into accountsome of the properties of the interaction ofdifferent probabilities of reinforcement anddifferent times since reinforcement exploredin Exp. 4, 5, and 6. But the formulation ofthe quantitative details, their application tothe VI schedules of the earlier experiments,

and the coordination of the overall-rate func-tions from VI schedules with the local ratesin Fl and two-valued interval schedules arebeyond the scope of this paper. In the presentresearch, different pigeons served in differentexperiments, and the magnitude of the indi-vidual differences among pigeons suggeststhat such an analysis would not stand up wellto comparisons across pigeons. For the pres-ent, then, the formulation in Exp. 3 must beconsidered a first approximation with its ap-plication limited to many-valued intervalschedules.

GENERAL DISCUSSIONThe present experiments examined the ef-

fects on responding of a variety of character-istics of interval schedules of reinforcement.Experiment 1 established that the overallrate of responding maintained by arithmeticVI schedules was a monotonically increasing,negatively accelerated function of the overallrate of reinforcement (Fig. 1). The local rateof responding at a given time since reinforce-ment was, correspondingly, a monotonicallyincreasing, negatively accelerated function ofthe probability of reinforcement at that time(Fig. 3). Experiments 2 and 3 examined VIschedules with different distributions of in-tervals. Experiment 2 demonstrated that re-inforcement of a response immediately aftera preceding reinforcement affected respond-ing over only a relatively short period of timesince reinforcement. In Exp. 3, two VI sched-ules with extra short intervals arrangedvarious probabilities of reinforcement at afixed time early in interreinforcement inter-vals (Fig. 7 and 8), and a constant-probabilityVI schedule arranged a fixed probability atvarious times within interreinforcement in-tervals (Fig. 11 and 12). These schedulesdemonstrated that the effect of a given prob-ability of reinforcement could be indepen-dent of the time since reinforcement at whichit occurred, but another schedule in Exp. 3,the "linear" VI schedule, suggested that theeffect of a given probability also depended onits proximity in time to other probabilities.The "linear" VI schedule separated differentprobabilities widely enough in time to changethe relationship between rate of respondingand probability of reinforcement (Fig. 9 and10). These effects, plus data in the literature

368

Page 43: A quantitative analyses of behavior responding by interval schedules

INTERVAL SCHEDULES OF REINFORCEMENT

on geometric and Fibonacci VI schedules, ledto the conclusion that responding is not sim-ply controlled by the probability of reinforce-ment at a particular time within an interval,but rather by the probability taken over aperiod of time or, in other words, by the localrate of reinforcement. Limitations on a pre-liminary formulation of the control by localrates of reinforcement were indicated by Exp.4, 5, and 6. Experiment 4 examined fixed-interval schedules, Exp. 5 showed in detailthe combined effects of two probabilities ofreinforcement as a function of their valuesand their separation in time in two-valuedschedules (mixed Fl FI), and Exp. 6 demon-strated that reinforcement at the end of thelongest interval in a two-valued schedule sup-ported the responding maintained by an ear-lier opportunity for reinforcement. These ex-periments suggested that the period of timewithin which reinforcement could contributeto a particular local rate of responding waslarge relative to the time since reinforcement.The spread of the effect of reinforcement atone time since reinforcement to local rates ofresponding at other times could be inter-preted in terms of a gradient of temporal gen-eralization. The performance maintained byan FI schedule may reflect such a gradient,but by its nature the FI schedule can provideonly one side of such a gradient: up to thetime at which reinforcement is made avail-able but not beyond that time. The elimina-tion of reinforcement at the end of the longinterval in the two-valued schedules of Exp.6 might have provided, but in fact did notprovide, complete gradients (see the perform-ance of Pigeon 86 in Fig. 19, upper left, fora suggestive example of a gradient of respond-ing around 30 sec since reinforcement inExp. 5).

Despite the ubiquitous individual differ-ences among pigeons, the monotonically in-creasing, negatively accelerated form of theinput-output function for interval scheduleswas consistent with many aspects of the datafrom the several experiments. The form ofthe function implies that the overall rate ofresponding maintained by a particular over-all rate of reinforcement may be criticallydetermined by the distribution of intervals ina schedule. The overall rate of responding isa weighted average of local rates of respond-ing, and local rates of responding depend on

local rates of reinforcement within intervals.If the distribution of intervals in a scheduleis changed while the overall rate of reinforce-ment is held constant, the decrease in the lo-cal rate of reinforcement at one time after re-inforcement must be accompanied by anincrease at some other time after reinforce-ment. Because the local rate of responding isa negatively accelerated function of the localrate of reinforcement, the decreased local rateof reinforcement at one time will not neces-sarily be compensated, in rate of responding,by the increased local rate of reinforcementat some other time.The dependence of overall rate of respond-

ing on the distribution of intervals is mosteasily demonstrated by the comparison of Fland VI schedules, as illustrated in Fig. 25.The FI schedule includes discriminable pe-riods of time during which the local rate ofreinforcement, as inferred from performance,is at or near zero (e.g., the responding of Pi-geon 34 between 0 and 90 sec in the FI 240-sec schedule: Fig. 18). Such performance,which results in a large proportion of timewhen low rates of responding occur duringeach interval, produces an overall rate ofpecking lower than that maintained by aschedule that provides no discriminable pe-riods of nonreinforcement (e.g., the constant-probability VI schedule).The overall rates of responding maintained

by VI schedules are higher than those main-tained by FI schedules that provide the sameoverall rate of reinforcement, except at 1800rft/hr (VI or FI 2-sec), when the schedulesapproach continuous reinforcement andwhen responding is more appropriatelytreated as a series of latencies from reinforce-ment than as a rate. (A reversal may also oc-cur at very low rates of reinforcement. Forexample, Fl 24-hr may maintain higher ratesof responding than VI 24-hr. Cf. Morse, 1966).It seems reasonable to assume that therandom-interval (constant-probability) VIschedule, in which the correlation betweenprobability of reinforcement and time sincereinforcement is minimal, and the Fl sched-ule, in which the correlation between prob-ability of reinforcement and time since rein-forcement is maximal, represent the full rangeof overall rates of responding, at each overallrate of reinforcement, that can be maintainedby interval schedules of reinforcement. The

369

Page 44: A quantitative analyses of behavior responding by interval schedules

A. CHARLES CATANIA and G. S. REYNOLDS

random-interval and arithmetic VI data arein fair agreement, suggesting that effects ofthe distribution of intervals on the overallrate of responding are small provided thatopportunities for reinforcement are reason-ably closely and uniformly spaced along thecontinuum of time since reinforcement. Onthe basis of Fig. 25, it is not possible to saywhether or not the VI and Fl functions havethe same form, and the form of the Flfunction would in any case depend on theway in which it is a derivative of the morefundamental function relating local ratesof responding and local rates of reinforce-ment.

LU

D8(z

Cjg 6(LU

V)

LI)z0L/)LLCy-

/ --'~~~--/ --

/_'- _ _ _ _. ._ _

APPENDIX I:ANALYSIS IN TERMS OFINTERRESPONSE TIMES

A number of accounts of the performancesmaintained by interval schedules of reinforce-ment have been concerned with the differen-tial reinforcement of interresponse times, orIRTs (Skinner, 1938; Newman and Anger,1954; Anger, 1956; Morse, 1966; Shimp, 1967).This section relates the present findings toIRT analyses in a treatment that is an alter-native to, but is not necessarily incompatiblewith, the treatment developed in the mainbody of the paper.

* PRESENT Fl DATA0 PRESENT VI DATA, SCHOENFELD & CUMMING 0960):Fl* FARMER (1963): F Io FARMER 0963):VI (P-0.S0)O FARMER (1963: VI Pz 0.25)

I Vi500 "f 1800

REINFORCEMENTS PER HOURFig. 25. Rates of pigeons' key-pecking as a function of the rates of reinforcement provided by Fl and VI

schedules. The present FI data are averaged across four pigeons (Fig. 17). The present VI data are averagedacross six pigeons (Fig. 1); only the three VI schedules providing the highest rates of reinforcement were com-mon to all six pigeons, so the average rates of responding on these three schedules were determined first, andthe other rates were averaged only after they had been multiplied by a constant to adjust for the differencesbetween birds in absolute levels maintained by the three common schedules. The FI data presented by Schoen-feld and Cumming (1960) come from different groups of two to four pigeons in different experiments (Hearst,1958; Cumming and Schoenfeld, 1958; Clark, 1959), in all of which intervals were timed from the end of thepreceding interval, rather than from the preceding reinforcement, and in which reinforcement, once arranged,was held available only for a time equal to the duration of the Fl (limited hold). The FT and VI data fromFarmer (1963) are averages across either two or three different pigeons at each point. Farmer arranged ran-dom-interval schedules (cf. Discussion, Exp. 3, or Appendix II} in which the probability of reinforcement, P,in each recycling time interval, T, was 1.0 (FI) or a lower value (VI). In each of Farmer's groups, T-was con-stant and P was varied. Data were selected from Farmer's groups so that P was constant and T varied and there-fore the distributions of intervals were comparable within each set of connected points.

370

Page 45: A quantitative analyses of behavior responding by interval schedules

INTERVAL SCHEDULES OF REINFORCEMENT

An IRT is the time separating two consecu-tive responses; the first response initiates theIRT, and the second terminates it (techni-cally, the boundaries of an IRT should bedefined in terms of response onsets, but re-sponse durations will be assumed negligiblefor the present purposes). An IRT is said tobe reinforced when the response that termi-nates it is reinforced. For a given IRT, there-fore, the probability of reinforcement is theprobability that responses will be reinforcedwhen they terminate an IRT of this duration.Within schedules of reinforcement, IRTs

and latencies are sometimes not distinguished,but the distinction may be important. Thefirst response after reinforcement terminatesa latency, timed from the end of the rein-forcement. This response does not terminatean IRT but, so long as it is not itself rein-forced, it initiates the IRT terminated by thenext response. In other words, two consecu-tive responses define the temporal boundariesof an IRT only if the first of the two responsesis not reinforced. This logical distinction isconsistent with the interpretation of rein-forcement as an event that not only maintainsresponding but also provides a discriminativestimulus for subsequent responding. It mayalso have a bearing on the special character-istics of the earliest times after reinforcement(see Exp. 2).

The Probability ofReinforcement for an Interresponse Time

Figure 26 illustrates the probabilities withwhich different IRTs are reinforced in sev-eral schedules of reinforcement (cf. Anger,1956, p. 152; Morse, 1966, p. 69). To empha-size differences among the schedules, the fig-ure shows a considerable range of IRTs; inpractice, the left-most portion is usually themost relevant, because the longer IRTs oc-cur relatively infrequently in most sched-ules.

In an FI schedule, the probability of rein-forcement varies linearly with IRT, reaching1.0 at a duration equal to the fixed interval.Consider, for example, a 50-sec IRT in theFl 100-sec schedule illustrated in Fig. 26. ThisIRT will be reinforced only if it begins dur-ing the last 50 sec of the 100-sec interval, andits probability of reinforcement is therefore0.5. No IRT can begin 100 or more sec afterreinforcement in this schedule, because the

response that would initiate such an IRTwould necessarily be reinforced.This calculation of probabilities of rein-

forcement in an Fl schedule assumes that, forany IRT, its distribution of starting times inthe interval is uniform or rectangular, or, inother words, the probability that a given IRTwill occur is independent of the time sincereinforcement. This assumption usually is notsatisfied within Fl performances; for exam-ple, no 50-sec IRT would ever be reinforcedif 50-sec IRTs never began after the first 25sec of the 100-sec interval. The probabilitiesof reinforcement in Fig. 26, therefore, may beconsidered relative frequencies only with re-spect to all possible starting times of eachIRT, and not necessarily with respect to ac-tual relative frequencies in a particular Flperformance. This observation imposes limi-tations on the present treatment, as discussedbelow, and indicates the importance of com-paring recorded distributions of all IRTswith recorded distributions of reinforcedIRTs; such data are not available for thepresent experiments.When the availability of Fl reinforcement

is limited to a specified period of time (lim-ited hold), the function relating probabilityof reinforcement to IRTs is altered for allIRTs longer than the limited hold, as illus-trated in Fig. 26 by the 20-sec limited holdadded to FI 100-sec. In that schedule, anyIRT between 20 and 100 sec long will be re-inforced only if it begins within a particular20-sec period of time within the 100-sec in-terval; for these IRTs, therefore, the prob-ability of reinforcement is 0.20. Any IRT ofmore than 120 sec cannot end before the lim-ited hold is over and so cannot be reinforced.(The effects of reinforcement available in thenext interval, after the limited hold is over,have been ignored in this computation.)The effect of a limited hold on perform-

ance is similar to the effect of a ratio schedule(Ferster and Skinner, 1957; Morse, 1966), andin a ratio schedule as in an interval schedulewith limited hold, the probability of rein-forcement is constant over a considerablerange of IRTs. The variable-ratio and fixed-ratio schedules in Fig. 26 show that whenevery tenth response on the average is rein-forced (VR 10) or when exactly every tenthresponse is reinforced (FR 10), the probabil-ity of reinforcement is 0.10 and is indepen-

371

Page 46: A quantitative analyses of behavior responding by interval schedules

A. CHARLES CATANIA and G. S. REYNOLDS

dent of IRT (assuming numerical positionsof IRTs within the ratio can be ignored).An Fl schedule provides differential rein-

forcement for long IRTs, in that the prob-ability of reinforcement is higher for longthan for short IRTs. Such differential rein-forcement is arranged more explicitly in aDRL (differential-reinforcement-of-low-rate)schedule. In the DRL schedule illustrated inFig. 26, the probability of reinforcement iszero for IRTs shorter than 100 sec and 1.0for IRTs equal to or longer than 100 sec.

ID

0O.5 -?

0FlO-sec. LH 20-n-z VR rFR 10

0.

I-~~~~~~~~~~~~~~~~~~~Sc

0*

50 100 150 200INTERRESPONSE TIME (SEC.)

Fig. 26. Probability of reinforcement as a functionof interresponse time in several schedules of reinforce-ment. The upper frame shows the functions for fixed-interval (Fl 100-sec), fixed-interval with limited hold(FT 100-sec LH 20-sec), variable-ratio (VR 10) andfixed-ratio (FR 10), and reinforcement of all responsesterminating interresponse times that exceed a mini-mum value (DRL 100-sec). The lower frame showsthe functions for three different types of VI schedules:arithmetic, constant-probability, and geometric.

In VI schedules, the relationship betweenIRTs and their probabilities of reinforcementdepends on the distribution of interreinforce-ment intervals. Figure 26 (bottom frame)shows illustrative functions for three VIschedules with roughly equal mean intervals:an arithmetic VI schedule (11 intervals from0 to 200 sec, with an additive constant of 20sec); a geometric VI schedule (10 intervalsfrom 1 to 512 sec, with a multiplicative con-stant of 2); and a constant-probability VIschedule (in which, at the end of successive10-sec periods of time since reinforcement, re-inforcement is scheduled with a probabil-

ity, rft/op, of 0.10; cf. Exp. 3 and AppendixII).The probabilities of reinforcement for

IRTs in the arithmetic and geometric sched-ules in Fig. 26 were calculated by dividing allstarting times of an IRT such that the IRTwould be reinforced by all possible startingtimes of the IRT. Consider, for example, a10-sec IRT in the arithmetic schedule. ThisIRT cannot occur in the 0-sec interval, inwhich the first response after reinforcement isreinforced. It will be reinforced if it beginsduring the last 10 sec of any of the 10 otherintervals, from 20 to 200 sec; the sum of allreinforced starting times, therefore, is 100sec. The IRT can begin at any time withinan interval; the sum of all possible startingtimes, therefore, is the sum of all intervals, or0 + 20 + 40 + ... + 200 = 1100 sec. Thus, theprobability of reinforcement for the 10-secIRT is 100 sec divided by 1100 sec, or 0.091.

Correspondingly, the reinforced startingtimes of a 10-sec IRT in the geometric sched-ule consist of the 1-, 2-, 4-, and 8-sec intervalsplus the last 10 sec of each of the six longerintervals, or 75 sec; all possible starting timesconsist of the sum of the intervals, or1+ 2 + 4 +... + 512 = 1023 sec. Thus, theprobability of reinforcement for the 10-secIRT is 75 sec divided by 1023 sec, or 0.073.The constant-probability schedule, as spe-

cified, does not consist of a finite number ofintervals over which all reinforced startingtimes and all possible starting times of anIRT can be summed. The probabilities of re-inforcement for IRTs, however, can be de-rived from the probability of reinforcement(rft/op) of 0.10 at the end of each 10-sec pe-riod of time since reinforcement. For exam-ple, any 10-sec IRT must end at or after theend of one 10-sec period and before the endof a second 10-sec period. Its probability ofreinforcement, therefore, is 0.10. On the as-sumption of a uniform distribution of start-ing times for each IRT, the probability ofreinforcement for all IRTs of less than 10sec increases linearly with IRT from 0 to0.10. A 5-sec IRT, for example, can begin dur-ing the first 5 sec or the last 5 sec of a given10-sec period, and its probability of reinforce-ment, therefore, is 0.50 times 0.10, or 0.05.For IRTs longer than 10 sec, the probabil-

ity that reinforcement had become availableat the end of either of two consecutive 10-sec

372

Page 47: A quantitative analyses of behavior responding by interval schedules

INTERVAL SCHEDULES OF REINFORCEMENT

periods must be taken into account. For a 20-sec IRT, for example, this probability is 0.19;the probability of 0.10 at the end of the first10-sec period plus the conditional probability,0.90 times 0.10, at the end of the second 10-sec period. Again, the probabilities of rein-forcement increase linearly, from 0.10 for a

10-sec IRT to 0.19 for a 20-sec IRT. (If a 10-sec limited hold were added to the schedule,as in the random-interval schedules of Farmer,1963, the probability of reinforcement wouldremain constant at 0.10 for all IRTs longerthan 10 sec).A comparable procedure for calculating

probabilities of reinforcement for IRTs couldhave been used for the arithmetic and geo-

metric schedules, but would have been more

complicated because of the different probabil-ities of reinforcement (rft/op) at each oppor-

tunity in those schedules. The VI functionsin Fig. 26 appear to be smooth curves, buteach is actually made up of linear segments.Changes in slope occur at IRTs equal to thedurations of the intervals in a given schedule.For IRTs up to about 40 sec, the probabil-

ities of reinforcement in Fig. 26 are slightlyhigher in the constant-probability schedulethan in the arithmetic schedule. For longerIRTs, the probabilities become higher in thearithmetic than in the constant-probabilityschedule. The probabilities in the geometricschedule are consistently the lowest. As men-

tioned above, the shorter IRTs are mostsignificant in analyzing performance becausethe longer IRTs occur relatively infrequently.The comparisons in Fig. 26 may imply that

overall rates of responding maintained by a

given overall rate of VI reinforcement shouldbe slightly higher in constant-probabilityschedules than in arithmetic schedules andlowest in geometric schedules, but they ne-

glect the possible role of different startingtimes of IRTs. Thus, they do not contributeto an account of how local rates of respond-ing increase with time since reinforcement inarithmetic schedules, decrease in geometricschedules, and remain roughly constant inconstant-probability schedules (Exp. 3).

The Probability of Reinforcement forInterresponse Times as a Function ofTime Since ReinforcementFor an IRT that begins within a particu-

lar period of time since reinforcement in an

interval schedule, its probability of reinforce-ment is given by two independent probabili-ties: the probability that the IRT will endat or after an opportunity for reinforcementand the probability of reinforcement at thatopportunity (rft/op). The former can be cal-culated on the assumption of a uniform dis-tribution of starting times of the IRT withinthe period of time considered; the latter canbe calculated from the distribution of inter-vals in the schedule.With time since reinforcement as a parame-

ter, Fig. 27 shows the probability of rein-forcement for IRTs in the three VI schedulesand the Fl schedule of Fig. 26. The periodsof time represented for the arithmetic andgeometric VI schedules are those between suc-cessive opportunities for reinforcement.

In the constant-probability VI schedule,the probability of reinforcement at each op-portunity is, by definition, independent ofwhether reinforcement became available atthe previous opportunity. The probabilitiesof reinforcement for IRTs, therefore, also re-main independent of time since reinforce-ment. (For the constant-probability scheduleillustrated, this is true as long as the probabil-ities are calculated over periods of at least10 sec. For example, if the first and second 5sec after reinforcement were considered sepa-rately, the function for the first 5 sec wouldbe displaced to the right in Fig. 27, to beginat an IRT of 5 sec; the function for the sec-ond 5 sec would be the same as that in thefigure.)Within each period of time in the arith-

metic schedule, the probability of reinforce-ment increases linearly from 0 to 1.0 withincreasing IRT. The later the time since re-inforcement, the steeper the function. In otherwords, for a given IRT (vertical cut throughthe functions), the probability of reinforce-ment increases as time passes since reinforce-ment. Or, for a given probability of reinforce-ment (horizontal cut through the functions),the IRT reinforced with that probabilitybecomes shorter as time passes since reinforce-ment.The broken line superimposed on the arith-

metic-schedule functions shows the effect ofadding three extra 20-sec intervals to theschedule. From 0 to 20 sec after reinforce-ment, the probabilities of reinforcement forIRTs up to 20 sec long become almost as high

373

Page 48: A quantitative analyses of behavior responding by interval schedules

A. CHARLES CATANIA and G. S. REYNOLDS

ui0.5-j-'l

~ ~ ~ ~ ~ ~ ~ ~ ~

205~~~~~20sirmoX eIHM

IF 100- .-

0 100 20

INTERRESPONSE TIME (SEC.)

Fig. 27. Probability of reinforcement as a function

of interresponse tiw;e, with periods of time since rein-

forcement as a parameter, in four different interval

schedules: constant-probability, arithmetic, and geo-

metric VI schedules and an FI schedule. The dotted

line in the tecond frame shows the effect of adding

extra short intervals to the arithmetic VI schedule.

Details in text.

as the later probabilities of reinforcement for

these IRTs within the period from 140 to

160 sec after reinforcement.

In the geometric schedule, the functions for

most periods of time since reinforcement are

concave downward, and are lower the longer

the time since reinforcement (the function for

256-to-512 sec, omitted from Fig. 27, is a

straight line; up to an IRT of 128 sec, it

corresponds to the function for 32-to-64 sec).

In general, the probability of reinforcementfor a given IRT decreases as time passes sincereinforcement, or the IRT reinforced with a

given probability becomes longer as time

passes since reinforcement.

For successive 20-sec periods of time sincereinforcement in an Fl 100-sec schedule, thefunctions are linear and parallel. Within thefirst 20 sec after reinforcement, for example,the probability of reinforcement is 0 for allIRTs of less than 80 sec. The probabilitythen rises linearly to 1.0 for an IRT of 100sec (cf. Morse, 1966, Fig. 3 and 4, pp. 70-71).The schedules in Fig. 27 are illustrative,

but the directions of change in the probabili-ties of reinforcement for IRTs as time passessince reinforcement are characteristic of thefour classes of interval schedules. The choiceof the periods of time over which probabili-ties were calculated was based in part on easeof computation, but also the further subdivi-sion of periods of time within the arithmeticand geometric schedules would have pro-duced functions in which the probability ofreinforcement was zero for some range ofIRTs. In the geometric schedule, for exam-ple, the subdivision of 128-to-256 sec into twoequal periods would have produced a func-tion in which, for 128-to-192 sec, the prob-ability of reinforcement was zero for all IRTsless than 64 sec. This schedule, however,would probably maintain responding at amoderate rate throughout this period of time.The problem of choosing periods of time

over which probabilities of reinforcementcan be calculated for IRTs is analogous to theproblem of choosing periods of time withinwhich to calculate local rates of reinforce-ment (Discussions, Exp. 3 and 6). The simi-larity of the two problems is illustrated bytheir common concern with the earliest timessince reinforcement, but the treatment interms of IRTs has the advantage that thedifficulty can be related to an observed dis-crepancy between assumptions and data.Probabilities of reinforcement for IRTs arebased on the assumption of a uniform distri-bution of starting times for each IRT, but lo-cal rates of responding, and therefore IRTs,are changing most rapidly during the earliesttimes after reinforcement (see Exp. 2 and 3).Thus, the difference between probabilities ofreinforcement for IRTs, in Fig. 27, and re-corded relative frequencies of reinforcedIRTs, from a performance, is likely to begreatest at the earliest times since reinforce-ment (e.g., 0-to-20 sec in the arithmetic sched-ule in Fig. 27). Another advantage of thetreatment in terms of IRTs is that no assump-

374

Page 49: A quantitative analyses of behavior responding by interval schedules

INTERVAL SCHEDULES OF REINFORCEMENT

tion is made that the rate of responding atone opportunity is high enough to producean available reinforcement before the timeis reached for the next opportunity (see Exp.3).

In any case, if IRTs become shorter 'andrates of responding increase as the probabili-ties of reinforcement for IRTs increase, thenthe functions in Fig. 27 agree in a generalway with the performances maintained byeach schedule (Exp. 3): in a constant-prob-ability VI schedule, local rate of respondingremains roughly constant over time since re-inforcement; in an arithmetic VI schedule,local rate of responding increases but the ad-dition of extra short intervals produces a rela-tively high local rate shortly after reinforce-ment; in a geometric VI schedule, local rateof responding decreases; and in an Fl sched-ule, local rate of responding increases andvery long IRTs often occur early in the in-terval.

The Relationship Between ObservedRates of Responding and theProbabilities of Reinforcement forInterresponse Times

Interresponse times and rate of respondingare reciprocally related. For each overall andlocal rate of responding, there is a correspond-ing average IRT. The relationship betweenoverall and local rates of responding, there-fore, can be expressed in terms of average

IRTs. In Exp. 3, overall rates of respondingmaintained by different overall rates of rein-forcement in arithmetic VI schedules were

compared with local rates of responding main-tained by different local rates of reinforce-ment within intervals of various VI sched-ules (Fig. 15). Figure 28 illustrates three pro-

cedures for making this comparison, with thedata from Pigeon 118 (Fig. 1, Exp. 1) as an

example.The top frame of Fig. 28 shows the proce-

dure used to derive the open circles in Fig. 15,Exp. 3. The data, average overall rates ofresponding obtained at each overall rate ofreinforcement, were connected by straightlines, the function was extrapolated linearlyto zero, and rates of responding correspond-ing to particular rates of reinforcement were

read directly from the graph, as illustrated.The middle frame shows the same procedure,except that the ordinate has been converted

uJ

z

uj

a.uLU

ui

0.1%

u,

u

u

I

kw

be

of

I-

z

~-a

S.-

101

5

0.

50 100REINFORCEMENTS PER HOUR

150

INTERRESPONSE TIME (SEC.)

Fig. 28. Data from Pigeon 118 (Fig. 1) are replottedto illustrate three techniques for estimating local ratesof responding in VI schedules. Dotted lines show esti-mations from 20 and 100 reinforcements per hour (rft/hr). Details in text.

from rate of responding to average IRT. Al-though average IRTs corresponding to differ-ent rates of reinforcement can be read directlyfrom the graph, the abscissa does not lend it-self to comparison with the probability-of-reinforcement functions for IRTs in Fig. 27.

10a

I II

II ~~~~~~~~~~~~~~~~~Io I II I

io lI I

I I

I I II I

I

I I

C . Ai .

375

I

Page 50: A quantitative analyses of behavior responding by interval schedules

A. CHARLES CA TANIA and G. S. REYNOLDS

The bottom frame of Fig. 28 illustrates aprocedure that makes this comparison possi-ble. Each of the straight lines, with overallrate of reinforcement (rft/hr) as a parameter,represents the probabilities of reinforcementfor IRTs within a particular arithmetic VIschedule. The figure represents the schedulesfor Pigeon 118 in Exp. 1 (heavy lines) and asample of other schedules (light lines). Thefunctions are linear because longer IRTs areexcluded; specifically, none of the functionsextends beyond an IRT equal to the shortestnon-0-sec interval in that VI schedule. Achange in the overall rate of reinforcementprovided by the schedule produces a corre-sponding change in the slope of the function.For example, when the overall rate of rein-forcement is doubled (e.g., 20 to 40 rft/hr),the slope doubles.The data for Pigeon 118 are plotted as in-

tersections of the probability-of-reinforcementfunction for a given schedule and the averageIRT maintained by that schedule. When thedata are connected by straight lines, the aver-age IRT maintained by other arithmetic VISchedules can be read from the graph, as il-lustrated for schedules providing 20 and 100rft/hr. In addition, the average IRT at aparticular time since reinforcement in a givenschedule can be compared with the averageIRT maintained by a given overall rate ofreinforcement by superimposing the datafunction for Pigeon 118 in the lower frameof Fig. 28 on the appropriate probability-of-reinforcement functions, such as those in Fig.27, for different times since reinforcement ina particular schedule.The results of such a procedure, with re-

spect to both general relationships and indi-vidual differences, do not differ much fromthe other two procedures, after average IRTsare converted to rates of responding. The ma-jor deviations are in those instances in whichit is necessary to extrapolate beyond the rangeof the overall rates of reinforcement ar-ranged for a given pigeon (cf. Pigeon 278,Fig. 15). In other words, the deficiencies ofthe analysis in terms of local rates of rein-forcement (e.g., the problem of the earliesttimes after reinforcement) are not eliminatedwhen the analysis is carried out in terms ofthe probabilities of reinforcement for IRTs.Nevertheless, the extent to which the defi-ciencies in the IRT analysis can be related to

oversimplifications in the underlying assump-tions (in particular, that of a uniform distri-bution of starting times of IRTs) suggeststhat the IRT analysis can serve as a usefuland perhaps a preferable alternative to theanalysis in terms of local rates of reinforce-ment. In addition, data are available on theform of the distribution of IRTs maintainedby VI schedules (e.g., Anger, 1956; Farmer,1963), and, although the average IRT doesnot provide information about the shape ofthe IRT distribution, the relationship be-tween rates of responding and IRT distribu-tions is more likely to be clarified if both areexpressed in the same dimension, as IRTs.

Implications of Interresponse-Time AnalysesThe data function in the bottom frame of

Fig. 28 shows that the average IRT decreasesas the slope of the probability-of-reinforce-ment function increases. The average IRT,however, does not change in such a way thatits probability of reinforcement remains con-stant (Morse and Herrnstein, 1955); theprobability of reinforcement increases asthe average IRT decreases (e.g., from thepoint on the 33-rft/hr function in Fig. 28 tothat on the 79-rft/hr function). It appearsthat the function, if extrapolated, wouldasymptotically approach a probability of 0with increasing IRT, and would reach a fi-nite IRT at a probability of 1.0 (paradoxi-cally, this probability corresponds to a sched-ule of continuous reinforcement, in which,according to the present account, latenciesbut no IRTs can occur).The higher probabilities of reinforcement

at which shorter average IRTs are maintainedsuggest some sort of reciprocal relationshipbetween reinforcement and IRTs. Reinforce-ment increases responding, and thereforeshortens IRTs. Other effects, perhaps includ-ing effort and fatigue, decrease respondingand therefore lengthen IRTs. Such an ac-count has considerable precedent: in balanc-ing these two opposing factors, the pigeonappears to compromise between obeying theLaw of Effect and obeying the Law of LeastEffort.A number of factors, however, may operate

to shorten or lengthen IRTs. To speak sim-ply of a reinforced IRT is convenient, butreinforcement has several effects, and somemay be antagonistic. The fundamental effect

376

Page 51: A quantitative analyses of behavior responding by interval schedules

INTERVAL SCHEDULES OF REINFORCEMENT

of reinforcement, and its defining character-istic, is that it enhances the organism's ten-dency to emit the reinforced response. Rein-forcement of responses, regardless of theirassociated IRTs, tends to shorten IRTs (suchphenomena as the high frequencies of shortIRTs in DRL performances, e.g., Staddon,1965, may be an example of this effect of re-inforcement).The tendency for IRTs to shorten even in

schedules of reinforcement in which longIRTs are differentially reinforced (e.g., VIschedules: Anger, 1956) has prompted thesuggestion that short IRTs are more suscep-tible to reinforcement than long IRTs (Mil-lenson, 1966). This view seems contradictedby the fact that, in Fig. 28, shorter averageIRTs are maintained only at higher prob-abilities of reinforcement. An alternative ac-count of the tendency for IRTs to shorten isbased on another characteristic of reinforce-ment, its effectiveness even after a delay (cf.Dews, 1960). For example, consider the effectof reinforcement of a 10-sec IRT, and com-pare it with the effect of the reinforcement ofthe last of five 2-sec IRTs. In both cases, aresponse at the end of a 10-sec period of timeis followed by immediate reinforcement. Inthe former case, however, one other responseis reinforced incidentally with a delay of 10sec, whereas in the latter case, five other re-sponses are reinforced incidentally with de-lays of 10, 8, 6, 4, and 2 sec. It seems reason-able to assume that the reinforcement of moreresponses within a fixed period of time, eventhough the reinforcement of some responsesis both delayed and incidental, will be morelikely to increase subsequent responding or,in other words, to shorten IRTs. (This char-acteristic of reinforcement may be relevant tothe development of short IRTs in otherschedules that do not differentially reinforceshort IRTs, such as concurrent DRL sched-ules: Malott and Cumming, 1964, 1966; ra-tio schedules: Millenson, 1966,- and stochas-tic schedules: Weiss and Laties, 1964; Blough,1966).Another factor that may tend to shorten

IRTs is that long IRTs can produce de-creases in the overall rate of reinforcement.When reinforcements are made availableduring long IRTs, the long IRTs add to theminimum interreinforcement interval. Inmost interval schedules, however, this factor

is usually negligible (except perhaps dur-ing acquisition), because the difference be-tween the minimum and the actual interrein-forcement intervals is likely to be small.The lengthening of IRTs may depend on

such potential factors as effort or fatigue, butreinforcement itself may also contribute. Ininterval schedules, reinforcement favors longIRTs because the probability of reinforce-ment increases with IRT. To the- extent thatthe temporal spacing of responses comes un-der the control of differential reinforcement,IRTs lengthen and, as a consequence, thenumber of responses per reinforcement alsodecreases. The complication is that when re-inforced IRTs are long, the tendency of rein-forcement to shorten IRTs antagonizes theeffects of the differential reinforcement oflong IRTs. (This complication arises moreobviously in DRL performances, in whichIRTs too short for reinforcement sometimespreponderate.)The final interval-schedule performance

may emerge as a compromise between antago-nistic effects of reinforcement. Reinforcementtends to shorten IRTs, directly and perhapsthrough an effect of delay of reinforcement.It also tends to lengthen IRTs, through thecontrol produced by the higher probabilityof reinforcement for long IRTs. As IRTs be-come longer or shorter, one or the other effectof reinforcement may predominate, but theinteraction that comes about because IRTsand their probabilities of reinforcement co-vary in interval schedules generates a bal-ance between the effects that is reflected inthe average IRT maintained for a given pi-geon at a given time within a given schedule.

This analysis of the performances main-tained by interval schedules of reinforcementtreats them in terms of a process: the interac-tion of IRTs and their probabilities of rein-forcement as a function of time since re-inforcement. The analysis emphasizes thevariables that come into direct contact withbehavior, rather than the variables specifiedin the arrangement of schedules (cf. Schoen-feld, Cumming, and Hearst, 1956). Accord-ing to this analysis, the power of positive re-inforcement lies in its capacity to control notonly the occurrence of responses, but alsotheir temporal relationship to other responsesand to events such as reinforcement. Thesetemporal constraints, imposed on perform-

377

Page 52: A quantitative analyses of behavior responding by interval schedules

A. CHARLES CA TANIA and G. S. REYNOLDS

ance because the differential reinforcementof IRTs is different within each schedule andat different times within the same schedule,may bear on the relative insensitivity of inter-val-schedule performances to some variables(e.g., magnitude of reinforcement: Catania,1963b). Sensitivity to such variables is morelikely to be obtained with nontemporal mea-sures of performance (e.g., the proportion ofchangeovers from one interval schedule to an-other when the schedules operate concur-rently: Catania, 1966).

Schedules can be designed either to mini-mize or to maximize temporal constraints (cf."synthetic" VI schedules, Newman and An-ger, 1954; stochastic reinforcement of IRTs,Weiss and Laties, 1964; reinforcement of"least-frequent" IRTs, Blough, 1966), buteliminating or establishing constraints withrespect to some variables is bound to affectconstraints with respect to others. In otherwords, no particular schedule of reinforce-ment manipulates "response strength";rather, it controls a particular sample of thevarious properties of responding.

APPENDIX II:CONSTANT-PROBABILITYVARIABLE-INTERVAL

SCHEDULESA constant-probability VI schedule is one

with a minimal correlation between probabil-ity of reinforcement and the time since thelast reinforcement. In other words, a constant-probability VI schedule provides that timesince reinforcement cannot acquire discrimi-native control over responding through its re-lationship to the availability of subsequentreinforcement. This condition may be a pre-requisite for local rates of responding that donot change with the passage of time since re-inforcement. The condition is obviously notsatisfied by an Fl schedule, which makes rein-forcement available at the same time in everyinterval; it is also not satisfied by a varietyof standard VI schedules, including thearithmetic and the geometric (Exp. 3). Thepresent section considers the design of con-stant-probability VI schedules, a problem sig-nificant for the technology of behavior be-cause a constant rate of responding providesa useful baseline against which to assess theeffects of many variables.

Two methods of designing constant-prob-ability VI schedules will be considered. One,illustrated by the random-interval schedulesof Farmer (1963) and Millenson (1963) andby the constant-probability schedule of Exp.3, holds constant the separation in time ofsuccessive opportunities for reinforcementwhile varying the relative frequencies of dif-ferent intervals. The other, illustrated by theschedules of Fleshler and Hoffman (1962: seeExp. 3, Discussion) and by a modified sched-ule described below, holds constant the rela-tive frequencies of the different intervalswhile varying the separation in time of suc-cessive opportunities for reinforcement.The random-interval schedules of Farmer

and Millenson arranged a constant, recyclingtime interval, T. Within each T-sec interval,the first response was reinforced with a prob-ability, P, corresponding to the statistic, re-inforcements per opportunity. The timing ofthe T-sec intervals was not interrupted dur-ing reinforcement, so that a 0-sec interval(reinforcement of the first response after areinforcement) was possible if T was less thanthe duration of reinforcement. As arrangedby Farmer, the schedules also included a lim-ited hold: a reinforcement made availablewithin one T-sec interval was not kept avail-able beyond the end of that interval.Farmer studied a range of T from 1 to 60

sec, and a range of P from 0.0052 to 1.0 (whenP equaled 1.0, these schedules correspondedto Fl schedules). Cumulative records showedthat rate of responding was roughly constantover time since reinforcement at only somecombinations of T and P. The deviationscan be attributed to at least three factors: thelimited hold, particularly when T equaled1 sec; the time to the first opportunity forreinforcement when T was large (30 or 60sec), which produced long pauses after rein-forcement (cf. constant-probability VI 379-secfor Pigeon 278 in Fig. 11, Exp. 3); and, triv-ially, the FI character of the schedules whenP equalled 1.0.

Millenson chose 4 sec as an optimal valueof T, and arranged schedules with P equal to0.0667 and 0.0183. In cumulative records, lo-cal rates of responding appeared roughly con-stant over time since reinforcement, althoughone of three pigeons showed systematicallylow rates of responding for some time afterreinforcement when P equalled 0.0667, and

378

Page 53: A quantitative analyses of behavior responding by interval schedules

INTERVAL SCHEDULES OF REINFORCEMENT

all three pigeons showed cyclic short-term al-ternations between high and low rates whenP equaled 0.0183.The average rate of reinforcement in a

random-interval schedule equals P/T. As Pand T become small (see Millenson, 1963),the distribution of interreinforcement inter-vals approaches the exponential distribution:

e-(t/t)f(t) = t dt,

where t is the duration of an interval, f(t) isthe relative frequency of the interval, t is themean interval, and e is the base of naturallogarithms. The relative frequencies of thediscrete intervals in the following distribu-tion with T equal to 10 sec and P equal to0.10 provide one approximation to the con-tinuous distribution described by the equa-tion (intervals are shown in parentheses):0.100 (10-sec), 0.090 (20-sec), 0.081 (30-sec),0.073 (40-sec), 0.066 (50-sec), 0.059 (60-sec),0.053 (70-sec), 0.048 (80-sec), 0.043 (90-sec),0.039 (100-sec), and so on. In this sequence,the exact relative frequency of t., the nth in-terval with intervals ranked in order of dura-tion, equals P(lp)n-1.The constant probability schedule of Exp.

3 provided a distribution of intervals simi-lar to the distribution with P equal to 0.10in the random-interval schedules of Farmerand Millenson. Each interval was an integralmultiple of the minimum interval, t (see Ta-ble 2 and Fig. 11, Exp. 3). The schedule dif-fered from the random-interval schedules inthat the sequence of intervals within each ses-sion was predetermined by a punched tape.Consequently, the schedule specified a longestinterval at the end of which the probabilityof reinforcement necessarily became 1.0.Both the random and predetermined meth-

ods of arranging constant-probability sched-ules have certain advantages. In a random-interval schedule, an interval of any multipleof T-sec is possible, though less likely thelarger the multiple. The probability of rein-forcement never becomes 1.0 except in theex post facto sense that there will always havebeen a longest interval when the relative fre-quencies of different intervals are tabulatedat the end of a particular session.Under some circumstances, however, a pre-

determined sequence of intervals may be pref-erable to a randomly generated sequence. A

random generator will occasionally (and un-predictably) produce a long, locally regularsequence, which, through a temporary localeffect on the rate of reinforcement or on thecorrelation between reinforcements and timesince reinforcement, may have significant ef-fects on performance, especially if it occurs inthe early stages of acquisition. A predeter-mined sequence (for example, in the form ofa loop of punched tape) not only avoids thispossibility, but also may simplify data collec-tion because the experimenter can predict thenumber of times the organism will reach vari-ous times since reinforcement within a par-ticular experimental session.A satisfactory sequence of intervals in

which each interval is an integral multiple ofthe minimum interval, however, is necessarilylong (e.g., the 60-interval sequence in Table2). With the usual VI programmer (e.g., RalphGerbrands Co.), such a sequence requires ex-cessively long tapes and produces the problemof tape breakage and tangling. A desirablesequence for many applications, therefore,would be short and yet would retain the basiccharacteristics of a constant-probability VIschedule.The method proposed by Fleshler and

Hoffman (1962) for generating a sequence ofintervals roughly satisfies these requirements.Their progression of intervals is described bythe equation:

tn = t [1 + In N + (N-n) In (N-n)-(N-n+l) In (N-n+l)],

where t. and t are, again, the durations ofthe nth and the mean intervals respectively,N is the total number of intervals, and Inrepresents the natural (base e) logarithm. Theequation is derived from the exponential dis-tribution (cf. Discussion, Exp. 3). In effect, asthe probability of reinforcement increasesfrom one opportunity to the next (Fig. 14),the temporal separation of successive oppor-tunities increases in such a way that the prob-ability of reinforcement per unit of time (inother words, the local rate of -reinforcement)remains roughly constant. In discussing theproblem that this distribution provides rein-forcement at discrete points in time and theprobability of reinforcement at other times iszero, Fleshler and Hoffman state:

"This difficulty would be insurmount-able if organisms had perfect temporal

379

Page 54: A quantitative analyses of behavior responding by interval schedules

A. CHARLES CATANIA and G. S. REYNOLDS

discrimination. The fact that they do notmeans that the effects of rf [reinforce-ment] at a given point in time willspread to nearby points in time (at leastwithin the difference limen). If the dif-ferences between successive terms in theprogression were sufficiently small so thatwithin the schedule context, discrimina-tion between these terms were poor, theeffective probability distribution wouldbe continuous and would approximatethe theoretical distribution" (Fleshlerand Hoffman, 1962, p. 530).

This schedule, as arranged by Chorney(1960), maintained a relatively constant localrate of responding over most of the range oftime since reinforcement, thus supporting theassumption that this and the preceding con-stant-probability VI schedules are equivalentand demonstrating the importance of theseparation of different opportunities for re-inforcement along the continuum of timesince reinforcement. Chorney's finding of ahigher rate of responding shortly after rein-forcement than later within intervals, how-ever, prompts a detailed examination of theearly terms of the progression.A sample sequence of intervals from the

progression is the following (20 intervals,mean = 100 sec): 2.5, 7.7, 13.5, 19.3, 25.5, 32.2,39.3, 47.0, 55.5, 64.5, 74.5, 85.6, 98.2, 112.5,129.2, 149.4, 174.6, 208.6, 260.9, and 399.6 sec.Local rates of reinforcement are approxi-mately one reinforcement per 100 sec at allopportunities except the first (2.5 sec, atwhich local rate of reinforcement is about35% higher) and the last (399.6 sec, at whichlocal rate of reinforcement is about 30%lower). The relatively high, early local rateof reinforcement might account for Chorney'sfinding, but this deviation alone cannot betaken too seriously because the computationof local rates of reinforcement is most arbi-trary at early and late times after reinforce-ment.The progression can also be evaluated in

terms of interresponse times. Consider a 2.5-sec IRT that begins either within the first2.5 sec after reinforcement or between 2.5 and7.7 sec. If the IRT begins within 2.5 sec, itsprobability of reinforcement is 0.050 becauseit must end at least 2.5 sec after reinforce-ment and because reinforcement is arranged

at this time in one of every 20 intervals.If this IRT begins between 2.5 and 7.7sec, however, its probability of reinforce-ment is 0.053 (reinforcement at 7.7 sec in oneof 19 intervals) multiplied by the probabilityof 0.48 that the IRT will end after 7.7 sec(see Appendix I). This probability equals0.0252, or about half the earlier probability,and the probability remains at roughly thisvalue through most of the remaining timesince reinforcement. In light of Chorney'sfindings, therefore, its higher value shortlyafter reinforcement may be significant. Themodification suggested below provides a pro-gression similar to Fleshler and Hoffman's,but takes into account the probabilities of re-inforcement for IRTs at different times sincereinforcement.

In a sequence of intervals such that all in-tervals occur with the same relative frequency,the probability of reinforcement at the end ofa given interval, t,D is given by the reciprocalof the number of intervals equal to or greaterthan t.; that is to say, reinforcements per op-portunity grows with time since reinforce-ment as the reciprocal of the number of in-tervals ending at or after the given time sincereinforcement. But given that a particularIRT is shorter than the time between succes-sive opportunities, the probability of rein-forcement of the IRT is directly proportionalto reinforcements per opportunity (see Ap-pendix I). Thus, to hold constant the prob-ability of reinforcement for a given IRT, theincrements in the durations of successive in-tervals in the progression must grow directlywith reinforcements per opportunity.

For example, in a progression of 20 inter-vals, the probability of reinforcement at theend of the shortest interval, tl, is 0.050, andso the probability of reinforcement for anIRT of t1-sec is also 0.050. But the probabil-ity of reinforcement (rft/op) at t2, at the endof the next shortest interval, is 0.053 (1/19).The duration of the increment added to tl,therefore, must be 1.053 (20/19) times t1 ifthe probability of reinforcement for this IRTat t2 is to be held equal to 0.050. Correspond-ingly, the increment added to t2 must be 1.056(19/18) times tl, and so on. A general pro-gression that satisfies these requirements andso holds constant the probability of reinforce-ment for any IRT less than or equal to t,sec is:

380

Page 55: A quantitative analyses of behavior responding by interval schedules

INTERVAL SCHEDULES OF REINFORCEMENT 381

n _t

tn = Z (N+1)-i'

where the symbols are the same as those inthe Fleshler and Hoffman equation. The fol-lowing sequence of 20 intervals with a meanof 100 sec is an example of the progression:5.0, 10.3, 15.8, 21.7, 28.0, 34.6, 41.8, 49.4, 57.8,66.9, 76.9, 88.0, 100.5, 114.8, 131.4, 151.4,176.4, 209.8, 259.8, and 359.8 sec. This progres-sion, with a shortest interval longer than thatin the Fleshler and Hoffman progression, islikely to generate relatively lower rates of re-sponding shortly after reinforcement thanwere observed in Chorney's experiment.With the qualification of a uniform distri-

bution of starting times for a particular IRT(see Appendix I), the sequence holds exactlyconstant the probability of reinforcement forany IRT less than or equal to the duration ofthe shortest interval. The sequence also inci-dentally holds local rates of reinforcementroughly constant at one reinforcement per100 sec, with the sole exception of the lastopportunity (end of the longest interval),when the local rate of reinforcement is higher.

In practice, the performances maintainedby the Fleshler and Hoffman schedule andby the present modification would probablynot be appreciably different at any but theshortest times after reinforcement. Both sched-ules, as short sequences with the approximatecharacteristics of constant-probability VIschedules, have been in laboratory use and,on inspection of cumulative records, appearto maintain fairly constant local rates of re-sponding over most of the range of time sincereinforcement.

Additional study may suggest further re-finement of the schedules. For example, if theearliest times after reinforcement have spe-cial characteristics, it may be desirable to findout whether a 0-sec interval could be includedin the sequence without excessively raisingthe local rate of responding shortly after re-inforcement (see Exp. 2).The above progressions provide no infor-

mation about the maximally effective orderof the intervals that make up a constant-probability VI schedule. Sequential proper-ties can produce systematic local changes inthe rate of responding. For example, if shortintervals are always followed by long inter-vals, the local rate of responding may become

relatively low immediately after reinforce-ment at the end of a short interval. Only in-formal data on the effects of the order of in-tervals are available. One schedule (colloqui-ally, the golden tape) was developed over aperiod of years by several investigators at theHarvard Pigeon Laboratories. One featurethat persisted among several variations in theschedule was that the two shortest intervalswere separated by exactly two of the inter-mediate intervals. One version of the sched-ule is the following (15 intervals, mean of 180sec): 560, 60, 220, 5, 140, 120, 5, 260, 500, 60,300, 20, 60, 350, and 140 sec. The order of theintervals was assumed to contribute to theschedule's success in maintaining roughlyconstant local rates of responding with onlyminor sequential effects (as observed in cumu-lative records). It is interesting to note thatthis schedule and a similar one designed byAnger (1956) had the property that local ratesof reinforcement at successive opportunities,though not constant, varied considerably lessthan in arithmetic or geometric VI schedules.

REFERENCESAnger, D. The dependence of interresponse times

upon the relative reinforcement of different inter-response times. Journal of Experimental Psychol-ogy, 1956, 52, 145-161.

Autor, S. M. The strength of conditioned reinforcersas a function of frequency and probability of re-inforcement. Unpublished doctoral dissertation,Harvard University, 1960.

Blough, D. S. The reinforcement of least-frequentinterresponse times. Journal of the ExperimentalAnalysis of Behavior, 1966, 9, 581-592.

Catania, A. C. Concurrent performances: Reinforce-ment interaction and response independence. Jour-nal of the Experimental Analysis of Behavior, 1963,6, 253-263. (a)

Catania, A. C. Concurrent performances: A baselinefor the study of reinforcement magnitude. Journalof the Experimental Analysis of Behavior, 1963, 6,299-300. (b)

Catania, A. C. Concurrent operants. In W. K. Honig(Ed.) Operant behavior: areas of research and ap-plication. New York: Appleton-Century-Crofts,1966. Pp. 213-270.

Catania, A. C. and Reynolds, G. S.- A quantitativeanalysis of the behavior maintained by intervalschedules of reinforcement. Paper presented at themeeting of the Psychonomic Society, Bryn Mawr,1963.

Chorney, H. Variable-interval schedules: the behav-ioral consequences of the probability of reinforce-ment as a function of time since reinforcement.Unpublished Master's thesis, Pennsylvania StateUniversity, 1960.

Page 56: A quantitative analyses of behavior responding by interval schedules

382 A. CHARLES CATANIA and G. S. REYNOLDS

Clark, F. C. The effect of deprivation and frequencyof reinforcement on variable-interval responding.Journal of the Experimental Analysis of Behavior,1958, 1, 221-228.

Clark, R. Some time-correlated reinforcement sched-ules and their effects on behavior. Journal of theExperimental Analysis of Behavior, 1959, 2, 1-22.

Cumming, W. W. Stimulus disparity and variable-interval reinforcement schedule as related to a be-havioral measure of similarity. Unpublished doc-toral dissertation, Columbia University, 1955.

Cumming, W. W. and Schoenfeld, W. N. Behaviorunder extended exposure to a high-value fixedinterval reinforcement schedule. Journal of the Ex-perimental Analysis of Behavior, 1958, 1, 245-263.

Dews, P. B. Free-operant behavior under conditionsof delayed reinforcement: I. CRF-type schedules.Journal of the Experimental Analysis of Behavior,1960, 3, 221-234.

Dews, P. B. The effect of multiple SA periods on re-sponding on a fixed-interval schedule. Journal ofthe Experimental Analysis of Behavior, 1962, 5,369-374.

Farmer, J. Properties of behavior under random in-terval reinforcement schedules. Journal of the Ex-perimental Analysis of Behavior, 1963, 6, 607-616.

Ferster, C. B. Control of behavior in chimpanzeesand pigeons by time out from positive reinforce-ment. Psychological Monographs, 1958, 72, (8,Whole No. 461).

Ferster, C. B. and Skinner, B. F. Schedules of rein-forcement. New York: Appleton-Century-Crofts,1957.

Findley, J. D. Preference and switching under con-current scheduling. Journal of the ExperimentalAnalysis of Behavior, 1958, 1, 123-144.

Findley, J. D. An experimental outline for buildingand exploring multi-operant behavior repertoires.Journal of the Experimental Analysis of Behavior,1962, 5, 113-166.

Fleshler, M. and Hoffman, H. S. A progression forgenerating variable interval schedules. Journal ofthe Experimental Analysis of Behavior, 1962, 5,529-530.

Gollub, L. R. The relations among measures of per-formance on fixed-interval schedules. Journal ofthe Experimental Analysis of Behavior, 1964, 7,337-343.

Hearst, E. The behavioral effects of some temporallydefined schedules of reinforcement. Journal of theExperimental Analysis of Behavior, 1958, 1, 45-55.

Herrnstein, R. J. Behavioral consequences of the re-moval of a discriminative stimulus associated withvariable-interval reinforcement. Unpublished doc-toral dissertation, Harvard University, 1955.

Herrnstein, R. J. Relative and absolute strength ofresponse as a function of frequency of reinforce-ment. Journal of the Experimental Analysis of Be-havior, 1961, 4, 267-272.

Hermstein, R. J. Secondary reinforcement and rateof primary reinforcement. Journal of the Experi-mental Analysis of Behavior, 1964, 7, 27-36.

Hull, C. L. Principles of behavior. New York: Apple-ton-Century-Crolts, 1943.

Kaplan, M. The effects of noxious stimulus intensityand duration during intermittent reinforcement of

escape behavior. Journal of Comparative and Physi-ological Psychology, 1952, 45, 538-549.

Kelleher, R. T. and Gollub, L. R. A review of posi-tive conditioned reinforcement. Journal of the Ex-perimental Analysis of Behavior, 1962, 5, 543-597.

Malott, R. W. and Cumming, W. W. Schedules ofinterresponse time reinforcement. PsychologicalRecord, 1964, 14, 221-252.

Malott, R. W. and Cumming, W. W. Concurrentschedules of interresponse time reinforcement:probability of reinforcement and the lower boundsof the reinforced interresponse time intervals.Journal of the Experimental Analysis of Behavior,1966, 9, 317-325.

Millenson, J. R. Some behavioral effects of a two-valued, temporally defined reinforcement schedule.Journal of the Experimental Analysis of Behavior,1959, 2, 191-202.

Millenson, J. R. Random interval schedules of rein-forcement. Journal of the Experimental Analysisof Behavior, 1963, 6, 437-443.

Millenson, J. R. Probability of response and proba-bility of reinforcement in a response-defined ana-logue of an interval schedule. Journal of the Experi-mental Analysis of Behavior, 1966, 9, 87-94.

Morse, W. H. Intermittent reinforcement. In W. K.Honig (Ed.). Operant behavior: areas of researchand application. New York: Appleton-Century-Crofts, 1966. Pp. 52-108.

Morse, W. H. and Herrnstein, R. J. The analysis ofresponding under three different forms of fixedinterval reinforcement. Paper delivered at themeeting of the Eastern Psychological Association,Philadelphia, 1955.

Neuringer, A. J. and Chung, S.-H. Quasi-reinforce-ment: control of responding by a percentage-rein-forcement schedule. Journal of the ExperimentalAnalysis of Behavior, 1967, 10, 45-54.

Nevin, J. A. Two parameters of conditioned rein-forcement in a chaining situation. Journal of Com-parative and Physiological Psychology, 1964, 58,367-373.

Newman, E. B. and Anger, D. The effect upon sim-ple animal behavior of different frequencies of re-inforcement. Report PLR-33, Office of the SurgeonGeneral, 1954. (Document #7779, ADI AuxiliaryPublications Project, Photoduplication Service, Li-brary of Congress, Washington, D.C. 20025.)

Norman, M. F. An approach to free-responding onschedules that prescribe reinforcement probabilityas a function of interresponse time. Journal ofMathematical Psychology, 1966, 3, 235-268.

Reynolds, G. S. Relativity of response rate and rein-forcement frequency in a multiple schedule. Jour-nal of the Experimental Analysis of Behavior, 1961,4, 179-184.

Reynolds, G. S. Some limitations on behavioral con-trast and induction during successive discrimina-tion. Journal of the Experimental Analysis of Be-havior, 1963, 6, 131-139.

Reynolds, G. S. Discrimination and emission of tem-poral intervals by pigeons. Journal of the Experi-mental Analysis of Behavior, 1966, 9, 65-68.

Reynolds, G. S. and Catania, A. C. Response rate asa function of rate of reinforcement and probabilityof reinforcement in variable-interval schedules.

Page 57: A quantitative analyses of behavior responding by interval schedules

INTERVAL SCHEDULES OF REINFORCEMENT 383

Paper presented at the meeting of the PsychonomicSociety, New York, 1961.

Reynolds, G. S. and Catania, A. C. Temporal dis-crimination in pigeons. Science, 1962, 135, 314-315.

Schoenfeld, W. N. and Cumming, W. W. Studies ina temporal classification of reinforcement sched-ules: Summary and projection. Proceedings of theNational Academy of Sciences, 1960, 46, 753-758.

Schoenfeld, W. N., Cumming, W. W., and Hearst, E.On the classification of reinforcement schedules.Proceedings of the National Academy of Sciences,1956, 42, 563-570.

Sherman, J. G. The temporal distribution of re-responses on fixed interval schedules. Unpublisheddoctoral dissertation, Columbia University, 1959.

Shimp, C. P. The reinforcement of short interre-sponse times. Journal of the Experimental Analysisof Behavior, 1967, 10, 425-434.

Sidman, M. Tactics of scientific research. New York:Basic Books, 1960.

Skinner, B. F. The effect on the amount of condition-ing of an interval of time before reinforcement.Journal of General Psychology, 1936, 14, 279-295.

Skinner, B. F. The behavior of organisms. New York:Appleton-Century-Crofts, 1938.

Skinner, B. F. "Superstition" in the pigeon. Jour-nal of Experimental Psychology, 1948, 38, 168-172.

Staddon, J. E. R. Some properties of spaced respond-ing in pigeons. Journal of the Experimental Analy-sis of Behavior, 1965, 8, 19-27.

Staddon, J. E. R. Attention and temporal discrimina-tion: factors controlling responding under a cyclic-interval schedule. Journal of the ExperimentalAnalysis of Behavior; 1967, 10, 349-359.

Stubbs, A. The discrimination of stimulus durationby pigeons. Journal of the Experimental Analysisof Behavior, 1968, 11, 223-238.

Weiss, B. and Laties, V. G. Drug effects on the tem-poral patterning of behavior. Federation Proceed-ings, 1964, 23, 801-807.

Weiss, B. and Moore, E. W. Drive level as a factorin the distribution of responses in fixed-intervalreinforcement. Journal of Experimental Psychology,1956, 52, 82-84.

Wilson, M. P. Periodic reinforcement interval andnumber of periodic reinforcement as parametersof response strength. Journal of Comparative andPhysiological Psychology, 1954, 47, 51-56.

Received 10 June 1963.