81 Schedules of Reinforcement
A schedule of reinforcement is the response requirement that must be met in order to obtain reinforcement. In other words, a schedule indicates what exactly has to be done for the reinforcer to be delivered. Different response requirements can have very different effects on behavior and can also explain aspects of human behavior that are often attributed to some desires and/or traits. (lecture notes from Theories)
Continuous Versus Intermittent Schedules
- Continuous Reinforcement occurs when reinforcement is administered each and every time the response is reinforced.
- Intermittent Reinforcement occurs when only some responses are reinforced.
Four Basic Intermittent Schedules
- Fixed Ratio (FR) schedule of reinforcement is contingent upon a fixed, predictable number of responses. (lecture notes from Theories)
- example: doing a certain number of math problems correctly
- Ratio strain is a disruption in responding due to an overly demanding response requirement. (lecture notes from Theories)
- example: doing a certain number of math problems correctly
- Variable Ratio (VR) schedule of reinforcement is contingent upon a varying, unpredictable number of responses. (lecture notes from Theories)
- example: working as a waitress, you never know how many tips you will receive or fishing is another example
- Fixed Interval (FI) schedule of reinforcement is contingent upon the first response after a fixed, predictable period of time. (lecture notes from Theories)
- example: payday, comes on the 1st and 16th of every month
- Variable Interval (VI) schedule of reinforcement is contingent upon the first response after a varying, predictable period of time. (lecture notes from Theories)
- example: waiting for a bus, you know it will be there you just don’t know approximately when
Simple Schedules of Reinforcement
- Duration Schedules of reinforcement are contingent on behaviors performed continuously throughout a period of time.
- Fixed duration (FD) is when the behavior is performed continuously for a fixed, predictable amount of time. (lecture notes from Theories)
- Variable duration (VD) is when the behavior is performed continuously for a varying, unpredictable amount of time. (lecture notes from Theories)
- Response-Rate Schedules, reinforcement is directly contingent upon the organism’s rate of response.
- Differential reinforcement of high rates (DRH) is contingent upon emitting at least a certain number of responses in a certain period of time. (lecture notes from Theories)
- example: athletic events
- Differential reinforcement of low rates (DRL) is when a minimum amount of time must pass between each response before the reinforcer will be delivered. (lecture notes from Theories)
- example: praising a child for taking his/her time on homework to get good results
- Differential reinforcement of paced responding (DRP), reinforcement is contingent upon emitting a series of responses at a set rate-neither too fast nor too slow. (lecture notes from Theories)
- example: swimming and/or running competitively, one must pace themselves to have sufficient energy for a last minute burst to the finish
- Differential reinforcement of high rates (DRH) is contingent upon emitting at least a certain number of responses in a certain period of time. (lecture notes from Theories)
- Noncontingent Schedules are when the reinforcer is delivered independently of any response.
- Fixed time (FT) schedule, the reinforcer is delivered following a fixed, predictable period time, regardless of the organism’s behavior. (lecture notes from Theories)
- example: Christmas gifts
- Variable time (VT) schedule, the reinforcer is delivered following a varying, unpredictable period of time, regardless of the organism’s behavior. (lecture notes from Theories)
- example: Skinner’s pigeons
- Fixed time (FT) schedule, the reinforcer is delivered following a fixed, predictable period time, regardless of the organism’s behavior. (lecture notes from Theories)
Complex Schedules of Reinforcement
A combination of two or more simple schedules.
- Conjunctive schedules are the requirements of two or more simple schedules must be met before a reinforcer can be delivered. (lecture notes from Theories)
- example: how much you earn monthly at your job depends on the number of hours you spend working
- Adjusting schedules are when the requirement changes as a function of the organism’s performance while responding to a previous reinforcer. (lecture notes from Theories)
- example: The whole class doing bad on an exam; therefore, next time the teacher won’t put as much information.
- Chained schedules consist of a sequence of two or more simple schedules.
- example: taking classes to obtain a degree
- A goal gradient effect is an increase in the strength and/or efficiency of responding as one draws near to the goal. (lecture notes from Theories)
- example: taking classes to obtain a degree