Pages

Wednesday, March 5, 2014

Schedules of Reinforcement

Skinner decided to consider how behavior would change if he varied the rate at which it was reinforced, which is called reinforcement schedules. Among the rates of reinforcement he tested are the following.
_ Fixed interval
_ Fixed ratio
_ Variable interval
_ Variable ratio

Fixed interval; A fixed-interval schedule of reinforcement means that the reinforcer is presented following the first response that occurs after a fixed time interval has elapsed. That interval might be 1 minute, 3 minutes, or any other fixed period of time. The timing of the reinforcement has nothing to do with the number of responses. This is much like a person’s weekly or bi-monthly salary.

Fixed ratio; the fixed-ratio schedule of reinforcement is where reinforcers are given only after the organism has made a specified number of responses. For example, the experimenter could reinforce after every 10th or 20th response. This type of reinforcer becomes an incentive to work harder, such as for a person who makes a commission, (or fixed amount of money); for each car he might sell.

Variable interval; with the variable-interval schedule of reinforcement, a person might be reinforced by preparing for a pop quiz at various intervals or time frames throughout a semester. A person would have to be alert and ready to respond well to the introduction of this type of reinforcer.

Variable ratio; the final schedule of reinforcement is the variable-ratio, which is based on an average number of responses between reinforcers, but there is great variability around the average. We would perceive that various types of gambling would operate under this schedule of reinforcement.

No comments:

Post a Comment