PSY 200 Chapter 8

What is intermittent reinforcement? Give an example.

an arrangement in which a behavior is positively reinforced only occasionally rather than every time it occurs.
Example: Jan's problem solving behavior was not reinforced after every math problem she solved, but after a fixed number of problem-solving res

What is a response rate? Give an example.

refers to the number of instances a behavior occurred in a given period of time.
Example: John said a bad word 7 times within a 30 minute time period.

What is schedule of reinforcement? Give an example.

Schedule of reinforcement- rule specific of when a behavior should be reinforced.
Example:

What is CRF? Give an example.

CRF= Continuous reinforcement
CR-all behavior gets reinforced (reinforcement every time the target behavior is delivered)
Example: turning the water faucet every-time= water running

What are four advantages of intermittent reinforcement for maintaining behavior?

...

What is an FR schedule? Give an example.

FR= Fixed ratio
FR- predictable # of responses (reinforcement occurs every time a fixed number of responses of a particular type are emitted.
Example: FR3 kid raising his hand 3 times in a row

What are three characteristic effects of an FR schedule?

1. steady response rate
2. post reinforcement pause
3. best for skill acquisition
4. ratio strain can occur

What is ratio strain?

deterioration of responding from increasing an FR schedule too rapidly (too big of a jump in one blow) Example: 3 problems to 33

What is a VR schedule? Give an example.

VR= Variale ratio
VR- reinforcer occurs after a certain number of a particular response, the # of responses required for each reinforcer changes unpredictably from one reinforcer to the next.
Example: gambling, lottery tickets, scratch off's

How is a VR schedule similar to an FR schedule?

both use ratio schedule when one wants to generate a high rate of responding and can monitor each response.

What are three characteristic effects of a VR schedule?

1. produces highest, steadiest response rate
2. little or no reinforcement pause
3.most hesitant to extinction
4. ratio strain can occur but at a higher number

What is an FI schedule?

FI=fixed interval
FI- reinforcer is presented following the first instance of a specific response after a fixed period of time.
Example: treasure chest opens @ 9:20 and opens again at 9:40. cannot open before the 20 mins

What is a VI schedule?

VI=variable interval
VI- a reinforcer is presented following the first instance of a specific response after an interval of time, and the length of interval changes unpredictably from one reinforcer to the next. (response is reinforced after unpredictable

Why are simple interval schedules not often used in training programs?

...

What is a pitfall of intermittent reinforcement? Give an example.

Traps not only the unwary but also those with some knowledge of behavior modification. Inconsistent use of extinction.
Example: parents tries to ignore child's tantrum but fails when child persists and parents finally gives in, meaning more tantrums to oc

Which schedules tend to produce higher resistance to extinction (RTE), the fixed or the variable schedules?

...