#1
What is a variable reinforcement schedule?
A reinforcement schedule where the reinforcement occurs after a fixed number of responses.
A reinforcement schedule where the reinforcement occurs after a variable number of responses.
A reinforcement schedule where the reinforcement occurs at fixed intervals of time.
A reinforcement schedule where the reinforcement occurs at variable intervals of time.
#2
Which type of reinforcement schedule is more resistant to extinction?
Fixed-ratio schedule
Variable-ratio schedule
Fixed-interval schedule
Variable-interval schedule
#3
Which psychologist is most closely associated with the concept of reinforcement schedules?
B.F. Skinner
Sigmund Freud
Jean Piaget
Carl Rogers
#4
What is the term for the gradual disappearance of a learned behavior when reinforcement is withheld?
Extinction
Reinforcement
Punishment
Generalization
#5
Which of the following is NOT a type of variable reinforcement schedule?
Variable-interval schedule
Variable-ratio schedule
Fixed-interval schedule
Fixed-ratio schedule
#6
What effect does a variable reinforcement schedule have on response rate compared to a fixed reinforcement schedule?
Variable reinforcement schedules decrease response rate.
Variable reinforcement schedules increase response rate.
Variable reinforcement schedules have no effect on response rate.
It depends on the type of reinforcement schedule.
#7
In variable reinforcement schedules, the reinforcement is given:
After every response.
After a fixed number of responses.
After a variable number of responses.
After a fixed interval of time.
#8
Which reinforcement schedule is often associated with gambling behavior?
Fixed-ratio schedule
Variable-ratio schedule
Fixed-interval schedule
Variable-interval schedule
#9
Which reinforcement schedule is characterized by a specific number of responses being required for reinforcement?
Fixed-ratio schedule
Variable-ratio schedule
Fixed-interval schedule
Variable-interval schedule
#10
In a variable-interval schedule, reinforcement is provided after:
A fixed number of responses.
A variable number of responses.
A fixed interval of time.
A variable interval of time.
#11
Which of the following is an example of a variable-ratio schedule?
Receiving a paycheck every month
Winning a prize after every fifth attempt
Getting a snack after every 30 minutes
Receiving a bonus after completing 10 tasks
#12
In a variable reinforcement schedule, the variability in reinforcement leads to:
Decreased motivation
Increased motivation
Consistent behavior
Predictable outcomes
#13
What is the main advantage of variable reinforcement schedules in behavior modification?
They lead to quicker extinction of behavior.
They are easier to implement.
They promote consistent responding.
They are more resistant to extinction.
#14
Which of the following is an example of a variable reinforcement schedule?
Receiving a paycheck every two weeks
Winning a prize after every tenth attempt
Getting a snack after every five minutes
Receiving a bonus after completing a fixed number of tasks
#15
Which term describes the tendency for behavior to increase when it is reinforced only some of the time?
Partial reinforcement effect
Intermittent reinforcement effect
Sporadic reinforcement effect
Random reinforcement effect
#16
Which term describes the phenomenon where a behavior is more resistant to extinction when reinforced intermittently rather than continuously?
Variable reinforcement effect
Intermittent reinforcement effect
Fixed reinforcement effect
Continuous reinforcement effect
#17
Which term describes the phenomenon where behavior increases before expected reinforcement is available, such as a rat pressing a lever more rapidly as the time for reinforcement approaches?
Anticipatory behavior
Preparatory behavior
Excitatory behavior
Contingent behavior