Thorndike referred to the principle of strengthening a behavior by its consequences as __________; in modern terminology, this is called.
the Law of Effect, reinforcement
In photographing cats in the puzzle box, Guthrie and Horton found that the behaviors of an individual cat were ____________ from trial to trial, but they were _________ from cat to cat.
similar, different
Superstitious behaviors are more likely to occur when an individual has _________ of the reinforcer.
little or no control
When using food to shape behavior of a rat, the sound of the food dispenser is a ____________, and the food dispenser is a _________, and the food itself is a _________.
conditioned reinforcer, primary reinforcer
Thorndike's research with the puzzle box is an example of a ________ procedure, whereas Skinner's research used a __________ procedure.
discrete trial, free operant
The three parts of a three-term contingency are the _________, the ____________, and the _________.
Each stimulus in the middle of a response chain serves as a _________ for the previous response and as a __________ for the next response.
conditioned reinforcer, discriminative stimulus
The procedure in which pigeons start to peck at a lighted response key when it precedes food deliveries is called ___________.
autoshaping
The Brelands used the term instinctive drift to refer to cases where an animal stopped performing _________ behaviors and started performing ___________ behaviors as its training progressed.
reinforced, instinctive
Another name for Thorndike's Law of Effect is: (This law is based upon reinforcement)
The Principle of Positive Reinforcement
Which of the following terms refer to the method of successive approximation to a goal? (A method used in behavior modification)
A. Shaping
B. Backward Chaining
C. Resurgence
D. Stimulus Control
Shaping
Which of the following was an apparatus used to establish the Law of Effect by Thorndike? (Similar to the Skinner Box)
A. The Columbia Obstruction Box
B. The Puzzle Box
C. The Water Maze
D. The Shuttle Box
The Puzzle Box
A previously neutral stimulus that has acquired the capacity to strengthen responses because that stimulus has been repeatedly paired with food is called: (Type of reinforce)
Conditioned Reinforcer
In operant conditioning, when an animal displays innate behaviors associated with a given reinforcer, even though these behaviors are not reinforced, this is called: (Type of biological constraint of operant conditioning)
Instinctive drift
Which of the following reflects the idea that different reinforcers evoke different systems or collections of behaviors? (Way to explain autoshaping behavior)
A. Sign Tracking
B. Instinctive Drift
C. Behavior – System Analysis
D. Superstitious behavior
Behavior – System Analysis
Resurgence is the reappearance of a previously reinforced response that occurs when a more recently reinforced response is extinguished. (Similar to spontaneous recovery)
True
Another name for operant conditioning is instrumental conditioning. (The subject's behavior is instrumental in obtaining the reinforce)
True
The best example of Skinner's concept of generalized reinforcer is money. (Refers to a large class of primary reinforcers)
True
Skinner referred to ____ behaviors as those that occur frequently early in the interval between reinforcers. (Examples include pecking towards the floor or moving along the front wall of the Skinner box.)
Interim
A ____ reinforcer is a stimulus that naturally strengthens any response it follows. (Critical component for shaping)
Primary
In ____ chaining, the teacher starts by reinforcing the first response of the chain, then gradually adds the second response, and so on. (Often used to train animals for sequence behaviors)
Forward
The procedure in which pigeons start to peck at a lighted response key when it precedes food deliveries is called ______: (Behavior that occurs without reinforcement)
Autoshaping
In instrumental conditioning, procedures that make use of lever pressing or similar responses are called:
Free operant conditioning
A sequence of behaviors that must occur in a specific order, with the primary reinforcer being delivered after the final response of the sequence.
response chain
The broad topic of how stimuli that precede a behavior can control the occurrence of that behavior.
stimulus control
The method of successive approximation.
shaping
If a response is followed by a reinforcer, the frequency of that response will increase.
law of effect
The subject's behavior is instrumental in obtaining the reinforcer.
instrumental conditioning
A previously neutral stimulus that has acquired the capacity to strengthen responses due to its repeated pairing with a primary reinforcer.
conditioned response
A reinforcer is anything that
increases the frequency of a given response occurring again
Thorndike demonstrated his principle of
Law of Effect using the puzzle box experiment
Superstitious behavior is the result of
accidental reinforcement
Successive approximation or shaping has many applications specifically with regard to
behavior modification in the classroom
Skinner used the term operant conditioning or instrumental conditioning to
describe behaviors which were strengthen by reinforcement
Three components of operant conditioning are:
1) the stimuli that precedes the response, 2) the response itself, and 3) the reinforcer
Like classical conditioning, the reappearance of a previously reinforced response that occurs when a more recently reinforced response is extinguished is called
resurgence
A response chain consists of
alternating series of stimuli and responses, and only the last response is followed by the primary reinforcer
The phenomenon of autoshaping is used to
rebut the principle of reinforcement
puzzle box
The Law of Effect:
Thorndike’s experiment
Cats could escape by making a response (e.g., pulling on a string)
The Law of Effect:
Thorndike’s experiment
Measure of performance was escape latency
o 1st time luck, 11th time all did it
The Law of Effect:
Thorndike’s experiment
Thorndike’s version of the principle of reinforcement, which states that responses that are followed by pleasant or satisfying stimuli will be strengthened and will occur more often in the future.
o Positive Reinforcement
The Law of Effect
This principle states that there is a parallel between the _____ of the camera and the ________ in the experiments by ________
Law of Effect and the Stop-Action Principle:
action, reinforcer, Guthrie and Horton.
The specific bodily position and the muscle movements occurring at the moment of reinforcement will have a
Law of Effect and the Stop-Action Principle:
higher probability of occurring on the next trial
Skinner’s (1948) superstition experiment
Superstitious Behaviors
Whatever behavior happened to be occurring when the reinforcer was delivered was strengthened.
Skinner’s (1948) superstition experiment
A behavior that occurs because, by accident or coincidence, it has previously been followed by a reinforcer.
superstition
common among athletes
superstition
Superstitions that are widely held are probably due to
communication with others
Some superstitions were originally
valid beliefs (e.g., bad luck to light 3 cigarettes with 1 match)
A procedure for teaching new behavior in which closer and closer approximations to the desired behavior are reinforced.
Shaping, or Successive Approximations
pressing in a rat
Shaping lever
Hypothetical distribution of height of rat’s head
(shaping example)
Shaping behaviors in the
classroom
Shaping as a ______ in behavior modification (e.g, teach self-care skills to the mentally disabled)
tool
Different from the discrete trial procedure used by Thorndike; The operant response can occur at any time, and the operant response can occur repeatedly for as long as the subject remains in the skinner box.
Research of B.F. Skinner:
The Free Operant Procedure
Response rate
The Free Operant Procedure
According to Skinner, there are ___________
Three-Term Contingency
1) the context in which the response occur
- relation b/w
Three-Term Contingency
2) the response itself
- lever pressing
Three-Term Contingency
3) the reinforcer
- reinforcer
Three-Term Contingency
This is the reappearance of a previously reinforced response that occurs when a more recently reinforced response is extinguished
Operant Conditioning:
Resurgence
stimulus that naturally strengthens any response it follow (e.g., food, water, comfort)
Operant Conditioning:
Primary Reinforcer
This reinforcer acts as a surrogate for the primary reinforcer
Operant Conditioning:
Conditioned Reinforcement
A class of conditioned reinforcers that are associated with a number of different primary reinforcer
Example: money
Operant Conditioning:
Generalized Reinforcers
learning to respond to one stimulus but not another
Operant Conditioning:
Discrimination Learning
a stimulus that indicates whether or not responding will lead to reinforcement
Operant Conditioning:
Discriminative stimulus
Extinction discrimination and generalization =
same as classical conditioning and operant conditioning
A sequence of behaviors that must occur in a specific order, with the primary reinforcer being delivered only after the final response of the sequence
Response Chains
Total task method, which is most effective?
Backward and Forward Chaining
Depends on training
Operant conditioning 3 ways
Response Chains
With extensive experience, the subject’s performance drifts away from the reinforced behaviors toward instinctive behaviors that occur when the animal is seeking the reinforcer.
Biological Constraints on Operant Conditioning:
Instinctive Drift
Experiment by Brown and Jenkins (1968); Similarity with superstitious behavior