Behaviour #5: learning

  1. Learning
    • changes in behavior that results from an animal's interaction with its environment
    • More precise:
    • -specific to the event experienced
    • -adaptive: species that adapt to learn will survive more than others
    • -lasting
    • -change of, modification in, or creation of, a single behavior
    • -involving the nervous system
    • -as a result of experience with an external event or series of events in an individuals life
    • -excludes fatigue, hormonal changes, other developmental or maturational processes, injury
  2. can you observe the learning process?
    • no you can only see the results
    • (see expression of responses as a result of learning)
  3. memory
    the changes in the nervous system by which information is stored during learning
  4. coincidence
    contingency
    • coincidence: 2 events happen together by chance
    • contingency: two events happen together more often than they should by chance
  5. Information
    predictable departures from randomness
  6. Brains and info processing
    • Brains are like contingency detecting and exploiting machines
    • -perception is the filtering of info from randomness
    • -learning is the ability to adapt responses to new contingencies (info) that are not programmed in the genes
  7. What are the two main approaches to learning in non-human animals?
    • Ethologists
    • Comparative psychologists
  8. ethologists view on learning
    • function that learning plays in the animal's natural environment
    • phylogenetic differences in learning abilities
    • role of leaning in development
  9. comparative psychologists view on learning
    • elucidate mechanisms underlying learning
    • general "law" of leaning
  10. Categories of Learning in ethology and comparative psychology
    • Non-associative learning: animal is exposed repeatedly to a single stimulus, and behavior changes
    • Associative learning: animal is exposed to 2 or more stimuli that have a particular relationship to one another. Animal demonstrate associative learning if response to one stimulus is altered by exposure to the second.
    • complex learning: insight, reasoning
  11. Non-associative learning: habituation
    • waning of a response, when a stimulus is repeated, even though the full response could still be made
    • most primitive and universal form of learning
    • eg a dog in a new home may be startled by the sound of a door opening, but over time this startle response slowly disappers
  12. Classic example of habituation (non associative learning)
    • Aplysia (marine slug)
    • -given a tactile stimulus
    • weak stimulus: took longer to habituate to than a strong stimulus
    • (however maybe the muscles in the slow were just too exhausted to respond?)
    • after a rest full recovery of response is seen
  13. Dishabituation
    recover of response strength due to a strong novel stimulus (i.e. increase intensity)
  14. Associative Learning
    and ex
    • Association between one stimulus, and either another stimulus or the animals own behavior, leads to a change in behavior
    • ex. pigs in a swine unit may associate feed with the sound of the feed truck, time of day, stockman's presence, and begin salivating before food actually arrives
    • ex. rats in a maze may associate food with the act of entering a particular corridor, and enter that corridor repeatedly each time they run this maze
    • ex cows become nervous when vet truck arrives
  15. Varieties of Associative Learning
    • Classical condition:
    •   -associations between events over which the animal has no direct control (ie feeding truck and food)
    •   -requires multiple exposures
    • Instrumental Conditioning
    •   -the animal's behavior is instrumental in learning (eg choosing the correct corridor in a maze)
    •   -requires multiple exposures
    • Passive avoidance learning (taste aversion learning)
    •   -eg rats can eat a novel food, become sick several hours later, and will avoid the food in the future
    •   -probably very different from classical and instrumental conditioning, not least because ir requires ONE exposure
  16. Give a classical conditioning example
    Blue Jays eat monarch butterfly's and then throws up (so don't eat again)
  17. Classical Conditioning: stimulus and response for pavlovs dog experiment
    • unconditional stimulus (UCS): puff of air
    • unconditional response (UCR): eye blink
    • neutral stimulus (NS): tone "sound"UCS and NS are presented together after repeated presentations, the animal responds to both UCS and NSNS is NOW referred to as a conditional stimulus (CS) and UCR is termed the conditional response (CR)
  18. Classical Conditioning
    • an Existing behavior is elicited by a new stimulus neutral stimulus
    • can result in unconditioned response if paired with UCS
    • long lasting
    • "pavlovian learning"
  19. 2nd order conditioning
    • the CS can act like an UCS for a new NS
    • ie a flashing light (NS) can be paired with a bell (CS) and the dog will salivate to the flashing light
  20. Extinction
    Disrupt association between UCS and CS-present CS with no UCSRing bell without giving dog food -so prevent a stimuli without supporting it
  21. Instrumental Learning
    • Classical Conditioning (pavlovian learning)
    • -the animal has no control over events that are changing its environment
    • -it just comes to respond to one event as if it were another
    • Instrumental conditioning (instrumental learning, operant conditioning, trial and error learning
    • -animal can behave spontaneously, its behavior is instrumental in the learning process
    • -learn from results of action

    • This is occurring all the time, even when you don't want it to!
    • -dog jumps up and receives social interaction
    • -cat scratching furniture
  22. Instrumental learning: teaching a puppy to sit
    • learns more readily if response occurs spontaneously rather than forced (pushing on back end)
    • food reward in hand, lift hand up, with one hand behind puppy to prevent 'backing up' and puppy naturally sits
    • as puppy sits say sit and reward (reinforce) immediately (food and praise are the reinforcing stimuli - DO NOT reward non sitting
    • eventually raised hand becomes visual clue for sitting; just as "SIT" becomes an auditory one

    animal is active and specifically involved in process
  23. B.F. Skinner
    • believed that instrumental conditioning was behind most if not all complex behaviour: his point was pivotal to modern behaviorism (because almost all complex acquired behavior could in theory be explained using instrumental learning)
    • Skinner referred to instrumental conditioning as operant conditioning
    • -skinner box: very simple barren enviro to study operant conditioning in pure, uncorrupted form
    • ie mouse has a lever which puts food in
  24. Operants
    • Operant- behavior that the experimenter wishes to alter the chance of being performed
    • (operants ie pecking key, pressing lever, sitting)
    • The animal must learn association between stimuli, responses, and consequences 

    • shape behavior: when animal gets close to doing what should, give rewards, only reward again if gets even closer
    • with shaping, very complex sequences can be taught
  25. Clicker training
    • works with operant behavior
    • clicker associated with positive reinforce (usually food)
    • Click (CS) and treat (UCS) - the click becomes classically conditioned as a secondary reinforcer
    • the click is as if it were a treat
    • used to shape behavior- give a click to give the animal small rewards for each approximation
    • so sound became positive on its own without food
  26. Ecological constraints on learning
    • Does shaping really make sense? How can animals learn in the wild if there is no-one there to shape them?
    • in fact few operants in a skinner box reflect anything an animal would do in the wild

    • For example:
    • pigeons can be taught extremely complex tasks (such as telling Picasso from monet) if the operant is to peck a key, and the reward is food
    • in fact they were even able to generalize from Picasso to other cubist painters and from monet to other impressionists
    • BUT it was almost impossible to teach a pigeon to peck a key for a non-food reward such as access to a mate
  27. Key definitions of reinforcement and punishment
    • 1.  want behavior to increase with positive reinforcement
    • 2. remove already applied stimulus to reinforce behavior
    • 3. stimulus presented and the behavior decreased "ie positive punishment"
    • 4. stimulus already present gets taken away and behavior decreases
  28. Punishment
    • 'punishment' is used in common speech to mean "positive punishment"- performing behavior results in noxious stimulus, intended to prevent a behavior
    • timing is critical
    • Intensity: optimal level is the minimum required to suppress response which is very difficult to judge in practice
    • positive punishments have side effects: unpredictable behavior, fear, suppression of all behaviors, makes learning harder if not right the first time

    hard to apply appropriately because animal needs to associate behavior with punishment
  29. Reinforcement schedule
    • Animals work harder when reinforcements are
    • a) few and far between
    • b) relatively irregular and unpredictable

    extinction- friend and foe


    • continuous
    • fixed interval
    • variable interval
    • fixed ratio
    • variable ratio
  30. KNOW
    different types of reinforcement schedules
    **Go into notes review chart

    • continuous reinforcement (CRF)
    • Fixed interval (F1)
    • Variable Interval (V1)
    • Fixed ratio (FR)
    • Variable ratio (VR)
  31. Do classical and operant conditioning offer evidence for higher mental processes in animals?
    • NO: they are simply association between stimulus and response
    • ability to learn per stimulus and response is not enough to infer any sort of comprehension
    • goal directed trial and error
  32. Examples of complex forms of learning
    • thinking/prediction/mental simulation
    • latent learning
    • cognitive maps
    • learning-set learning
    • insight learning
    • tool use, and tool making
  33. Mental simulation
    Latent Learning
    • Mental simulation: is the ability to run a simulation of trial-and-error learning, and to learn from it
    • -obvious advantages: less costly, quicker
    • -Basis of "thinking"

    • Latent Learning: Putting together 2 separate experiences that have never occurred together as if they had been instrumentally conditioned
    • (so taking an event that happened in the past and using that info later on, but in diff enviro)
    • -may involve mental simulation
    • -again obvious biological advantages

    latent learning ex: ie mouse in a maze gets shocked in a black square... later on in a new maze there is a black square and it avoids it
Author
ARM
ID
241385
Card Set
Behaviour #5: learning
Description
learning
Updated