You are on page 1of 3

B.F.

Skinner, The Behavior of Organisms (1938), Science and Human


Behavior (1953)
rejects implicit "S-O-R" psychology of Hull and the classical behaviorists: no
appeals to implicit unobservable physiological responses inside the organism, or to
underlying neural connections in organism; rejects "S-C-R" psychology of Tolman: no
appeals to intervening cognitive phenomena or mental states; accepts only
observable "S" and "R" events, moreso than any other behaviorist: "empty
organism" view
view of science: no appeal to any kind of theory proposing an underlying cause
of behavior - method is to just catalog and summarize observations about behavior;
similar to Hume's view of causation: no knowable "cause-effect" relation, just
observation of certain events reliably following others
Skinner vs. Thorndike on operant conditioning: (1) Skinner assumes no neural model
or brain states explaining S-R connections; (2) Skinner does not believe
reinforcement strengthens an S-R connection - responses are not caused by stimuli,
but rather are selected and produced for their reinforcing consequences
Skinner's operant conditioning:
1) goal is perfect prediction / control of behavior; emphasis on technology /
engineering of behavior (practical) rather than on science / explanation of behavior
(theoretical); lawfulness must be found in the individual, not in groups of subjects;
individual differences left in behavior must be due to differences in reinforcement
histories
2) cumulative record: a learning curve plotting cumulative number of responses
against time (so it can only go up or stay flat) - slope is "response rate", the main
Skinnerian dependent variable; emphasizes maintenance of behavior: the end
product of learning rather than the actual process of learning (how many responses
were made in total thus far)
"Skinner box" captures lots of behavior in little time with little fatigue;
response is bar press for rats, key peck for pigeons
This allows for easy modification and flexibility through different experiments.
Additionally, it can track many responses, namely through a specific graph. The
response rate is how the animals performance is ultimately measured.
3) reinforcement increases rate of responding; positive = delivering a stimulus the
animal "wants" (e.g., food), negative = taking away a stimulus the animal "doesn't
want" (e.g., shock)
Can be defined as anything that increases the rate of responding. If the rate
of responding increases, then it was likely reinforced.
punishment decreases rate of responding; positive = delivering a stimulus the
animal "doesn't want" (e.g., shock), negative = taking away a stimulus the animal
"wants" (e.g., parental attention in "time-out" procedure)

according to Skinner, punishment causes at best a temporary suppression of


responding
note: reinforcement and punishment are both defined solely in terms of their
effect on behavior (and not in terms of "drive reduction", "goals", etc.): anything
that increases the rate of responding is considered a reinforcer, anything that
decreases the rate of responding is considered a punishment
4) response - molar: an "operant" is a class of behaviors which includes any
response that is controlled by the reinforcement, i.e., any response that brings
about a given consequence; same behavior may be instance of different operants in
different contexts
"superstition" in pigeons develops when some behavior
is accidentally reinforced and then controlled by its apparent consequences
5) stimulus - event correlated with the production of a response; stimulus
is occasion for, not cause of, response
"stimulus control": discriminative stimulus SD indicates response will be
reinforced; S is the stimulus indicating the response will not be reinforced;
example: light turned on or off in Skinner box - bar press only reinforced when light
is on
6) conditioned reinforcement ("secondary reinforcement" for Hull) - a stimulus
associated with reinforcement eventually becomes reinforcing itself; works like
higher order conditioning: must be backed up with primary reinforcement or
extinction will result
in a Skinner box, "magazine training" is the first step - click of "magazine"
(food delivery mechanism) when food is delivered acts as SD for response of
approaching the food tray; because the click always accompanies food delivery, it
becomes a conditioned reinforcer; food may not be consumed immediately, but
click does follow response immediately (improves reinforcement timing)
generalized reinforcer - a stimulus associated with many primary reinforcers,
not tied to any particular motivational state - for example, money, social approval,
etc.
chaining: note that all SD are conditioned reinforcers because responding in
their presence always leads to reinforcement!; thus a complex behavior pattern can
be conceived as a chain of simple responses: each response "link" reveals a new
SD which indicates that the next response will be reinforced, while at the same time
acting as an SR (reinforcement) for the previous response
7) shaping - method for producing new responses in an animal, consisting of
differential reinforcement of successive approximations to a desired response; using
shaping techniques, pigeons have been taught to play ping-pong, a response that is
obviously not in their inital operant repertoire; whereas stimulus control is based on
discrimination of stimuli, shaping is based on discrimination of responses

8) schedules of reinforcement - partial reinforcement effect says that response is


stronger when animal is not reinforced on every trial; measuring strength of the
response by its resistance to extinction, the basic schedules in order of increasing
effectiveness are: CR (ex.: soda machine); FI (ex.: cramming for quizzes; "scallop"
occurs because time is an SD); VI (ex.: checking e-mail; no scallop); FR (ex.:
piecework); VR (ex.: slot machine)
9)

other learning phenomena as treated by Skinner:

motivation: no "drive-reduction" or other theoretical entity is hypothesized there is just an empirical observation that food-deprived rats respond at a higher
rate for food reinforcement
extinction: not necessarily the disappearance of a response, but rather a return
to the response's "operant level" (the rate at which the response appears without
any reinforcement)
spontaneous recovery: recognized as an empirical phenomenon without much
explanation of its mechanism
generalization: when a stimulus complex sets the occasion for a response, the
response also occurs when the animal encounters stimulus complexes which share
elements with the original stimulus or are otherwise similar
inhibition: the unobserved theoretical entity / intervening variable employed by
Pavlov and Hull plays no role for Skinner - there is only a tendency to produce one
response or another, depending on reinforcement history

You might also like