Search This Blog

Monday, August 18, 2014

B. F. Skinner

B. F. Skinner

From Wikipedia, the free encyclopedia
B. F. Skinner
B.F. Skinner at Harvard circa 1950.jpg
Born Burrhus Frederic Skinner
March 20, 1904
Susquehanna, Pennsylvania, United States
Died August 18, 1990 (aged 86)
Cambridge, Massachusetts, United States[1]
Nationality American
Fields Psychology, linguistics, philosophy
Institutions University of Minnesota
Indiana University
Harvard University
Alma mater Hamilton College
Harvard University
Known for Operant conditioning
Influences Charles Darwin
Ivan Pavlov
Ernst Mach
Jacques Loeb
Edward Thorndike
William James
Jean-Jacques Rousseau
Henry David Thoreau
Notable awards National Medal of Science (1968)
Signature

Burrhus Frederic (B. F.) Skinner (March 20, 1904 – August 18, 1990) was an American psychologist, behaviorist, author, inventor, and social philosopher.[2][3][4][5] He was the Edgar Pierce Professor of Psychology at Harvard University from 1958 until his retirement in 1974.[6]

Skinner invented the operant conditioning chamber, also known as the Skinner Box.[7] He was a firm believer of the idea that human free will was actually an illusion and any human action was the result of the consequences of that same action. If the consequences were bad, there was a high chance that the action would not be repeated; however if the consequences were good, the actions that led to it would be reinforced.[8] He called this the principle of reinforcement.[9]

He innovated his own philosophy of science called radical behaviorism,[10] and founded his own school of experimental research psychology—the experimental analysis of behavior, coining the term operant conditioning. His analysis of human behavior culminated in his work Verbal Behavior, as well as his philosophical manifesto Walden Two, both of which[citation needed] have recently seen enormous increase in interest experimentally and in applied settings.[11] Contemporary academia considers Skinner a pioneer of modern behaviorism along with John B. Watson and Ivan Pavlov.

Skinner discovered and advanced the rate of response as a dependent variable in psychological research. He invented the cumulative recorder to measure rate of responding as part of his highly influential work on schedules of reinforcement.[12][13] In a June 2002 survey, Skinner was listed as the most influential psychologist of the 20th century.[14] He was a prolific author who published 21 books and 180 articles.[15][16]

Biography


The Skinners' grave at Mount Auburn Cemetery

Skinner was born in Susquehanna, Pennsylvania to William and Grace Skinner. His father was a lawyer. He became an atheist after a Christian teacher tried to assuage his fear of the hell that his grandmother described.[17] His brother Edward, two and a half years younger, died at age sixteen of a cerebral hemorrhage. He attended Hamilton College in New York with the intention of becoming a writer. He found himself at a disadvantage at Hamilton College with many due to his intellectual attitude.[18] While attending, he joined Lambda Chi Alpha Fraternity. He wrote for the school paper, but as an atheist, he was critical of the religious school he attended. He also attended Harvard University after receiving his B.A. in English literature in 1926 where he would later research, teach, and eventually become a prestigious board member. While at Harvard, he invented his prototype for the Skinner Box. Also, a fellow student Fred Keller, convinced Skinner he could make an experimental science from the study of behavior. This led Skinner to join Keller and they created different tools for small experiments.[18] After graduation, he unsuccessfully tried to write a great novel while he lived with his parents, which he later called the Dark Years.[18] He soon became disillusioned with his literary skills despite encouragement from widely renowned literary genius Robert Frost and concluded that he had little world experience and no strong personal perspective from which to write. His encounter with John B. Watson's Behaviorism led him into graduate study in psychology and to the development of his own operant behaviorism.[19]

Skinner received a Ph.D. from Harvard in 1931, and remained there as a researcher until 1936. He then taught at the University of Minnesota at Minneapolis and later at Indiana University, where he was chair of the psychology department from 1946–1947, before returning to Harvard as a tenured professor in 1948. He remained at Harvard for the rest of his life. In 1973 Skinner was one of the signers of the Humanist Manifesto II.[20]

In 1936, Skinner married Yvonne Blue. The couple had two daughters, Julie (m. Vargas) and Deborah (m. Buzan). He died of leukemia on August 18, 1990,[21] and is buried in Mount Auburn Cemetery, Cambridge, Massachusetts.[22] Skinner continued to write and work until just before his death. A few days before Skinner died, he was given a lifetime achievement award by the American Psychological Association and delivered a 15-minute address concerning his work.[23]

A controversial figure, Skinner has been depicted in many different ways. Much of Skinner’s criticism derived from his penchant for attempting to apply science proven in laboratory environments at a universal level.

Theory

Skinner called his particular brand of behaviorism "Radical" behaviorism.[24] Radical behaviorism is the philosophy of the science of behavior. It seeks to understand behavior as a function of environmental histories of reinforcing consequences. Such a functional analysis makes it capable of producing technologies of behavior (see Applied behavior analysis). This applied behaviorism lies on the opposite side of the ideological spectrum as the field of cognitive science. Unlike less austere behaviorism, it does not accept private events such as thinking, perceptions, and unobservable emotions in a causal account of an organism's behavior
The position can be stated as follows: what is felt or introspectively observed is not some nonphysical world of consciousness, mind, or mental life but the observer's own body. This does not mean, as I shall show later, that introspection is a kind of psychological research, nor does it mean (and this is the heart of the argument) that what are felt or introspectively observed are the causes of the behavior. An organism behaves as it does because of its current structure, but most of this is out of reach of introspection. At the moment we must content ourselves, as the methodological behaviorist insists, with a person's genetic and environment histories. What are introspectively observed are certain collateral products of those histories.
...
In this way we repair the major damage wrought by mentalism. When what a person does [is] attributed to what is going on inside him, investigation is brought to an end. Why explain the explanation? For twenty five hundred years people have been preoccupied with feelings and mental life, but only recently has any interest been shown in a more precise analysis of the role of the environment. Ignorance of that role led in the first place to mental fictions, and it has been perpetuated by the explanatory practices to which they gave rise.[25]
Skinner stood at the opposite position from humanistic psychology for his whole career, and denied humans possessing freedom and dignity as well as evidenced in his book Beyond Freedom and Dignity.[26] Most of his theories were supposed to be based on self-observation, which caused him to become a supporter for behaviorism. Much of this self-observed theory stemmed from Thorndike’s Puzzle Box, a direct antecedent to Skinner’s Box. The psychologist further expanded on Thorndike’s earlier work by introducing the concept of Reinforcement to Thorndike’s Law of Effect.[27] Skinner was an advocate of behavioral engineering and he thought that people should be controlled through the systematic allocation of external rewards.[28] Skinner believed that behavior is maintained from one condition to another through similar or same consequences across these situations. In short, behaviors are causal factors that are influenced by the consequences. His contribution to the understanding of behavior influenced many other scientists to explain social behavior and contingencies.[29]

Reinforcement is a central concept in Behaviorism, and was seen as a central mechanism in the shaping and control of behavior. A common misconception is that negative reinforcement is synonymous with punishment. This misconception is rather pervasive, and is commonly found in even scholarly accounts of Skinner and his contributions. To be clear, while positive reinforcement is the strengthening of behavior by the application of some event (e.g., praise after some behavior is performed), negative reinforcement is the strengthening of behavior by the removal or avoidance of some aversive event (e.g., opening and raising an umbrella over your head on a rainy day is reinforced by the cessation of rain falling on you).

Both types of reinforcement strengthen behavior, or increase the probability of a behavior reoccurring; the difference is in whether the reinforcing event is something applied (positive reinforcement) or something removed or avoided (negative reinforcement). Punishment and extinction have the effect of weakening behavior, or decreasing the future probability of a behavior's occurrence, by the application of an aversive stimulus/event (positive punishment or punishment by contingent stimulation), removal of a desirable stimulus (negative punishment or punishment by contingent withdrawal), or the absence of a rewarding stimulus, which causes the behavior to stop (extinction).

Skinner also sought to understand the application of his theory in the broadest behavioral context as it applies to living organisms, namely natural selection.[30]

Schedules of reinforcement

Part of Skinner's analysis of behavior involved not only the power of a single instance of reinforcement, but the effects of particular schedules of reinforcement over time.
The most notable schedules of reinforcement presented by Skinner were interval (fixed or variable) and ratio (fixed or variable).
  • Continuous reinforcement — constant delivery of reinforcement for an action; every time a specific action was performed the subject instantly and always received a reinforcement. This method is impractical to use, and the reinforced behavior is prone to extinction.
  • Interval Schedules : based on the time intervals between reinforcements[31]
    • Fixed Interval Schedule (FI) : An operant conditioning principle in which reinforcements are presented at fixed time periods, provided that the appropriate response is made.
    • Variable Interval Schedule (VI) : An operant conditioning principle in which behaviour is reinforced based on an average time that has expired since the last reinforcement.
Both FI and VI tend to produce slow, methodical responses because the reinforcements follow a time scale that is independent of how many responses occur.
  • Ratio Schedules : based on the ratio of responses to reinforcements[31]
    • Fixed Ratio Schedule (FR) : An operant conditioning principle in which reinforcement is delivered after a specific number of responses have been made.
    • Variable Ratio Schedule (VR) : An operant conditioning principle in which the delivery of reinforcement is based on a particular average number of responses (ex. slot machines).
VR produce slightly higher rates of responding than FR because organism doesn’t know when next reinforcement is. The higher the ratio, the higher the response rate tends to be.[32]

Air crib

In an effort to help his wife cope with the day-to-day tasks of child rearing due to the birth of their second child, Skinner thought he might be able to improve upon the standard crib. He invented the 'air-crib' to meet this challenge. An 'air-crib'[33] (also known as a 'baby tender' or humorously as an 'heir conditioner') is an easily cleaned, temperature and humidity-controlled crib Skinner designed to assist in the raising of babies.

Skinner designed this initial, preliminary prototype of the Air-Crib because he thought it would help parents who were awakened by their crying babies at night due to cold temperatures, and a need for essential clothing, or sheets. Despite allegations to the contrary, Skinner’s daughter Deborah claims to never have felt abused or neglected by use of the Air-Crib.[27] He thought doing so would alleviate “troublesome” environmental issues.[34]

It was one of his most controversial inventions, and was popularly mischaracterized as cruel and experimental.[35] The crib was often compared to his operant conditioning chamber, crudely known as the "Skinner Box." This association with a system of experimentation and pellet rewards quashed any success. It was designed to make early childcare simpler (by reducing laundry, diaper rash, cradle cap, etc.), while encouraging the baby to be more confident, mobile, comfortable, healthy and therefore less prone to cry. Babies sleep and will sometimes play in air cribs but apart from newborns, most of a baby's waking hours will be spent out of the crib. Reportedly it had some success in these goals.[35] Air-cribs were later unsuccessfully commercially manufactured by several companies.[35][36]

A 2004 book by Lauren Slater, entitled Opening Skinner's Box: Great Psychology Experiments of the Twentieth Century[37] caused much controversy by mentioning the common rumors that Skinner had used his baby daughter Deborah in some of his experiments and that she had subsequently committed suicide. Although Slater's book stated that the rumors were false, Slater also allowed the reader to believe that Deborah had disappeared. A reviewer in The Observer in March 2004 then misquoted Slater's books as supporting the rumors. This review was read by Deborah Skinner (now Deborah Buzan, an artist and writer living in London) who wrote a vehement riposte in The Guardian.[38]

Operant conditioning chamber

While a researcher at Harvard,[39] B. F. Skinner invented the operant conditioning chamber, popularly referred to as the Skinner box,[40] to measure responses of organisms (most often, rats and pigeons) and their orderly interactions with the environment. The box had a lever and a food tray, and a hungry rat could get food delivered to the tray pressing the lever. Skinner observed that when a rat was put in the box, it would wander around, sniffing and exploring, and would usually press the bar by accident, at which point a food pellet would drop into the tray. After that happened, the rate of bar pressing would increase dramatically and remain high until the rat was no longer hungry.
Skinner discovered that consequences for the organism played a large role in how the organism responded in certain situations.[41] For instance, when the rat would pull the lever it would receive food. Subsequently, the rat made frequent pulls on the lever.[42] Negative reinforcement was also exemplified by Skinner placing rats into an electrified chamber that delivered unpleasant shocks. Levers to cut the power were placed inside these boxes. By running a current through the “operant conditioning chamber,” Skinner noticed that the rats, after accidentally pressing the lever in a frantic bid to escape, quickly learned the effects of implementing the lever and consequently used this knowledge to stop the currents both during and prior to electrical shock. These two learned responses are known as Escape Learning and Avoidance Learning.[27] The operant chamber for pigeons involves a plastic disc in which the pigeon pecks in order to open a drawer filled with grains.[43] The Skinner box led to the principle of reinforcement, which is the probability of something occurring based on the consequences of a behavior.[44]

This device was an example of his lifelong ability to invent useful devices, which included whimsical devices in his childhood[45] to the cumulative recorder to measure the rate of response of organisms in an operant chamber. Even in old age, Skinner invented a Thinking Aid to assist in writing.[46]

Cumulative recorder

The cumulative recorder is an instrument used to automatically record behavior graphically. Its graphing mechanism consisted of a rotating drum of paper equipped with a marking needle. The needle would start at the bottom of the page and the drum would turn the roll of paper horizontally.
This cumulative recorder was used for the Skinner box to record the rat's behavior.[39] This apparatus produced consistent and accurate records of behavior.[39]

Teaching machine


The teaching machine, a mechanical invention to automate the task of programmed instruction

The teaching machine was a mechanical device whose purpose was to administer a curriculum of programmed instruction. In one incarnation, it housed a list of questions, and a mechanism through which the learner could respond to each question. Upon delivering a correct answer, the learner would be rewarded.[47]

Skinner advocated the use of teaching machines for a broad range of students (e.g., preschool aged to adult) and instructional purposes (e.g., reading and music). Another of the multiple machines he envisioned could teach rhythm:
A relatively simple device supplies the necessary contingencies. The student taps a rhythmic pattern in unison with the device. "Unison" is specified very loosely at first (the student can be a little early or late at each tap) but the specifications are slowly sharpened. The process is repeated for various speeds and patterns. In another arrangement, the student echoes rhythmic patterns sounded by the machine, though not in unison, and again the specifications for an accurate reproduction are progressively sharpened. Rhythmic patterns can also be brought under the control of a printed score. (Skinner, 1961. Teaching machines. Scientific American, 205, 90-112. doi:10.2307/1926170, p. 381).
The teaching machine had such instructional potential because it provided immediate and regular reinforcement that maintained students’ interest, as the “material in the machine [was] always novel” (Skinner, 1961, p. 387). In this way, a student’s attention could be maintained without the use of aversive controls. The efficiency of the teaching machine resulted from its automatic provision of reinforcement, individualized pace setting, and a coherent instructional sequence for the student. It engaged students and allowed them to learn by doing.

Teaching machines, though perhaps rudimentary, were not rigid instruments of instruction. They could be adjusted and improved based upon reports of students’ performance. For example, if a student’s report showed numerous incorrect responses, then the machine could be reprogrammed to provide less advanced prompts or questions- the idea being that students acquire behaviors most efficiently when their error rate is minimized. Along these lines, multiple choice formats were not best suited for teaching machines because contingencies of reinforcement would be left to chance; moreover, this format could increase student mistakes and induce erroneous behaviors.

Not only useful in teaching explicit skills, machines could also promote the development of a repertoire of behaviors Skinner called self-management. Self-management refers to how students think- how they attend to the environment with the view of responding appropriately to stimuli. Machines give students the opportunity to first pay attention before receiving a reward as reinforcement. This is in stark contrast with what Skinner noticed as the classroom practice of initially capturing students’ attention (e.g., with a lively video) and delivering a reward (e.g., entertainment) before they have actually done attended- a practice which actually counters the development of self-management and fails to correctly apply reinforcements for correct behavior. What Skinner referred to as a teaching machine would probably be akin to a computer software program today that provided highly structured and incremental instruction. Skinner’s influence on such machines is undeniable. He was the first to pioneer the use of machines in the classroom, especially at the primary level. Today teaching machines such as Language Labs have been incorporated into modern education. Though it was just one of a number of inventions, it embodies much of Skinner’s theory of learning and has wide-reaching implications for education in general and classroom instruction in particular.[48]

There has been a resurgence of interest in the notion of the teaching machine and its relationship to adaptive learning systems of the early 21st Century[49]

Pigeon-guided missile

The US Navy required a weapon effective against the German Bismarck class battleships. Although missile and TV technology existed, the size of the primitive guidance systems available rendered any weapon ineffective. Project Pigeon[50][51] was potentially an extremely simple and effective solution, but despite an effective demonstration it was abandoned when more conventional solutions became available and in particular the radar system. The project centered on dividing the nose cone of a missile into three compartments, and encasing a pigeon in each. Each compartment used a lens to project an image of what was in front of the missile onto a screen. The pigeons would peck toward the object, thereby directing the missile.[52]
Skinner complained "our problem was no one would take us seriously."[53] The point is perhaps best explained in terms of human psychology (i.e., few people would trust a pigeon to guide a missile no matter how reliable it proved).[54]

 Verbal summator

A device for discovering "latent speech", Skinner experimented with a device he called the "verbal summator."[55] Although Skinner was not a supporter of the concepts of personal assessment or projective testing, he eventually created a tool that was similar to an auditory version of the Rorschach inkblots.[55] This was later known as the verbal summator. This invention was created in order to project subconscious thoughts, much like the inkblots have done. He used this device to create data for his verbal behavior theory. Other researchers later saw this device as a chance to conduct research and for applied purposes for research and applied purposes. The concept of the verbal summator sparked interest within the scientific community and eventually led to other new tests: the “tautophone test, the auditory apperception test, and the Azzageddi test” and has inspired many others.[55]

Verbal Behavior

Challenged by Alfred North Whitehead during a casual discussion while at Harvard to provide an account of a randomly provided piece of verbal behavior,[56] Skinner set about attempting to extend his then-new functional, inductive, approach to the complexity of human verbal behavior.[57]
Developed over two decades, his work appeared as the culmination of the William James lectures in the book Verbal Behavior. Although Noam Chomsky was highly critical of Verbal Behavior, he conceded that "S-R psychology" (which Skinner's system was most certainly not: operant conditioning consists of a stimulus (or antecedent) (S) emitting a response (R) which then becomes more or less likely in the future dependent upon its consequence (C))[58] was a reason for giving it "a review." Verbal Behavior had an uncharacteristically slow reception, partly as a result of Chomsky's review, paired with Skinner's neglect to address or rebut any of Chomsky's condemnations.[59] Skinner's peers may have been slow to adopt and consider the conventions within Verbal Behavior due to its lack of experimental evidence—unlike the empirical density that marked Skinner's previous work.[60] However, Skinner's functional analysis of verbal behavior has seen a resurgence of interest in applied settings.[61]

Influence on education

Skinner influenced education as well as psychology in both his ideology and literature. In Skinner’s view, education has two major purposes: (1) to teach repertoires of both verbal and nonverbal behavior; and (2) to encourage students to display an interest in instruction. He endeavored to bring students’ behavior under the control of the environment by reinforcing it only when particular stimuli were present. Because he believed that human behavior could be affected by small consequences, something as simple as “the opportunity to move forward after completing one stage of an activity” could prove reinforcing (Skinner, 1961, p. 380). Skinner favored active learning in the sense that students were not merely passive recipients of information doled out by teachers. He was convinced that a student had to take action; “to acquire behavior, the student must engage in behavior” (Skinner, 1961, p. 389).

Moreover, Skinner was quoted as saying "Teachers must learn how to teach ... they need only to be taught more effective ways of teaching." Skinner asserted that positive reinforcement is more effective at changing and establishing behavior than punishment, with obvious implications for the then widespread practice of rote learning and punitive discipline in education. Skinner also suggests that the main thing people learn from being punished is how to avoid punishment.

In The Technology of Teaching, Skinner has a chapter on why teachers fail (pages 93–113): Essentially he says that teachers have not been given an in-depth understanding of teaching and learning. Without knowing the science underpinning teaching, teachers fall back on procedures that work poorly or not at all, such as:
  • using aversive techniques (which produce escape and avoidance and undesirable emotional effects);
  • relying on telling and explaining ("Unfortunately, a student does not learn simply when he is shown or told." p. 103);
  • failing to adapt learning tasks to the student's current level;
  • failing to provide positive reinforcement frequently enough.
Skinner suggests that any age-appropriate skill can be taught. The steps are
  1. Clearly specify the action or performance the student is to learn to do.
  2. Break down the task into small achievable steps, going from simple to complex.
  3. Let the student perform each step, reinforcing correct actions.
  4. Adjust so that the student is always successful until finally the goal is reached.
  5. Transfer to intermittent reinforcement to maintain the student's performance.
Skinner's views on education are extensively presented in his book The Technology of Teaching. It is also reflected in Fred S. Keller's Personalized System of Instruction and Ogden R. Lindsley's Precision Teaching.

Skinner associated punishment with avoidance. For example, he thought a child may be forced to practice playing his instrument as a form of seemingly productive discipline. This child would then associate practicing with punishment and thus learn to hate and avoid practicing the instrument. Additionally, teachers who use educational activities to punish children could cause inclinations towards rebellious behavior such as vandalism and opposition to education.[34]

Walden Two and Beyond Freedom and Dignity

Skinner is popularly known mainly for his books Walden Two and Beyond Freedom and Dignity for which he made the cover of TIME Magazine.[62] The former describes a visit to a fictional "experimental community"[63] in 1940s United States, where the productivity and happiness of the citizens is far in advance of that in the outside world because of their practice of scientific social planning and use of operant conditioning in the raising of children.

Walden Two, like Thoreau's Walden, champions a lifestyle that does not support war or foster competition and social strife. It encourages a lifestyle of minimal consumption, rich social relationships, personal happiness, satisfying work and leisure.[64] In 1967, Kat Kinkade founded the intentional community Twin Oaks, using Walden Two as a blueprint. The community is still in existence today and continues to use the Planner-Manager system and other aspects of the original book in its self-governance.

In Beyond Freedom and Dignity, Skinner suggests that a technology of behavior could help to make a better society. We would, however, have to accept that an autonomous agent is not the driving force of our actions. Skinner offers alternatives to punishment and challenges his readers to use science and modern technology to construct a better society.

Political views

Skinner's political writings emphasized his hopes that an effective and human science of behavioral control – a technology of human behavior – could help problems unsolved by earlier approaches or aggravated by advances in technology such as the atomic bomb. One of Skinner's stated goals was to prevent humanity from destroying itself.[65] He comprehended political control as aversive or non-aversive, with the purpose to control a population. Skinner supported the use of positive reinforcement as a means of coercion, citing Jean-Jacques Rousseau's novel Emile: or, On Education as an example of freedom literature that "did not fear the power of positive reinforcement".[3]
Skinner's book, Walden Two, presents a vision of a decentralized, localized society, which applies a practical, scientific approach and futuristically advanced behavioral expertise to peacefully deal with social problems. Skinner's utopia is both a thought experiment and a rhetorical piece. In his book, Skinner answers the problem that exists in many utopian novels – "What is the Good Life?" In Walden Two, the answer is a life of friendship, health, art, a healthy balance between work and leisure, a minimum of unpleasantness, and a feeling that one has made worthwhile contributions to a society in which resources are ensured, in part, by a lack of consumption.
If the world is to save any part of its resources for the future, it must reduce not only consumption but the number of consumers.
—B. F. Skinner,  Walden Two, p. xi.
The world ethos was to be achieved through behavioral technology, which could offer alternatives to coercion,[3] as good science applied correctly would help society,[4] and allow all people to cooperate with each other peacefully.[3] Skinner described his novel as "my New Atlantis", in reference to Bacon's utopia.[66] He opposed corporal punishment in the school, and wrote a letter to the California Senate that helped lead it to a ban on spanking.[67]
When Milton's Satan falls from heaven, he ends in hell. And what does he say to reassure himself? 'Here, at least, we shall be free.' And that, I think, is the fate of the old-fashioned liberal. He's going to be free, but he's going to find himself in hell.
—B. F. Skinner,  from William F. Buckley Jr, On the Firing Line, p. 87.

Superstitious pigeons

One of Skinner's experiments examined the formation of superstition in one of his favorite experimental animals, the pigeon. Skinner placed a series of hungry pigeons in a cage attached to an automatic mechanism that delivered food to the pigeon "at regular intervals with no reference whatsoever to the bird's behavior." He discovered that the pigeons associated the delivery of the food with whatever chance actions they had been performing as it was delivered, and that they subsequently continued to perform these same actions.[68]
One bird was conditioned to turn counter-clockwise about the cage, making two or three turns between reinforcements. Another repeatedly thrust its head into one of the upper corners of the cage. A third developed a 'tossing' response, as if placing its head beneath an invisible bar and lifting it repeatedly. Two birds developed a pendulum motion of the head and body, in which the head was extended forward and swung from right to left with a sharp movement followed by a somewhat slower return.[69][70]
Skinner suggested that the pigeons behaved as if they were influencing the automatic mechanism with their "rituals" and that this experiment shed light on human behavior:
The experiment might be said to demonstrate a sort of superstition. The bird behaves as if there were a causal relation between its behavior and the presentation of food, although such a relation is lacking. There are many analogies in human behavior. Rituals for changing one's fortune at cards are good examples. A few accidental connections between a ritual and favorable consequences suffice to set up and maintain the behavior in spite of many unreinforced instances. The bowler who has released a ball down the alley but continues to behave as if she were controlling it by twisting and turning her arm and shoulder is another case in point. These behaviors have, of course, no real effect upon one's luck or upon a ball half way down an alley, just as in the present case the food would appear as often if the pigeon did nothing—or, more strictly speaking, did something else.[69]
Modern behavioral psychologists have disputed Skinner's "superstition" explanation for the behaviors he recorded. Subsequent research (e.g. Staddon and Simmelhag, 1971), while finding similar behavior, failed to find support for Skinner's "adventitious reinforcement" explanation for it. By looking at the timing of different behaviors within the interval, Staddon and Simmelhag were able to distinguish two classes of behavior: the terminal response, which occurred in anticipation of food, and interim responses, that occurred earlier in the interfood interval and were rarely contiguous with food. Terminal responses seem to reflect classical (as opposed to operant) conditioning, rather than adventitious reinforcement, guided by a process like that observed in 1968 by Brown and Jenkins in their "autoshaping" procedures. The causation of interim activities (such as the schedule-induced polydipsia seen in a similar situation with rats) also cannot be traced to adventitious reinforcement and its details are still obscure (Staddon, 1977).[71]

This experiment was also repeated on humans, in a less controlled manner, on the popular British TV series Trick or Treat, leading to similar conclusions to Skinner.[72]

B.F. Skinner Quotations

"I do not admire myself as a person. My successes do not override my shortcomings"[73]

"We must delegate control of the population as a whole to specialists—to police, priests, teachers, therapies, and so on, with their specialized reinforcers and their codified contingencies"[74]

"It is a mistake to suppose that the whole issue is how to free man. The issue is to improve the way in which he is controlled"[75]

"Education is what survives when what has been learnt has been forgotten"[76]

"As the senses grow dull, the stimulating environment becomes less clear. When reinforcing consequences no longer follow, we are bored, discouraged and depressed."[73]

Criticism

J. E. R. Staddon

As understood by Skinner, ascribing dignity to individuals involves giving them credit for their actions. To say "Skinner is brilliant" means that Skinner is an originating force. If Skinner's determinist theory is right, he is merely the focus of his environment. He is not an originating force and he had no choice in saying the things he said or doing the things he did. Skinner's environment and genetics both allowed and compelled him to write his book. Similarly, the environment and genetic potentials of the advocates of freedom and dignity cause them to resist the reality that their own activities are deterministically grounded. J. E. R. Staddon (The New Behaviorism, 2001) has argued the compatibilist position, that Skinner's determinism is not in any way contradictory to traditional notions of reward and punishment, as he believed.[77]

Noam Chomsky

Perhaps Skinner's best known critic, Noam Chomsky published a review of Skinner's Verbal Behavior two years after it was published.[78] The 1959 review became better known than the book itself.[79] Chomsky's review has been credited with launching the cognitive movement in psychology and other disciplines. Skinner, who rarely responded directly to critics, never formally replied to Chomsky's critique. Many years later, Kenneth MacCorquodale's reply[80] was endorsed by Skinner.
Chomsky also reviewed Skinner's Beyond Freedom and Dignity, using the same basic motives as his Verbal Behavior review. Among Chomsky's criticisms were that Skinner's laboratory work could not be extended to humans, that when it was extended to humans it represented 'scientistic' behavior attempting to emulate science but which was not scientific, that Skinner was not a scientist because he rejected the hypothetico-deductive model of theory testing, and that Skinner had no science of behavior.[81]

Psychodynamic psychology

Skinner has been repeatedly criticized for his supposed animosity towards Freud, psychoanalysis, and psychodynamic psychology. There is clear evidence, however, that Skinner shared several of Freud's assumptions, and that he was influenced by Freudian points of view in more than one field, among them the analysis of defense mechanisms, such as repression.[82] To study such phenomena, Skinner even designed his own projective test.[83]

TV Guide

In the controversial books Beyond Freedom and Dignity (1971) and Walden II (1946/1968), Skinner laid out his vision of utopian society in which behavior was controlled by the judicious application of the principle of reinforcement.[84] In those books, he put forth the simple but stunning claim that our subjective sense of free will is an illusion and that when we think that we are exercising free will, we are actually responding to present and past patterns of reinforcement. We do things in the present that have been rewarding in the past, and our sense of "choosing" to do them is nothing more than an illusion. Skinner argued that his insights could be used to increase human well-being and solve social problems. Not surprisingly, that claim sparked an outcry from critics who believed that Skinner was giving away one of our most cherished attributes − free will − and calling for a repressive society that manipulated people for its own ends. The criticism extended to TV Guide, which featured an interview with Skinner and called his ideas "the taming of mankind through a system of dog obedience schools for all."[citation needed]

List of awards and positions

  • 1926 A.B., Hamilton College
  • 1930 M.A., Harvard University
  • 1930−1931 Thayer Fellowship
  • 1931 Ph.D., Harvard University
  • 1931−1932 Walker Fellowship
  • 1931−1933 National Research Council Fellowship
  • 1933−1936 Junior Fellowship, Harvard Society of Fellows*1936-1937 Instructor, University of Minnesota
  • 1937−1939 Assistant Professor, University of Minnesota
  • 1939−1945 Associate Professor, University of Minnesota
  • 1942 Guggenheim Fellowship (postponed until 1944-1945)
  • 1942 Howard Crosby Warren Medal, Society of Experimental Psychologists
  • 1945−1948 Professor and Chair, Indiana University
  • 1947−1948 William James Lecturer, Harvard University
  • 1948−1958 Professor, Harvard University
  • 1949−1950 President of the Midwestern Psychological Association
  • 1954−1955 President of the Eastern Psychological Association
  • 1958 Distinguished Scientific Contribution Award, American Psychological Association
  • 1958−1974 Edgar Pierce Professor of Psychology, Harvard University
  • 1964−1974 Career Award, National Institute of Mental Health
  • 1966 Edward Lee Thorndike Award, American Psychological Association
  • 1966−1967 President of the Pavlovian Society of North America
  • 1968 National Medal of Science, National Science Foundation
  • 1969 Overseas Fellow in Churchill College, Cambridge
  • 1971 Gold Medal Award, American Psychological Foundation
  • 1971 Joseph P. Kennedy, Jr., Foundation for Mental Retardation International award
  • 1972 Humanist of the Year, American Humanist Association
  • 1972 Creative Leadership in Education Award, New York University
  • 1972 Career Contribution Award, Massachusetts Psychological Association
  • 1974−1990 Professor of Psychology and Social Relations Emeritus, Harvard University
  • 1978 Distinguished Contributions to Educational Research Award and Development, American Educational Research Association
  • 1978 National Association for Retarded Citizens Award
  • 1985 Award for Excellence in Psychiatry, Albert Einstein School of Medicine
  • 1985 President's Award, New York Academy of Science
  • 1990 William James Fellow Award, American Psychological Society
  • 1990 Lifetime Achievement Award, American Psychology Association
  • 1991 Outstanding Member and Distinguished Professional Achievement Award, Society for Performance Improvement
  • 1997 Scholar Hall of Fame Award, Academy of Resource and Development

Honorary degrees

Skinner received honorary degrees from:

In popular culture

Writer of The Simpsons Jon Vitti named Principal Skinner character after behavioral psychologist B. F. Skinner.[85]

Bibliography

Mandatory Palestine

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Mandatory_Palestine   Palestine 1920–...