Showing posts with label Skinner. Show all posts
Showing posts with label Skinner. Show all posts

Saturday, January 7, 2017

Struggling with Radical Behaviorism: Ideological Barriers to Mainstream Acceptance

I used to be more interested in consciousness.  The question of what it was and how it happened seemed fundamental to understanding why humans do what we do.  The "problem" of consciousness was key to the question of free will, which all broader questions of social politics seemed to hinge on. 
It was a decades-long, rambling trip which ultimately - quite by chance - led me to behaviorism, the actual science of behavior, which generally puts this question to bed.  Or at least tucks it in nicely. Not that the explanation is complete, but there is plenty of basic science from which to derive a solid foundation on the matter.

Of course, this understanding is far from mainstream, for a variety of reasons.  In the main, it is an unintuitive understanding: "I" plainly choose my behavior, do I not?  Free will seems self-evident. But as is often the case with "common-sense" intuition, this evidence is a cultural construct.  We live in a world in which the individual is assumed to be the master of his own destiny.  In the majority of Judeo-Christian religions, common interpretation views man as a free actor in a morality play, choosing between the temptations of the devil and religious teaching, each moment the crux of an epic, metaphysical struggle.  Our legal system follows suit, as it has tended to since its founding.  The "guilty" is he who could have acted differently but chose not to.  Our economic system also follows, assuming the profit of man's economic actions to be his own responsibility - whether leaving him destitute or in gilded chambers.

The intuition-based concept of the Free man is thusly reinforced everywhere through social institutions at every level. But the meat of the intuition, fundamental to these larger structures, is a philosophical game we have all learned to play.  Behaviorists call it "mentalism", and it is as essential to our early formation as the milk in our baby bottles.  In his paper Behavior Analysis, Mentalism, and the Path to Social Justice(2003), Jay Moore writes:

...Mentalism may be defined as an approach to the study of behavior which assumes that a mental or "inner" dimension exists that differs from a behavioral dimension. This dimension is ordinarily referred to in terms of its neural, psychic, spiritual, subjective, conceptual, or hypothetical properties. Mentalism further assumes that phenomena in this dimension either directly cause or at least mediate some forms of behavior, if not all.

Examples of mentalism are rife in our language.  People get in fights because they are "angry".  People don't do their work because they are "lazy".  People do great things because they are "driven".  The list of adjectives supposedly describing causative inner states is endless.  People act because they are: smart, dumb, ambitious, shy, calculating, cruel, evil, compassionate, kind, generous, stingy, clever, funny, quiet, rambunctious, etc.

Yet what are these words actually describing?  People certainly behave in ways that have these characteristics.  However, this is not an explanation but rather a description of past behavior, and an educated guess as to how they might behave in the future given similar circumstances.  The problem with mentalisms is that they can easily become circular:  a person acts a certain way, is described with a mentalistic term, and the term is then purported to be the cause of the behavior.

The so-called "cognitive revolution" in the social sciences, heralded in by Noam Chomksy's (1959) famously vicious critique of Skinner's landmark work, Verbal Behavior, was predicated on the notion that mental events are indeed causative.  To this day, cognitivists use the architectural language of the personal computer to seek out causation, hypothesizing mental events using computational terms like memory, processing and algorithms.  However to Skinner, all of this is merely further description.  Even if one were to develop a precise cataloguing of every possible rotation of the smallest molecular particle involved in the process of say, my daydreaming about fishing for trout, it would still have nothing to say about what actually causes my thinking behavior.

Here, the behaviorist has the advantage of being informed by science, more specifically the science of behavior.  A core principle of radical behaviorism is that a a science of behavior is possible.  That is, behavior is a deterministic process which can be understood without appealing to non-physical events.  In short, to quote William Baum (1994), "A science of human behavior is possible". To the behaviorist, the structure of the moving parts - while certainly an honorable and interesting subject phenomenologically - are secondary to the larger truth of causation: that behavior is a product of an environment acting upon the genetic make-up of an organism over time.  Behaviorists design experiments to manipulate environmental variables, in order to find controlling relationships with variables in the organism that are dependent on the manipulation.

However, society is still firmly in the camp of the structuralist.  While I realize there is an element of simplicity to the notion that to completely understand a thing is to account for all of it's parts, I've long been suspicious that the zealous embrace of Chomsky's attack on Skinner was ultimately more about a cultural zeitgeist than anything else (In 1971, Chomsky showed his cards a bit when he wrote a statement so absurd it offers a clue to his sense of deep ideological resentment: "At the moment we have virtually no scientific evidence and not even the germs of an interesting hypothesis about how human behavior is determined").

America was entering the 1960's, and libertarian rebellion was fomenting against the strictures of the past.   Nothing less than a quasi-religious awakening was occurring, which sought to bust the shackles of old institutional dogma and paint a road to enlightenment upon the canvas of the expanding mind.  In the eyes of the many on the left, institutional knowledge had brought us the atom bomb, Vietnam, sexism, racism, and the suit and tie.  To many on the right, scientific knowledge was less suspect, but to the extent that it encroached upon the established order of institutions such as the church, marriage, and capitalism (communism was an existential threat almost nothing ought not be sacrificed to prevent), it was dangerous for different reasons.

Skinner's Verbal Behavior could not have come at a worse time.  In it, he laid out the most detailed and cogent argument yet for a radical behaviorism in which all of human behavior - including thought itself - was under the control of physical contingencies.  In his suit and tie, with his cumulative records and operant chambers, he represented everything the left despised.  As Camille Paglia (2003) argued in her essay Cults and Cosmic Conscousness: Religious Visions in the 1960's, the 1960's was a time of "spiritual awakening" and "rebellious liberalization", just one of many religious revivals in American history.  She likens the period to Hellenistic Rome, in which "mystery religions" rose up in response to an oppressive institutional order.  Dionysianistic practice emphasized "a worshipper's powerful identification with and emotional connection" to God.  She goes on to note the context in which a certain long-haired man in sandals rose to prominence:
The American sixties, I submit, had a climate of spiritual crisis and political unrest similar to that of ancient Palestine, then under Roman occupation.
In the 20th century, the culture moment was projected through popular media icons such as Frank Sinatra, Elvis, Jim Morrison and the Beatles: each embodied the generation's desire for personal emotional liberation and sexual independence.  Describing a strange episode in which rumors circulated of Paul McCartney's premature death:

The hapless McCartney had become Adonis, the dying god of fertility myth who was the epicene prototype for the deified Antinous: after Antinous drowned in the Nile in 130 ad, the grief-stricken Hadrian had him memorialized in shrines all over the Mediterranean, where ravishing cult statues often showed the pensive youth crowned with the grapes and vines of Dionysus.

Burrhus Frederick Skinner, with his measured demeanor and supremely rationalistic style of communication, was the very opposite of Adonis.

On the right, his argument was often viewed as nothing less than paving the way for godless totalitarianism.  Indeed, in his 1971 Beyond Freedom and Dignity, he writes:

A free economy does not mean the absence of economic control, because no economy is free as long as goods and money remain reinforcing.  When we refuse to impose controls over wages, prices, and the use of natural resources in order to not interfere with individual initiative, we leave the individual under the control of unplanned economic contingencies. (emphasis added)

The critique, whether or not its fear that radical behaviorism leads to a state controlled economy is quite irrelevant to Skinner's point: if human behavior is controlled by contingencies, then they will be in effect no matter what type of economic system one chooses.

On campuses across America (Europe had never quite embraced behaviorism to begin with), young students (future professors) of psychology took up the banner of cognitivism and never looked back.  Never mind that most of them likely never bothered to read Verbal Behavior.  Granted, it is a difficult book.  Radical behaviorism is a concept which requires a good degree of open-mindedness, and courage to go where the evidence takes you, rather than relying upon the safety of old cultural intuitions.  It no more paves the way to totalitarianism than does Darwin's theory of evolution pave the way for eugenics.  But like evolution, radical behaviorism is rather unintuitive.  Both are selectionist.   In evolution, the organism is the product of a biological shaping process extending back through time, with each generation.  There is nothing in the structure of the organism per-say, that "is" evolution.  The only way to understand evolution is by examining the relationships between organisms - which have been selected -  over long periods of time.  Similarly, radical behaviorism says there is no thing in the organism that "is" behavior.  Rather, the behavior is selected for over the course of the organism's lifespan.

Just as the genetic configuration is selected for that most suits the organism to its environment, the organism's patterns of behavior are selected for which have been most reinforcing.  Just as the genes for a white coat have been selected for as most beneficial for polar bears hunting in the arctic ice, the behavior of speaking the phrase "Where is the restroom?" has been selected for as most beneficial in English verbal communities.  Once familiar enough with the basic science of evolution, the concept isn't too difficult to grasp.  I think the same can be said for radical behaviorism.

Most people never have to fully grasp the complexities of the science of evolution - radiocarbon dating, genetic drift, sedimentary rock, random mutation, etc - in order to embrace it.  Instead, they can rely upon an environment in which the "settled" science immerses them from grade school to instill in them an intuitive grasp of geologic time and the notion of natural selection.  The science of behaviorism has no such mainstream acceptance.  Therefore concepts such as discriminative stimuli, schedules of reinforcement, the matching law, respondent versus operant, extinction bursts, establishing operations, etc. are not considered "settled" outside of the field and no such intuition is able to be built.

Rather, mentalistic accounts of behavior rule the day with nearly the degree of vigor that they did a hundred or even a thousand years ago.  In this sense, society operates with a basic psychological outlook that could quite easily be considered medieval.  Indeed, one only need look towards subjects such as criminal justice or income disparity to see where such thinking leads - in which "driven" men claim moral right to mansions, and "evil" men are delivered to concrete cells of solitary confinement.

So too in our daily lives do we encounter the suffering and anxiety caused by confusion over the basic principles of behavior.  Intuiting the actions of others as being caused by them, we become resentful and intolerant, blinded to the reality that their actions are the result of the contingencies in their lives.

Further still, we turn this false mentalism upon ourselves, believing falsely that there is something in us that is responsible for our actions, as opposed to the contingencies within which we are shaped.  Just as we develop toxic emotions as a response to others, we develop it in response to our own "self".  We imagine this entity as responsible for actions we would rather not have occur.  This leads us down the fruitless path of "becoming better people", and looking only into our own thoughts and feelings, rather than examining the functional relationships between our environment and our history of reacting within it.  We have been sold on the notion that there is something wrong with how we "process" the environment, rather than our behavior being a perfectly natural, learned response to environmental contingencies.

The cognitive revolution did not represent a shift from a centuries-old deterministic, mechanistic view of behavior in which Free man did not exist, to a new view in which Free man existed as a function of a "self" which processed information and chose to act based upon some emergent, metaphysical system.  Rather, for hundreds or even thousands of years, Free man was commonly assumed to exist as an independent actor responsible for his own lot in life, and it was only for a brief period - a few decades - that behaviorism developed and held sway in psychological study.  Aside from it being a mature, complex field of study with numerous insights into human behavior, to the extent that cognitivism rejects a behavior analytic approach in favor of appeals to mentalism, the cognitive revolution would better be described as a "cognitive reversion" to the old, intuitive conception of "self" that has always been foundational to religious, economic and civic institutions.

However, as fitting for a revolution, cognitivist mentalism indeed led to a widespread purging of behaviorism as a respectable science.  In The Structure of Scientific Revolutions (1970), Thomas Kuhn writes of this process:

When it repudiates a past paradigm, a scientific community simultaneously renounces, as a fit subject for professional scrutiny, most of the books and articles in which that paradigm had been embodied. Scientific education makes use of no equivalent for the art museum or the library of classics, and the result is a sometimes drastic distortion in the scientist's perception of his discipline's past. More than the practitioners of other creative fields, he comes to see it as leading in a straight line to the discipline's present vantage. In short, he comes to see it as progress. No alternative is available to him while he remains in the field.

To the hapless psychology student, there is simply no point in engaging with behaviorism beyond the most primitive level.  Textbooks routinely dismiss Skinner's work as, while describing an important part of human behavior, antiquated when it comes to dealing with the true complex natural of human behavior.  While is is sometimes suggested that cognitive science hasn't abandoned behaviorism, but rather quietly subsumed it, David Palmer (Behavior Analyst, 2006) argues the contrary:

....Such examples suggest that, instead of building principles of behavior into its foundation, cognitive science has cut itself loose from them. Cognitive psychology textbooks neither exploit nor review reinforcement, discrimination, generalization, blocking, or other behavioral phenomena. By implication, general learning principles are peripheral to an understanding of cognitive phenomena. Even those researchers who have rediscovered the power of reinforcement and stimulus control hasten to distance themselves from Skinner and the behaviorists. For example, the authors of a book that helped to pioneer the era of research on neural networks were embarrassed by the compatibility of their models with behavioral interpretations: “A claim that some people have made is that our models appear to share much in common with behaviorist accounts of behavior … [but they] must be seen as completely antithetical to the radical behaviorist program and strongly committed to the study of representations and process”.

In my personal experience, I routinely encounter Psychology graduates who possess little more than a rudimentary understanding of behavioral principles.   If the general education teachers I worked with in public schools were consciously applying behavioral principles in their classrooms, they certainly never spoke of it.  In my own training, as an undergraduate in Social Sciences, and as a graduate in Elementary Education, Skinner's work received at most a total of one lecture in an undergraduate course, and a paragraph or two in graduate school.  His work on operant conditioning, while acknowledged as important to understanding learning at rudimentary levels, is quickly passed over in favor of the work of cognitive theorists such as Vygotsky (zone of proximal development, scaffolding), Piajet (schema), Bandura (social learning) and Erickson (psychosocial development), who are commonly viewed as offering something more than would be possible through adherence to behaviorism alone. Their work is commonly viewed as refuting behaviorism, and thought of as taking our understanding of learning further, in ways that would be impossible under a behavioral analytic approach, and thus more critical to learning and social development.  While their insights are indeed valid and useful, to view them as in any way a refutation of behavioral principles would be a serious error.   Each these theorist's work can easily be accounted for via the application of  behavior analytic principles.   Ironically, to the extent that these cognitive theories fail to engage with the scientific, behavioral principles underlying their existence, they are in their own way reductionist; to properly understand the concepts of zones of proximal development or schema without taking into consideration principles such as establishing operations,  generalization, learning histories or schedules of reinforcement is to reduce these phenomena to vague simplifications.  Yet simplification, especially when presented in the context of a compatible reinforcement history, is itself highly reinforcing.  To an individual raised to believe in an all-powerful God who is communicated in an inerrant bible, the notion of divine creation of man in a short period of time is much easier to embrace than a chaotic process of natural selection over hundreds of millions of years.

The first edition of On the Origin of Species was published in 1859, but the theory of evolution wasn't widely accepted until decades later.  Widespread public acceptance wasn't gained until perhaps the 1940's, with the Catholic church eventually allowing that evolution is at least compatible with the bible in 1950.  Still, to this day evolution remains a controversial theory accepted by only 60% of the populations in the U.S. and Latin America, according to Pew Research (2015).  In many respects, the evidence for evolution is more clear-cut, in that developments in multiple areas of science - from biology to geology to particle physics - have played a key ole in its understanding.  The structure of DNA was not even understood until a century later.  In many respects, our understanding of the brain, the most complex object known in the universe, is much less far along.  For behavior skeptics, an emphasis on structuralism combined with mentalistic bias, points toward an almost unfathomable complexity.  Indeed, consciousness has famously been coined as "the hard problem" - an rather mythical designation.  Behaviorists who question whether the problem is all that hard are often labeled as "reductionists" - too easily seduced by a naively simplistic account of a complex phenomena.

But the radical behaviorist does not deny the complexity of the moving parts (environmental stimuli, biological molecules, and past history).  Rather, he merely insists that at its core there is a deterministic, functional relationship at work.  I'm often struck by the similarity with the "Intelligent Design" argument put forth by evolution skeptics.  Biological organisms are claimed to be "irreducibly complex", so as to never have been able to originate without an intelligent designer.  Yet this argument also chooses to misdirect attention to the structure of the organism, to seek an understanding of it removed from the context of history.  And just like evolution can only be understood as a function of geologic time and the interplay between genes and environment, so too can behavior only be understood as the interplay between the phylogeny (genetic history) and ontogeny (environmental, life history) of an organism.

Compared with Darwinian evolution, the rate of acceptance of radical behaviorism over cognitivist mentalism may not be in terrible shape.  Maybe by the 2040's we'll have seen a steady shift towards a behavior analytic approach.  However, I have my doubts.  Evolution's largest direct social implication might have been a sound refutation of biblical literalism.  But that was never so central to our institutions.  Religious freedom, after all, had long been enshrined in our constitution.

The threat from the radical behavioral perspective to the established institutional order is in my view much greater, in that it provides scientific justification for the moral claim that as social products, ultimate accountability lies in the system we build for man, not for man's actions within that system.  How to redraw our institutions so as to align with this truth is the real challenge.  But we must begin with the premise that, to the extent that it is founded in mentalistic notions of human behavior, the current system is not only unjust, but misguided and philosophically corrupt.  There are a great many aspects to the current order that are reinforcing to behavior that preserves it, not the least of which is simple human greed (the tendency to accumulate wealth in a manner that is unjust).  But the opposite of greed is generosity, and generous acts are simple to argue for.  What is more difficult is the untangling of the mentalistic rationale for systems that allow the behaviors of human greed. 


Baum, W. M. (1994). Understanding behaviorism: Science, behavior, and culture. New York: HarperCollins.

Chomsky, N. (1959). ”A Review of B. F. Skinner's Verbal Behavior”. Language, 35, No. 1, 26-58.

Kuhn, T. (1970). The Structure of Scientific Revolutions. Chicago: University of Chicago Press

Moore, J. (2003). Behavior Analysis, Mentalism, and the Path to Social Justice. The Behavior Analyst. 26 (2), 181. http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2731454/pdf/behavan00006-0003.pdf

Chomsky, N. (1971). The Case Against B.F. Skinner. The New York Review of Books. 17(11), 3. http://www.nybooks.com/articles/1971/12/30/the-case-against-bf-skinner/

Paglia, C. (2003). Cults and Cosmic Consciousness: Religious Vision in the American 1960s. Arian 10 (3), 60-61.  http://www.bu.edu/arion/files/2010/03/paglia_cults-1.pdf

Palmer, D. (2006) On Chomsky’s Appraisal ofSkinner’s Verbal Behavior: A Half Century of Misunderstanding. The Behavior Analyst. 29 (2), 260.

Skinner, B.F. (1957). Verbal Behavior. MA: Copley Publishing Group.

Skinner, B. F. (1971). Beyond freedom and dignity. New York: Knopf.

“Religion in Latin America: Widespread Change in a Historically Catholic Region.” Pew Research Center, Washington D.C. (Nov. 13, 2014)http://www.pewforum.org/2014/11/13/chapter-8-religion-and-science/, 07/02/2016.

“US Becoming Less Religious,” Pew Research Center, Washington D.C. (Nov. 3, 2015)









-->

Monday, August 22, 2016

Don't Tell Us Any More About Dualism

Hand prints in Pettakere cave, Sulawesi,
Indonesia, est. 35,000 and 40,000 years old
Apparently Tom Wolfe is not a big fan of Noam Chomsky either (see my previous post).  In his August 2016 essay in Harper's Magazine titled The Origins of Speech, Wolfe sets out to take Chomsky down a notch or twelve.   While I've personally always admired Chomksy's vocation as thorn in the side of Western imperialism and neo-liberal capitalist dogma, he's far too often a polemicist of the cheapest order.  Instead of keeping an eye peeled for nuance, irony and self-skepticism, his fervor seems to compel him to lazily troll the waters of simplistic, conspiratorial global plotting, in which there are rarely simply no good answers, but always rather the good guys and the bad guys, and his own uncanny ability to always spot who is who.

Wolfe too bristles at Chomsky's ad hominem attacks - "The epithets ("fraud", "liar", "charlatan") were Chomsky's way of sentencing opponents to Oblivion".  But while he spends a good deal of time examining Chomky's anarchist roots and rise to fame in the anti-war movement through harsh criticisms of the Vietnam war, the real focus of the piece is Chomsky's invention, in the late 1950's, of something called Universal Grammar.  UG is a theory of language in which humans, thought to essentially possess a "language organ" somewhere deep in the brain, that
"could use the "deep structure", "universal grammar" and "language acquisition device" that [the child] was born with to express what he had to say, no matter whether it came out of his mouth in English, Urdu or Nagamese."
Regular readers of this blog will note that I am a Board Certified Behavior Analyst.  In my practice I work with children with developmental disabilities (primarily autism), who suffer severe language deficits.  As such, I am trained in the science of behavior, the philosophy of which is called Radical Behaviorism.  From Wikipedia:
Radical behaviorism differs from other forms of behaviorism* in that it treats everything we do as behavior, including private events such as thinking and feeling. Unlike John B. Watson's behaviorism, private events are not dismissed as "epiphenomena," but are seen as subject to the same principles of learning and modification as have been discovered to exist for overt behavior. Although private events are not publicly observable behaviors, radical behaviorism accepts that we are each observers of our own private behavior.
(*these other forms are largely no longer practiced, radical behaviorism today being the predominant philosophy at the root of applied behavior analysis, experimental analysis of behavior, organizational behavioral management and relational frame theory)

As a behavior analyst, I am not an expert in language per say.  I'm certainly not trained in linguistics, nor speech pathology.  That said, as much as language is behavior, I understand how it works quite well.  Behaviorism as a field draws much of its foundation from the work of B.F. Skinner.  More than anyone else, he developed and expanded upon the notions of reinforcement and punishment, antecedents and consequences, discriminative stimuli, motivating operations and schedules of reinforcement to establish a robust theory of language.  In 1957, he too published a landmark book, Verbal Behavior.  Again, from Wikipedia:
For Skinner, the proper object of study is behavior itself, analyzed without reference to hypothetical (mental) structures, but rather with reference to the functional relationships of the behavior in the environment in which it occurs. 
Within Behaviorism, Chomsky is famous for his harsh criticism of Verbal Behavior, in which he generally misunderstands its fundamental concepts and levels baseless attacks.  Outside Behaviorism, however, Chomsky is famous for utterly refuting Skinner, igniting the "Cognitive Revolution", and generally assigning Behaviorism to the dustbin of history.  Behaviorists are endlessly baffled by a dismissal of Verbal Behavior that continues to this day, as psychology, education and language students are routinely taught Skinner only in passing, and with nowhere near the depth required to truly understand his work.

I must pause.  I realize that sounds like a suspicious plea.  Oh, if only they understood our work, they would agree with it.  The implication is, well, kind of Chomskian in its dismissal of disagreement: your objections are not valid because you haven't taken the time to understand the subject.  Maybe you have become too accustomed to the decadence of your bourgeois hegemony, non, comrade?

To start with, behaviorism is actually quite hard.  For starters, it is deterministic, and thus in opposition to traditionally dominant notions of free will and agency that are the very pillars of entire religious and political dogmas.  It is also highly technical and reliant on an elaborate network of scientific principles that much each be understood in their own right before the larger whole is assembled into a coherent theory.  And finally, it is often quite unintuitive.  Aside from its refutation of free will (an assertion many will reject outright based on what will be described as self-evidence), it's principles describe interactions between events that take place across a timeline that isn't easily grasped at first.  In this way it is like the theory of evolution, which requires a pulling-together of a number of various concepts (mutation, natural selection, change over time), and that can't be pointed to easily as a process unfolding right before our eyes.

In another piece I've been working on, I plan to go into more detail as to what I think Chomsky's motivations, or possibly as important, the larger public embrace of his supposed refutation of behaviorism's explanation of language.  I hope to have that up soon.  But as one might imagine, Verbal Behavior and Chomsky's notion of a Universal Grammar were bound to conflict.  Now, contrary to common understanding, behaviorism does not believe in the tabula rasa view of humans, in which we are entirely a product of our environment.  Rather, behaviorism understands that every organism has certain genetic proclivities for certain stimuli - these drive our basic wants and needs (food, sex, shelter, touch, etc.).  And within our species there are no doubt a range of individual differences in proclivity - some people will be more sensitive to certain stimuli more than others.

However, this is only the beginning.  We are learning creatures, and possess a basic tendency to behave in certain ways more or less, depending on physiological desires and the ways in which the environment does or does not stimulate them.  In this way, we are no different than most (if not all) other mammals.  If a baby chick makes a certain sound near her mother, and is rewarded by more attention, she will be more likely to make that sound again when her mother is near.  If a dolphin eats a certain type of fish and is disgusted by the taste, it will be less likely to eat that fish again.  If a bear hears the sound of a car and follows it to a campground filled with tasty potato chips, it will be more likely to follow that sound again (however, if following the sound is followed by no chips - or worse, a gunshot - it will be less likely).

When we drive past a sign in a store window announcing "75% OFF ALL ITEMS"*, we will be more likely to go into the store.  Of course, as humans, our behaviors will be somewhat more complex.  We will be likely, in fact, to begin a behavioral chain that we have learned (that has been REINFORCED, or has been previously followed by benefits): for starters, the words of the sign have previously been reinforced ("75% off"  = wow!, VERY available; all items").  Seeing that the store is one in which many items you enjoy are found within, you become extra excited.  You then engage in the behavior of thinking about what you might want there.  When you realize you could really use an antique Buddha lamp you've been eyeing in the window (a behavior previously reinforced by a college course you took in which you discovered the Buddha and how he reminded you (conditioning) of Santa Claus, whose appearance was consistently followed by presents), you are even more likely to pull over.  You now engage in a series of responses - checking your mirror, turning on the signal, turning the wheel - all of which you have learned have the desired effect of safely navigating the car where you want it to go (previously engaging in that behavior has had precisely the same result).

(*In fact, just reading "75% off" likely evoked in you a bit of conditioned warm feelings as you have likely been conditioned by our society to respond to large discounts).

Have you ever walked into a room and realized you forgot why you went there in the first place?  Have you ever been reading a book and realized that you hadn't even noticed the last couple of paragraphs you were reading because your mind was elsewhere?  These seem strange because they are unexpected moments in which "mindless" behavior is obvious.  But in reality, how much of our daily behavior is indeed "mindless" anyway?  In behaviorism, this "mindlessness" is easily explained in behavioral terms, as a part of our normal conditioning.  In fact, even our awareness of our actions is the product of conditioning; the language we use to describe it is Verbal Behavior.  The "radical" in Radical Behaviorism emphasizes this completeness: it deals not only with observable behavior, but with the verbal behavior that takes places within our own minds as well.

So what to make of Chomsky's Universal grammar?  It isn't necessarily inconsistent with behaviorism.  One could imagine the human brain possessing some structure that finds certain patterns of stimulus more reinforcing.  We do seem to see this in universal preferences for certain symmetrical facial features, or compositional patterns in art.  However, the real issue is more philosophical, and has to do with emphasis.  Is language something that is created by a physiological structure in the brain - a sort of computational device, or is it something that emerges over time, out of repeated interactions between the brain and the environment?

Chomsky preferred the former.  While he couldn't point to any such structure yet discovered, he assumed it was only a matter of time before it would be.  Mainly, he argued, the fact that humans were capable of generating utterly new thoughts and ideas refuted the notion that we were bound by our experiential interactions with the world.  He pointed out that small children were able quickly to imagine and speak of things which they had no direct knowledge.  He noted as well that all human languages had certain patterns in common.  This was clear evidence that there was something universal about language, something that had evolved in our species.

Wolfe describes how influential Chomsky's position became.  Despite a lingering lack of evidence for any organic grammar structure, his theory was massively popular, and helped propel linguistics as a field into increasing popularity.
"Thanks to Chomsky's success, linguistics rose from being merely a satellite orbiting around language studies and became the main event on the cutting edge... the number of full, formed departments of linguistics soared."
 The so-called "cognitive revolution" ushered in an era in which the context of the speaker (history, stimuli, motivations, etc.) was less important than the structures presumed to exist within the brain.  Although, still without much evidence of what or where those structures were in the brain, researchers had to be content with inventing metaphors for these supposed mental processes.  Chomsky's emphasis on physical structure inspired elaborate conceptualizations that were themselves rooted in 3D dimensional intuitions: sorting, shifting, stacking, storing, building, etc.  With the rise of the computer, which was indeed a physical system in which code could be written and organized that did all these things, a perfect analogy was there for the taking.  What was the mind, it could intuitively be imagined, but a computer of flesh and blood?

But we had made this mistake before.  Rene Descartes, writing in 1647, reasoned that the mind could not be part of the body because it was ethereal and indivisible, as the body was corporeal and thus divisible.

[T]here is a great difference between the mind and the body, inasmuch as the body is by its very nature always divisible, while the mind is utterly indivisible. For when I consider the mind, or myself in so far as I am merely a thinking thing, I am unable to distinguish any parts within myself; I understand myself to be something quite single and complete….By contrast, there is no corporeal or extended thing that I can think of which in my thought I cannot easily divide into parts; and this very fact makes me understand that it is divisible. This one argument would be enough to show me that the mind is completely different from the body…. 
The notion of the "mind" being an entity separate from causality, and therefore from the body, is known as Cartesian Dualism, and is a fallacy.   As Daniel Dennet writes in Consciousness Explained:
Cartesian materialism is the view that there is a crucial finish line or boundary somewhere in the brain, marking a place where the order of arrival equals the order of "presentation" in experience because what happens there is what you are conscious of. [...] Many theorists would insist that they have explicitly rejected such an obviously bad idea. But [...] the persuasive imagery of the Cartesian Theater keeps coming back to haunt us—laypeople and scientists alike—even after its ghostly dualism has been denounced and exorcised.

Somehow, however, the dualist trappings of cognitivism persisted.   While behaviorism explicitly opposes it as mentalistic,  and by definition non-deterministic and unscientific, the idea "that the mind and mental states exist as causally efficacious inner states of persons"  has been a major assumption of many cognitive theories for the last half-century.  

Wolfe fast-forwards us to 2005.  A paper by Linguist Daniel L. Everett, titled Cultural Constraints on Grammar and Cognition, described a tiny Amazonian tribe, the Pirahã , who Everett had spent a great deal of time with as a Christian missionary, and whose language broke all the Chomskian rules of Universal Grammar.  Indeed, it was observed that it was the culture of the Pirahã themselves that defined their language.
"...Their unique ways of living shaped the language - not any "language organ", not any "universal grammar" or "deep structure" or "language acquisition device" that Chomsky said all languages had in common.
This was a people who lived radically different lives than most everyone else on the planet.  Intensely isolated, they lived in a world most of us wouldn't recognize.  Because of their cultural patterns, they simply had no use for concepts and ideas most of us might take for granted.  They used almost no tools, they kept few material goods, neither read nor wrote, and had no mathematical concepts beyond "a lot" and "a little".  When Everett showed them black and white images, they struggled to make sense of them, so unaccustomed as they were to pattern recognition.  Furthermore, they had no words for yesterday, or tomorrow, but rather referred to them as "other days".  With little conception of the past, Everett's attempts at converting them to the Christian faith were hopeless.  In a hilarious (and honestly, quite triumphant, personally)  passage of the article, Wolfe writes of what happened when Everett tried to teach them about Jesus:
"How tall is he?" the Pirahã would ask.
"Well, I really don't know, but -"
"Does he have hair like you?" meaning red hair.
"I don't know what his hair was like, but -"
The Pirahã lost interest in Jesus immediately.... After about a week.... one of the Pirahã, [named] Kohoi, said to Everett politely but firmly, "We like you Dan, but don't tell us any more about Jesus."
 
To a behaviorist, this makes perfect sense.  Culture evolves, and with it language.  Why would a language develop if it wasn't being reinforced?  To the Pirahã , who had no use for concepts of tomorrow or yesterday, no such language was needed.  The universal structures Chomsky proposed to have evolved in all humans were curiously absent in in the Pirahã.  Everett argued that this was, as the New Scientist described his claim, "the final nail in the coffin for Chomsky's hugely influential theory of universal grammar... [that] most linguists still hold to its central idea."

In my work, I regularly encounter children who completely lack any language, or as we say - a verbal behavior repertoire.  There can be many reasons for this, but often, the most salient reason is lack of learning opportunities.  What this generally means is that, either because of their genetic make-up or specific environmental history, they require more exposure to certain stimuli before relationships develop.  We apply the principles developed by Skinner in Verbal Behavior, in which different contingencies of reinforcement are targeted for different language skills.  We teach echoic (vocal) imitation to begin to develop in the learner the ability to produce certain sounds.  We find motivating items and make their availability contingent on emitting certain requesting ("manding") behavior, usually either by handing a correct image of the item, or signing or making a correct vocalization.  We teach receptive language skills by saying the name of an item or activity and reinforcing when the child indicates the correct response.  We teach labeling by asking what an item is and reinforcing the correct response.  We teach what Skinner called "intraverbal" behavior, in which a response is controlled only by other verbal behavior, i.e. not simply what is in the room (e.g. What do you wear on your head?  Hat).  We teach matching and categorizing items into groups by what features they share, what they do, or logical classification.

All of these skills are slowly built up in a child until the skills, as a whole, represent a language repertoire. The child is now talking.  The child is now thinking.  Now, by definition, these children have a disability.  They were not raised in an environment completely lacking in language.  For whatever reason, they were not able to "pick up" language as easily as their typical peers.  There are many reasons for this, and much is still not yet understood.  However, a common feature of the autism diagnosis is that what is sensorially pleasing to a typical child might not be so to a child with autism.  For instance, eye contact, a critical component of social connection, is often lacking in autistic children.

And yet it is critical for learning.  When a baby makes eye contact with you and you smile back, indicating joy, the baby experiences social reinforcement.  This will go on to become one of the most powerful mechanisms in the child's life as it grows and begins interacting with the world.  Some of the first behaviors of an infant are shaped by differential social reinforcement.  That is, when a  baby says "mago mago", we likely show little excitement, and there is little reward for the child.  It will be no more likely to occur again.  However, if the child says "mama mama", and our eyes light up and our cheeks rise and the corners of our mouth widen, that behavior has just been rewarded, and will be more likely to occur again.   The baby wants that response, and so will say mama over and over.  Soon, it realizes that when it says "mama", we stop what we are doing and pay it attention.  That little word is like a giant invisible rope that it can now use to turn us around.  Yet, if the child is uninterested in your facial expressions, it will miss the difference in response you show.  It won't be able to learn to say words that can use to control its environment.

This process of conditioning extends its reach into every aspect of our lives.  More words mean more leverage in the world.  They mean more access to new items and activities.  They mean more thoughts and ideas.  Children without language live in very simple worlds, with only the simplest access to rudimentary, basic sensory activities.  But as language develops, so to does their world:  a toy car is no longer a thing with wheels that spin, it is a people mover that drives on a road and crashes into things!  Little people panic and move out of the way.  Categories of community helpers need to come and help put out the fire.  Friendships are forged in trust over the terrible disaster.  Social relationships are explored and complex emotions are developed.

Nature versus nurture has always been a puzzle.  As soon as babies are born - even within the womb, really, they begin to experience environmental stimuli and the process of conditioning.  What would it be like if a child was raised without humans around?  We've never been able to do such an experiment, for obvious ethical reasons.  However, in these children, who lack a typical means of conditioning, you in some ways have a control group for what it would look like.  It is as if they weren't exposed to the normal contingencies that day to day social life exposes typical children to.  Like the Pirahã people, they are cut off from that process of socialization, even if unlike the Pirahã, the barrier is not geographic, but physiological.

Apparently Chomsky wasn't moved in the slightest by Everett's revelations.  Wolfe writes that Chomsky told him Everett's opinion "amounts to absolutely nothing, which is why linguists pay no attention to it."  A reaction strikingly similar to the one he gave Skinner in 1957.  At least then he wrote a scathing review.  The original idea, which assumed some mysterious physiological organ capable of generating new thought independent of one's environmental context, which threw out the notion of language as a behavior conditioned for like any other, which revolutionized the fields of psychology, linguistics and education among others, was finally refuted by a small tribe in Brazil.  Meanwhile Behaviorism, the field which Skinner was instrumental in developing, and for whose work he received the Humanist of the Year Award in 1972, which spawned multiple journals and is today the pre-eminent treatment for Autism and has helped millions of individuals begin to speak, still struggles to find mainstream acceptance.

Wolfe closes by noting that despite Chomsky's theory failing to ever find any hard evidence, he
"had made the most ambitious attempt since Aristotle's in 350B.C. to explain what language exactly is.  And no one else in human history had come even close.  It was dazzling in its own flailing way..."

Actually, nothing could be further from the truth.  Behaviorists have not only a much better explanation, but we put it into practice on a daily basis.  While it may be a rather complex, a bit unintuitive, and downright subversive to many traditional ways of thinking, its results are indeed quite dazzling.   To those of us in the field, we can hardly keep from imagining how transformative a larger, more widespread adoption of our techniques and principles might be in other areas of human development, such as education, or criminal justice.  And something tells me, the foundation of our science is a lot more solid than might be overturned by researching an obscure population in a remote part of the world.  As I used to tell my students when I taught high school biology, all it would take to overturn evolutionary theory would be to find fossilized remains of a human in a strata of rock from the age of the dinosaurs, so too might we find an ancient culture whose language was not explainable by conditioning or reinforcement.  But I won't hold my breath.









Saturday, February 13, 2016

A Simple Thing Like Behavior


A BCBA, Adam Ventura, wonders about the future of applied behavior analysis, and whether something like  a singularity might one day arrive, with technology displacing real therapists.  But whether it will or not, a more immediate question is how to raise awareness about the field in the context of decades of (my words) deliberate skepticism.

Since learning about ABA 3 years ago, getting my BCBA last October, and to this day, I've been fascinated by this question of mainstream acceptance of our field.  Or, rather, why there isn't more of it.

I have always been fascinated my human behavior. 

One of my first jobs was delivering meals to people with AIDS all over the city of San Francisco, from luxury penthouse apartments to squalid projects.  At the time I was also attending college and taking social sciences courses, learning about political history, economics, philosophy, psychology, etc.  I was thinking very deeply about the injustice that I saw all around me, and wanted to understand both how it had come to be, as well as what could be done about it.  One of the ways I passed the long drives in my delivery van was to listen to political talk radio.  I was struck, over and over, by the emphasis on the right - personalities like Rush Limbaugh - on the behavior of the poor.  Welfare abuse, laziness, criminality, poor parenting, failure to clean up their neighborhoods, were a constant refrain. 

How could we, as a society, be expected to help these people (who were so often minorities) if they seemingly refused to help themselves?  I was disgusted by the generalizations, the smugness, the lack of empathy.  But what they described was often quite true.  I saw up close so many of the same behaviors.  Not by everyone in the neighborhoods, of course, but by too many.

On the left, the emphasis was on structural problems - racism by banks and employers, school teachers with low expectations, a history of oppression and disenfranchisement that left generations of families with few resources and psychopathologies like physical abuse or addiction that were a function of growing up in wretched conditions.

So I thought deeply about how these narratives conflicted, and yet were at the root of political disagreement in the country.  This was back in the 1990's, but little has changed.  The history of conservative and progressive thought can almost be defined by this conflict: why do people do what they do?  Is it because they freely make different choices, or because their choices are constrained by larger social structures?  If they are free, then do we simply blame them for their lot in life because it is one they choose?  Or do we help them out of an obligation because their choices are a function of the environment in which they have lived – an environment we tacitly support as fellow Americans?  It seemed all to come down to free will: do we have it or not?

This was a HARD problem.  The more I read, the more difficult I understood the problem to be.  For centuries people had been struggling with it.  I, however, felt like the problem was relatively simple.  From what I saw around me, people increasingly seemed entirely a product of their genes and environment.  I was by this time working with schizophrenics and people with traumatic brain injuries in different group homes.  I saw just how fragile the brain is, and how we take for granted the role it plays in our emotional and cognitive abilities.

I obtained an undergraduate degree in social science and a Master’s in Elementary Education.  I wanted to help children maximize their potential.  Yet from the beginning I could see how trapped kids were.  Even at the poor school where I did my internship, I could see the stratification beginning: regardless of income, what seemed to matter most was the support the children were receiving at home.  The teachers were doing their best – I saw greatness and I saw frustration.  But in a class of 30 students there was only so much a teacher could do.  No matter, I would find a way.

I read Maslow and Bandura, Piaget and Vygotsky.  Skinner - I was terrified to find out much, much, later – was entirely absent.  We learned about “schema” and “multiple intelligences”.  We learned that what mattered was making lessons “fun”, and that through high expectations and diligence, all of our kids could go to college.  (My social sciences background was skeptical that we were ignoring larger structural forces – who, for example, would clean the bathrooms and wash the dishes when everyone was attending college?  But no matter.  I pressed on.)

My first experiences were as a substitute teacher in Reading, PA, a post-industrial, post-white flight city in which poor, misbehaved children were the norm.  Gunshots at night and drug deals translated with palpable immediacy into children dropped of at school too tired to work, angry, frustrated, resentful, mistrustful of authority and with a deep need for attention.  As a substitute, I struggled.  But I assumed when I had my own classroom, I would be able to reach all of them and give them what they needed.

When I finally did, a smaller Kindergarten class of around 22, I had a vindicating year.  Despite the 5 to 6 year-old children coming to school with extremely low academic readiness, I was able to get them all to basic grade-level standards before the year was done.  Many of them could barely recognize letters, shapes or numbers.  Many had never been read to.  Many couldn’t hold a pencil properly as they had been given few opportunities to use them.  I gave out homework to try and make up for this lack.  I told silly stories to engage them. I danced.  I illustrated letter sounds with fanciful cartoons.  I brought in books by the cartful form the local library to stimulate their curiosity. 

The parents were for the most part loving, caring, and devoted.  Yet many simply did not have an academic mindset.  Many had not thought to read books to their children, much less provide a cognitively enriched environment.  They showed their love with hugs, food, and kisses and freedom – one parent told me her daughter (beautiful baby-teeth smile filled with metal caps, however highly inattentive and at the bottom of her class academically) would not do her homework because as soon as she got home from school would strip to her underpants and run outside for the rest of the day.  Some struggled to get their children to school on time.  One child missed 2 months of school because of a gunshot wound from a careless cousin.  More than a few children spoke of the horror movies they loved to watch – one dressed as Chucky for Halloween.  Their world was rough and unkempt.  Parents were struggling.  There were stories of incarceration, parents unfit because of drugs so grandparents took over.  Most parents worked low-pay jobs – gardeners, clerks, maids.  One child spoke very little English but excelled academically.  His Mexican immigrant parents had been professionals in Mexico.

But the next year was worse.  For financial reasons I had to teach a double-class, and was now responsible for juggling both a kindergarten and first-grade curriculum.  I struggled with classroom management in my attempt to provide differentiated instruction to children with a functional grade-level range of 3-4 years.  Mid-year, I was asked by the principle to leave my classroom and fill-in for the high school science teacher that had quit.  It was a K-12 school, and I was felt to be the only of the elementary teachers to be a good fit for the older kids. 

But these weren’t just older kids.  Ours was a charter school.  It had originally been established as an alternative homeschool site for a group of largely white, Christian parents in the majority poor, Hispanic neighborhood.  But over the years it expanded enrollment and the local demographic (poor, minority) began to edge in.  In my time there, I saw the last few families – better off, organized, involved – pull their children out.  To make up decreasing enrollment, younger classes were consolidated, and high school students were recruited by accepting more and more students who had dropped out of regular education schools – even continuation schools.  This meant a host of behavioral problems. 

These kids hated school.  The most successful teachers seemed the meanest.  One teacher told me “what these kids understand best is meanness, so don’t be nice to them.”  I was horrified.  I tried to make learning fun.  I had little success.  These kids weren’t interested in success.  What they were interested in was fighting, getting high and having unprotected sex.  I met my first teen mothers.  One of them, at the age of 17, had a 2-year-old son, and appeared mainly interested in gossiping and surreptitiously painting her nails or trying to plug in her curling iron.  It dawned on me that her child would soon be in kindergarten.  Some desperate teacher would be trying to make up for this young girl’s complete lack of parenting skills. 

What the hell was going on?  I dug deeper.  I read Hart and Risley’s Meaningful Differences, a watershed study that was one of the first attempts to collect data on parent-child interactions within different socio-economic classes.  (Interesting, Todd Risley, I was to discover later, was to be a founding contributor to The Journal of Applied Behavior Analysis’ landmark paper, “Seven Dimensions of AppliedBehavior Analysis, in 1968.)  The story was gripping.  It explained everything that I saw happening.  It described stressed-out families interacting with their children in ways that were mostly loving, yet lacking in cognitive elements that provided opportunities for expanded verbal development.  Looking at the 17-year-old mom in my class, I was not surprised in the least.  This was generational poverty at work.

I kept on.  Enrollment eventually dropped even further and the school could no longer afford to keep me on as a science teacher.  By this time I had gotten credentials in Earth and Biological Science.  I took a job myself at a continuation high school.  Here was a population in exponentially more need of intervention.  Kids got high in the bathroom.  Many were foster children or lived in group-homes.  Any work at all was regularly refused, no matter how much support was given.  Instead, students shared stories of horrific abuse, rape, and violence.  Outside my classroom door each morning the lunch counter opened early for the teen mother program.  When done with their work, I allowed some of my male students to leave and go spend time with their children.  But teen fathers usually wanted nothing to do with their children.  Students regularly only showed up for the first few months or weeks of class, often only enough to satisfy a court mandate.  Fights broke out in my room.  Hair was pulled.  Extensions were pulled off.  Children fled out the back door, security chasing after them.  I would hear stories later about parents getting involved and searching the streets to facilitate their children’s revenge.  One student came to class flying high on what seemed likely to be methamphetamines.  He lived in a local group home and I never saw him again.

I poured myself into my blog.  I wrote more than a thousand pages, combining philosophy, psychology, cognitive science, economics – everything I read that helped me better understand just what in our system had broken.  I studied school performance maps and their relationship to demographics and geographic housing patterns.  I looked at average incomes and crime rates, education levels and quality of libraries by zip code.  I began thinking about the concept of financial capital leverage, how it takes money to make money.  I thought about this concept in a global sense: what about human capital?  Social capital?  Education, safety, parenting, sanitation, peers, parks, transportation, infrastructure – all forms of capital at one’s disposal.  All of these were determinate of one’s self-efficacy.  Forget about the income gap.  The reality of human growth involves something greater.  I needed a better term so I began referring to this as “societal capital”: everything that a society provides to a person that s/he is able to leverage into the exponential attainment of more capital.  There was negative capital as well.  Drug abuse, mental illness, and discrimination were all negative forms of capital that had the opposite effect: they decreased one’s capital exponentially.  I doubted many people were reading.  But I had to write.  I needed an outlet.  What I was writing about wasn’t commonly found on either the right or the left.  It was either all the kids’ fault, or it was the teachers’ fault.  The conversation wasn’t advanced enough to look deeper.

At school, my principal, in his infinite wisdom, was convinced what was needed was to adhere to strict curricular standards and test preparation.  These kids needed to learn.  No matter that many were being beaten and abused at home, suffering PTSD from secret tragedies, or only coming to school because it was the one relatively safe place in their lives where they could sit quietly or engage a peer in friendly conversation.  The old continuation model, where students were given hands-on, therapeutic course work like sculpting, art, or poetry – a recognition of their broken state, was long gone.  Now we were all about state test performance.  The principal, I once remember, confessed to his staff his love of “data”.  Data was going to drive our teaching.  Data was going to drive performance.

When yearly testing came around, my students drew patterns in the bubbles.

I then moved on again, this time to teach in general education in Yucca Valley, named after the beautiful, somehow alien-looking cactus trees.  I hoped the students would at least be in something more than survival mode.  At this point, 5 years into my career, I know I was.  I had two small children and a wife at home to support.  I was used now to laying my head down at night on a pillow of guilt, doubt, anger, and fear.  But he next year was hardly better.  In some ways it was worse.  These kids – mostly low-SES or with various troubles – were satisfied with D’s, something they were used to acquiring by the skin of their teeth at the end of the year after doing little work and pleading with their teachers.  Towards the end of the first semester, 2/3 were failing, despite every possible prompt I could have given them.  The 4-5 daily calls home I made were of little use.  The parents had long since given up hope on shaping the behavior of their children.  The assistant principal, however, pulled me aside one day and explained that I couldn’t fail this many kids.  There simply wasn’t enough capacity in the summer school program. 

I was slowly going mad.  During my long commute up into the wretched dust of the high desert, I fantasized about driving my car into the divider.  Some nights I cried.  I couldn’t take it any more.  Now not only did my students hate me.  I hated them.

So this was it, I thought.  As the end of the year came I packed my things, fully expecting to never teach again.  I thought about going into special education, where I hoped (naively) at least I might have the support to meet my students’ needs.   I soon learned that the burnout rate in special ed was actually higher than anywhere else in education.  But I was determined.  Maybe I could make it work.

On my blog, I was realizing I had said most of what I needed to say about our broken system of education – that in reality what we had was a broken economic system built upon a broken, flawed philosophical system that didn’t understand human behavior.  In a special education course I learned about the history of disability rights in the US in the latter half of the 20th century.  After a procession of legal victories, a social conception of rights had formed to agree that people with special needs deserved larger society’s active effort in providing them the support they needed to be as successful as possible.  No longer shoved away into the corner, out of sight and out of mind, society now had a responsibility to these individuals.  In 1975 the Individuals with Disabilities Act guaranteed that children with special needs had a right to a "free and appropriate education".  Schools were now required by law to address students’ needs and make accommodations for them. 

It occurred to me that what we had come to realize was morally correct with regard to physical disabilities we had not yet come to realize for the merely “disadvantaged”.  Sure, we offer the bare minimum through a patchwork of programs such as free/reduced lunch or title 1 funds for special tutoring.  But it can't possibly make up for the problems these kids live with.  And it is nothing like the large-scale, sustained, legally binding law we have for students with disabilities.

Why not?  Is not a child with numerous risk factors deserving of special assistance in a comprehensive, cohesive fashion?  When you look at all of the factors that go into a child’s emotional, physical and academic success, the very concept of disadvantage defines an unequal future.  Is this, to use the language of the 1975 law, “appropriate”?

I realize providing an comprehensive, individualized education to disadvantaged students is a radical notion.  What it implies is a complete restructuring of our entire educational system, as well as a rather invasive and powerful legal intrusion into the family.  Yet the larger problem is philosophical: kids are born with disabilities.  There is little we can do to change that.  But kids are not born disadvantaged; they are born into an environment of disadvantage.  We can change this, but it requires a reshaping of our economic and social institutions.

Let’s look at a real-world example of what I mean.  Take the 2 year-old child of the 17-year-old mother.  If this child was taken and place in an upper-SES family, it’s risk factors removed, it would likely go on to college and stand a high chance of success.  Of course, this would be entirely immoral, aside from impractical.  This child might be fine, but what about the structure within which the parent exists.  She will continue the rest of her life working for low wages, providing society with cheap labor.  But in this scenario society has not changed.  The sector of the economy that requires an underclass to operate will continue.  And who is to say the parent won’t simply have another child?  She likely will.  She will be living in poverty, in a neighborhood with similar property values as she can afford, inhabited by people of similar means.  They will all send their children to the same school, to exponentially de-leverage each other, while on the other side, the upper-SES property value families will be leveraging away between rounds of tennis and political caucusing.

We’re not going to take children away from their families.  That’s barbaric.  But is the current neo-liberal agenda, to “fix the schools”, much better?  

It sounds good, and has a bipartisan ring.  But in the past decades we’ve tried: charters, school closings, hiring “the best and brightest”, union-busting... we’ve made next to zero progress.  After a sober analysis of the functional relationships in poor communities, children and education, is it any surprise?  At best, schools are an extra leg-up for those unlucky enough to have low levels of “societal capital”.  At worse they are a continued excuse for a society that wants to pretend it is helping those that get left behind by the system, all-the-while depending on classes of poor people for its cheap labor.   Given that most of our political eggs are in such a hopeless basket, this alternative seems pretty barbaric too.

So what to do?  Well, we can start by understanding the problem.  And to understand it, we must return to Skinner.  As I mentioned previously, what is broken in society is our philosophy of human behavior.  In 2016, enormous numbers of people still believe in the magical concept of “free will”.  What this means is that they believe that all people, once they reach a certain age, are “free” to make their own decisions, regardless of life experience.  If, at this point, they make poor choices, the responsibility begins and ends with them.  It is not society’s fault.  Therefore, society is not responsible either for investing in special programs to help them, or to alter its institutions so as to stop creating the environments which produce individuals who would make such decisions.  This is the raw core of disagreement between right and left, democrat and republican, “big government” versus “small government”.  You can have policy debates about the efficacy of different policy responses, but less government necessarily means leaving people on their own to suffer whatever contingencies naturally arise in any given environment.  For some, this may work fine.  But for many, this will mean a life shaped by a lack of resources.  And just like financial resources, Rent will develop in which those with capital will be in a better position to profit off of those with less.   Humans are selfish by nature, and without strong contingencies keeping us in check, we are very good at living with inequality.  We are very good at building walls, both literally and figuratively.

So to continue the metaphor, the first wall that must come down is the notion of free will.  Skinner’s work, along with countless others in the natural science of behaviorism, has devastated it once and for all.  Radical behaviorism the most rigorous, empirical, and parsimonious account available of human behavior - both verbal (including private thoughts) and non-verbal.  Its laws are irrefutable. 

The political history of behaviorism versus cognitive science is long and not well enough understood.  But it is apparent that those who decades ago began proclaiming its death were ignorant of the science.  Not only is it flourishing today, with practical applications that show results in certain populations - namely those with autism - that would be almost unthinkable in any other field of psychology, but offers great promise in a great many other areas as well, from business to politics, mental health to urban planning.  (And of course we can't forget education.)

A common critique of behaviorism is that it is too simple, too reductionist.  Sure, it is said, it describes well the behavior of simple animals, or some simple human behaviors, but doesn’t come close to explaining the complexity of human thought.  This is both true and false.  True, many human thoughts are the product of incalculable functional relations between an individual’s genetic make-up and the schedules of punishment and reinforcement that have acted upon them in a process of unfolding, never-ending contingencies.  We cannot account for specific instances at such levels of resolution.  But this would be like asking how one molecule of H2O got to be just where it is in the middle of a hurricane, or of a molecule of pigment in a Rembrandt. 

At the same time, it would be false to say we don’t a have a clear explanation for the principles from which molecules of water or paint operate.  They are part of highly complex systems, yes, but systems that obey fundamental laws.  The same can be said of behavior.  We know the principles of respondent and operant conditioning are responsible for every measurable human behavior, no matter how complex.  As a system, the human “mind” is entirely behavioral, and operates according to the same principles and laws that all other organisms do.  I can’t possible tell you whether or not you are going to drink coffee tomorrow morning, or why you didn’t today.  But I can explain the principles involved that will, over time have come to create your coffee-drinking behavior.

But why, people would always say, does it just FEEL like we are in control of our actions?  There must be something more.  Maybe some emergent property, quantum entanglement, or some yet to be discovered phenomenon that allowed for us - the most sophisticated thing in the known universe - to be acting free from the constraints of determinism.

I’ve only been studying behaviorism for close to three years now, and a certified behavior analyst for less than one year.  But I'm still struck by the gap between behaviorism's vast scientific body of knowledge and popular discussions of psychology and philosophy -  not to mention its implication for every other realm of human endeavor.  I still love to read non-behaviorist writing on the subject, but I find it increasingly difficult to take anyone seriously who doesn’t at least understand the basic principles of behaviorism.  It would be like reading book on geology by someone who has never heard of radiometric dating.  To be honest, the discussions of the “mind” one often hears from non-behaviorists bears striking resemblance to evolution skeptics discussing intelligent design.  One of the first things I do now when picking up any book dealing with a related subject is check the index for references to Skinner.  There usually are none.

I’m now a BCBA, with a successful practice and doing work I love each day with beautiful families and their children with autism.  I am deeply gratified by the constant miraculousness of the work – the principles of “verbal behavior” that Skinner developed and others built on to provide effective treatments I get to deliver are literally teaching the behavior of thought in children and allowing them to more fully meet their potential as individuals.

But while the work I do makes a profound difference in the lives of children with special needs, I can’t help but think of the larger world, with so much sadness, suffering and inequality.  I can’t help but think of the impact we could have – not just with the practical application of the principles of behaviorism, but the philosophical implications its science has for how we structure and organize our society.  With a behavioral lens, so many old mythologies that have kept us from real political and social progress slip away, exposed as illusions.  In their place, a vast humility takes form that sees each individual as unique, yet inseparably tied to everything else in the universe, shaped by the contingencies of day to day life, interacting through his breath, blood, organs and nerve endings in complete harmony.

It is through our objective philosophy of science that we reach out across four thousand millennia and gaze upon ourselves, both in all we could be and what we may one day yet become.  This reaching is our “will”, though it is anything but free.  It provides the reinforcement.  It provides the punishment.  All we can ever do is go along for the ride, from these words to your history of associated memories.  What will be will be, however in this humble endeavor, I try my best to be light from that gaze, into darkness of the unknown.