Saturday, August 27, 2016

LA Traffic



Things aren't looking too good out there on the freeways this afternoon, Bob.

Madness, sadness and slow death continue to be an issue as drivers make the slow crawl home.

Southbound on the 405 we have a head-on collision of mid-life panic.  A driver in the left lane is frantically twisting the knob on his stereo.  So far nothing yet to soothe his creaking husk of a corporeal vessel as it hurdles along the expressway, sealed within a 2004 Honda Civic with deteriorating gaskets, 45ft feet above the Glenn Reynolds apartments, a Maxwell's mini-storage and Donuts 4 U.

Celine Dion is only slightly brightening the day of a driver of a green Saturn Vue westbound 10 just before you get to La Cienega.  Her daughter is in the backseat pouting because she hates her new teacher who yelled at her to sit down even though she really had a question about problem #12 and the stupid girls behind her were the ones who dared Oscar to make the sound of an elephant.  But how many times has she heard this song before - she saw Titantic 4 times in 2 weeks when it came out and her girlfriends would squeal because Leo's hair was so cute and she would fantasize about him at night and dream of meeting someone who would take her hand just like Rose and run with her and she would follow him wherever he went.  But all she does now is sit at the front desk of L8 radiology and enter patient id numbers into the booking server.

Not much movement on the 110 at Staples center.  It appears AM 1070 is advertising gold again.  Mike rolls his eyes and spits another sunflower shell into his travel mug.  He's still thinking about the god-damn Mexicans he passed at the onramp, standing around the low-wall in front of Grayson's appliances.  The faded 1960's font reminds him of how this country used to be, when a man could open up his own business and compete.  Now the x-stretch iron extender bars are locked over the front windows, a pathetic For Lease sign barely visible.  And these guys loitering in the parking lot for what, to low-ball some actual Americans who love their country and deserve to work for a decent living, not like these dirty illegals who will take anything and then go back to their gang-banger kids who have zero respect for anyone.

A bright spot on the 5 though at Long Beach.  Things are opening up and a bleached blond sales rep in a black Mercedes S-Class just slowed to let a Blue Ford Fiesta merge in front of her.  Her test results just came back from the biopsy and she's negative.  Thank God for small miracles.  Maybe she should call Geoff and have him pick up a bottle of something good to celebrate.  Who gets cancer when they're 28?  Jesus, that was scary.  One minute you're meeting performance targets, paying down the Am Ex, planning a trip back East to visit family in Vermont - the next you're crying in the waiting room in a faded gray hospital gown.

Thanks, Barb.


Friday, August 26, 2016

Thinking About Critique

Harold Pollack posted a heart-breaking photo on his blog from in which a child bride is in tears as she is forced into the hands of her bride-to-be.  He writes that
We sometimes hear the argument: Who are you outsiders to criticize someone else’s culture? One answer can be seen in the picture below. The women most intimately affected often object.
The photo is indeed deeply saddening.

Political correctness is complicated, and often gets reduced to an easy caricature (which ironically is used as a blunt weapon to shut down debate).  But one of its key elements is the notion that we all have cultural baggage and biases, that we each have relative social privileges based on our gender, sexual, racial, income, education, etc. histories, and that we should be cognizant of this. The whole movement rose out of a long, slow cultural transformation - reaching its pinnacle in the civil rights movement - in which social inequality and prejudice was understood to be a function of cultural assumptions borne out of ignorance and exploitation (women not having the vote gave men more power, blacks and minorities having to do menial labor gave white more power, and thus the relationship was exploitative).

It's obviously much more complex, and I feel like I am merely stating the obvious here, but to the degree that political correctness is lamented, I feel there is an obvious historical ignorance at work.  Ironically, those who despise political correctness are often the same people who feel that these imbalanced power relationships no longer exist and so the need for critical self-awareness is unnecessary, however the act of being self-critical is the very thing which prevents these cultural inequalities and exploitative relationships from developing in the first place!  The basic truth of historical discrimination is how it was so often less about a top-down explicit, rational, logical oppression, but rather an unconscious, bottom-up acceptance of traditional assumptions and views that went unexamined, and to the extent that they were, they were rationalized and justified post-hoc.

Obviously, self-awareness can be taken to extremes, and become neurotic.  It can also be used as an unfair, ad-hominem, bludgeon to scold those whose opinions you disagree with.  But the basic idea that we should A)Acknowledge our historical tendency of prejudice, B) Acknowledge that it could be a factor in our thinking at any time presently, and C) Spend a little time thinking before we speak, I think is perfectly reasonable.

That said, specific matters of cultural analysis are complex.  At issue here - what kinds of issues might come up in the discussion of child-brides in an outsider culture, I personally am rather unfamiliar with the terrain of cultural criticism on both sides of the moral/cultural relativity fence.   The photographer sheds some light in an interview.   But I do know there are good arguments on both sides.  On the one hand, a crying young girl being forced to marry (and assumedly be raped by her groom), is despicable.  On the other, there is a long history of (white) Westerners' outrage at such practices being amplified and intensified not merely by the act itself but by it as a rationalization of self-superiority through a dehumanization of the other.  An example of this closer to home might be the outrage at black wayward youths and characterization of them as "thugs" - a word that too easily erases the obvious social conditions from which the behavior arises).

Monday, August 22, 2016

Don't Tell Us Any More About Dualism

Hand prints in Pettakere cave, Sulawesi,
Indonesia, est. 35,000 and 40,000 years old
Apparently Tom Wolfe is not a big fan of Noam Chomsky either (see my previous post).  In his August 2016 essay in Harper's Magazine titled The Origins of Speech, Wolfe sets out to take Chomsky down a notch or twelve.   While I've personally always admired Chomksy's vocation as thorn in the side of Western imperialism and neo-liberal capitalist dogma, he's far too often a polemicist of the cheapest order.  Instead of keeping an eye peeled for nuance, irony and self-skepticism, his fervor seems to compel him to lazily troll the waters of simplistic, conspiratorial global plotting, in which there are rarely simply no good answers, but always rather the good guys and the bad guys, and his own uncanny ability to always spot who is who.

Wolfe too bristles at Chomsky's ad hominem attacks - "The epithets ("fraud", "liar", "charlatan") were Chomsky's way of sentencing opponents to Oblivion".  But while he spends a good deal of time examining Chomky's anarchist roots and rise to fame in the anti-war movement through harsh criticisms of the Vietnam war, the real focus of the piece is Chomsky's invention, in the late 1950's, of something called Universal Grammar.  UG is a theory of language in which humans, thought to essentially possess a "language organ" somewhere deep in the brain, that
"could use the "deep structure", "universal grammar" and "language acquisition device" that [the child] was born with to express what he had to say, no matter whether it came out of his mouth in English, Urdu or Nagamese."
Regular readers of this blog will note that I am a Board Certified Behavior Analyst.  In my practice I work with children with developmental disabilities (primarily autism), who suffer severe language deficits.  As such, I am trained in the science of behavior, the philosophy of which is called Radical Behaviorism.  From Wikipedia:
Radical behaviorism differs from other forms of behaviorism* in that it treats everything we do as behavior, including private events such as thinking and feeling. Unlike John B. Watson's behaviorism, private events are not dismissed as "epiphenomena," but are seen as subject to the same principles of learning and modification as have been discovered to exist for overt behavior. Although private events are not publicly observable behaviors, radical behaviorism accepts that we are each observers of our own private behavior.
(*these other forms are largely no longer practiced, radical behaviorism today being the predominant philosophy at the root of applied behavior analysis, experimental analysis of behavior, organizational behavioral management and relational frame theory)

As a behavior analyst, I am not an expert in language per say.  I'm certainly not trained in linguistics, nor speech pathology.  That said, as much as language is behavior, I understand how it works quite well.  Behaviorism as a field draws much of its foundation from the work of B.F. Skinner.  More than anyone else, he developed and expanded upon the notions of reinforcement and punishment, antecedents and consequences, discriminative stimuli, motivating operations and schedules of reinforcement to establish a robust theory of language.  In 1957, he too published a landmark book, Verbal Behavior.  Again, from Wikipedia:
For Skinner, the proper object of study is behavior itself, analyzed without reference to hypothetical (mental) structures, but rather with reference to the functional relationships of the behavior in the environment in which it occurs. 
Within Behaviorism, Chomsky is famous for his harsh criticism of Verbal Behavior, in which he generally misunderstands its fundamental concepts and levels baseless attacks.  Outside Behaviorism, however, Chomsky is famous for utterly refuting Skinner, igniting the "Cognitive Revolution", and generally assigning Behaviorism to the dustbin of history.  Behaviorists are endlessly baffled by a dismissal of Verbal Behavior that continues to this day, as psychology, education and language students are routinely taught Skinner only in passing, and with nowhere near the depth required to truly understand his work.

I must pause.  I realize that sounds like a suspicious plea.  Oh, if only they understood our work, they would agree with it.  The implication is, well, kind of Chomskian in its dismissal of disagreement: your objections are not valid because you haven't taken the time to understand the subject.  Maybe you have become too accustomed to the decadence of your bourgeois hegemony, non, comrade?

To start with, behaviorism is actually quite hard.  For starters, it is deterministic, and thus in opposition to traditionally dominant notions of free will and agency that are the very pillars of entire religious and political dogmas.  It is also highly technical and reliant on an elaborate network of scientific principles that much each be understood in their own right before the larger whole is assembled into a coherent theory.  And finally, it is often quite unintuitive.  Aside from its refutation of free will (an assertion many will reject outright based on what will be described as self-evidence), it's principles describe interactions between events that take place across a timeline that isn't easily grasped at first.  In this way it is like the theory of evolution, which requires a pulling-together of a number of various concepts (mutation, natural selection, change over time), and that can't be pointed to easily as a process unfolding right before our eyes.

In another piece I've been working on, I plan to go into more detail as to what I think Chomsky's motivations, or possibly as important, the larger public embrace of his supposed refutation of behaviorism's explanation of language.  I hope to have that up soon.  But as one might imagine, Verbal Behavior and Chomsky's notion of a Universal Grammar were bound to conflict.  Now, contrary to common understanding, behaviorism does not believe in the tabula rasa view of humans, in which we are entirely a product of our environment.  Rather, behaviorism understands that every organism has certain genetic proclivities for certain stimuli - these drive our basic wants and needs (food, sex, shelter, touch, etc.).  And within our species there are no doubt a range of individual differences in proclivity - some people will be more sensitive to certain stimuli more than others.

However, this is only the beginning.  We are learning creatures, and possess a basic tendency to behave in certain ways more or less, depending on physiological desires and the ways in which the environment does or does not stimulate them.  In this way, we are no different than most (if not all) other mammals.  If a baby chick makes a certain sound near her mother, and is rewarded by more attention, she will be more likely to make that sound again when her mother is near.  If a dolphin eats a certain type of fish and is disgusted by the taste, it will be less likely to eat that fish again.  If a bear hears the sound of a car and follows it to a campground filled with tasty potato chips, it will be more likely to follow that sound again (however, if following the sound is followed by no chips - or worse, a gunshot - it will be less likely).

When we drive past a sign in a store window announcing "75% OFF ALL ITEMS"*, we will be more likely to go into the store.  Of course, as humans, our behaviors will be somewhat more complex.  We will be likely, in fact, to begin a behavioral chain that we have learned (that has been REINFORCED, or has been previously followed by benefits): for starters, the words of the sign have previously been reinforced ("75% off"  = wow!, VERY available; all items").  Seeing that the store is one in which many items you enjoy are found within, you become extra excited.  You then engage in the behavior of thinking about what you might want there.  When you realize you could really use an antique Buddha lamp you've been eyeing in the window (a behavior previously reinforced by a college course you took in which you discovered the Buddha and how he reminded you (conditioning) of Santa Claus, whose appearance was consistently followed by presents), you are even more likely to pull over.  You now engage in a series of responses - checking your mirror, turning on the signal, turning the wheel - all of which you have learned have the desired effect of safely navigating the car where you want it to go (previously engaging in that behavior has had precisely the same result).

(*In fact, just reading "75% off" likely evoked in you a bit of conditioned warm feelings as you have likely been conditioned by our society to respond to large discounts).

Have you ever walked into a room and realized you forgot why you went there in the first place?  Have you ever been reading a book and realized that you hadn't even noticed the last couple of paragraphs you were reading because your mind was elsewhere?  These seem strange because they are unexpected moments in which "mindless" behavior is obvious.  But in reality, how much of our daily behavior is indeed "mindless" anyway?  In behaviorism, this "mindlessness" is easily explained in behavioral terms, as a part of our normal conditioning.  In fact, even our awareness of our actions is the product of conditioning; the language we use to describe it is Verbal Behavior.  The "radical" in Radical Behaviorism emphasizes this completeness: it deals not only with observable behavior, but with the verbal behavior that takes places within our own minds as well.

So what to make of Chomsky's Universal grammar?  It isn't necessarily inconsistent with behaviorism.  One could imagine the human brain possessing some structure that finds certain patterns of stimulus more reinforcing.  We do seem to see this in universal preferences for certain symmetrical facial features, or compositional patterns in art.  However, the real issue is more philosophical, and has to do with emphasis.  Is language something that is created by a physiological structure in the brain - a sort of computational device, or is it something that emerges over time, out of repeated interactions between the brain and the environment?

Chomsky preferred the former.  While he couldn't point to any such structure yet discovered, he assumed it was only a matter of time before it would be.  Mainly, he argued, the fact that humans were capable of generating utterly new thoughts and ideas refuted the notion that we were bound by our experiential interactions with the world.  He pointed out that small children were able quickly to imagine and speak of things which they had no direct knowledge.  He noted as well that all human languages had certain patterns in common.  This was clear evidence that there was something universal about language, something that had evolved in our species.

Wolfe describes how influential Chomsky's position became.  Despite a lingering lack of evidence for any organic grammar structure, his theory was massively popular, and helped propel linguistics as a field into increasing popularity.
"Thanks to Chomsky's success, linguistics rose from being merely a satellite orbiting around language studies and became the main event on the cutting edge... the number of full, formed departments of linguistics soared."
 The so-called "cognitive revolution" ushered in an era in which the context of the speaker (history, stimuli, motivations, etc.) was less important than the structures presumed to exist within the brain.  Although, still without much evidence of what or where those structures were in the brain, researchers had to be content with inventing metaphors for these supposed mental processes.  Chomsky's emphasis on physical structure inspired elaborate conceptualizations that were themselves rooted in 3D dimensional intuitions: sorting, shifting, stacking, storing, building, etc.  With the rise of the computer, which was indeed a physical system in which code could be written and organized that did all these things, a perfect analogy was there for the taking.  What was the mind, it could intuitively be imagined, but a computer of flesh and blood?

But we had made this mistake before.  Rene Descartes, writing in 1647, reasoned that the mind could not be part of the body because it was ethereal and indivisible, as the body was corporeal and thus divisible.

[T]here is a great difference between the mind and the body, inasmuch as the body is by its very nature always divisible, while the mind is utterly indivisible. For when I consider the mind, or myself in so far as I am merely a thinking thing, I am unable to distinguish any parts within myself; I understand myself to be something quite single and complete….By contrast, there is no corporeal or extended thing that I can think of which in my thought I cannot easily divide into parts; and this very fact makes me understand that it is divisible. This one argument would be enough to show me that the mind is completely different from the body…. 
The notion of the "mind" being an entity separate from causality, and therefore from the body, is known as Cartesian Dualism, and is a fallacy.   As Daniel Dennet writes in Consciousness Explained:
Cartesian materialism is the view that there is a crucial finish line or boundary somewhere in the brain, marking a place where the order of arrival equals the order of "presentation" in experience because what happens there is what you are conscious of. [...] Many theorists would insist that they have explicitly rejected such an obviously bad idea. But [...] the persuasive imagery of the Cartesian Theater keeps coming back to haunt us—laypeople and scientists alike—even after its ghostly dualism has been denounced and exorcised.

Somehow, however, the dualist trappings of cognitivism persisted.   While behaviorism explicitly opposes it as mentalistic,  and by definition non-deterministic and unscientific, the idea "that the mind and mental states exist as causally efficacious inner states of persons"  has been a major assumption of many cognitive theories for the last half-century.  

Wolfe fast-forwards us to 2005.  A paper by Linguist Daniel L. Everett, titled Cultural Constraints on Grammar and Cognition, described a tiny Amazonian tribe, the Pirahã , who Everett had spent a great deal of time with as a Christian missionary, and whose language broke all the Chomskian rules of Universal Grammar.  Indeed, it was observed that it was the culture of the Pirahã themselves that defined their language.
"...Their unique ways of living shaped the language - not any "language organ", not any "universal grammar" or "deep structure" or "language acquisition device" that Chomsky said all languages had in common.
This was a people who lived radically different lives than most everyone else on the planet.  Intensely isolated, they lived in a world most of us wouldn't recognize.  Because of their cultural patterns, they simply had no use for concepts and ideas most of us might take for granted.  They used almost no tools, they kept few material goods, neither read nor wrote, and had no mathematical concepts beyond "a lot" and "a little".  When Everett showed them black and white images, they struggled to make sense of them, so unaccustomed as they were to pattern recognition.  Furthermore, they had no words for yesterday, or tomorrow, but rather referred to them as "other days".  With little conception of the past, Everett's attempts at converting them to the Christian faith were hopeless.  In a hilarious (and honestly, quite triumphant, personally)  passage of the article, Wolfe writes of what happened when Everett tried to teach them about Jesus:
"How tall is he?" the Pirahã would ask.
"Well, I really don't know, but -"
"Does he have hair like you?" meaning red hair.
"I don't know what his hair was like, but -"
The Pirahã lost interest in Jesus immediately.... After about a week.... one of the Pirahã, [named] Kohoi, said to Everett politely but firmly, "We like you Dan, but don't tell us any more about Jesus."
 
To a behaviorist, this makes perfect sense.  Culture evolves, and with it language.  Why would a language develop if it wasn't being reinforced?  To the Pirahã , who had no use for concepts of tomorrow or yesterday, no such language was needed.  The universal structures Chomsky proposed to have evolved in all humans were curiously absent in in the Pirahã.  Everett argued that this was, as the New Scientist described his claim, "the final nail in the coffin for Chomsky's hugely influential theory of universal grammar... [that] most linguists still hold to its central idea."

In my work, I regularly encounter children who completely lack any language, or as we say - a verbal behavior repertoire.  There can be many reasons for this, but often, the most salient reason is lack of learning opportunities.  What this generally means is that, either because of their genetic make-up or specific environmental history, they require more exposure to certain stimuli before relationships develop.  We apply the principles developed by Skinner in Verbal Behavior, in which different contingencies of reinforcement are targeted for different language skills.  We teach echoic (vocal) imitation to begin to develop in the learner the ability to produce certain sounds.  We find motivating items and make their availability contingent on emitting certain requesting ("manding") behavior, usually either by handing a correct image of the item, or signing or making a correct vocalization.  We teach receptive language skills by saying the name of an item or activity and reinforcing when the child indicates the correct response.  We teach labeling by asking what an item is and reinforcing the correct response.  We teach what Skinner called "intraverbal" behavior, in which a response is controlled only by other verbal behavior, i.e. not simply what is in the room (e.g. What do you wear on your head?  Hat).  We teach matching and categorizing items into groups by what features they share, what they do, or logical classification.

All of these skills are slowly built up in a child until the skills, as a whole, represent a language repertoire. The child is now talking.  The child is now thinking.  Now, by definition, these children have a disability.  They were not raised in an environment completely lacking in language.  For whatever reason, they were not able to "pick up" language as easily as their typical peers.  There are many reasons for this, and much is still not yet understood.  However, a common feature of the autism diagnosis is that what is sensorially pleasing to a typical child might not be so to a child with autism.  For instance, eye contact, a critical component of social connection, is often lacking in autistic children.

And yet it is critical for learning.  When a baby makes eye contact with you and you smile back, indicating joy, the baby experiences social reinforcement.  This will go on to become one of the most powerful mechanisms in the child's life as it grows and begins interacting with the world.  Some of the first behaviors of an infant are shaped by differential social reinforcement.  That is, when a  baby says "mago mago", we likely show little excitement, and there is little reward for the child.  It will be no more likely to occur again.  However, if the child says "mama mama", and our eyes light up and our cheeks rise and the corners of our mouth widen, that behavior has just been rewarded, and will be more likely to occur again.   The baby wants that response, and so will say mama over and over.  Soon, it realizes that when it says "mama", we stop what we are doing and pay it attention.  That little word is like a giant invisible rope that it can now use to turn us around.  Yet, if the child is uninterested in your facial expressions, it will miss the difference in response you show.  It won't be able to learn to say words that can use to control its environment.

This process of conditioning extends its reach into every aspect of our lives.  More words mean more leverage in the world.  They mean more access to new items and activities.  They mean more thoughts and ideas.  Children without language live in very simple worlds, with only the simplest access to rudimentary, basic sensory activities.  But as language develops, so to does their world:  a toy car is no longer a thing with wheels that spin, it is a people mover that drives on a road and crashes into things!  Little people panic and move out of the way.  Categories of community helpers need to come and help put out the fire.  Friendships are forged in trust over the terrible disaster.  Social relationships are explored and complex emotions are developed.

Nature versus nurture has always been a puzzle.  As soon as babies are born - even within the womb, really, they begin to experience environmental stimuli and the process of conditioning.  What would it be like if a child was raised without humans around?  We've never been able to do such an experiment, for obvious ethical reasons.  However, in these children, who lack a typical means of conditioning, you in some ways have a control group for what it would look like.  It is as if they weren't exposed to the normal contingencies that day to day social life exposes typical children to.  Like the Pirahã people, they are cut off from that process of socialization, even if unlike the Pirahã, the barrier is not geographic, but physiological.

Apparently Chomsky wasn't moved in the slightest by Everett's revelations.  Wolfe writes that Chomsky told him Everett's opinion "amounts to absolutely nothing, which is why linguists pay no attention to it."  A reaction strikingly similar to the one he gave Skinner in 1957.  At least then he wrote a scathing review.  The original idea, which assumed some mysterious physiological organ capable of generating new thought independent of one's environmental context, which threw out the notion of language as a behavior conditioned for like any other, which revolutionized the fields of psychology, linguistics and education among others, was finally refuted by a small tribe in Brazil.  Meanwhile Behaviorism, the field which Skinner was instrumental in developing, and for whose work he received the Humanist of the Year Award in 1972, which spawned multiple journals and is today the pre-eminent treatment for Autism and has helped millions of individuals begin to speak, still struggles to find mainstream acceptance.

Wolfe closes by noting that despite Chomsky's theory failing to ever find any hard evidence, he
"had made the most ambitious attempt since Aristotle's in 350B.C. to explain what language exactly is.  And no one else in human history had come even close.  It was dazzling in its own flailing way..."

Actually, nothing could be further from the truth.  Behaviorists have not only a much better explanation, but we put it into practice on a daily basis.  While it may be a rather complex, a bit unintuitive, and downright subversive to many traditional ways of thinking, its results are indeed quite dazzling.   To those of us in the field, we can hardly keep from imagining how transformative a larger, more widespread adoption of our techniques and principles might be in other areas of human development, such as education, or criminal justice.  And something tells me, the foundation of our science is a lot more solid than might be overturned by researching an obscure population in a remote part of the world.  As I used to tell my students when I taught high school biology, all it would take to overturn evolutionary theory would be to find fossilized remains of a human in a strata of rock from the age of the dinosaurs, so too might we find an ancient culture whose language was not explainable by conditioning or reinforcement.  But I won't hold my breath.









Sunday, August 14, 2016

Vote Hillary for Prez, Play Hacky Sack with Rednecks

I have a pretty low opinion of Noam Chomsky.  I tend to find him a cranky, paranoid a-hole.  So I was pleasantly surprised to read this cogent argument for voting the lesser-of-two-evil candidates, or "LEV". Especially this bit:
The left should devote the minimum of time necessary to exercise the LEV choice then immediately return to pursuing goals which are not timed to the national electoral cycle.

So, quickly vote LEV, then get on with the larger process of political change.
I think may arguers against LEV have deluded themselves into thinking that the masses of unheard/unvoting people are really with them, but afraid to vote their conscience. This is a common fantasy on the left and the right: more support for their side than is reflected in elections.
I think this might be true among a small number of voters. But far more are not at all "with them" - either because they don't care or simply disagree.
People's voting habits generally reflect their political beliefs.  You can see this as polling matches up with geography.  In places like Berkeley, CA, with high numbers of progressive voters, you don't see many center-left, neo-liberals elected to local office. Neither, in rural, right-wing conservative towns like Bristol, TN, do you see many moderate Republicans.
Yet in races involving large, politically diverse electorates, such as senate or presidential races, that is exactly what you get: candidates who appeal both to their more politically extreme base, as well as the moderates and swing voters.
If you want a more extreme candidate - a socialist, or a creationist, you shouldn't be withholding your vote in general elections. Instead, you should be working to convince people in Berkeley to accept Jesus Christ as your lord and savior and that abortion is murder, or that people in Bristol, TN should raise the top marginal tax rate back up to 90% and rewrite the second amendment.
Of course, that's a bit more difficult than the faux sense of personal pride in not voting for the LEV.

Wednesday, August 3, 2016

Caring Less (or What's at the Top of Mt. Conservatism)

Voting, our most precious right, is also our most important civic duty.  The idea of someone being able to fraudulently vote in an election - effectively nullifying our own vote, is justifiably worrisome.  Why not, then, pass strict requirements - such as a driver's license or state-issued ID - to make sure people are who they say they are?

Because many do not have a state-issued ID, and would thus not make it to the voter booth.  This is an empirical fact, and we have examples of just this sort of thing happening.  A study of Indiana's 2008 general election found that 
 out of the roughly 2.8 million persons who cast ballots during Indiana's 2008 general election, 1,039 arrived at the polls without valid identification and then cast a provisional ballot. Of those 1,039 persons without valid identification who cast a provisional ballot, 137 ultimately had their provisional ballot counted.
So a balance is being struck between lower requirements and higher voter participation, versus higher requirements and lower participation.

The interesting thing to me about the Voter ID debate is that it comes down to one's sympathy for those who would or would not be affected by passage of more stringent requirements. One side says the requirements are reasonable and that if people end up not voting it is their fault. The other says that it is unreasonable and that it is not their fault. 

Which, of course, is the original issue at the core of the left/right divide: what personal responsibility means; how much control each of us has over our lives; what freedom means. 

I sometimes wonder about the chicken or the egg. Does this philosophical position lead one to a political stance, or does a political stance lead one to this philosophical position?  But without knowing the details of one's learning history - or history of reinforcement - it would be hard to know either way.


As I've argued on this blog before here, here, and here, I think the logic of a conservative philosophical position, rooted in a free will notion which views the individual as free to act unless physically restrained, actually pushes one towards less sympathy for those with disadvantage.  This aligns towards ancient patterns of bigotry.

Stereotypes are defined both by an emphasis on a "fixed and oversimplified image or idea of a particular type of person or thing".  Stereotypes reinforce bigotry when they reinforce that thing being essentialized into an innate trait of the thing (blacks are violent, Mexicans lazy, women dumb, Jews conniving, etc.).  A view that eschews historical social pressures in favor of people acting of their own accord is more susceptible to buying in to a bigoted narrative.  Bigoted stereotypes are incorrect because they take a kernel of truth - a generalized behavioral observation - and justifies a blanket view of a group as possessing some inherent trait.  Modern, civilized people aren't supposed to hold such beliefs.

If one believes group differences are a product of their learning history, then individual stereotypes are at best only an example of one individual's own learning history.  A "dumb" blond is not an example of her gender's inferior intellect, but her lack of socialization and education.  The "truth" of a stereotype (e.g. a supposed pattern of "dumb" blondes), is only evidence of a larger socialization process which is apparently not adequately educating blond women.  The same would be said for Hispanics disproportionately performing menial labor, blacks in prison, or successful Jews.  In other words, you could take any member of the supposed stereotypic class, drop them into the learning history of any other, and they would turn out similarly.  This is the liberal, scientific worldview -rooted in behavioral and social science.

If one discounts this view, and instead believes group differences are a product of individual choices, then individual stereotypes are at best only an example of an individual's own choices.  The "dumb" blonde is choosing to be dumb.  The violent black man is choosing to be violent.  However, what then to make of general stereotypes - patterns of behavior such as repeated encounters with "dumb" blondes?  Since no socialization structure is at fault, and individuals are free to act of their own accord, how to explain this phenomenon?  

It is here, deep, deep, deep into the philosophical weeds of political philosophy, tangles in the vines of intuitions and examinations of human nature, that - in my view - conservatism becomes incoherent.  Two roads must be taken A) soft (liberal) conservatism, and B) hard (right wing) conservatism.  

Soft conservatism concedes the point on human nature to science - we are indeed a product of our environment.  What then to do about obvious disparities and inequities among groups?  It is a moral duty to help, seeing as moral responsibility (duty and obligation to fairness) is now logically forced to acknowledge disparity and privilege, and act.  How, as a society, then, do we do this?  We can't involve the government, as it is (per conservative assumptions) corrupt and inefficient.  Government intervention will only make things worse.  Here welfare is pointed to as promoting poverty - reinforcing it, to use a scientific term.  Instead, the soft conservative must rely on the promotion of faith and family as the source of intervention.  The privileged individual must fulfill his or her own moral obligation by direct contribution to charity.  Whether or not this is the more efficacious intervention becomes at this point wrapped in first assumptions embedded deep in ideology.  Some soft conservatives acknowledge that indeed certain government actions are necessary - such as public education, but draw the line at others, such as food stamps, which foster dependency (how this is true of government charity, and not private charity, I've never understood).  

However, in practice this position results in large-scale abandonment of millions of underprivileged groups.  Private charities are nowhere near up to the task of interventions required to actually rectify inequalities that allow individuals to live up to their full potential, much less acquire basic minimum income, health insurance, job training, neighborhood safety, etc.  In fact, they would claim, government intervention (financed by taxation) actively stands in the way of the most powerful intervention of personal fulfillment: jobs created by business.  The fact that enormous sectors of the economy are dependent on disenfranchised, low-skilled populations to work for poverty wages.  I'm not sure how a soft conservative squares this inevitably Darwinian aspect of capitalism with his view that inequality is a product of social forces.  Is there some magical capitalism in which all low-skill labor is undertaken only by teenagers on summer jobs while preparing for lucrative careers after colleges or trade apprenticeships?  That's a lot of dishes washed, boxes loaded, lettuce picked and asses wiped.  Here, I fear, is the logical end-point of soft conservatism so shrouded in an airy, high-altitude mist that few will ever dare to ever climb.

Rather, I imagine many soft conservatives, sensing dangerous contradiction, adopt a useful version of hard conservatism, in which an appeal to individual choice and free will is leaned on in times of rhetorical necessity, i.e. "Well, it is true that he is a product of his environment, but he should have known better.  There is still some aspect of freedom in all of us. This accounts for his ultimate personal responsibility for his actions.  I don't care if he grew up poor, in a single household, dropped out, fell in with a gang... he made a choice."  This is a clever technique, but unfortunately a rhetorical fallacy known as "ignoratio elenchi"
....presenting an argument that may or may not be logically valid, but fails nonetheless to address the issue in question. More colloquially, it is also known as missing the point.
The soft conservative is forced to make this error because his larger conservative ideology of limited government clashes with his acceptance of the science of deterministic human behavior.  I'm also curious as to what role a basic temperamental aversion to norm-breaking might play in him resisting the notion of individual responsibility and wanting to continue to place blame on the individual.  It has been argued that authoritarian tendencies, as well as sensitivity to disgust are correlated with conservatism.  These could be genetic, or even epigenetically latent in each of us, but brought out by some form of conditioning.  Regardless, the ignoratio elenchi simply appeals to a non-scientific viewpoint, which is the foundation of hard conservatism.

Hard (right wing) conservatism refuses to accept the science on human nature, and instead appeals to a mentalistic notion of causality as existing within the individual.  We are all free to make choices, and thus are ultimately responsible for our own actions.  However unscientific (the most parsimonious evidence overwhelmingly favors human behavior as a product of the interaction between genetic (phylogenic) and environmental (ontogenic) processes), hard conservatism dovetails perfectly with an ideology of limited government and  personal responsibility.  Because we are free to make choices, no government (or even charitable) interventions are morally required, as individuals are responsible for their lot in life.

At first glance, hard conservatism would not seem especially susceptible to stereotypes.  If people are free to act of their own accord, individual "dumb" blondes, or violent black men have no relation to their relative group.  However, what to make of stereotypic patterns of behavior among groups?  One "dumb" blond is understandable, but what to make of a larger pattern?  What to make of a disproportionate number of black men committing violent crimes?  Now, it appears, we are back at square one: what causes these behaviors? If we all have free choice, wouldn't every group engage in patterns of behavior at roughly the same rates.  Wouldn't there be as many wealthy Jews as Mexican immigrants?  Why aren't there more female physicists?

Remember: as hard conservatives we aren't allowed to appeal to social structures and environmental learning (we especially don't want to because that might lead to our own moral culpability, which could be aversive).  So what is a hard conservative to do?  Either accept some of the evidence of social learning, and soften your conservatism, or double down, and assume some essential aspect of the group that would explain why the individual might be more likely to engage in certain behaviors, i.e. racism and bigotry.

It is no coincidence that bigoted ideology is correlated with right wing politics.  Groups classified by the Southern Poverty Law Center are overwhelmingly conservative.  Of course, it is perfectly possible to be a liberal and be a racist.  If I view blacks as inferior, I may want to have more active government trying to control them (ironically, some conservatives indeed favor an active role of government when it comes to oppressing gays).  But, to the point of this article, one can accept the scientific view of human behavior as genetically and socially determined, as well as the idea that some groups are genetically superior to others.  But these people are the overwhelming minority, likely because once you begin accepting the science, you are already down the road to a more objective epistemology, and have been reinforced by the process of becoming open to new ideas.

But isn't all of this stuff rather high-level?  These may be the metaphysical assumptions upon which our ideologies are built, but how many of us ever examine them so deeply?  Rather, the average citizen simply has beliefs, attitudes and behaviors towards others.  Political positions are taken, stances adopted, feelings felt and then explained accordingly.  As noted, the vast majority of citizens identify as non-racists, as non-bigots.  Heck - even avowed racists sometimes deny being racist.  Instead, they merely identify as pro-white, pro-separation.  Instead, it is attitudes and actions that we must examine.

In my last blog post, I described a process in which we can identify as non-bigots, yet actively engage in practices, through our language and actions that indeed privilege certain groups over others.  Someone says - believes - they are not racist, and yet crosses the street to avoid an unknown black man on the street.  Someone is irritated by the slang or enunciation of a minority, yet fails to recognize their own (i.e. critiques a black girl for saying "axe" instead of ask, but themself says "appointmeh" instead of appointment, or "Immanah" instead of I'm going to).  Instances of these types of failure to treat everyone fairly, to offer the same sort of compassion and understanding to some groups than you might your own, are myriad.  They are difficult to identify and quantify, and thus to properly point out.  It thus becomes difficult to argue their existence, or that someone has engaged in them, especially when what they indicate is bigotry, maybe the most socially unacceptable behavior one can be accused of engaging in.

But they do exist, and represent real barriers for members of these groups.  They certainly aren't as bad as institutionally codifying strictures which in the past explicitly discriminated against disadvantaged groups.  But they can, individually be just as damaging.  A boss who discounts his female employee's intelligence, a landlord who favors white tenants, a teacher who disciplines his Hispanic student more harshly - all have very direct consequences.

The process is unconscious.  Studies show over and over how subtle they creep into our thoughts and actions.  This is the nature of bias, and it is a quite natural process.  We are each of us biased in ways that are values-neutral.  We might be biased towards certain types of movies, music or foods.  We might be biased towards certain types of personalities, or even interest groups - such as readers, hunters, or church-goers.  It's generally a useful and benign behavior.  Scientifically, it is how we function in the world.  Certain behaviors such as hunting or reading Jane Austen bring us pleasure, and thus reinforce the behavior of hunting or reading.  We then associate those activities with pleasure, and can now be considered to have a "bias", in that we will be more likely to engage in those activities in the future.  Likewise, we can eat a rotten banana, or watch a certain type of film, and experience displeasure.  The action of banana-eating or watching that kind of movie has been punished, and we will be less likely to engage in it in the future, or be biased against them.

The process takes place whether we are "conscious" of it or not.  We can learn to become conscious of the process, by experiencing pleasure after (or being reinforced by) the behavior of thinking about our experiences.  But most of our biases go unexamined, for good reason.  We wouldn't want to have to constantly be aware of our preference for every little thing we do.  Fortunately, we can "turn it off".

Yet if we don't know that we are doing it, it can come as a shock to have it pointed out.  We may not want to admit it, if clashes with our values of fairness and justice.  Furthermore, it can be difficult to account for.  Because so much of our daily behavior occurs without us actively, explicitly being aware of it, when we rely upon our memory of past behavior, we are highly unreliably.  In some ways, we can be the best authority on our past behaviors, as we are at least always "within our skin" (as Skinner liked to put it).  But we are only as good as our memory, and few of us has a very clear record of how have behaved in the past.  At best, we have fragments of experiences, lacking in great portions of the experience as it actually was at the time.

Others can often times be much better authorities of our past behaviors.  They may recall words and actions we engaged in that we ourselves may not recall, or even misremember.  They may be more attuned to certain of our behaviors that we were never even aware of at all.  We have all had the experience of identifying ways in which others are biased in ways they themselves are not aware of.  We are probably also aware of times in which, when we confronted them with examples of their biases, they refused to believe.  Indeed, if one's image of oneself is dependent on not acting in certain ways, evidence of such actions can be quite aversive.

Conservatism, whether hard or soft, inevitably faces the dilemma of what to make of apparent patterns of behavior among groups that must be explained, or at least reacted to.  As an ideology, it is ill-equipped to deal with the moral component.  The duty to respond, according to the scientific view of human nature, requires a personal duty to intervention that limited government and emphasis on others' personal responsibility can't abide.  As such, it is forced into advocation of policy positions that inevitably devalue the experience of the disadvantaged; to the extent that the unfairness of the position of the disadvantaged is lessened, the moral duty of the conservative is lessened.

If the plight of the hurricane refugee is lessened, so to need be the conservative response.  If the plight of the teenage parent is lessened, so to the response.  So to the immigrant, the gay, the drug addict, the homeless, etc.  Conservatism can almost be defined by a continued assault on the moral obligation we might owe the disadvantaged by continued assault on the disadvantage itself, whether by the actual lived experience or who is to blame for the disadvantage: it isn't really so bad, or they brought it on themselves.

In this light, so much conservative rhetoric can be seen as simple moral deflection.  Recently, conservative Bill O'Reilly gave a crisp example.  After Michelle Obama invoked the moral historicity of her lived experience,
“I wake up every morning in a house that was built by slaves, and I watch my daughters — two beautiful, intelligent, black young women — playing with their dogs on the White House lawn.”
O'Reilly, clearly defending himself (and fellow whites) from any implied historical moral culpability, felt the need to point out that the slaves were indeed
"well fed and had decent lodgings provided by the government."
Other notable examples of this conservative tendency are simply too many to list.  But one would be forgiven for getting the feeling from listening to conservatives that they are constantly under attack for holding unpopular beliefs.  In fact, this is true.  Their beliefs are incompatible with scientific notions of human nature, which establishes a case for a moral culpability that the rest of us share, and that they deny.  They are thus defensive, perceiving (correctly) to be immoral actors.  In order to remain consistent with a self-conception as being in favor of fairness and justice, they spend much of their time trying to demonstrate that they are not immoral by either arguing against the existence of the disadvantage, or explaining how it is the fault of the victim of disadvantage, for which they bear no personal responsibility. 

Is it any wonder then, that a law that seeks to compensate for the disadvantage in ability to vote experienced by some groups would be argued against by conservatives, especially when said groups tend not to be conservatives specifically because of conservatives' inherent aversion to feeling moral obligation to helping disadvantaged groups?