Years ago, I was walking down a street with my family, in the Brooklyn district of New York, on my way to the subway museum. My wife and I were talking to each other trying to follow the directions on a map when suddenly an uneasy feeling came upon me. It slowly dawned upon me that lots of eyes were staring at us, and soon the realisation occurred that we were the only White people in a neighborhood where everyone else seemed Black. While most people were on their way doing their usual things, there were definitely eyes of contempt directed at us. Almost as soon as I became aware of our situation, my wife seemed to have become aware of it as well. An uneasy came upon us. Given that we had the children in tow, and the Subway was only about 300 meters away, we agreed that it would be better to nonchalantly cross the street and get back to the subway. We got there without incident. I imagine everyone has a similar story.
What was interesting about this situation, is that that the sense of threat posed to my family--assessed either rightly or wrongly--was first appreciated by an unconscious processing of the environmental situation and which eventually asserted itself onto my consciousness. Clearly, what had happened whilst my wife and I were walking down the street, talking about our day's plan, was that visual information from my immediate environment was being assessed by subconscious subroutines and a a threat analysis was being made, all without any deliberate awareness by myself. Eventually, the subroutine felt that action was required and impressed itself on my consciousness by eliciting a sense of unease.
Now the traditional approach to western anthropology divided man into rational and irrational, or more specifically, biological and cognitive but what's become increasingly apparent is that this anthropology is wrong, and division should be made along the lines of biological, biocomputive and cognitive elements. With the biocomputive layer being able to process information for the majority of time without our conscious awareness of it, only making itself noticed through the sensations of intuition, attention, attraction and repulsion.
If you think about a simple task such as walking, you'll realise just how complicated a task it is to accomplish in real life. Neurons must fire to increase the tension in some muscles and decrease them in others, balance must be maintained, objects avoided , speed maintained and so on. If these actions were the result of a conscious deliberative process they would overwhelm the mind, instead subconscious subroutines take care of all the work and walking simply becomes an issue of command. Robotics engineers looking at the problem have termed it Moravec's Paradox.
The main lesson of thirty-five years of AI research is that the hard problems are easy and the easy problems are hard. The mental abilities of a four-year-old that we take for granted – recognizing a face, lifting a pencil, walking across a room, answering a question – in fact solve some of the hardest engineering problems ever conceived... As the new generation of intelligent devices appears, it will be the stock analysts and petrochemical engineers and parole board members who are in danger of being replaced by machines. The gardeners, receptionists, and cooks are secure in their jobs for decades to come.Our conscious rationality rests upon a deep foundation of biocomputative processing of which were are largely unaware of. Now, most people instinctively grasp the fact that biologically automatic processes are involved in motor movements, but what needs to be recongised is that automatic processes are also involved in our perception and cognition which result in unconscious biases in our thinking. The influence of these biases are most marked at the level of System 1 thinking, in other words the thinking of the cognitive miser: the average man. These biases are not bugs but are features of the system, designed to ensure our long term biological survival in a primitive environment. For instance, the preference for blue, amongst Europeans may be "hard coded" in our DNA as a result of evolutionary--or whatever else--selection. It's not a learned behaviour.
and;
Encoded in the large, highly evolved sensory and motor portions of the human brain is a billion years of experience about the nature of the world and how to survive in it. The deliberate process we call reasoning is, I believe, the thinnest veneer of human thought, effective only because it is supported by this much older and much more powerful, though usually unconscious, sensorimotor knowledge[Ed]. We are all prodigious olympians in perceptual and motor areas, so good that we make the difficult look easy. Abstract thought, though, is a new trick, perhaps less than 100 thousand years old. We have not yet mastered it. It is not all that intrinsically difficult; it just seems so when we do it.
The point of all this is for people to recognise that we are inbuilt with biases and preferences which are not deliberately chosen but which are "hard wired" into our biocomputative architecture. Sometimes our preferences are not a choice, but are a feature of our innate biology and therefore intrinsic to what it means to be human. Stereotyping, color association and homophily are examples of human biases which may provide a foundation for thinking about human relations in a way which avoids the pernicious affects of Darwinianism which has so plagued the Right in the 20th Century.
For those who are interested in scientific papers.
Link 1.
Link 2.
Link 3.
Link 4.
Link 5.
Link 6.
Years ago, Arthur Koestler observed that if the centipede had to think consciously about which order to move his legs in, he wouldn't be able to walk at all.
ReplyDeleteSomeone else observed that a large organization is "a hierarchy of wastebaskets." It HAS to be this way, as the CEOs who are going overboard on de-managing are going to find out.
Has like... nobody read CS Lewis Abolition of Man? It's all right there.
ReplyDeleteOne more thing:
ReplyDeleteIf you think about a simple task such as walking, you'll realise just how complicated a task it is to accomplish in real life. Neurons must fire to increase the tension in some muscles and decrease them in others, balance must be maintained, objects avoided , speed maintained and so on. If these actions were the result of a conscious deliberative process they would overwhelm the mind, instead subconscious subroutines take care of all the work and walking simply becomes an issue of command.
And if you've watched babies, you'll see the effort to walk with conscious deliberative processing. Which must be so, unlike the horse, man doesn't come out able & ready to walk. Instead he must be taught to walk. And like all things, man then practices - using consciousness - over and over until eventually the commands & routine reach the subconscious level, and then walking becomes "innate." But it's not just something done automatically at first. The ability to self-program (with effort) new instinctive commands is probably the most powerful feature of humanity.
I wonder if rational and irrational are just poorly formed labels. Most inborn biases, emotional responses and various autonomous subroutines have some basis in adaptive fitness. And we know this because it's obvious to *our rational minds*. So then calling autonomous subroutines "irrational" is just a bad name for them. Perhaps subrational, or pre-cognitive is a better term. It's hard to imagine Aristotle or Aquinas being overly flummoxed by this categorization. "Irrational" would better serve to describe misuses of the rational mind.
ReplyDelete@David
ReplyDeleteAgree.
@Nate.
The thing about human automatic functioning is that it is capable of self learning. No child, says to themeslves, "move the rectus femoris muscle by this much, then relax the hamstring muscles by that much, then counterbalance on the opposite leg. Walking comes intuitively to kids.
One of the most amazing things I ever saw was a blind baby get up and walk. Think about that for a moment. How do you balance without visual cue? A lot of human behaviour is automatic in the fine detail.
@Nick,
Thanks for dropping by.
I think arational is a better term.
Stanovich beginning to define rationality, recognising that the Aristotlean notion--which has influenced Western philosophy and Christianity--is a fairly weak definition of it. i.e rational vs irrational , Whereas modern cognitive science tends to view it as a spectrum. i.e rational------irrational vs arational. There is this whole category that the Aristotlean view does not incorporate. Would Aristotle, were he alive today, have any issues with it. ? I doubt it. But this whole layer of "machine cognition" which operates under the rational---irrational spectrum is ignored.
In days of yore, this machine intelligence went under the name of human nature, but where exactly does human nature sit?(as I understand it) the Aristotlean concept of rationality viewed human nature as something implanted in our "rationality" instead of something that existed outside of it. The implications of this subtle shift in understanding is profound. If human nature is something that subsists within our rationality it is amenable to learning and change. On the other hand, if it subsists outside our rationality it is only capable of control. Let me explain the significance of this.
Take fat acceptance for instance. Under the Old Aristotlean notion, education and argument can perfect human nature to make a man find fat women attractive. Aristotle's definition of rationality seems to feed the blank slate notion of human nature by positing that it is malleable through instruction. On the other hand, if our machine programming is designed to find fat women repulsive then the best we can hope for is a suppression of machine generated disgust by rationality. We aren't a blank slate but are "front loaded with values" simply by being human. The "Law written upon the heart is not made up in the head but is embodied in corporeal machine intelligence.
As Stanovich quotes;
"The great debate about human rationality is a "high stakes controversy" because it involves nothing less than the models of human nature that underlie economics, moral philosophy, a personal (folk theories) theories about human behaviour."
Political theory and social theory become transformed as well when we realise that political man is not a "rational animal", rather, he is a cognitive miser, where machine intelligence, with all its 'irrational' biases exert a greater influence on his decisions than informed thought.
This shit is important, especially for NRx.
Great post, I think about this a lot.
ReplyDeleteWe have a problem in that we have conflated "reason" with purely the logical reasoning side of affairs, however logic alone provides no premises to plug into your syllogisms. All of our premises are a product of direct perception.
I may be getting this wrong, but I believe this direct perception is what the Romans referred to as mens, the Greeks as sophroneos, and what Chesterton called "common sense". It is the part of the mind that directly perceives some aspect of reality. It is not "irrational", as you have pointed out, but is the basis (in the sense of foundation) of reason.
Common sense is rightly called a "sense" because it is a kind of intellectual sight. Logic exists to help correct this sense, since this sense is touched by original sin as are all the others, and so logic is invaluable.
Thomas didn't have the highest honor because he did not believe until he had seen, but he DID believe after he HAD seen. Leftism is focused on not believing what you do see. You perceived danger when you perceived hostility. The good Leftist would hang out until he got mugged or worse because he wouldn't believe the evidence of his own eyes for fear of thinking himself a racist.
To emphasize the point, I have been in a white hellhole or two in my time (contrary to leftist propaganda, many "privileged" whites live in circumstances of terrifying poverty and violence). I perceived danger and was impatient to leave. The point isn't white or black, it's being around men who intend you harm, perceiving that, and leaving. I doubt you would have the same danger response in a Missionary Baptist Church on Sunday, nor would I.
Stereotyping...are examples of human biases which may provide a foundation for thinking about human relations in a way which avoids the pernicious affects of Darwinianism which has so plagued the Right in the 20th Century.
ReplyDeleteStereotyping doesn't fit here. That's just IQ, or pattern recognition. Reminds me of Sailer's idea of the thoughtcrime of noticing patterns...
I don't see that Darwinianism has plagued the Right in this or last century. It's more a blight on the Left.
The Right tended to accept Darwinianism for what it is, and deal with the "pernicious effects" through religion and education. The Left simply denies Darwinism for humans, period, as if they are somehow magical creatures.
I read an article years ago in which it was pointed out that every person who got attacked had an uneasy feeling something bad was going to happen to them. Most of them ignored the feeling, to their detriment.
ReplyDeleteStereotyping is is like many type 1 systems: natural, intuitive, automatic, and wrong.
ReplyDeleteThe fact that type 1 thinking can't handle correlations other than 0 or 1 causes a *huge* amount of grief and always will.
However, to ennoble it as anything more than a system that proved marginally evolutionarily useful millennia ago, and is intensely maladaptive in a contemporary environment seems silly.
In a modern world where the chances of attack are almost nil, the cost of an protective instinct going off inappropriately 99.999% of the time vastly outweighs the 0.001% of the time it might provide a modicum of help.
We're stuck with sterotyping in the same way we're stuck with any number of suboptimal brain systems that encourage crime, obesity, etc. But the fact that we're stuck with it doesn't mean we don't try to structure society to prevent or minimize it.
@Tom
ReplyDeleteWe're stuck with sterotyping in the same way we're stuck with any number of suboptimal brain systems that encourage crime, obesity, etc. But the fact that we're stuck with it doesn't mean we don't try to structure society to prevent or minimize it.
The problem though, Tom, is that system 1 thinking is the default operative mode of human thought. Furthermore, for most people though, system 2 thought requires conscious effort which can be exhausted with overuse, and therefore we will resort to system 1 cognition under fatigue or stress.
The thing about human automatic functioning is that it is capable of self learning. No child, says to themeslves, "move the rectus femoris muscle by this much, then relax the hamstring muscles by that much, then counterbalance on the opposite leg. Walking comes intuitively to kids.
ReplyDeleteOne of the most amazing things I ever saw was a blind baby get up and walk. Think about that for a moment. How do you balance without visual cue? A lot of human behaviour is automatic in the fine detail.
No, we really don't know. When babies are first born, they obviously lack motor control and a substantial amount of their early efforts are spent developing this. For all we know most of that time is spent developing the habitual commands of automatic functions, i.e. "I need to move my rectus femoris muscle - Oh this mental command does that - I need to do it again and again until I don't need to activate the command consciously."
** Yes, OBVIOUSLY a newborn doesn't strictly think like that but you don't have to watch kids long to see that there's clearly something along the lines going on there. Why do you think they're so often big on repetition? Of doing the same task (even if it's just moving a block) over and over endlessly? Practice in developing the software & hardware.**
What do you mean how do you balance without a visual cue? You do know balance is a function of the inner ear, right? You could blindfold yourself and balance easy. That wasn't the point, the point is did the baby (even blind) walk perfectly the first time? Did it get up and run a small marathon on the first try? No, you're looking at the end and ignoring the transition, namely the child had to make a conscious command to walk. It isn't until after several conscious efforts that the walking command gets pushed down into the arational system.
If behavior is automatic, then there's the obvious question: Why physical therapy? Yes there is a bit of a component with strength needing to be increased, but there are still those who's strength is sufficient, their command lines are not, they essentially have to rebuild their automatic functions. If it was all "self-learning" never needing conscious command input, then this should never happen. OR it should be impossible to recover after the command inputs have been damaged.
What your missing is that both sides are right. The blank slatists are right in that we have an immense capability with training, so immense we haven't figured out what the limits of it are (save beyond the "chest" issue which is why, again, I recommend the Abolition of Man). Obviously no baby is born with an intricate knowledge of computer coding, but give it some years and it can learn. On the other hand, "it's all nature" folks are right in that we clearly come preloaded with some software/hardware features. But try and say that one side is completely right/wrong and we end up looking foolish and missing the "whole man." We create men without chests and... well like I said, read that book by CS Lewis. It will at least give you some better vocabulary to discuss man's duality.
Tom, We're stuck with sterotyping in the same way we're stuck with any number of suboptimal brain systems that encourage crime, obesity, etc.
ReplyDeleteStereotype: widely held but oversimplified image or idea of a particular type.
Sterotyping is actually super-optimal. It uses the intelligence of the group to make the best decisions possible with limited information. Because we always have limited information in every data set. It's what makes primates so effective, sharing group knowledge, and what humans so powerful working in groups. Hell, it's what insurance companies use. It's saved more people's lives than seatbelts or bullet-proof vests.
@SP, I'm not disagreeing that we use Type 1 most of the time. But the framework in which most people live must be designed with type 2 thinking.
ReplyDeleteOf course, homophily exists, but given the massive harm it causes to humanity as a whole, I see no reason to mold society to it any more than I see a need to cater to other natural emotions such as violent jealousy.
There are a ton of human impulses that society suppresses, and because man is a product of both nature and nurture, such impulses are heavily modified by experience. I'm sure that if I wasn't raised in Toronto, a heavily multicultural city, I'd be slightly discomfited by my workplace where there is no majority race, I hear Arabic, Hindi and Russian spoken around me, and the clothing styles, while majority jeans and polo shirt, also include Hijab/Abaya and the occasional sari.
Nobody here (a programming shop) blinks an eye because this is what you see every day on the street.
Yes, there are fewer cultural touchstones with my coworkers, so I exercise some cultural homophily with my personal friends (we're all hard-core computer geeks from a variety of races), but surely homophilic tendencies are no basis for the structure of society.
Or should I be agitating for a society where everyone must have read the science fiction greats before getting citizenship?
@Anonymous, Sterotyping is actually super-optimal
Super-optimal? No, it isn't, and if you use it in most jobs, you get fired because you're expected to look at what it *actually there*, not what a lazy system suggests is the last amount of work you can get away with and still have some personal positive benefit.
It produces a slightly better *personal* result most of the time (which is why it persists), bad results overall some of the time, and occasionally catastrophic results.
And yes, insurance companies are in the market of aggregating risk, but use statistical correlation. They don't deny coverage "because youngster's can't drive". They know exactly what extra risk is involved.
The simplification of stereotyping is sub-optimal, and since we're not hive-minded groups, almost always unjust. If and when I'm called to account for my sins, I'm pretty certain it'll be done on the basis of who *I* am, and not on the basis of some vague stereotypes of white people, or men, or computer nerds.
And if stereotyping is anathema in the kingdom of God, then at least as an ideal, I'm not going to embrace it here.