Saturday, December 31, 2005

The God of Death

Uncertainty makes most people very uncomfortable.

This is why we have traditions. This is why people are willing to trade freedoms for a perception of security. This is why we have religion.

This is why we have gods.

It is natural for any organism to develop a profound aversion to death. The more averse to death an organism is, the more likely it will live long enough to breed and pass on these aversions (biologically and culturally) to the next generation. Organisms that don't have this aversion are more likely to die, and by doing so don't pass on their lack of fear of death. Over time, natural selection ensures that the vast majority of individuals in that species will hate death, hate it deep in their minds, bones, and dna. Fear of death becomes a deeply set neurosis across the species, irrational but ever present.

The most likely explanation of what happens to an individual after they die is - nothing. Nothing happens. They're gone. Dead. Kaput. Pushing up daisies. Worm food.

I say this because there is really zero evidence that anything beyond this exists. Although there is a long history of "after death experiences", ghosts, etc, none of these instances are ever verifiable. And believe me, given the fascination everyone has with what happens "after", if there were any chance of verifying some truth of an afterlife it would have been pounced upon and shouted from the highest towers. Instead you have rumors, myths, and charlatans capitalizing on the fascination and dread associated with this lack of evidence to feed to people some form of hope that after they die, they're not really dead. (It is these who shout from the highest towers, offering no verfiable evidence, but only a demand for faith.)

And religion is just the most organized system of rumor, myth, and charlatans trying to provide some certainty for people about an afterlife.

God (or gods for the pantheistic) is a very handy concept for people to address this deeply held neurosis. Don't understand life, how we came into being, from whence came the Universe? God did it, he has it well in hand, don't worry about it. Don't understand what happens after death (and this uncertainty makes you profoundly disturbed?) Ask God, and he'll tell you what makes you more comfortable. Don't understand why your wife got cancer? Why your child was paralyzed? God had his reasons, and his reasons are mysterious, not for men to ken.

God is a very useful invention for a species that must be, by the very fact that it has survived on earth for millenia, neurotic about death.

Like the atomic bomb, it is an invention that can be very destructive if used poorly. But if we can understand the inner workings of our minds like we learned to understand the inner working of the atom, we can perhaps free ourselves to create new, wonderful inventions akin to the many wonderful developments resulting from our understanding of the atom. Perhaps its possible to derive comfort from our neuroses with facts instead of superstition.

It is possible to break out of this dead end paradigm. We take comfort, and practical benefit, from understanding that diseases are caused by bacteria and viruses rather than satanic influences. The concept that Demokritos' "atom" (meaning "unable to be divided") really could be divided after all only took a couple thousand years.

Maybe now that we're a few thousand years after the invention of monotheism, it's time we looked a little deeper.

Friday, December 30, 2005

Pointers For Success

Joel Spoelsky just posted an article The Perils of JavaSchools, which makes the claim that because schools teach students Java instead of a "lower level" language like 'C' or Scheme, he now has a harder time weeding out the "true" developer minds from the wannabes.

I realize that Joel is spelled G-O-D in some developers workbooks, and that his blog is sure read by a lot more people than mine. But he is just wrong, wrong, wrong on this topic. (Well, maybe he's right in that he has trouble sorting out his idea of a good developer. But that's a personal problem. And a little known and lesser used language like Scheme? Lisp maybe, or Smalltalk - I think even Pascal got more academic use than Scheme. What little Private Idaho of a CS world did he grow up in, anyway?)

The argument Joel makes is that, since most student's programming experience these days is in a language that doesn't directly represent pointers, they can't have ever handled a truly hard problem. (He goes on to make a point that this is similar to learning Latin or Greek, in that the experience of learning those languages somehow makes you smarter. Now I don't know if Joel took Latin or Greek, or can read or speak either now, but I know he considers himself pretty darn smart - heck, I do too. Perhaps learning Scheme filled his ancient language requirement.)

Let's examine this a bit.

Like Joel, I started my programming career using punch cards. Writing programs was a long, laborious, tedious affair. Much planning was done before even writing the program (flowcharts, etc.), because producing a deck of punch cards that implement a poorly designed logic flow was a huge waste of time (not to mention trees).

Fortunately, some smart people (who still must have been taking Latin) came up with the idea of abstraction in computer languages. Each generation of computer language further abstracted the developer away from the bits being toggled in the computer. This abstraction, like any abstraction, allowed the developer to fit a broader scope into his limited working memory.

Most people have a working memory of about 7 'chunks' of information. It those 7 chunks are filled with seven octal representations of byte instructions, then it's hard to think about much else until you've worked through putting those seven bytes into a more persistent form, like punch cards. If those 7 chunks are filled with seven SOAP services, each of which perform large complex operations, then the breadth of scope at which the developer is working is dramatically larger. This is why Abstraction Is Good.

Learning the seven assembler addressing schemes of the DEC RSTS and VMS operating systems, learning how to "cold start" (boot) our old IBM by entering the correct address space via toggle switches - none of this made me more equipped to deal with J2EE, SOAP, or (god forbid) Visual Studio.

In fact, I would be so bold as to say that developers who work in Java today are dramatically more productive than the 'C' developers I used to have on my teams. This isn't because they're smarter. They're just working at higher levels of abstraction.

Is it useful to ask a candidate to demonstrate basic concepts like pointers, linked lists, and recursion? Maybe. It would at least sort out those with some CS education from those who just read Visual Basic For Dummies. But these concepts are first year CS concepts, so they're mostly just insulting to anyone that actually studied in the field.

Are there good Java programmers and bad Java programmers? You bet. The old 10x productivity ratio between the best and the mediocre still holds. But the way to evaluate good from bad isn't to see if they can demonstrate a great facility with pointers.

(What is the way then, you ask? Ah - that's for my next post.)

(BTW - one of the best hard data cases for the classic 10x difference in programmer productivity actually comes from Joel in this article...)

The Deforestation of Requirements

Requirements documents are a unnecessary waste of time and trees.

Let's look at the typical process of developing a requirements document. This differs a bit between custom development and product development efforts, but the end result of each is typically a formal requirements document. (For my reader's digest version of the typicaly requirements document process, see the bottom of this post).

The classic requirements document consists of words. Lots and lots of words. They may be organized functionally, by feature, by category (system, business, market...). But it's all words.

But words don't compile into working software. And here is our first huge impedance mismatch. Words can have multiple meanings. Put them into sentences, and the multiple meanings multiply. Put them into paragraphs, and usually instead of helping to clarify, it just adds more room for interpretation.

And because software developers need to somehow translate these words into bytes, there is a lot of further discussion. Many of these discussions are identical to the discussions that went into making the Requirements document in the first place. "What does it mean when it says the system shall present the list of search results. What data from the results to you want displayed? Do you want it in a tabular list, or some other presentation? And how about 'must respond in a timely manner'? Is that 1 second? Or 10 seconds? Is that everytime? Or just most of the time? How much is most of the time?"

Most of the Agile methodologies replace the requirements document with something a little more useful. Called use cases, story cards, UI wireframes, they at least try to look at things from an end user perspective. (Which is great for business software, but a bit harder for embedded or infrastructure software).

But the biggest stride forward in these agile methodologies wasn't the introduction of yet another way to capture words. It was the introduction of the concept of iteration between the developers and the users. It was the concept that it is almost impossible to get requirements right the first time, and that you'll need a few passes to refine some ideas, to recognized dead ends, and to find areas no one thought to include the first time (or two) around.

It was the recognition that no one is smart enough to see it all and get it right just by thinking about hit and writing words in a document. It bears repeating - No One Is That Smart. Not users, business analysts, product managers, or software developers. No one. So don't start a process that assumes this is even possible.

(Here is where the advocates for requirements documents shout "it's an evergreen document - it evolves with the project!" Oh great - you recognize that iteration and refinement will be necessary, yet you introduce a piece of paper that must be continuously updated and reconciled with the software, and that at the end of the project is tossed out. I realize that the additional overhead implies a lot of extra dollars for the consultant, or a much bigger budget for the product development team, but let's pretend that's not our true goal here.)

Should a team make some sort of stab at figuring out what they want to build (and just as importantly, why) before starting on these iterations? Absolutely. It's a great idea to have an idea of what you want to build before spending more money to build it.

My point is that the initial stab should be just that - a stab, an outline, a set of notecards. Having the associated business reasons for each idea, and force ranking these ideas have lots of value too. But to spend a lot of time trying to refine these and getting "buy-in" across an organization - this has been a waste of time in pretty much every project I've been involved with, because we just don't know enough yet - not until we start to implement do we have the knowledge to refine.

And with today's IDEs providing ease of rapid iteration, this is the quicker, better path to good working sofware.

________________
Side Notes:

(For arguments on the other side, you can read a couple of good blogs from friends of mine here and here that make the case for formal requirements docs. My opinion is that their arguments are logical, but not practical in the real world of real corporations, real developers, and real training. I've tried both, and know which I've found has a better outcome.)

_______________________
Readers Digest version of the Requirements doc process.

For custom software (a "one off" project for a single paying customer):
The customer has someone in charge of the project from their end. Let's call this person the Customer Project Lead (CPL). The CPL is on the hook to help the project team identify and reach all the "stakeholders" in the project (business people, line people, managers, IT, and possibly other departments). A series of meetings is held where the goals of the project are discussed (or at least, the reasons why the customer is paying this vendor and what they hope to get out of it). The stakeholders start tossing off things the system has to do (usually, these are very much in the weeds, and are things about the current system/process that they either love and want to keep, or hate and want to change).

For product (software product designed to be sold and used by many customers):
Product managers take the role of a proxy CPL. They are supposed to talk to real, paying customers to find out what they want, like, and don't like. They are supposed to be familiar with competitor products and what they do and how they do it. And they are supposed to distill all this down into a requirements document. (Some of the better ones iterate a bit with engineering - "can you make it do this? How about this?" - but in the end, all the knowledge is supposed to end up in a requirements document.

Quis custodiet ipsos custodes?

In the classic political tactic of diversion, the U.S. Department of Justice today announced that they were forming a probe to investigate the alleged leak of information regarding the Bush administration's executive order authorizing domestic wiretaps without a court order.

With most of the nation expressing concern to outrage regarding what appears to be illegal wiretaps by the Bush adminstration, the DoJ is choosing not to investigate or even defend said wiretaps. Instead, it is going on a witch hunt after the whistleblower who let the black cat out of the bag.

Even the ancient Romans knew that the most dangerous hole in a society based upon the rule of law is with the very enforcers of that law. As the title above says, Who Watches the Watchers?

Mirror, Mirror

We're starting to get closer to some brain functions that have a lot to do with what makes us human, a subject that continues to fascinate me. (prior posts)

At the heart of so many behaviors we tend to associate with "being human" is a cluster of neurons called mirror neurons. This is a hot topic in neuroscience, because the mirror neuron system seems to underly so many advanced cognitive functions.
...mirror neurons play a major explanatory role in the understanding of a number of human features, from imitation to empathy, mindreading and language learning. It has also been claimed that damages in these cerebral structures can be responsible for mental deficits such as autism. - European Science Foundation
These same structures are found in some evolutionarily related species, but seem to be more developed in humans. Some experiments by Derek Lyons at the Yale Cognition and Development Lab start to indicate that imitation really is the sincerest form of flattery - so much so that it is hard-wired into our brains (discussed in Children Learn by Monkey See, Monkey Do. Chimps Don't- New York Times (registration required) For those who don't want to register with NYT to see the article, here is the gist.
[The experiments are] evidence that humans are hard-wired to learn by imitation, even when that is clearly not the best way to learn. If he is right, this represents a big evolutionary change from our ape ancestors. Other primates are bad at imitation. When they watch another primate doing something, they seem to focus on what its goals are and ignore its actions.
One of my heroes of neuroscience, V.S. Ramachandran, has an interesting article on the myriad of ways mirror neuron affect us. Language, cultural transmission of knowledge, empathy, autism, and many other "human" traits emerge from these special neuron clusters.

From my readings, I have formed an impression that this system is critical to our ability to transmit knowledge from one person to another, our ability to understand another person and to cooperate with them, and our very ability to internally model human behavior, including our own (which is at the core of my definition of consciousness).

The idea goes like this. Over time (and evolution) the brains of our ancestors took the basic modeling capabilities inherent in physical systems (such as the ability to tell how far out to reach a hand to reach an apple), and used some of this capacity to model other "systems". At some point, and I'm not certain which would have come first (perhaps they came together, since they use the same system), the ability to model other beings internal states was developed to a certain degree, and we also gained the ability to hold a continuous model of our own inner states. This state of self-modeling is what we call self-awareness, and is at the heart of what we mean when we say we are conscious.

Consciousness is the continual modeling of the world, our relationship to it, and the state of these models themselves. Disruptions in consciousness are disruptions to any of the myriad of subsystems necessary to peform continuous modeling - short term memory, mirror neurons, etc.

If the field of Artificial Intelligence is going to make any true progress, it will be in understanding how this modeling works to such a degree that we can model it.

Wednesday, December 28, 2005

Do The Limbo

The front page of the Austin-American Statesman today had an article about how the Catholic church is reconsidering its stance on the concept of Limbo.

Apparently its not a Caribbean dance with a stick you like to watch women do when drunk (how low can I go). Instead, it is a place where dead babies go (after they die) if they hadn't had a bath yet. Also resident in limbo are maybe some people who might have been saints if they had been born after approximately 32 A.D., but had the misfortune to be born earlier than that. (Dumb asses - I'm sure if they were truly saintly they would have waited till after Jesus died...)

Anyway, The Church, in its infallible wisdom, has decided to give the whole thing a second look. It seems that even though there has never been an official Papal Bull issued on the subject, it has long been viewed in some Catholic circles that the unbaptised babies can't go to heaven, since they haven't had the special dispensation for Original Sin (you know, when Adam showed Eve his trouser snake and Eve took a bite...or something like that. Anyway, the sin of everyone's daddy is apparently visited upon the sons and daughters unless some asexual man holding a death cross pours some special H2O in your direction).

Since there were only Heaven and Hell as choices (because if you give man a third way out, he'll take it), this meant that Limbo was some sort of "outermost" ring of Hell - kind of like living Detroit, but in the good part of town.

However, this category includes aborted fetuses (in another view of this church, when the seed of man quickens the egg of woman, a soul is created). And for some reason condemning these aborted fetuses to hell just seemed a little too harsh. So they created Limbo. (What made it even worse was that fetuses who didn't make it to term and were aborted naturally died due to God's Will, and people's heads really started spinning when they determined this meant that The Big Guy himself was purposefully condemning new souls to Hell...)

I could go on, but really, what's the point. The remarkable efforts of Theology (which really shouldn't be an -ology at all) to try to reconcile the myriad of contradictions and inconsistencies inherent in pretty much all religious dogma just kills me. (And so I'm probably already living in Hell, which come to think of it would go a long way toward explain some things...)

Wednesday, December 21, 2005

Illogical Guilt

I stand accused of using cynicism as a lazy man's adaptation to a barrage of poor thinkers and liars.

Guilty.

I henceforth resolve to use the brain given to me (by God, evolution, my parents, or intelligent alien designers) to seek objective truth rather than just distrust or dismiss everything that seems fishy.

I've just been reading Crimes Against Logic by Jamie Whyte. This is a good book, a quick read, and one everyone should pick up and absorb. Although at times logically flawed itself in its peridodic hyperbolic storytelling to make a point, it is fundamentally sound in its premise and its points.

The premise is that politicians, priests, journalists, and your friends regularly use verbal ju jitsu to avoid any direct confrontation with facts and evidence in a discussion. Whyte identifies 12 techniques that can be recognized and discounted for those who are truly interested in pursuit of truth.

He also condemns our educational system (and parents) for not teaching basic logical thinking skills to our young so that they can be equipped to deal with the many obfuscations that serve the pedantic pontificates so well.
Alas, most know next to nothing about the ways reasoning can go wrong. Schools and universities pack their minds with invaluable pieces of information...but leave them incapable of identifying even basic errors of logic. Which makes for a nation of suckers, unable to resist the bogus reasoning of those who want something from them, such as votes or money or devotion.
And he makes a point that drives it all home to me.
Many instead defend themselves with cynicism, discounting everything said by anyone in a position of power or influence. But cynicism is a poor defense, because it won't help tell good reasoning from bad. Believing nothing is just as silly as believing everything. Cynicism, like gullibility, is a symptom of underdeveloped critical faculties.


I need to work harder at thinking.

(Which, according to this article, helps me lead a longer, more productive life as a bonus...)

Tuesday, December 20, 2005

Topical Cream

Judge Bans Teaching Intelligent Design
U.S. District Judge John Jones (appointed by President W) has issued 139 pages of pure gold. I'm sure you'll be reading many posts for days to come on this one, but a few brief (heh) comments...
  • Intelligent Design is religion, not science, since it doesn't come anywhere close to the tenets of a scientific theory.

  • Intelligent Design may even be the correct explanation of life on this planet, but that doesn't make it science.

  • As a religious belief, it doesn't belong in a Biology curriculum.
Why the constant efforts to mask religion in the guise of science if not to evangelize and establish one religion over others? I actually think schools should teach a comparative religion class, when speakers from each religion could even come in to prosteletyze their position. Let's expose kids to religion, and religious beliefs. But no, religion advocates say that the "teaching" of religion belongs at home. Unless it's in science class. And it's their religion.

Adminstration Caught Spying On U.S. Citizens, Declares Jihad on New York Times
It's ludicrous. Sen. John Cornyn (the standard mouthpiece for comments Bush wants to make but probably can't without mispronouncing) attacked the New York Times for publishing some investigative journalism findings.

The issue is domestic spying. The FUD is editorial privilege.

On the FUD:
  • The NYT only released the story now to promote a book by one of their reporters. The book isn't out yet. They don't have a financial interest in the book. And it was most probably the other way around - the fact that a book is coming out next month which outs the NSA spying program forced their hand to either get scooped (and be accused of partisanship for holding it) or to publish (and be accused of partisanship for publishing). Editors should have published the article when it was first written. They supposedly didn't because they were told by unnamed administration officials it would damage national security.

  • It was only this New York Times article that caused many Senators, including some Republicans, to stall the renewal of the Patriot Act. How friggin' short is our attention span, anyway? It was only last week that reports about possible fillibusters and senatorial concerns came out - before the NYT article.


On Domestic Spying:

What, you're surprised? (Brace yourself - I have some really tough news about Santa for you...)

Don't blame Bush. This administration has run roughshod over rights and due process for years, and 'we' voted it back in. That's the great thing about democracy - the people always get the government they deserve.

Iraqi Elections Bring Peace and Goodwill

Oh wait...never mind.

Sunday, December 18, 2005

A Measure Of Understanding

I just watched an old video, and it prompted an interesting thought (okay, I realize that you may not think it's interesting, but I do, and it's my blog).

The video showed our current ability to image from about 1024 meters (about 100 million light years) down to about 10-14. Actual images using today's technology appear to really go from about 1026 meters (abt 12 billion light years) down to about about 10-9m (atom) for practical imaging (with the rest down to the makeup of a proton under 10-14 as representing our theoretical understanding).

For most of history, we were limited to the resolution of the human eye. Ancient astronomers could make out some of the near planets out to Jupiter (about 1011m). On the low end, close examination could make out skin cells (maybe 10-2m).

In the late 1200's, with the invention of magnifying lenses, we took a jump. The high end went to about 1012m, and the low end with a single lens went down to about 10-3 or -4m. In the 1600's, with the inventions of the telescope and microscope, the high end went to about 1013 or 14m, and the low end went down to the cellular level, around 10-4 or -5m.

So a table of progress kind of looks like this (all powers of 10, which means I should have plotted this on a log/log scale, but was too lazy to take the time).






Yearslowhigh
5 -2 11
4 -3 12
3 -5 14
2 -14 24

With these trends (in both scale and time to reach the next order of magnitude), we've about reached certain "known" limits (the edge of the Universe on the high end, and quarks on the low end). Is this the end of progress in the dimension of sight? Or just the edge of another singularity, where we literally cannot yet see what's on the other side?

It smells like a singularity. I hope it is. Because our current theories just aren't cutting it. And without some fundamental changes in our understanding of space (and it's constant companion - or should I say relative? - time), we're not ever going to get off this planet.

And it is a certainty that someday Alice won't be able to live here anymore.

Friday, December 16, 2005

The C Word

Thanks to the hard hitting fair and balanced investigative reporting of Bill O'Reilly, we have learned that those Liberals Who Hate America have struck again, launching a jihad against Christmas. Apparently Dr. Evil Dean, in cahoots with Arch Fiend Soros, have managed to delete the word 'Christmas' from all e-books across the world.

According to "highly placed government sources who had nothing to do with that Valerie Plame thing, whoever she is", The League Of Evil was able to exploit a "back door" installed into library computers around the world to commit their heinous crime against God and Nature. Operating from deep in their underground lair in Dover, PA, where they already "voted out God", they were apparently able to exploit something called a "rootkit hack", which allowed them total access to the library computers.

The critical security flaw came about from a copyright protection scheme originally put onto the CDs of "Burle Ives' Christmas Jingles". When placed into a computer, the CD would install a hidden program to keep the CD from being copied. "They knew that God fearing Christians around the world love listening to this CD at this holy time of year," said sources. Apparently, this "rootkit" software interacted with another back door already installed onto libary computers by the FBI as part of the Patriot Act "Operation Suck This", and created a whole in the computer security that the demonic duo could exploit.

(Operation Suck This, as you'll remember, was the program mandated by the Patriot Act to allow the FBI to monitor suspicious activity on library computers. When a patron would access target material, such as "Winning Back America" by Dr. Evil Dean, or "Tour of Duty: John Kerry and the Vietnam War", the program would quickly notify agents nearby, who would quickly whisk the patron away to overseas prisons where they could be questioned more "thoroughly".)

"First they kill Christmas. Next, they'll be killing Christians...it's Rome all over again, " O'Reilly noted. "Pretty soon, the U.S. will be just like Holland, where they've already killed God and replaced Him with Drugs and Homosexuality. I've heard they actually have Dikes in the streets there, with little Dutch boys performing perverted acts with their fingers in public. Did you know that?"

Fellow Fox News Host John Gibson explained. "Everyone knows Baby Jesus was born on December 25, ever since Pope Julius I rediscovered his birth certificate in the 4th century. Before that, Liberals were in control, celebrating the birth of Mithra on that same day, dancing with goats, drinking wine, and f-f-f-fornicating," Gibson stuttered. Pausing for a moment in prayer, he then continued. "They've always longed to go back to those days, and now they're making their move."

O'Reilly continued. "You won't see this story on the Liberal Controlled Media. That's why we're trying to get the word out." When asked if the fact that his television show was shown in all major outlets and was by his own claim "by far the highest rated prime time daily news program in the country" showed that the liberals no longer controlled the media, he replied "That's a stupid question. Who told you to ask that question? Get off my show. Cut the mike - cut the mike..."

Monday, December 12, 2005

Cultural Evolution

Evolution theory consists of an extremely powerful set of concepts. While these concepts have clearly been demonstrated in terms of their explanatory and predictive powers in such fields as biology (hopefully even Intelligent Design proponents accept that they too can get bird flu as it evolves), it seems that the constant religiously inspired bickering has limited further thinking on the application of these ideas to other fields of study.

One field that hasn't been sufficiently explored is the evolution of cultures and societies. Why do some cultures produce wealth and expansion while others stagnate or die off? Chance? God?

These latter two explanations lack predictive power. So lets look at the time tested set of ideas we call evolution to see if we can gain better understanding, and perhaps a set of tools to allow us to predict what societies will enhance the well being of their populace, and which will not.

Historical Dynamics

First of all, are there patterns that recur in history in regards society and cultures? There seems to be a fair amount of research to support this basic concept.

Peter Turchin's War & Peace & War: The life cycles of imperial nations identifies recurring societal patterns that explain how new, cooperative groups emerge and form an expansionist culture, and how "competition and conflict between groups" eventually undermine the success of these cultures.

Cooperation. This single, powerful concept is the primary driver of the creation of a society, and the forms which this cooperation takes determines the culture of that society. I've written before on cooperation being one of the primary "laws of the universe", and here is yet another example.

I've also referred on occasion another apparent natural law, the inverse power law. This "inverse power curve" or Pareto distribution is a common natural function - we see it all the time in settings from biology to blogs. This "law" seems to play a strong role in the demise of cultures. Turchin theorizes that a successful culture grows rich via whatever non-zero cooperative values the culture has adopted. However, over time, inequalities in wealth and power naturally emerge among its people. The very success of the culture and the expression of this inverse power curve create the conditions for the fall of that culture via the "corrosive effect that glaring inequality has on the willingness of people to cooperate."

We see this lack of cooperation as a common element in the collapse of societies all the time. Collapse: How Societies Choose to Fail or Succeed by Jared Diamond highlights that societal failures of economics are often correlated with (and he would say driven by) environmental failures. In other words, time and time again societies ignore environmental impact of their culture, and negatively impacting the environment inevitably leads eventually to a societal collapse. The tragedy of the commons is all too common.

Bottom line: Some cultures win and grow, and some lose and die off. The discussion above talks about some reasons why. But I think there are interesting "meta-principles" at work as well.

Cultural Selection

I believe that the concepts of evolution, including natural selection, mutation, variability within a species, punctuated equilibrium, and other concepts included underneath the umbrella of evolution apply just as readily to cultures as they do to bacteria.

Cultures that win and grow are those that thrive in a particular environment. Cultures that survive a long time are those that adapt to changes in that environment, and that adaptive ability comes from the amount of mutation and variability of ideas supported by the culture.

Rome is a classic example. Their culture lasted over a thousand years because it supported a wide variability in certain dimensions (styles of governance, tolerance of diversity in race and religion, rewarding novel ideas in technology and economics). When it died, it was due to economic inequalities (among other things) driving down the amount of cooperation taking place within the society.

Today's example is American Culture. Although only around for a couple hundred years, there is much about the culture of the U.S. that aligns with the "natural laws" of evolutionary theory. Our cultural "memes" self-propogate outside our culture, expanding our cultural boundaries and growth. Why is Rock-and-Roll a world wide phenomenon? Why are there McDonald's in every country of the world?

Why do I use "pop" culture examples instead of "meaningful" examples, such as democracy, innovation, tolerance? Because they're easier, for one, and they make the point. And we've seen time and time again that other societies who want the cultural sugar our culture generates learn that they also need to adopt some of the sugar cane agriculture and processing that allows that sugar to spread.

American culture has great promise as a growing wealth generator, bringing a higher standard of living to more and more of the world. But only if it maintains those aspects that support its continued evolutionary adaptability. Diminishing tolerance, less cooperation, and expansion by force of arms all work against our culture's staying power.

(I was going to get into contrasting Western/American culture with Islamic culture in the context of evolutionary theory, but this post is already too long and rambling. And not funny at all. So I'll save that for another post, hopefully when my writing style gets back to being more humorous than pedantic).

Sunday, December 11, 2005

Rebels With A Cause

I just saw the move Syriana. This movie lays out the middle east situation as Traffic did the drug situation. Both written by Stephen Gaghan, they each present the intractability of a complex set of problems.

By a strange twist of fate, I also happen to be reading 1776 by David McCullough. For those in the USA, I needn't explain the significance of that date, nor that the book portrays a story of a band of rebels who used any means at their disposal to attack and expel an occupying military force.

And so I started thinking about Iraq. And about the rebels. And the occupying military force. And I got really uncomfortable.

Why? Because I think terrorists are scum of the earth, animals of destruction who by their actions against civilians abrogate any "inalienable rights" a human may have. And while I know some of the "Iraqi insurgents" are of this ilk, it also occurs to me that some may be of the "rebel" persuasion - nationals fighting against an occupying military force to regain the right of self-representation and self-determination.

Some of these rebels may believe that they cannot be fairly represented in a democratization process heavily influenced (if not controlled) by that occupying power. Some of these rebels may believe that the occupiers are not protecting them from the terrorists themselves, or from crime lords and local militia leaders. Some of these may have seen their families accidentally killed, or had relatives falsely imprisoned, tortured, and killed (not necessarily by the U.S., Abu Graihb style - but by the current Iraqi administration placed into power and supported by the U.S.).

They may believe that they need to take up arms to protect themselves and to regain their right to life, liberty, and the pursuit of happiness.

And it really disturbs me to think that, while the scum who kill civilians are terrorist animals, the enemy combatants who attack and kill our own service men may be the patriots with a cause.

What a bloody mess.

Get Out of Dodge


I agree with Congressman Murtha. There is not a military solution to Iraq. It is time to start an orderly, but rapid, draw down of troops from Iraq. Our service members are targets of everyone - terrorists and patriots alike. It is our responsbility to our own people to get them out of harms way.

We cannot impose democracy at the end of a gun barrel. Whether or not Iraq adopts a western style democracy is up to them, not us. They may not be ready for such a government - their citizens certainly haven't had the years of education and rule of law such a government requires. I frankly doubt their culture and history provide the necessary soil for the growth of our form of government. But it is up to them, not us. What possible business do we have there?

Only one business, really. Oil. As Syriana alluded, about the only real reason to keep troops there is to ensure that we control access to oil resources, by ensuring weak, corrupt governments and chaos continue in the middle east.

But whether you buy this premise or not, it is hard to make a coherent, moral argument for why 2,000 American troops or 30,000 Iraqi civilians need to die, and even harder to justify why this tragedy should continue. "Stay the course" is meaningless, when your course doesn't lead to a destination.

Time for the military to leave, and for the advisors, if asked, to help with the reconstitution of a government, an infrastructure, and the rule of law where citizens can safely go about their business.