Really Bad Math: Evolution and Information
To me, one of the most irritating trends in creationists' attacks on evolution is the misappropriation of math to create totally bogus arguments based on nothing more than fancy jargon. They rely on nothing more than most people's fear of math: they count on the fact that most people see mathematical jargon, and won't look any closer.
The two key ways they do this is to misuse information theory; and to misuse probability. Probability is a matter for another day; today, I'm going to do my best to show quickly and easily how wretched the whole line of argument about information theory is.
First, let's take a quick look at a couple of examples of this argument, picked entirely at random. (I did a google search on "information", "evolution", and "create", and in the list of hits that came up, these were the lucky first two creationists.)
The Creation Science Homepage, "Do You Believe in Evolution", question :
Science Against Evolution
Anyone who's read my posts on information theory should immediately see what's bogus about this. Literally, the very definition of information in information theory is randomness.
So of course a random process can "create" information: a random process is, in fact, in information theory, by definition, a random process is precisely an information generator.
I'm not saying that the idea of it being difficult to create meaningful proteins randomly isn't difficult: it really is. But it works because of the sheer quantity of information being generated: when you're talking about billions upon billions of living things, constantly reproducing, constantly mutating - even if only one mutation in a billion generates information that is in any way valuable, that's no problem. But that's really a subject for another post.
But as you can see, the whole "information theory" argument isn't an argument at all. It's an attempt to make an argument from authority by using the words information theory and entropy, without any real reference to the actual theory beyond its name. It relies on the fact that most of the people who they're trying to reach aren't going to go and read Shannon's papers or Chaitin's books. Because even a glance, a trivial passing skim of a page or two of any real information (pun intended) about IT will reveal the utter fraud of their so-called arguments.
The two key ways they do this is to misuse information theory; and to misuse probability. Probability is a matter for another day; today, I'm going to do my best to show quickly and easily how wretched the whole line of argument about information theory is.
First, let's take a quick look at a couple of examples of this argument, picked entirely at random. (I did a google search on "information", "evolution", and "create", and in the list of hits that came up, these were the lucky first two creationists.)
The Creation Science Homepage, "Do You Believe in Evolution", question :
Information from Randomness?
Information theory states that "information" never arises out of randomness or chance events. Our human experience verifies this every day. How can the origin of the tremendous increase in information from simple organisms up to man be accounted for? Information is always introduced from the outside. It is impossible for natural processes to produce their own actual information, or meaning, which is what evolutionists claim has happened. Random typing might produce the string "dog", but it only means something to an intelligent observer who has applied a definition to this sequence of letters. The generation of information always requires intelligence, yet evolution claims that no intelligence was involved in the ultimate formation of a human being whose many systems contain vast amounts of information.
Science Against Evolution
Evolutionists must explain where the genetic information in DNA came from. They can't do it. Here is how one evolutionist tries to dance around the problem:
It is simply not possible to change a hemoglobin gene into an antibody gene in one step. Â To understand how evolution really works, we have to abandon the notion that such mutations can happen. Instead we must think of mutations as small changes affecting the functions of preexisting genes that already have long and complex histories. Usually, new mutations tend to damage genes in which they occur because they upset their precise functioning and their finely honed interactions with other genes. But sometimes they change them in ways that increase the fitness of their carriers, or might increase the fitness of their carriers farther down the line if the environment should alter in a particular way.2 [italics in the original]
Dobhzhansky's view [is] that much of the variation needed to accomplish the transition was already present in the gene pool  3 [italics in the original]
There are two fallacies in this argument. The first is that random changes in existing information can create new information. Random changes to a computer program will not make it do more useful things. It doesn't matter if you make all the changes at once, or make one change at a time. It will never happen. Yet an evolutionist tells us that if one makes random changes to a hemoglobin gene that after many steps it will turn into an antibody gene. That's just plain wrong.
Anyone who's read my posts on information theory should immediately see what's bogus about this. Literally, the very definition of information in information theory is randomness.
So of course a random process can "create" information: a random process is, in fact, in information theory, by definition, a random process is precisely an information generator.
I'm not saying that the idea of it being difficult to create meaningful proteins randomly isn't difficult: it really is. But it works because of the sheer quantity of information being generated: when you're talking about billions upon billions of living things, constantly reproducing, constantly mutating - even if only one mutation in a billion generates information that is in any way valuable, that's no problem. But that's really a subject for another post.
But as you can see, the whole "information theory" argument isn't an argument at all. It's an attempt to make an argument from authority by using the words information theory and entropy, without any real reference to the actual theory beyond its name. It relies on the fact that most of the people who they're trying to reach aren't going to go and read Shannon's papers or Chaitin's books. Because even a glance, a trivial passing skim of a page or two of any real information (pun intended) about IT will reveal the utter fraud of their so-called arguments.
9 Comments:
"Information theory states that "information" never arises out of randomness or chance events."
You know I have heard variations on this over an over - thanks for pointing it out and how bogus it is. I wonder where the idea came from since it is somewhat pervasive in bad ID arguements. It would be interesting to trace it back somehow. I can't even think of a starting point, Maybe some Thermodynamic arguement which is just as bogus?
By Anonymous, at 11:10 PM
"Information theory states that "information" never arises out of randomness or chance events."
You know I have heard variations on this over an over - thanks for pointing it out and how bogus it is. I wonder where the idea came from since it is somewhat pervasive in bad ID arguments. It would be interesting to trace it back somehow. I can't even think of a starting point, Maybe some Thermodynamic reasoning which is just as bogus?
By Anonymous, at 11:12 PM
I think the information theory "argument" is just a rehash of the old entropy argument: Thermodynamics says entropy increases. Entropy is disorder, hence disorder increases. Life is order, not disorder, hence life cannot arise from non-life.
Something like that, anyway. This is of course bogus, since the law in question only says entpropy increases in an isolated system. You try being an isolated system for an extended time period, and we'll see if your entropy doesn't increase, just like it should. In fact, as animals we are very effective generators of entropy: We eat highly organized vegetable and animal matter, and excrete you-know-what. Total entropy increases, but in the process we manage to skim off some negative entropy to use for our own good.
Entropy is an information theoretic as well as a physics concept. The two are analogous, but too commonly conflated, which I think is wrong. But it is not too surprising that creationists will turn to the other when their use of the first is debunked.
By Anonymous, at 5:38 AM
I think the "information doesn't increase without intelligent intervention" is actually more fundamental than that - it's the key to why the creationist mob can't accept that evolution works. They feel intelligence should hold some sort of privileged position in the universe and, by inference, that natural forces shouldn't be able to do all the cool stuff we can.
When you look in more detail at their position, you see that their logic is generally reliant on conflating two layman's definitions of information - firstly as data that means stuff and secondly as data that does stuff. The first is indeed something that can't exist without some sort of intelligence to see the meaning in it. Hence, it's fairly unlikely for it to appear otherwise - you're not going to find a bacterium with the complete works of Shakespeare encoded into its genome.
The second, however, is precisely what natural selection is tailor-made to produce. It's also an objective enough quality that no intelligent observer is necessary for it to make sense. So there's absolutely no reason for this type of information not to arise. However, by conflating the two, Dembski manages to "prove" the Just Plain Wrong in the eyes of his audience.
And then they wonder why I see the need to ask them for a rigorous definition...
By Lifewish, at 8:47 AM
markk:
I've seen the argument come from two different routes.
One is just deliberate obfuscation. People like Dembski know perfectly well that they're creating a bullshit argument, but they're coming from a perspective where the fundamental value is religious: if lying can save someone's soul, then lying is OK. They know they can appeal to people's intuition with a phony argument from authority, so they do.
The other is cluelessness. There are a lot of people out there who get thrown off by the fact that there's an intuitive connection between thermodynamic entropy, and information entropy - and thermodynamic entropy can naively be understood as tearing down order and structure, which intuitively seems to most people to correspond with losing information.
I find the second route a lot less disagreeable than the first. It's one thing to be wrong because you don't understand things as well as you think you do. It's bad: ignorance is not a good excuse in my book. But it's not deliberate lying. People coming from that perspective don't know that they're wrong - they genuinely believe that they're making a valid argument. People like Dembski don't have that excuse - they know full well that they're using a false argument. They just don't care.
By MarkCC, at 12:53 PM
The claim that evolution does not have an intelligent guide is also, in my opinion misleading. I've worked just a little bit with genetic algorithms. The problems approached have been nasty NP hard multi-dimensional problems, and good solutions are those, say with a local minimum. Genetic algorithms can be very good at searching these solutions spaces.
Life's current genetic engine is a good deal more complicated than anything i've cobbled together. The genetic engine has evolved along with the rest of the organism. There is every reason to suspect that it is really, really smart.
The human genome has more than a billion bits of information. That's on the order of magnitude of the complexity of the information you might carry in your brain. These are comparable systems.
I'm sure if a human had a billion years to design something as complex as life, that she could do as good a job. Individual humans are not immortal. Since the time that modern life arose on Earth, life has been immortal, and therefore, has had the time to devote to the problem.
By Stephen, at 1:35 PM
stephen:
I don't necessarily agree with your phrasing, but your point is dead on, and I'll have a post about that coming later this week.
I think that saying evolution has an intelligent guide is wrong. Intelligence implies a sense of conscious, deliberate guidance, which I think is going too far.
But when people make arguments about evolution not being able to work because it's an unguided random process - well, that's not correct either.
Evolution *is* a random process; but that doesn't mean it's unguided. The nature of an expanding reproductive system creates its own kind of guidance. Not all outcomes are equally likely - you have a sort of bumpy search space being traversed by the process of evolution - there are definitely certain areas of that search space that are attractive.
The closest metaphor I can think of off the top of my head is lagrange points in orbital systems. You tend to end up with piles of junk in the lagrange points. There's no intelligent process pushing things towards the lagrange points, but as things drift through an orbital system, the nature of the system makes lagrange points "attractive", so that you end up with a higher density of material in those points than in other random locations in the system.
By MarkCC, at 10:49 AM
I like the term "directed". Saying something's "directed" doesn't imply a director in the same way that "guided" tends to.
By Lifewish, at 8:37 PM
This is probably a bit late, but there are computer programs that do generate beneficial mutations. Including "irreducibly complex" ones.
By Bronze Dog, at 10:01 PM
Post a Comment
<< Home