Uncommon Dissent

Posted 27 March 2011 by

by Joe Felsenstein
http://evolution.gs.washington.edu/felsenstein.html
Over at Uncommon Descent an unusual discussion has erupted. A commenter named "MathGrrl" who has been occasionally active there as a critic of ID has actually been allowed to make a guest posting. She gave several examples of situations where one could make a specification of what were the best genotypes, and asked how in these cases Complex Specified Information could be defined. She has handled the discussion with great restraint. Several hundred comments later no consensus has emerged. Commenters at anti-ID blogs ( here, here, here, here, and here), have concluded from this that the concept of CSI is vacuous. I'd like to give a perspective that may be unpopular here. I don't think Complex Specified Information is a vacuous concept, though we usually do not have enough information to actually calculate numbers for it. Simply put, birds fly and fish swim. They do so a lot better than organisms coded by random strings of DNA (formed by mutation without natural selection -- organisms coded for by monkeys typing with four-key typewriters). If we could imagine looking at all possible such organisms with the same length genome as (say) a bird, the fraction of them that would fly as well as a bird, or better, would be incredibly tiny. So tiny that if every particle in the universe were a monkey with an ATGC typewriter, there would not have been enough time by now to produce anything as good as a bird even once since the time of the Big Bang. That is the essence of William Dembski's argument. Note that getting technical about information theory is not required. People love to contradict each other about information theory, but we can set most of that part of the argument aside. A simple definition of Specified Information would be that it is the negative log (to the base 2) of the fraction of those sequences that are better at flying than a bird. We don't have enough information to actually calculate it, but we can be sure that it is big enough to pass Dembski's threshold of 500 bits, and thus CSI is present. So am I saying that CSI is present in examples of life? Yes, I am. So does that mean that it follows that design is present in those cases? No, it does not. As I have explained before (here), Dembski draws the conclusion that the presence of CSI proves design because he has a theorem, the Law of Conservation of Complex Specified Information (LCCSI), which supposedly proves that an amount of specified information large enough to constitute CSI cannot arise by natural processes, even once in the history of the universe. In fact, he is wrong, for two reasons: * His theorem is not proven. Jeffrey Shallit and Wesley Elsberry pointed out (here) that Dembski violated one of the conditions of his own theorem when gave his proof that this large an amount of SI could not arise by deterministic processes. * In any event, to use his theorem (even if it were proven) to rule out natural selection you have to use the same specification (say "flies as well as or better than this bird") both before and after evolutionary processes act. And this Dembski does not do. His conservation theorem involves changing the specification in midstream. When you require that the specification say the same, you can immediately see that the amount if SI cannot be conserved. Natural processes such as natural selection can improve the flight of birds. Advocates of ID endlessly repeat the mantra that the presence of CSI proves that design is present. They are relying on Dembski's LCCSI, whether they know it or not. But natural selection can put Specified Information into genomes, and when it acts repeatedly, can easily exceed the threshold that Dembski uses to define CSI. The issue is not CSI, it is the conservation law, one that has not been proven in any form that is relevant to detecting design.

170 Comments

RBH · 27 March 2011

In a way this is a flashback to the reason "Febble," a British neuroscientist, was banned from UD four years ago. She argued that on Dembski's definition of the behavior of an "intelligent" designer, natural selection qualifies as an intelligent designer.

The take-home is that to the extent that ID notions like "intelligence" or "complex specified information" can be made operationally explicit and testable, naturalistic processes are entirely adequate to produce the phenomena those notions are invoked by creationists to explain (where "explain" in the latter case is used very loosely, of course).

Glen Davidson · 27 March 2011

The biggest problem is that design is quite easily detected without it having to be complex at all. Handaxes are not very complex, yet are readily understood to have been designed--by humans, oddly enough, not God (really, if God were busily operating in the environment, what right would we have to ascribe intelligently-made ancient artifacts to humans?).

That's why Dembski attempts to redefine simple design as being "complex," because all life is actually complex, while design per se can be either simple or complex. Dembski wanted to conflate design and life, so he calls simple "unlikely" (via "natural" means) artifacts complex when they are in fact simple.

Only modern life can be said to be reliably complex in every known instance. "Elegance" and "simplicity" mark many designed objects, and these objects are typically as readily demonstrated to be designed as the latest computer chips are. What is strikingly obvious in most designed objects is rationality in its construction, regardless of their simplicity or complexity, while known non-engineered (or artificially selected) life is invariably both complex and without rational design and construction.

Design is not characterized by CSI, rather by rationality. Life is complex, but without rational design behind it (evolution can sometimes come close to what we might (wrongly, in fact) consider to be rational ends, yet the beginnings are obviously not rationally chosen--they are simply what is available to evolution).

ID's conflation of design's frequent simplicity with life's complexity exists only to obscure the importance of rationality's existence within designs, and its lack in all "wild-type" life.

Glen Davidson

Douglas Theobald · 27 March 2011

The definition of CSI you provide was described in a 2007 PNAS paper by Jack Szostak (2009 Nobel laureate).

Proc Natl Acad Sci U S A. 2007 104 Suppl 1:8574-81.
"Functional information and the emergence of biocomplexity."
Hazen RM, Griffin PL, Carothers JM, Szostak JW.

http://www.pnas.org/content/104/suppl.1/8574.long

Szostak calls it "functional information". (It's identical to a definition I came up with and submitted to the Journal of Theoretical Biology back in the early 90's, but the paper was rejected -- blah, blah, blah).

mrg · 27 March 2011

I'm getting very leery of the word "information". The word is perfectly valid in an informal sense, of course: "Boyo, there's a lot of good information in this book!" However, in a technical discussion its usage has to be defined relative to the subject at hand.

That is, unlike a concept like "energy", it has little general applicability, and outside of the specific discussions for which "information" is variously defined, it simply causes confusion. We know that one of the basic features of life (as opposed to nonlife) is heredity; we know that heredity is embodied in the sequences of the genome.
If we ask the question of whether there is "information" in the genome, what do we know when we get an answer that we didn't before?

The CSI argument of ID seems identical to Paley's notions of organized complexity, as in a watch, with the same conclusion, that it is a trademark of Design. Modern evolutionary theory disagrees; the CSI argument of ID is simply reiterating Paley's assertion, under a smokescreen of technical verbosity, and claiming it as a proof when it hasn't moved any of the old pieces on the chessboard.

Of course, any discussion of the merits or lack thereof of the use of "information" as a scientific term is beside the point when it comes to the dubious individuals who show up in Pandaland, and throw the term around as a measure of what in military terms would be called "noise jamming".

fnxtr · 27 March 2011

mrg said: (snip) The CSI argument of ID seems identical to Paley's notions of organized complexity, as in a watch, with the same conclusion, that it is a trademark of Design. (snip) Of course, any discussion of the merits or lack thereof of the use of "information" as a scientific term is beside the point when it comes to the dubious individuals who show up in Pandaland, and throw the term around as a measure of what in military terms would be called "noise jamming".
Exactly, mrg. They've updated their jargon from "pocket watch" to "computer code" and from "elan vital" to "information", but it's still the same arguments from ignorance and incredulity. SSDC: same shite, different century.

OgreMkV · 27 March 2011

Don't forget that they can't even define 'C', 'S', or 'I' in a consistent way that makes sense. And just to reiterate what RBH pointed out, there is nothing in any aspect of "Intelligent Design Theory" that requires... well... intelligence.

One Pro-ID commenter has even claimed that termites are intelligent, at least as ID defines the term. (http://ogremk5.wordpress.com/2011/03/05/what-is-intelligent-design/#comment-81)

Further, I maintain that it is impossible, even in theory, to determine whether a genetic sequence or protein was designed or the result of pure randomness or randoness + natural selection, which renders the entire ID 'argument' moot.

The stuff I've seen that provides some semblance of a positive argument for ID has zero difference from what would be expected if evolution were the designer.

They cannot detect design, even in theory. So, yeah, ID is vacuous nonsense. It's totally not needed or useful.

harold · 27 March 2011

Sorry, I'm still confused by how CSI is meaningful.
A simple definition of Specified Information would be that it is the negative log (to the base 2)
The log or negative log to the base 2 is a common function to come across in discussions of information that touch on the binary system, because, among other things, it relates the minimum number of digits of string length needed (number of "bits") to convey numerical information. For example, the log base 2 of 8 is 3, and the log base 2 of 16 is 4. To express the numerical information "eight" through the numerical information "fifteen", you need four binary bits (1000 through 1111). For fractions it works the same way, except with negative log 2.
of the fraction of those sequences that are better at flying than a bird.
I'm seriously not sure what you mean here. I'll ask the easy questions first - which birds are we talking about, and how do we measure the quality of their flying in a reproducible way? Which sequences are we talking about? I'm guessing you're talking about randomly generated genome-sized sequences of DNA, and imagining what would happen if those sequences were substituted for the genome of a bird at earliest embryonic stage, or something. Is that right?
We don’t have enough information to actually calculate it, but we can be sure that it is big enough to pass Dembski’s threshold of 500 bits, and thus CSI is present.
At one level, obviously, we are in complete agreement that either CSI is a meaningless term, or it is a trivial term that refers to something, but that has no relevance to the theory of evolution. I'm just not sure it is in any way a meaningful term.

SAWells · 27 March 2011

"A simple definition of Specified Information would be that it is the negative log (to the base 2) of the fraction of those sequences that are better at flying than a bird. We don’t have enough information to actually calculate it, but we can be sure that it is big enough to pass Dembski’s threshold of 500 bits, and thus CSI is present."

This is a better piece of work on CSI than any creationist has ever managed to produce.

SAWells · 27 March 2011

Oh, unless the bird is a penguin. Or a moa.

Joe Felsenstein · 27 March 2011

Douglas Theobald said: The definition of CSI you provide was described in a 2007 PNAS paper by Jack Szostak (2009 Nobel laureate). ... (It's identical to a definition I came up with and submitted to the Journal of Theoretical Biology back in the early 90's, but the paper was rejected -- blah, blah, blah).
... and a special case of it was described by me in a paper in American Naturalist in 1978. I wasn't a reviewer on your 1990 submission but if I had been I might have noted that!

Joe Felsenstein · 27 March 2011

mrg said: I'm getting very leery of the word "information". The word is perfectly valid in an informal sense, of course: "Boyo, there's a lot of good information in this book!" However, in a technical discussion its usage has to be defined relative to the subject at hand. That is, unlike a concept like "energy", it has little general applicability, and outside of the specific discussions for which "information" is variously defined, it simply causes confusion. We know that one of the basic features of life (as opposed to nonlife) is heredity; we know that heredity is embodied in the sequences of the genome. If we ask the question of whether there is "information" in the genome, what do we know when we get an answer that we didn't before?
I agree in that, in this case, calling it information does not actually accomplish much. One could as easily say that one is measuring, not Complex Specified Information but “farness out into the upper tail of the distribution of fitness”. Organisms are manifestly nonrandomly extremely far out there, too far to just be there by pure mutation. In Dembski's argument the work is supposed to be done by the additional Law of Conservation. Except it doesn't work. Taking logarithms and calling the result information is unnecessary.
The CSI argument of ID seems identical to Paley's notions of organized complexity, as in a watch, with the same conclusion, that it is a trademark of Design. Modern evolutionary theory disagrees; the CSI argument of ID is simply reiterating Paley's assertion, under a smokescreen of technical verbosity, and claiming it as a proof when it hasn't moved any of the old pieces on the chessboard.
In this case it is degree of adaptation, not the way complexity is or is not organized, that is supposed to be decisive. So I don't see a direct parallel to Paley. But I haven't actually read Paley.

Joe Felsenstein · 27 March 2011

OgreMkV said: ... Further, I maintain that it is impossible, even in theory, to determine whether a genetic sequence or protein was designed or the result of pure randomness or randoness + natural selection, which renders the entire ID 'argument' moot. The stuff I've seen that provides some semblance of a positive argument for ID has zero difference from what would be expected if evolution were the designer. ...
If the Law of Conservation of Complex Specified Information did the job it was intended to do, then the consequences would be huge, and we would be forced to acclaim Dembski as the greatest figure in evolutionary biology (perhaps ahead of Darwin). Alas for him, the LCCSI doesn't do this, and as a result you are correct -- the nonrandom degree of adaptation of organisms is explicable from natural selection, as you note.
harold said: Sorry, I'm still confused by how CSI is meaningful.
A simple definition of Specified Information would be that it is the negative log (to the base 2)
The log or negative log to the base 2 is a common function to come across in discussions of information that touch on the binary system, because, among other things, it relates the minimum number of digits of string length needed (number of "bits") to convey numerical information. For example, the log base 2 of 8 is 3, and the log base 2 of 16 is 4. To express the numerical information "eight" through the numerical information "fifteen", you need four binary bits (1000 through 1111). For fractions it works the same way, except with negative log 2.
of the fraction of those sequences that are better at flying than a bird.
I'm seriously not sure what you mean here. I'll ask the easy questions first - which birds are we talking about, and how do we measure the quality of their flying in a reproducible way? Which sequences are we talking about? I'm guessing you're talking about randomly generated genome-sized sequences of DNA, and imagining what would happen if those sequences were substituted for the genome of a bird at earliest embryonic stage, or something. Is that right?
We don’t have enough information to actually calculate it, but we can be sure that it is big enough to pass Dembski’s threshold of 500 bits, and thus CSI is present.
At one level, obviously, we are in complete agreement that either CSI is a meaningless term, or it is a trivial term that refers to something, but that has no relevance to the theory of evolution. I'm just not sure it is in any way a meaningful term.

Matt G · 27 March 2011

Their hypothesis (if you can call it that) seems to be that they can use CSI to detect design, and by extension a designer (i.e., God). Do they really want to make God a testable hypothesis which they would be forced to reject if/when their arguments fail?

Joe Felsenstein · 27 March 2011

Sorry for the preceding folks: I and the comment system appended harold's comment to my reply to OgreMkV, and named it a reply to harold. Aarghh!! anyway ...
harold said: Sorry, I'm still confused by how CSI is meaningful. ,,,
me: of the fraction of those sequences that are better at flying than a bird.
I'm seriously not sure what you mean here. I'll ask the easy questions first - which birds are we talking about, and how do we measure the quality of their flying in a reproducible way?
I'm being informal, but any old (flying) bird species. The point is that even without going into detail about how to measure flying ability, they are way better at it than any bird that whose genome was a random string of DNA.
Which sequences are we talking about? I'm guessing you're talking about randomly generated genome-sized sequences of DNA, and imagining what would happen if those sequences were substituted for the genome of a bird at earliest embryonic stage, or something. Is that right?
Precisely, and almost all of that time we don't even get out one living cell.
me: We don’t have enough information to actually calculate it, but we can be sure that it is big enough to pass Dembski’s threshold of 500 bits, and thus CSI is present.
At one level, obviously, we are in complete agreement that either CSI is a meaningless term, or it is a trivial term that refers to something, but that has no relevance to the theory of evolution. I'm just not sure it is in any way a meaningful term.
It simply is a way of saying that real organisms are so far out into a tail of fitness (or if you prefer, flying ability), that this good or better would occur less than 1 time in 2-to-the-500 sequences. And that much is fairly obviously true. Then Dembski trots out the Law of Conservation and concludes that you can't get there by natural selection. But, the way he formulates that and uses it, he's wrong.

Flint · 27 March 2011

CSI appears to be a designation applied to whatever the cdesign proponentsists have already determined is Designed, using some theological test never made explicit.

Joe Felsenstein · 27 March 2011

Matt G said: Their hypothesis (if you can call it that) seems to be that they can use CSI to detect design, and by extension a designer (i.e., God). Do they really want to make God a testable hypothesis which they would be forced to reject if/when their arguments fail?
Well, that is an argument that might persuade a person of faith not to use a god-of-the-gaps approach. However in this case they start by thinking that they have a Law of Conservation that does not allow adaptation by natural selection. And since there is tons of adaptation everywhere in life, they think they have a very big Gap. However, as the Law turns out not to do the job, all they are left with is that the adaptation could instead be due to natural selection. That doesn't prove that it is natural selection, so the Designer is not refuted, just nowhere near proved to be at work.

Joe Felsenstein · 27 March 2011

Flint said: CSI appears to be a designation applied to whatever the cdesign proponentsists have already determined is Designed, using some theological test never made explicit.
I disagree -- I think it is just a way of arguing that there is so much adaptation that it could not be explained by pure mutation (without natural selection). So it is just a mathematicized version of the explosion-in-a-junkyard analogy. As many people here have noted, that analogy has no counterpart to natural selection. If Dembski's Law of Conservation did the job, it would rule out natural selection. But, alas for them ...

Shebardigan · 27 March 2011

The purpose of the concept "Complex Specified Information" is to sneak the concept of a Specifier into the discussion. In the context in which it was inserted, the unstated assumption is that the Specifier is an active entity, i.e. The Designer.

As noted before, a more honest terminology would be "Complex Specifying Information". But this can obviously arise from natural processes, and therefore is a non-starter as an IDC propagational tool.

Complex Specified Information could take practically any form, and have any number of purposes, the number of which could, practically speaking, be zero.

Examples: a grocery shopping list tucked into my shoe by my wife yesterday morning. The painting The Persistence ff Memory by Salvador Dali.

Douglas Theobald · 27 March 2011

Joe Felsenstein said:
Douglas Theobald said: The definition of CSI you provide was described in a 2007 PNAS paper by Jack Szostak (2009 Nobel laureate). ... (It's identical to a definition I came up with and submitted to the Journal of Theoretical Biology back in the early 90's, but the paper was rejected -- blah, blah, blah).
... and a special case of it was described by me in a paper in American Naturalist in 1978. I wasn't a reviewer on your 1990 submission but if I had been I might have noted that!
I imagine it's been "invented" independently many times -- it seems a natural measure of functional information to me. In my paper I actually argued that, from a biological perspective, a better definition would be the log of the fraction of sequences that provide the same absolute fitness (conditional on some specified finite sequence space). It's then trivial to show that this type of "CSI" can increase due to natural selection.

REC · 27 March 2011

Unless I'm mistaken, you can even see casual admissions that the stuff of evolution produces "active information" at least in an attempt to critique the digital organism Avida:

"Mutation, fitness, and choosing the fittest of a number of mutated offspring [5] are additional sources of active information in Avida we have not explored in this paper."

Evolutionary Synthesis of Nand Logic: Dissecting a
Digital Organism Ewart Dembski and Marks

http://evoinfo.org/papers/2009_EvolutionarySynthesis.pdf

I can't think of a compelling reason that if those are sources of active information in Avida, that they are not also in life.

Flint · 27 March 2011

CSI appears to be a designation applied to whatever the cdesign proponentsists have already determined is Designed, using some theological test never made explicit.

Flint · 27 March 2011

I disagree – I think it is just a way of arguing that there is so much adaptation that it could not be explained by pure mutation (without natural selection).

I don't think this is a disagreement. Start with the assumption of Design, theologically required. Now, how do we determine whether the Designer did it? Well, if doctrine implies it was created as-is, then it was. Evolution is, from this view, nothing more than an incorrect interpretation of what scripture makes obvious. How can we show it's incorrect? Well, maybe it's too complicated to have happened that way. Maybe there's no obvious evolutionary pathway. But these aren't more than ad hoc justification in support of foregone conclusions.

mrg · 27 March 2011

Douglas Theobald said: I imagine it's been "invented" independently many times -- it seems a natural measure of functional information to me.
From what I've seen, Leslie Orgel is usually said to have invented the term "CSI", though he didn't have any "Design" agenda behind it. The whole notion of "functional information" is a quagmire. It is possible to come up with ad-hoc definitions of it for specific circumstances, but there's no general definition that allows a measure of it in one circumstance to be compared to a measure in another circumstance.

Chris Lawson · 27 March 2011

The CSI concept fails on the C, the S, and the I. But its greatest failing is the "Specified" -- because gene sequences are not specific. Not just in theory, but observably so. There are many mutations which make no difference at all to a gene's function. There are other mutations that reduce a gene's function, but not enough to make it completely non-functional. There are genes that vary in function depending on whether they are homozygous or heterozygous (e.g., heterozygous sickle cell malaria resistance good, homozygous sickle cell anaemia bad). There are species that lack enzymes of related species but do just fine (jawless fish can form clots just fine despite having a much simpler clotting cascade than we do, with 6 enzymes instead of our ten).

The idea that the enormous variation in genetic sequences in the biosphere is *specified* is wrong from the start. Even if one posits for the sake of argument that God created life pretty much as it is today, it is demonstrable that this fictional God did not do so with specified genetic information.

Not that I mind Dembski trying to enumerate and measure the complexity of genetic information. It could be an interesting project if he wasn't obsessed with squaring the circle to prove that God made circles.

Renee Marie Jones · 27 March 2011

Simply put, birds fly and fish swim. They do so a lot better than organisms coded by random strings of DNA (formed by mutation without natural selection – organisms coded for by monkeys typing with four-key typewriters). If we could imagine looking at all possible such organisms with the same length genome as (say) a bird, the fraction of them that would fly as well as a bird, or better, would be incredibly tiny. So tiny that if every particle in the universe were a monkey with an ATGC typewriter, there would not have been enough time by now to produce anything as good as a bird even once since the time of the Big Bang.

Do you really know how small that fraction is? How did you estimate it? Or are you just making this up?

Douglas Theobald · 27 March 2011

a better definition would be the log of the fraction of sequences that provide the same absolute fitness
That should read "the same or greater".

John Kwok · 27 March 2011

Joe Felsenstein said:
Flint said: CSI appears to be a designation applied to whatever the cdesign proponentsists have already determined is Designed, using some theological test never made explicit.
I disagree -- I think it is just a way of arguing that there is so much adaptation that it could not be explained by pure mutation (without natural selection). So it is just a mathematicized version of the explosion-in-a-junkyard analogy. As many people here have noted, that analogy has no counterpart to natural selection. If Dembski's Law of Conservation did the job, it would rule out natural selection. But, alas for them ...
Of course they argue that purely "random" mutation can't happen, neglecting the importance of prior phylogentic history in constraining the types of mutations that would be possible, or more, broadly, to invoke something akin to Gould's notion of contingency. This is the fundamental problem I see with Meyer's absurd attempt in "testing" deviations from a "true" Design via data in the fossil record, since there is no prior accounting of that phylogenetic history.

harold · 27 March 2011

Joe Felsenstein -

Would it be reasonable to say that you are pointing out that the term "CSI" as used by Dembski is essentially synonymous with the more common terms "fitness" and "adaptation" (which are themselves more or less synonymous in many contexts)?

To put it another way, Dembski invented the term CSI, claimed that it is a feature of modern living organisms (presumably of those pesky plants that nobody in creationism cares about, too, as well as "sexy" mammals and motile bacteria), and then claimed that it was a feature that could not have evolved.

There are two possible ways to process this, in the light of the strong evidence for the theory of evolution and Dembski's failure to rebut any of the positive evidence.

1) The term is meaningless and there is no reason to think organism or anything else "have" it, or

2) The term is meaningful but the claim that evolution could not have created the feature is false.

You make a fairly good case for "2)", but it is dependent on a coherent and reproducible definition of "CSI". If ID/creationists start dissembling about the definition of CSI, and my dime says they will, because you have drawn attention to it, that might be a weak argument in favor of "1)".

stevaroni · 27 March 2011

I think the real crux to this thing is whether or not a certain type of "information", let's call it "Dembskian Information" can be created or must be conserved.

How does Dembski address the effects of selection?

We all know, and I think that even Dembski himself cannot successfully gloss over, the fact that selection actually exists.

A population of gazelle is born. They are all somewhat different. Some of them are better at surviving to maturity than others. Though there are statistical blips ( some otherwise very fit gazelle will still be struck by lightning, for example ) in general the environment makes a decent decision about what works.

This works, that doesn't. Speed, yes. Tendency to stand their ground against the lions... not so much.

The environment has made a selection based on experimentation with what works, therefore adding information to the genome. Certain choices worked, and were kept, certain choices didn't and were discarded in a pile of bloody fur.

Did Dembski ever get around to addressing what effect selection might have on Dembskian Information?

Ryan Cunningham · 27 March 2011

Bullshit.

This is the definition of a scientifically vacuous concept. You can't precisely define (let alone measure) any of the terms involved in his equations He's asserting that something is "impossible" by assigning an arbitrary threshold on an unmeasurable quantity. Dembski is just inconsistently flinging technical terms around to create an intellectual smoke screen. In the end, he's made a decades long career out of ineptly obfuscating an argument from ignorance.

Think about it, Joe. You wouldn't even let an undergrad get away with this kind of crap.

Douglas Theobald · 27 March 2011

Ryan Cunningham said: Think about it, Joe. You wouldn't even let an undergrad get away with this kind of crap.
Joe is not defending Dembski's argument; he is defending the reasonableness of at least one possible definition of CSI. This is no different than the analogous creationist argument against evolution using the second law of thermodynamics. The argument is BS; the concept of entropy is not. To take the analogy a bit further -- the definition of CSI that Joe proposes is measurable in some simple situations (see the Szostak PNAS paper I cited http://www.pnas.org/content/104/suppl.1/8574.long ). In real, complicated organisms it will of course be impossible to measure, but that doesn't mean that we should conclude that a value for it does not exist. Analogously, entropy changes are measurable in simple systems, but try to measure the entropy change for the development of a chicken from an egg to an adult hen.

John Kwok · 27 March 2011

Ryan Cunningham said: Bullshit. This is the definition of a scientifically vacuous concept. You can't precisely define (let alone measure) any of the terms involved in his equations He's asserting that something is "impossible" by assigning an arbitrary threshold on an unmeasurable quantity. Dembski is just inconsistently flinging technical terms around to create an intellectual smoke screen. In the end, he's made a decades long career out of ineptly obfuscating an argument from ignorance. Think about it, Joe. You wouldn't even let an undergrad get away with this kind of crap.
Thanks for stating the obvious Ryan. Am disappointed that, of all people, Joe Felsenstein, might be willing to concede that Dembski and his fellow Dishonesty Institute "scientifically-trained" mendacious intellectual pornographers may have a point. They have none, as Jeffrey Shallit and Wesley Elsberry, among others, have demonstrated ever since Dembski started pimping his breathtaking inanity.

John Kwok · 27 March 2011

Douglas Theobald said:
Ryan Cunningham said: Think about it, Joe. You wouldn't even let an undergrad get away with this kind of crap.
Joe is not defending Dembski's argument; he is defending the reasonableness of at least one possible definition of CSI. This is no different than the analogous creationist argument against evolution using the second law of thermodynamics. The argument is BS; the concept of entropy is not. To take the analogy a bit further -- the definition of CSI that Joe proposes is measurable in some simple situations (see the Szostak PNAS paper I cited http://www.pnas.org/content/104/suppl.1/8574.long ). In real, complicated organisms it will of course be impossible to measure, but that doesn't mean that we should conclude that a value for it does not exist. Analogously, entropy changes are measurable in simple systems, but try to measure the entropy change for the development of a chicken from an egg to an adult hen.
I don't claim to understand the mathematical equations, but I think Jeffrey Shallit and Wesley Elsberry have been quite thorough in debunking Dembski's mathematical absurdities and have been so effective that I don't think anyone, especially Joe Felsenstein, should offer them even a proverbial "fig-leaf" of support like this.

Henry J · 27 March 2011

Don't ask if something was designed - that's the wrong question. Ask if it was engineered, and by what method - different methods of engineering leave evidence that can be identified. To ask if something was designed requires prior knowledge about the intended purpose or function of the design; without that prior knowledge the question is useless, and maybe even meaningless. ----------------

natural selection qualifies as an intelligent designer.

A better way of putting that (imnsho) is to point out that an evolving gene pool has at least two of the attributes that we associate with intelligence - it tries experiments (produces varieties in addition to those already present), and it records the variations that increased production, simply by having more copies of them left in the pool. It's sort of analogous to what a neural network does.

mrg · 27 March 2011

Informally speaking, there's no reason not to talk about "genetic information" or "biological information", and I certainly will accept that such concepts could be reasonably defined for specific cases.

But only for those specific cases, and more importantly, "is this trip really necessary?" How much would we lose if we didn't pay any attention to "genetic information"? Its utility seems limited, and I worry that any discussions of it are simply contributing to the dedicated efforts of the ID camp to muddy the waters.
I think you could train a parrot to be a creationist if you just taught him to squawk: "No new information!"

Actually, I think you could have always trained a parrot to be a creationist, but these days "information" is the relevant verbiage.

Douglas Theobald · 27 March 2011

I don't claim to understand the mathematical equations, but I think Jeffrey Shallit and Wesley Elsberry have been quite thorough in debunking Dembski's mathematical absurdities and have been so effective ...
As I understand it (and Shallit and/or Wesley should jump in here), their primary arguments have not been against the definition of CSI per se but rather against Dembski's claims about its implications (like the purported "Law of conservation of CSI", which Joe criticizes). And in any case, Joe's definition is a bit different from Dembski's -- he's showing that there is in fact a valid/reasonable definition of the thing Dembski's CSI is supposed to measure. (Whether Joe's CSI, which is equivalent to Szostak's functional information, is of any practical use is a different question and I'm skeptical that it is -- this was one of the criticisms I received of my JTB submission, and I think it's somewhat valid.)

Wesley R. Elsberry · 27 March 2011

I think we covered how to do the sort of useful work Dembski claims his concept is supposed to do, but doesn't. Check out section A.1 in our appendix. We apply the universal distribution to get "Specified Anti-Information". There is no argument over probabilities, since this is out of algorithmic information theory. And there is no appeal to "rarefied design", as what SAI signifies is that something with high SAI is due to a simple computational process.

As we note in section 5 and in the appendix, we believe that what CSI actually identifies, when it can be said to work at all, is the outcome of simple computational processes. That's why our "specified anti-information" (SAI) is a superior approach to "specification" than Dembski's methods. [Posted here.]

harold · 27 March 2011

Ryan Cunningham said: Bullshit. This is the definition of a scientifically vacuous concept. You can't precisely define (let alone measure) any of the terms involved in his equations He's asserting that something is "impossible" by assigning an arbitrary threshold on an unmeasurable quantity. Dembski is just inconsistently flinging technical terms around to create an intellectual smoke screen. In the end, he's made a decades long career out of ineptly obfuscating an argument from ignorance. Think about it, Joe. You wouldn't even let an undergrad get away with this kind of crap.
I hope it's obvious that I am inclined to this view myself. I am willing to cut Joe some slack here. He thinks he can pin them down to an arguably meaningful definition of CSI. (Note that this is not a question of whether Dembski is FOS, but merely for which reason Dembski is FOS.) My guess is that Joe's own attempt to give ID/creationists some minimal credit will cause them to dissemble and misrepresent him (probably in another forum) and prove to him that they don't even deserve that much credit. But for now, that is just my guess. He is making an honorable effort to assign some kind of meaning to the term "CSI". I'll wait and see how it turns out. And I'll be quite surprised indeed if it does not turn out as I predicted.

John Kwok · 27 March 2011

One of the few "intelligent" posters in reply to MathGrrl's post:

Noesis

03/24/2011

6:40 pm

MathGrrl,

I was not challenging you with my remark about Omega. I was trying to give you a hint.

I’ll say outright this time that Dembski wanted (past tense, because we’re talking about work he seems to have abandoned) dearly to have a probability measure on the space of possible biological forms, so he could take the negative logarithm of the probability of a form to get information.

As Stuart Kauffman has pointed out, there can be no such probability measure, because none of us can know the space of possible biological forms (or phase space, as he puts it).

Dembski does not know the phase space. He has often complained that evolutionary biologists won’t give him the probabilities that he needs. He has indicated that evolutionary theory is deficient because it does not yield those probabilities. He seems to believe that if the theory says that there are chance contributions to biological evolution, then it should provide probabilistic models. This does not follow logically. If I see you flipping an apparently fair coin to select inputs to a “black box,” then I know that there is a chance contribution to the behavior of the system. But there is no way for me to provide a detailed probabilistic model. In particular, I do not know the range of responses of the black-box system.

Dembski promised long ago to produce an upper bound on the probability of evolution of the bacterial flagellum. He has yet to get back to us with that. If should ever claim to have that bound, it will be bogus. Again, he cannot measure probability on a set he cannot hope to define. And without probability, there is no CSI.

Joe Felsenstein · 28 March 2011

Greetings to all the people who hollered “bullshit” and insisted that all parts of Dembski's argument are completely wrong and that I was naïve. No, I'm not naïve, and yes, I've put a lot of work into understanding Dembski's arguments over the years, and yes, I know who I'm dealing with.

And no, not all parts of his argument are meaningless or wrong. Specified Information is not a silly concept -- not when Leslie Orgel invented it, not when I invented a relative of it, not when Doug Theobald invented it, and yes, not when Dembski used it. We can't really compute it in real cases, but we can easily see that the amount of it (500 bits) that Dembski needs to call it Complex Specified information is a lot less than the amount of it in any real life form. That is, any old life form you care to study has a genotype whose fitness is certainly in the upper 1-in-10-to-the-150th part of the distribution of fitnesses of all possible DNA sequences of the same length as that genome.

And that's all you need to show that a genotype that fit cannot happen even once in a number of trials equal to the number of events that have happened in the whole universe, if each trial is just mutations of all bases, with no natural selection. For Dembski that seals it, the explanation must be design. For the rest of us, no, it could well be repeated rounds of natural selection. The reason Dembski thinks it couldn't is that he has his Law of Conservation of Complex Specified Information. This law has been shown to be (a) unproven, and (b) of the wrong form to do what he needs it to do. And if it is put in the right form it is easily seen to be wrong.

That's where the real body is buried in Dembski's argument, and all the attempts to show that everything else is wrong too actually don't help -- they just distract from understanding where the real problem is in the argument. I would be interested in seeing whether Dembski could make any defense of his argument on this point (so far he has declined to, pointing instead to later arguments of his that are not about the same thing at all).

The fact that MathGrrl, as admirably as she has conducted herself, felt that the point to concentrate on was whether CSI could be precisely defined shows to me that anti-ID people have not understood the importance of concentrating on the issues involving the LCCSI.

Paul King · 28 March 2011

I think that something that is clearly illustrated by the whole affair is that CSI is NOT well understood, even by the people who allege that it is a "problem" for evolutionary biology. And a lot of the blame has to be placed on Dembski's shoulders - in fact even the name is misleading.

I would suggest that the specific measure proposed in this article is another error. I have suspicions that it is close to Dembski's original idea, but if so, he had deviated from that by the time The Design Inference was published. By my reading Dembski does not insist on taking a uniform probability for all possible sequences but instead says that all relevant factors should be taken into account. Which would include the influence of natural selection. But in closing off an obvious error he has made his method impractical and even unworkable in more complex cases, which is why it is never used correctly in any non-trivial case.

Mike Elzinga · 28 March 2011

Douglas Theobald said: Analogously, entropy changes are measurable in simple systems, but try to measure the entropy change for the development of a chicken from an egg to an adult hen.
Some of the problems here with information are similar to the misconceptions regarding entropy. For example, it is not clear what anybody is saying when referring "the entropy of a chicken at various stages in its development." If we compare just a young chicken with an adult chicken, the adult is approximately a young chicken with a larger volume. If they have approximately the same body temperature, then the adult has more total energy as well as more volume to contain more microstates. Thus the adult has more entropy (entropy scales with volume). So it is not like the adult is somehow "more organized" and more advanced. In fact, if the misconceptions that are commonly associated with the term entropy are used, adults are "less advanced" than their younger versions of themselves. One can, in principle, account for all the energy inputs and outputs and temperatures of an organism over its entire life span. That would give you the entropy change; but what does that tell you about the organism and its complexity and organization? Living systems are energy driven assemblies of many subsystems, each comprised of complex assemblies of atoms and molecules. The entropy of such a system gives you no handle on how the system behaves or how it relates to other living systems. Trying to tack "information" onto living systems appears to be no more helpful than trying to attach "the entropy of the system" to a living organism. If anything, such an exercise seems to misdirect the focus on just what makes a living system what it is. Given the billions of living systems that have existed on this planet, there are better ways to compare them than with a number specifying "information" or entropy.

Chris Lawson · 28 March 2011

Joe,

I agree with most of what you say and certainly don't think you should be accused of being soft on ID for analysing their claims on face value. So I agree. Except for the "Specified Information is not a silly concept" bit. As soon as the word "Specified" is in there, it ceases to have bearing on evolution AND it plays into the hands of Dembski's rhetorical tactics.

A term like "genetic probability of fitness" would be better. In this case, words really matter.

Rolf Aalberg · 28 March 2011

Law of conservation? Law?? Of conservation? Sorry, I am out of the water here, but out of what does that law emerge? Why and how would something abstract be conserved?

I have to turn my back on the transcendental world and turn to the world in which I live and ask: WTF has that got with the real world to do? It looks so outlandish to me. Is it really that hard to make sense out of the real world, to explain in plain language what it is all about?

IMHO, it seems to me that Dembski is somewhere out in another worlds, otherwise he might perhaps have done what I often find in books: Attempts at making science accessible for the general public and people like me.

OTOH, if it isn't science it may be hard to popularize - that might reveal a fundamental flaw in the subject.

(Most) birds fly, don't they?

Rolf Aalberg · 28 March 2011

Just have to add, Joe Felsensteins comments helps a lot to make me get an inkling of what it is all about.

k.e., · 28 March 2011

The only thing interesting happening here is Joe seems to have Stockholm Syndrome.

Hooray for Chris Lawson with "A term like “genetic probability of fitness” would be better. In this case, words really matter"

Indeed creationists when they see the word "information" think of Genesis and are trying to shoe horn (wedge) purely subjective beliefs into some muddy metric that doesn't stink.

As far as "The Law of Conservation of Information"?

Want to buy a nice swamp or a free energy machine?

get real.

Venture Free · 28 March 2011

There is one comment that really stands out for me in that entire post, and it is of course conspicuously ignored by everyone. It's by a visitor going by the moniker Tulse. Here it is in it's entirety (with a little bit of formatting by me for readability).
I’m a visitor here, so perhaps I’m not familiar with the conventions of this blog. But if this were a physics blog and an Aristotelian asked how to calculate the position of an object from its motion, I wouldn’t expect the respondents to spend time arguing about the motives of the poster, or whether objects remain in motion or naturally come to rest — I’d expect someone to simply post:
y = x + vt + 1/2at**2 where:
  • y = final position
  • x = initial position
  • v = initial velocity
  • a = acceleration
  • t = time
If an alchemist asked on a chemistry blog how one might calculate the pressure of a gas, one wouldn’t argue about the nobility of gold or the Philosopher’s Stone — one would simply post:
p=(NkT)/V where:
  • p = absolute pressure of the gas
  • N = number of gas molecules
  • k = Boltzmann’s constant
  • T = temperature of the gas
  • V = volume of the gas
And if a young-earth creationist asked on a biology blog how one can determine the relative frequencies of the alleles of a gene in a population, one wouldn’t argue about the literal interpretation of Genesis — one would simply post:
p² + 2pq + q² = 1 where:
  • p = population frequency of allele 1
  • q = population frequency of allele 2
These are examples of clear, detailed ways to calculate values, the kind of equations that practicing scientists uses all the time in quotidian research. Providing these equations allows one to make explicit quantitative calculations of the values, to test these values against the real world, and even to examine the variables and assumptions that underlie the equations. Is there any reason the same sort of clarity cannot be provided for CSI?

Wesley R. Elsberry · 28 March 2011

Joe Felsenstein:

We can’t really compute it in real cases, but we can easily see that the amount of it (500 bits) that Dembski needs to call it Complex Specified information is a lot less than the amount of it in any real life form.

I've pointed out before (still looking for the link) that given Dembski's analysis of a bacterial flagellum in "No Free Lunch", any functional protein of more than a certain relatively low number of peptides exceeds his "universal probability bound", and thus Dembskian "design" would be just about everywhere in biology. In various challenges where people have offered two strings and asked the IDC contingent to distinguish between the randomly generated one and the non-randomly assigned one, I've been able to apply KSAI successfully to pick out the non-random instance. SAI is precisely the way to account for the sort of orderliness that Dembski appeals to on an intuitive level but fails to address technically. So I am still of the opinion that Dembski's approach is incoherent and unworkable, but SAI shows that one can explain the examples Dembski uses in a technically competent way that has nothing to do with "designer"-talk.

Joe Felsenstein · 28 March 2011

k.e., said: The only thing interesting happening here is Joe seems to have Stockholm Syndrome.
And what am I expected to say to that? That you've got Oppositional Defiant Disorder? Such an exchange would really help a lot ...
k.e., said: Hooray for Chris Lawson with "A term like “genetic probability of fitness” would be better. In this case, words really matter"
In fact, I have been saying more or less the same thing. The point that Dembski believes leads to his Design Inference is that living organisms are very far out in the upper tail of the fitness distribution. So far that it is not plausible that pure mutation (unaided by natural selection) caused that. "Information" does not have to come into this at all. Dembski can go ahead with his Design Inference if this point is admitted (and I think it is obviously true). To do that, all he needs is to have a conservation law for how far out you are in the tail of the distribution of fitnesses. This he thinks he has (though it turns out he doesn't have a law that will conserve that). Describing the quantity as "information" of any kind is unnecessary. A side effect of making that point is that all the arguments about who understands information theory better than who are not important.
k.e., said: As far as "The Law of Conservation of Information"?
Look at my article (cited in the original post). The LCI is an invention of the late Peter Medawar, not William Dembski. Medawar was not only no creationist, his Nobel Prize was for the discovery of the analogy between natural selection and the cellular processes in the immune system. Dembski tries to extend Medawar's LCI to Complex Specified Information, and as pointed out in my article (and previously by Elsberry and Shallit), this doesn't work. Medawar's LCI is for one-to-one mappings, and does not work in a real world that has many-to-one mappings.
k.e., said: Want to buy a nice swamp or a free energy machine? get real.
That must have been addressed to someone else, given what I said about the Law of Conservation of Complex Specified Information.

Joe Felsenstein · 28 March 2011

Wesley R. Elsberry said: Joe Felsenstein:

We can’t really compute it in real cases, but we can easily see that the amount of it (500 bits) that Dembski needs to call it Complex Specified information is a lot less than the amount of it in any real life form.

I've pointed out before (still looking for the link) that given Dembski's analysis of a bacterial flagellum in "No Free Lunch", any functional protein of more than a certain relatively low number of peptides exceeds his "universal probability bound", and thus Dembskian "design" would be just about everywhere in biology.
Agreed -- what I am saying amounts to the same thing. And if one had a conservation law for how far out in the tail of the fitness distribution the organism is, you could then rule out processes like natural selection causing that. You would not need to discuss it in terms of "information". In that sense Dembski's Design Inferene argument survives all the criticism that the information theory is incoherent. However it does not survive the collapse of his conservation law.

SWT · 28 March 2011

Douglas Theobald said: Analogously, entropy changes are measurable in simple systems, but try to measure the entropy change for the development of a chicken from an egg to an adult hen.
This can be measured using whole-body calorimetry. The easiest portion of the process to measure would be from the time one acquires the freshly fertilized egg to the time the chick emerges. If someone out there is interested in funding such a study, let me know and we'll talk $$, since I don't have a calorimeter big enough or properly equipped to do this. Also, the calorimeters I do have are in nearly constant use.

Douglas Theobald · 28 March 2011

SWT said:
Douglas Theobald said: Analogously, entropy changes are measurable in simple systems, but try to measure the entropy change for the development of a chicken from an egg to an adult hen.
This can be measured using whole-body calorimetry. The easiest portion of the process to measure would be from the time one acquires the freshly fertilized egg to the time the chick emerges.
Could you explain, please? I see how you could perhaps get the enthalpy change by enclosing an egg in a calorimeter and measuring the heat given off until the time the chick hatches (isothermal I suppose) -- but how could you get the change in entropy?

co · 28 March 2011

Douglas Theobald said:
SWT said:
Douglas Theobald said: Analogously, entropy changes are measurable in simple systems, but try to measure the entropy change for the development of a chicken from an egg to an adult hen.
This can be measured using whole-body calorimetry. The easiest portion of the process to measure would be from the time one acquires the freshly fertilized egg to the time the chick emerges.
Could you explain, please? I see how you could perhaps get the enthalpy change by enclosing an egg in a calorimeter and measuring the heat given off until the time the chick hatches (isothermal I suppose) -- but how could you get the change in entropy?
Indeed. The Clausius formulation of entropy is going to be fantastically difficult to measure here, and I can't imagine doing an experiment carefully enough to get useful numbers on a living thing like a chicken. You'd have to isolate it so carefully that even things like removing waste would be nearly impossible. But I'd love to see it done right!

JohnK · 28 March 2011

I agree with Paul King's entire comment. Including:
Paul King: I would suggest that the specific measure proposed in this article is another error.
Yes, Dembski in defining CSI in The Design Inference refers to the best estimate of the probability including all known priors, not just simply "the fraction" as Felsenstein suggests based on assumptions of uniform prob distribution/the principle of indifference, which is merely what Dembski is typically forced to fall back on in examples.
The Design Inference, Chap 3, pg 69 ...the probability of an event is never the prob­ability of an event simpliciter, but always the probability of an event in relation to certain background information... Change the back­ground information and the probability changes as well... Definition: The probability of an event E with respect to background information H, denoted by P(E|H) and called “the probability of E given H,” is the best available estimate of how likely E is to occur under the assumption that H obtains.

Terenzio the Troll · 28 March 2011

Advocates of ID endlessly repeat the mantra that the presence of CSI proves that design is present.
Please excuse the stupidity of my question: I am pretty sure this point has already been discussed over and over again here at Pandasthumb; being the troll I am, though, I will try and ask nonetheless. How do the advocates of ID define "design"? As others have mentioned in earlier comments, design is something that all of us have an intuitive grasp of, but I don't think has ever been formally defined by anybody. I might be wrong, of course. As mr. Felsenstein stated, even if one can formally define CSI and even if a LCCSI does exist and we have a mathematical form for it, if we can not specify "design" in any measurable (numeric) form, then we can not derive any useful inference linking CSI and "design". As Venture Free (or rather Tulse) puts it, the very first step should be something like D = f(i(t)) integral over time T of (i(t)/dt) = K (btw, is it possible to write mathematical symbols like integrals in a comment?)

John Kwok · 28 March 2011

Terenzio the Troll said:
Advocates of ID endlessly repeat the mantra that the presence of CSI proves that design is present.
Please excuse the stupidity of my question: I am pretty sure this point has already been discussed over and over again here at Pandasthumb; being the troll I am, though, I will try and ask nonetheless. How do the advocates of ID define "design"? As others have mentioned in earlier comments, design is something that all of us have an intuitive grasp of, but I don't think has ever been formally defined by anybody. I might be wrong, of course. As mr. Felsenstein stated, even if one can formally define CSI and even if a LCCSI does exist and we have a mathematical form for it, if we can not specify "design" in any measurable (numeric) form, then we can not derive any useful inference linking CSI and "design". As Venture Free (or rather Tulse) puts it, the very first step should be something like D = f(i(t)) integral over time T of (i(t)/dt) = K (btw, is it possible to write mathematical symbols like integrals in a comment?)
I have yet to see any meaningful, quantifiable defintion of determining "good" Design from leading ID "savants" like Behe, Dembski, Marks, Meyer or Wells. I don't think one exists IMHO.

OgreMkV · 28 March 2011

I have two issues with the material being discussed.

1) The Law of Conservation of Complex Information (or whatever it is). I think that the reason no one bothers even talking about it is because it is so demonstrably wrong. There is no law. While I agree that this is a critical error to Dembski's arguments, it's not the thrust of Dembski's arguments.

He claimed that it was possible to detect design in a particular sequence of information. That claim is impossible. It cannot be done.

This is a much easier argument than what you are suggesting. It deals directly with both his claims and his processes[sic] and shows simply that he doesn't have a clue what he's talking about.

2) On the calculation of specified information. I think that this too, is a useless task as applied to genetics/proteins. There is no simple way to extract useful information about a sequence. Mathematically speaking, there is no difference between

AAA GGG CCC UUU
and
AUG CCG GUC UAA

One codes for a valid protein (start and stop included), one does not. Yet, in the search space of protein sequences, they have the same value.

One sequence is more compressible than the other, but for biological purposes that's meaningless.

I think Joe's exmaple of the bird wing is somewhat useful and somewhat disingenius. Yes, it would be theoretically possible to search for a system (genome) that provided for 'better' flight (for some definition of better). But that doesn't help us, because we still could not determine if the sequence was designed, evolved, or random.

Complexity, fine, I think we can grasp that at some level. I think it needs a lot more detail though. Information, again, possibly understood by some (and the meaning of the information has nothing to do with it), but again needs a lot more detail. Specified, totally useless, by what to do what?. If you start with a genome that is a bird and modify it through the search space and end up with a penguin, is that more specified or less specified? It's a valid genome (it results in an organism), it just can't fly... but then the original genome probably can't swim and survive Antarctic temps either.

Does this help or am I just saying the same things...

DS · 28 March 2011

Seems to me that the issue is quite simple. Dembski needs to define complex specified information. Then he needs to define the "Law" of Conservation of Information. Preferable he should do both of these things before publishing books on the subject. In particular, he needs to provide some experimental evidence, or at least some rationale for the latter concept.

Now here is the important part. After defining his terms, Dembski needs to demonstrate conclusively that random mutations and natural selection cannot increase information. So basically, he needs to show that adaptation cannot occur. Until then, all you have is the bold assertion that natural selection cannot produce adaptations. So the argument ultimately comes down to refusing to accept that Darwin was right.

Of course the last one hundred and fifty years of research have show conclusively that Darwin was right, sometimes in spectacular ways that he himself could never have imagined. Dembski is simply two hundred years behind the times. Maybe that is why he never publishes in real journals. Natural selection is the designer, no intelligence is required. Deal with it.

harold · 28 March 2011

Mike Elzinga -
If we compare just a young chicken with an adult chicken, the adult is approximately a young chicken with a larger volume. If they have approximately the same body temperature, then the adult has more total energy as well as more volume to contain more microstates. Thus the adult has more entropy (entropy scales with volume).
Interestingly, this is more or less a big part of what my initial response was the very first time I heard a creationist say "entropy". Putting aside that there are spontaneous, local entropy decreases in systems all the time, why do you even think that the process of biological evolution is necessarily associated with a net decrease in the entropy of anything? That was during my early "giving creationists too much credit" phase. I soon realized that they did not know or care what entropy was, did not know or care that anyone else knew what entropy was, and would not learn anything or respond to any feedback except with personal attacks and repetition of false assertions. I realized that this question assumed that the creationist being asked had some actual knowledge of entropy, so I stopped using it.

mrg · 28 March 2011

OgreMkV said: I think that the reason no one bothers even talking about it is because it is so demonstrably wrong. There is no law.
I think the physicists have a good saying about it: "This is not right. This is not even wrong." One of the characteristics of the ID crowd is their inability to show they are right in practice while trying to prove the other side is wrong in theory. I keep telling that to creationists and all I get is a blank stare.

John Kwok · 28 March 2011

DS said: Maybe that is why he never publishes in real journals.
No, DS, here's Dembski's rationale as to why he doesn't publish in scientific journals: "I've just gotten kind of blase about submitting things to journals where you often wait two years to get things into print. And I find I can actually get the turnaround faster by writing a book and getting the ideas expressed there. My books sell well. I get a royalty. And the material gets read more." These comments of his have been posted elsewhere on the web on many occasions, but I got the direct quotes here: http://www.physics.smu.edu/pseudo/ID/ However, I don't quite "buy" Dembski's argument since I have offered him assistance in writing the definitive textbook on Klingon Cosmology which I am certain would be an instant bestseller. Much to my surprise, he has refused my most generous offer (Of course I have stipulated that he must desist from publishing anything further on his favorite mendacious intellectual pornography; Intelligent Design cretinism). He's interested in getting royalties from his Xian flock and from those in the public who are gullible enough into thinking that he is a credible mathematician.

Douglas Theobald · 28 March 2011

harold said: Mike Elzinga -
If we compare just a young chicken with an adult chicken, the adult is approximately a young chicken with a larger volume. If they have approximately the same body temperature, then the adult has more total energy as well as more volume to contain more microstates. Thus the adult has more entropy (entropy scales with volume).
Interestingly, this is more or less a big part of what my initial response was the very first time I heard a creationist say "entropy".
I intentionally made the question ambiguous, but for me the more interesting question is to consider an adult chicken plus everything it has ever eaten or breathed, and call that the "system". Now what is the entropy change of the system? I know that egg->chicken is spontaneous, so the overall entropy change must be positive (I'm invoking the second law -- somebody who believes in elan vital may think that development violates the second law and won't be convinced by me assuming it's validity in this case). But I actually have no clue what to predict about the entropy change of the system here. And regarding the egg-in-a-calorimeter experiment -- I don't even know what to expect for the enthalpy change. Is chicken development exothermic or endothermic? If it were endothermic, that would actually be fun because that would mean that the egg increased in entropy during the process.

Stanton · 28 March 2011

Dembski sez: "I've just gotten kind of blase about submitting things to journals where you often wait two years to get things into print. And I find I can actually get the turnaround faster by writing a book and getting the ideas expressed there. My books sell well. I get a royalty. And the material gets read more."
Of course, any intelligent person can recognize this as sour grapes over not being able to publish any Intelligent Design-themed research in any reputable science journal from 2000 paces. On the other hand, that he should be bitter over that when he never did, never does, and never will do any Intelligent Design-themed research... It makes him look very childish.

mplavcan · 28 March 2011

John Kwok said:
DS said: Maybe that is why he never publishes in real journals.
No, DS, here's Dembski's rationale as to why he doesn't publish in scientific journals: "I've just gotten kind of blase about submitting things to journals where you often wait two years to get things into print. And I find I can actually get the turnaround faster by writing a book and getting the ideas expressed there. My books sell well. I get a royalty. And the material gets read more." These comments of his have been posted elsewhere on the web on many occasions, but I got the direct quotes here: http://www.physics.smu.edu/pseudo/ID/ However, I don't quite "buy" Dembski's argument since I have offered him assistance in writing the definitive textbook on Klingon Cosmology which I am certain would be an instant bestseller. Much to my surprise, he has refused my most generous offer (Of course I have stipulated that he must desist from publishing anything further on his favorite mendacious intellectual pornography; Intelligent Design cretinism). He's interested in getting royalties from his Xian flock and from those in the public who are gullible enough into thinking that he is a credible mathematician.
Let's just translate that for the audience. "I can't get published in scientific journals because my papers get rejected, which hurts my ego. Nevertheless my target audience as an evangelical Christian is not the scientific community, but the public at large. Besides, I would rather get paid money for evangelizing." Hypocrisy^3

fnxtr · 28 March 2011

Joe Felsenstein said: The point is that even without going into detail about how to measure flying ability, they are way better at it than any bird that whose genome was a random string of DNA.
Stop the presses. ID has discovered selection.

DS · 28 March 2011

Claiming that publishing in journals takes too long will only work for a couple of years. It has now been over ten years and still no real publications. By now that defense is getting pretty weak, especially since he knows that he will never be taken seriously by the scientific community unless and until he publishes in real journals. If there were anything at all in any of the books it would be in journals by now.

The really funny thing is that if you asked him, I'm sure Dembski would not deny that natural selection can produce adaptations. He just won't get the connection of how that simple admission destroys all of his pseudoscience. Oh well, at least he can still find some rubes to buy his nonsense.

Helena Constantine · 28 March 2011

Glen Davidson said: ...Handaxes are not very complex, yet are readily understood to have been designed--by humans, oddly enough, not God (really, if God were busily operating in the environment, what right would we have to ascribe intelligently-made ancient artifacts to humans?). ... Glen Davidson
Actually, farmers in the Roman Empire turned up handaxes in their fields quite frequently and recognized that they were not natural objects. They in fact believed that they were made by the god Jupiter, actually that they were the cinders left over from thunderbolts. Whenever a temple of Jupiter is excavated its always found to contain a collection of hand-axes that were dedicated to the god by the farmers who found them. There can hardly be a clearer example of ancient ignorance inspiring a mythological solution, only to be supplanted by a rational one after the scientific revolution.

Venus Mousetrap · 28 March 2011

I still have a challenge for ID people that hasn't been attempted. Find the CSI of a Garden of Eden pattern in the Game of Life cellular automaton.

Unlike the real world, all the rules of the Game of Life are known, which makes it perfect to experiment in. And, unlike in the real world, a Garden of Eden is known to be designed - it cannot exist naturally in the Game of Life, since there is no combination of cells that can result in that pattern. It has to be placed by something outside the automaton.

It's the perfect test subject! Everything about these patterns should show design, if we can detect it as Dembski suggests. But we can't. Garden of Eden patterns don't have a function beyond 'be an impossible pattern'. To calculate CSI we'd need to find all other impossible patterns of the same size. In other words, we need to know everything that's designed before we can tell if something is designed! That's the whole trick of CSI - the only information it finds is what you put into it. If you specify a function, the information you find comes from your specification, not from the organism.

Of course, the explanation for the above is that ID is just nonsense intended to forward a Christian agenda, but I always like to point out their inexplicable refusal to focus on cellular automata, which I think would be the perfect testing ground for something like ID.

k.e., · 28 March 2011

Joe
Thank you for the clarifications.

Your article is the best explanation I have read yet for Dembski's 'C','S', an 'I' particularly where he co-opted those terms from. It's a pity the UD crowd don't take more notice they might learn something.

The most problematic term is "I". Dembski and his followers conflate together "Information" as defined by Shannon, “Knowledge" as in Fundamentalist Christian Theology and "Function or Fitness"

Dembski has stated "Intelligent design is just the Logos theology of John's Gospel restated in the idiom of information theory."

Dembski's private definitions for Information aside I don't think even the above assertion will prove much use for advancing the study of evolution.

His agenda clearly is one of social engineering through the use of rhetoric. And that’s the problem with offering him a fig leaf. ID will co-opt honest players and spin it for their own uses.

Such as here with Hazen et als Functional Information.

http://www.metacafe.com/watch/3995236/mathematically_defining_functional_information_in_molecular_biology_kirk_durston/

Using unambiguous language and clearly establishing terms where consensus can be agreed is not in ID’s interest.

The use of the word ‘Law’ I think it's unfortunate that such a powerful word is used for inverse transformations. The law of gravity cannot be argued with; Poe's law on the other hand fills blogs. Equally Conservation with a capital 'C' which with Medawar's LCI is just an operation event followed by an error comparison not an actual conservation of potential energy. The terms certainly don't bear the same weight or should that be mass? In the real world. If Shannon Information i.e. a string of data in a noisy channel if by invoking Medawar's LCI ite was to conserve energy we really would then have a breakthrough worthy of a Newton. Obviously he had no such intention.

I understand why you are going about this the way you are so good luck.

Ted Herrlich · 28 March 2011

My blog was one of the one Joe pointed out -- BTW, thanks Joe my hits shot up nicely -- My issue wasn't so much that CSI is vacuous, but more to the point, worthless. I see the problem as two-fold. The first, hilariously pointed out by MathGrrl and all of the commenters, that no one seems to be able to calculate this number. The idea of CSI isn't a bad one, but it doesn't mean much if no one seems to be able to calculate one. Plus how can the idea even be supported if no one can explain how to get there.

The second problem is how W. Dembski keeps trying to use it. It's core to his Explanatory Filter, the filter used to detect design. It's been MIA since Dembski first said he would be delivering it to the world, just like PZ pointed out with Paul Nelson and ontogenetic depth. So currently we can detect design, no that's not right, we can detect Intelligent Design by comparing something to a numeric index that no one can calculate. Oh yea, that takes us down the road a ways. [sarcasm intended]

Dembski also offers CSI as a supporting leg of the whole concept of Intelligent Design, along with Behe's Irreducible Complexity (IC), another completely unsupported idea. So now we have this ID concept that is standing on two unsupported legs CSI and IC. Don't know about you, but two legged stools are pretty unstable and even worse when the two legs cannot provide any support.

I think Joe makes a valid point, something Behe has admitted in one of his many responses to criticisms of IC. I can't imagine Dembski being brave enough to say that even if CSI is found, and be calculated to a repeatable degree -- that it still would not be evidence countering evolution. In reality it's not the CSI alone, but it's the acceptance that CSI cannot come about through natural causation -- another yet to be supported idea from Dembski.

And this is something Dembski and Co want to teach in HS biology classes? It sure as hell isn't ready for primetime, let alone a classroom.

Ted Herrlich
tedhohio@gmail.com
http://sciencestandards.blogspot.com

eric · 28 March 2011

SAWells said: Oh, unless the bird is a penguin. Or a moa.
There's a good point here that I think a lot of people missed. In Joe's original post, he assumes flight is (one of many) good measures of CSI-ness. But that's begging the question: you are defining what counts as CSI before you then determine what counts as CSI! The premise (if we use flight...) contains the conclusion (flying things have CSI). The circularity becomes obvious when we use a different premise. Instead of flight-worthiness, how about we use genome incompressability? In which case, CSI becomes very similar or equivalent to Shannon entropy and random strings will almost certainly have much more CSI than a bird's. So I have to disagree that Joe has come up with a valid variation on Dembski's idea. Using this methodology, one can define the same string as having CSI or not having CSI based on the choice of initial criteria.
Harold said: Would it be reasonable to say that you are pointing out that the term “CSI” as used by Dembski is essentially synonymous with the more common terms “fitness” and “adaptation”
Fitness measures are inherently relative. Dembski's CSI fundamentally is not - or at least it's not supposed to be. It shouldn't matter in what context we find some string of characters, we should be able to determine designed/not designed solely on the internal characteristics of the string itself. That is the whole point of CSI! So to the extent that Joe's definition works, he has killed the patient to save the patient. He has thrown out one of the defining, fundamental principles of CSI (design can be determined without consideration of other evidence) to try and make it work. Evolutionary concepts of fitness recognize that it is both relative and context-dependent. As the joke goes, "I don't have to outrun the bear, I only have to outrun YOU." But creationist concepts of CSI want to say that your legs are either designed or they aren't regardless of the existence of bears. So I would say, no, not the same.

harold · 28 March 2011

Here is my summary of how I understand this conversation, so far.

1) Dembski claimed that CSI is a factor which, if detected, rules out biological evolution as an explanation for whatever "has CSI". However, a) he cannot accurately define CSI, b) he cannot measure CSI, and c) he cannot offer any rationale to support his claim that even if he could define and measure it, its presence would rule out biological evolution, even in the presence of abundant positive evidence for evolution.

2) Joe is generously offering to help out with issues "a)" and "b)" above. He offers a definition and calculation for CSI. His definition and calculation are not pragmatic (as he concedes), and rely on subjective evaluation of features of organisms, as "how well a bird flies".

Having said that, he defines CSI, in essence, by estimating the frequency of DNA sequences that would produce a "genome of a well-flying bird", out of the total number of sequences of the same nucleotide length* that can be generated if nucleotides are randomly selected one at a time. *Of course, the actual exact number of nucleotides in a bird genome varies from individual to individual, but let's put that aside for now.

So if some species of bird that flies has ~3 billion base pairs, the total number of random possible "genomes" is about 4 to the power of 3 billion, which is far greater than the number of elementary particles in the universe.

The number of variations that still represent a genome of a viable bird that "flies as well or better" than an example of a living bird of that species is probably surprisingly high, but it is an infintesimal fraction of the larger number. Assuming inter-observer reliability with respect to flight quality, Joe takes the negative log 2 of this fraction and generates a CSI number (in theory).

3) However, Joe then points out that bird flight did evolve, so that, even after Joe doing some of Dembski's work for him, the actual rationale for Dembski's CSI - to contradict evolution - is invalid.

4) Joe's purpose seems to be to emphasize part "c)" of section "1)" above - that even if you do make a reasonable effort to assign a meaning to the term CSI, it still has nothing to do with contradicting evolution.

5) I finally add that, in my opinion, creationists will either ignore Joe, or contradict his definition of CSI without offering a coherent alternative.

harold · 28 March 2011

Also, I should note that, in my opinion, Dembski's whole CSI spiel is nothing but the "747 created by a tornado in a junkyard" meme, expanded to many hundreds of expensive pages.

Ironically, creationists are attacking themselves when they use this meme. They contradict their own claim. Substitute "God" for "tornado" above, and you have the creationist claim.

No scientist has ever claimed that living organisms magically appeared in some modern, highly developed state suddenly, so the probability of a living organism being "randomly" assembled all at once is not relevant to science.

Dembski's claims about information are just a variation of the "it is improbable that such and such appeared out of nothing" meme.

Mike Elzinga · 28 March 2011

Douglas Theobald said: And regarding the egg-in-a-calorimeter experiment -- I don't even know what to expect for the enthalpy change. Is chicken development exothermic or endothermic? If it were endothermic, that would actually be fun because that would mean that the egg increased in entropy during the process.
It has to be, on the average, endothermic. This can be inferred from what happens when the temperature of the system is reduced. It stops operating. Energy input from a heat bath is necesssary for not only keeping the "signaling processes" in the range in which the chemical reactions occur, but also in the range in which the bonds among molecules and atoms are loose enough for matter to flow through the system. Matter flows in on average and forms bonds which release energy. Some of that energy contributes to the temperature bath. The process cannot be exothermic for long. Energy flowing out would eventually reduct the temperature of the system below where any chemical reactions that sustain the signalling processes would continue. And any bonds would then be settled deeper into their mutual potential wells and the system would become rigid. Matter would stop flowing in the system.

harold · 28 March 2011

Mike Elzinga -

Another interesting point.

Of course, all biological systems are endothermic over any reasonable defined scale or time period. All life requires energy input or dies. Chicken embryos utilize stored energy.

The primary energy for life comes almost exclusively from the sun (I'm aware of chemotrophic bacteria; I said "almost" exclusively).

This is actually one of two reasons why the Matrix movies are absurd. You can't "harvest" energy from comatose humans, you have to feed them.

The other reason being that if our lives are "illusions", but the illusions are sustainable and there is no particular advantage to the non-illusory state, there is no logical reason to care.

Tulse · 28 March 2011

eric said: Fitness measures are inherently relative. Dembski's CSI fundamentally is not - or at least it's not supposed to be. It shouldn't matter in what context we find some string of characters, we should be able to determine designed/not designed solely on the internal characteristics of the string itself. That is the whole point of CSI! So to the extent that Joe's definition works, he has killed the patient to save the patient. He has thrown out one of the defining, fundamental principles of CSI (design can be determined without consideration of other evidence) to try and make it work. Evolutionary concepts of fitness recognize that it is both relative and context-dependent. As the joke goes, "I don't have to outrun the bear, I only have to outrun YOU." But creationist concepts of CSI want to say that your legs are either designed or they aren't regardless of the existence of bears. So I would say, no, not the same.
Excellent points. The problem with the notion of "information" is that it is always relative to some system of interpretation -- the same string of digits could describe chess moves, or longitude and latitude, or a photo, or the Treaty of Westphalia. Or nucleotide sequences. Or, they could be just a random string. What determines their information content is the system of interpretation that is applied to them -- the string does not inherently "contain" information, free of that context. So the notion that one can identify design solely by looking at purely abstract sequences is absurd.

Douglas Theobald · 28 March 2011

Mike Elzinga said:
Douglas Theobald said: And regarding the egg-in-a-calorimeter experiment -- I don't even know what to expect for the enthalpy change. Is chicken development exothermic or endothermic? If it were endothermic, that would actually be fun because that would mean that the egg increased in entropy during the process.
It has to be, on the average, endothermic. This can be inferred from what happens when the temperature of the system is reduced. It stops operating.
This reasoning doesn't seem sound -- the system stops operating if I fry the egg, too.

harold · 28 March 2011

Douglas Theobald -

Of course the fact that a system is endothermic doesn't mean that the rate of energy input doesn't matter.

All life requires SOME energy input to survive, of the right type and at the right rate.

Your egg-frying system won't work very well if you try to fry an egg at the temperature of the surface of the sun, either. That doesn't mean the you don't need energy to fry an egg.

Biochemical reactions in modern organisms are very temperature sensitive. You'll find organisms adapted to arctic waters and blazing deserts, but that's actually a very narrow temperature range in the grand scheme of things.

Douglas Theobald · 28 March 2011

harold said: Douglas Theobald - Of course the fact that a system is endothermic doesn't mean that the rate of energy input doesn't matter. All life requires SOME energy input to survive, of the right type and at the right rate.
Harold -- in thermodynamics, rates are irrelevant. Enthalpy is a state function. You seem to be confusing enthalpy with free energy (Gibbs free energy is the relevant state function here, being the measure of spontaneity at constant temp and pressure, which is roughly the condition for a chicken, especially an egg in our calorimeter). You can have energy input in endothermic and exothermic reactions. I was gently correcting Mike, as the fact that running a reaction at a lower temperature or a higher temperature does not tell us anything directly about its enthalpy change. And in this case, both reducing and increasing the temperature kill the reaction, so something is wrong with his reasoning.

Gabriel Hanna · 28 March 2011

I'd love to ask Dembski about the Noether theorems.

For every quantity that is conserved, there is some associated invariance in physical law.

conservation of energy requires that physical laws don't change with time
conservation of momentum requires that physical laws don't change with position

If "specified information" is conserved, what invariance in physical law corresponds to it?

Kevin B · 28 March 2011

Gabriel Hanna said: I'd love to ask Dembski about the Noether theorems. For every quantity that is conserved, there is some associated invariance in physical law. conservation of energy requires that physical laws don't change with time conservation of momentum requires that physical laws don't change with position If "specified information" is conserved, what invariance in physical law corresponds to it?
Judge Jones' Law: Creationist arguments don't change with time.

Mike Elzinga · 28 March 2011

Douglas Theobald said:
Mike Elzinga said:
Douglas Theobald said: And regarding the egg-in-a-calorimeter experiment -- I don't even know what to expect for the enthalpy change. Is chicken development exothermic or endothermic? If it were endothermic, that would actually be fun because that would mean that the egg increased in entropy during the process.
It has to be, on the average, endothermic. This can be inferred from what happens when the temperature of the system is reduced. It stops operating.
This reasoning doesn't seem sound -- the system stops operating if I fry the egg, too.
Yes; that is correct. Now you are breaking bonds and tearing the system apart. It is important to recognize that living systems are examples of systems that work within a very narrow temperature range. The systems have to be "soft," and the heat bath in which they are immersed cannot drive them so hard that they come apart.

Douglas Theobald · 28 March 2011

Mike Elzinga said:
Douglas Theobald said:
Mike Elzinga said:
Douglas Theobald said: And regarding the egg-in-a-calorimeter experiment -- I don't even know what to expect for the enthalpy change. Is chicken development exothermic or endothermic? If it were endothermic, that would actually be fun because that would mean that the egg increased in entropy during the process.
It has to be, on the average, endothermic. This can be inferred from what happens when the temperature of the system is reduced. It stops operating.
This reasoning doesn't seem sound -- the system stops operating if I fry the egg, too.
Yes; that is correct.
So you agree that we can't predict endothermic vs exothermic? Or what?

Gabriel Hanna · 28 March 2011

Kevin B said:
Gabriel Hanna said: I'd love to ask Dembski about the Noether theorems. For every quantity that is conserved, there is some associated invariance in physical law. conservation of energy requires that physical laws don't change with time conservation of momentum requires that physical laws don't change with position If "specified information" is conserved, what invariance in physical law corresponds to it?
Judge Jones' Law: Creationist arguments don't change with time.
Nice! So to prove that specified information is conserved, all creationists have to do is never change their arguments. And they don't. QED. I'm going out right now to lobby my school board to Teach the Controversy!

DS · 28 March 2011

Kevin B said: Judge Jones' Law: Creationist arguments don't change with time.
The Law of conservation of information: The information content in creationist arguments can never increase. In other words, they are immune to information input. They represent a closed system with regards to information.

Mike Elzinga · 28 March 2011

Douglas Theobald said:
Mike Elzinga said:
Douglas Theobald said:
Mike Elzinga said:
Douglas Theobald said: And regarding the egg-in-a-calorimeter experiment -- I don't even know what to expect for the enthalpy change. Is chicken development exothermic or endothermic? If it were endothermic, that would actually be fun because that would mean that the egg increased in entropy during the process.
It has to be, on the average, endothermic. This can be inferred from what happens when the temperature of the system is reduced. It stops operating.
This reasoning doesn't seem sound -- the system stops operating if I fry the egg, too.
Yes; that is correct.
So you agree that we can't predict endothermic vs exothermic? Or what?
ENDOTHERMIC. And the systems operate within an energy window of 0.01 eV to about 0.04 eV Below that range they "freeze out", and above that range they start coming apart.

mrg · 28 March 2011

Kevin B said: Judge Jones' Law: Creationist arguments don't change with time.
Qualifier: They can, however, alter their coloration. "Evolution is forbidden by the Second Law of Thermodynamics ... scratch that, make it: evolution is forbidden by the Law of Conservation of Information." I do have to hand them the cookie for sheer brass in simply making up a fundamental law of physics.

Gabriel Hanna · 28 March 2011

mrg said:
Kevin B said: Judge Jones' Law: Creationist arguments don't change with time.
I do have to hand them the cookie for sheer brass in simply making up a fundamental law of physics.
That's the problem, you can't just invent conservation laws. So let Dembski define CSI rigorously, and then we can find the physical symmetry that corresponds. Then his conservation law will be empirically tested. I will begin holding my breath now, surely Dembksi will jump right on this before I die of anoxia.

Douglas Theobald · 28 March 2011

Mike Elzinga said:
Douglas Theobald said:
Mike Elzinga said:
Douglas Theobald said:
Mike Elzinga said:
Douglas Theobald said: And regarding the egg-in-a-calorimeter experiment -- I don't even know what to expect for the enthalpy change. Is chicken development exothermic or endothermic? If it were endothermic, that would actually be fun because that would mean that the egg increased in entropy during the process.
It has to be, on the average, endothermic. This can be inferred from what happens when the temperature of the system is reduced. It stops operating.
This reasoning doesn't seem sound -- the system stops operating if I fry the egg, too.
Yes; that is correct.
So you agree that we can't predict endothermic vs exothermic? Or what?
ENDOTHERMIC. And the systems operate within an energy window of 0.01 eV to about 0.04 eV Below that range they "freeze out", and above that range they start coming apart.
So, let's get this straight. An egg sits in a calorimeter, held at constant temperature and pressure, and as it develops into a chick you think it takes in heat? What again is the reason? What you've given so far provides absolutely no justification for that belief.

mrg · 28 March 2011

Gabriel Hanna said: That's the problem, you can't just invent conservation laws.
Sure you can. They make up physics all the time in comics and B-videos. Now actually validate them? That's a different story. ID has a car that can never win races because all its engine can do is produce noise. But they don't have any intention of even entering races, much less winning them; making the noise is all that's important. Indeed, the Law of Conservation of Information is defined as vaguely and situationally (direction of wind, phase of moon, ETC) as possible to make sure it cannot be validated. It would lose all its confusion value otherwise.

Tulse · 28 March 2011

Gabriel Hanna said:you can't just invent conservation laws.
If you invent one conservation law, does another have to go away?

Henry J · 28 March 2011

mrg said: It would lose all its confusion value otherwise.
Is that the law of conservation of confusion?

David Utidjian · 28 March 2011

I prefer my eggs poached and runny.

mrg · 28 March 2011

Henry J said: Is that the law of conservation of confusion?
No. Confusion spontaneously grows exponentially.

Mike Elzinga · 28 March 2011

Douglas Theobald said:
Mike Elzinga said:
Douglas Theobald said:
Mike Elzinga said:
Douglas Theobald said:
Mike Elzinga said:
Douglas Theobald said: And regarding the egg-in-a-calorimeter experiment -- I don't even know what to expect for the enthalpy change. Is chicken development exothermic or endothermic? If it were endothermic, that would actually be fun because that would mean that the egg increased in entropy during the process.
It has to be, on the average, endothermic. This can be inferred from what happens when the temperature of the system is reduced. It stops operating.
This reasoning doesn't seem sound -- the system stops operating if I fry the egg, too.
Yes; that is correct.
So you agree that we can't predict endothermic vs exothermic? Or what?
ENDOTHERMIC. And the systems operate within an energy window of 0.01 eV to about 0.04 eV Below that range they "freeze out", and above that range they start coming apart.
So, let's get this straight. An egg sits in a calorimeter, held at constant temperature and pressure, and as it develops into a chick you think it takes in heat? What again is the reason? What you've given so far provides absolutely no justification for that belief.
As far as I know, all eggs are incubated at some temperature. The temperature differs with species. And as far as I know, most living systems can store energy temporarily. But ultimately they must take in energy and matter in order to keep functioning. They die when taken out of the heat bath in which they normally reside, or when some of their subsystems no longer respond properly to energy input.

JimNorth · 28 March 2011

Douglas Theobald said: So, let's get this straight. An egg sits in a calorimeter, held at constant temperature and pressure, and as it develops into a chick you think it takes in heat? What again is the reason? What you've given so far provides absolutely no justification for that belief.
I dunno nothing about calorimetry, but the egg farmer in me says that a 60-watt light bulb hanging approximately one foot above a fertilized egg will eventually lead to the development of a hatched chick. That sounds like an input of energy to me.

Gary Hurd · 28 March 2011

My apologies to all the people smarter than I am, like Joe, who can accept the existence of fantasy stuff like Dembski’s “Complex Specified Information.” Because for the life of me, I cannot get past the fact that even Joe’s attempt to salvage some sense for it fails miserably on the very example he used to help it along. It had two parts;

“Simply put, birds fly and fish swim.”

“A simple definition of Specified Information would be that it is the negative log (to the base 2) of the fraction of those sequences that are better at flying than a bird.”

You see the problem?

Let’s act like good ostriches and pull our heads out of somewhere, and consider the example of the rhea, or the emu, or a slew of other flightless birds. They fly much, much worse than Exocoetidae, but flying fish are said not to fly for some values of “fly.” So, we could substitute bats, who do fly much better than many species of birds. (I’ll skip the thousands of flying insects, or the other thousands that cannot fly). They cannot run worth a damn, so the emu pulls ahead on that score. Which asks another question of CSI, “Do we need to recalculate (not the we can really calculate it) a new “value” of CSI for birds “Running,” as opposed to “Flying?” Or, is it in addition?

The 20 or so species of penguin swim very well, an advantage they share with a dozen or so other bird species, and nearly all fish over most other birds (and bats). (I would say that penguins actually do swim much better than some species of fish, ie many sculpin, and gobies). So, do we calculate a new CSI score for swimming, too?

It seems we can exclude ostriches from the "created kinds" based on the flying, or swimming CSI, but include them from the running CSI. It makes Genesis much simpler, don't it?

OgreMkV · 28 March 2011

The temperature of the environment only affects the temperture range of the interior of the egg (providing for optimum enzyme reactions).

The energy used for the growth and development of the chick is entirely contained within the yolk of the egg. I think the shell is not a gas barrier. If that is correct, then the egg will lose energy (most as heat) as the structure within develops (the chick).

Venus Mousetrap · 28 March 2011

Tulse said:
Gabriel Hanna said:you can't just invent conservation laws.
If you invent one conservation law, does another have to go away?
Well, I laughed. XD

JimNorth · 28 March 2011

OgreMkV said: The temperature of the environment only affects the temperture range of the interior of the egg (providing for optimum enzyme reactions). The energy used for the growth and development of the chick is entirely contained within the yolk of the egg. I think the shell is not a gas barrier. If that is correct, then the egg will lose energy (most as heat) as the structure within develops (the chick).
That was my initial impulse but Mike pursuaded my easily distracted mind. My choice is now exothermic. Oh, and the mass of the egg does decrease during incubation by around 2.5% indicating a loss of energy through the eggshell.

Mike Elzinga · 28 March 2011

OgreMkV said: The temperature of the environment only affects the temperture range of the interior of the egg (providing for optimum enzyme reactions). The energy used for the growth and development of the chick is entirely contained within the yolk of the egg. I think the shell is not a gas barrier. If that is correct, then the egg will lose energy (most as heat) as the structure within develops (the chick).
Energy storage may be part of the confusion. I don't know of any example of a living organism that does not exist within an energy cascade and within a narrow energy window within that cascade. Any system that is driven by some form of energy flow is not going to violate any laws of physics. It is not possible to have any kind of system that puts out more energy than it takes in. Whatever stored energy it has eventually gets used up unless replenished. Replenishing that energy takes at least as much energy as what is replenished. I think most living systems are quite inefficient in their use of energy. The net flow though them over their lifetimes may be zero, but in the overall process, more energy is spread around and the overall entropy of the universe increases.

Douglas Theobald · 28 March 2011

JimNorth said: the egg farmer in me says that a 60-watt light bulb hanging approximately one foot above a fertilized egg will eventually lead to the development of a hatched chick. That sounds like an input of energy to me.
It of course takes energy to maintain constant temperature in an open system, where heat is always dissipating away. But a calorimeter is very very close, by design, to an isolated system (it is extremely well-insulated and will hold a constant temp for a very long time with no heat input). In fact, it takes no energy input (zero) to maintain constant temp in an isolated system at equilibrium (a perfectly insulated system). If an egg is sitting in our extremely well-insulated calorimeter, you don't need a heater to incubate it, as long as it all starts out at the right temp -- the insulation keeps heat from entering or leaving. Now what happens to the temp in the calorimeter as the egg sits there? If the egg is mostly just oxidizing carbs, then it gives off a little heat itself (carb oxidation is exothermic), and the temp goes up a little bit. On the other hand, many reactions take in a little heat, and those are endothermic. If the development of that egg is endothermic, the temp will drop a bit. But just saying "you gotta maintain temp for the reaction to go" tells you nothing about whether the reaction is exothermic or endothermic.

Reed A. Cartwright · 28 March 2011

I really bet this has been measured. If not, it should be. I think this paper has some on topic citations:

http://jp.physoc.org/content/150/1/239.full.pdf

Douglas Theobald · 28 March 2011

Mike Elzinga said:
Douglas Theobald said: So, let's get this straight. An egg sits in a calorimeter, held at constant temperature and pressure, and as it develops into a chick you think it takes in heat? What again is the reason? What you've given so far provides absolutely no justification for that belief.
As far as I know, all eggs are incubated at some temperature.
But what does that have to do with the question I asked? As I explained to Jim, just because a reaction has an optimal temperature tells you nothing whatsoever about whether the reaction is exothermic or endothermic. Here's a real example I know well, as I teach this in my graduate thermodynamics class. Protein folding (say, for lysozyme) is spontaneous, as long as you do it at the right temperature. Most proteins prefer something around room temperature. If you try to fold a protein at too low of a temperature, it doesn't work (it's called cold-unfolding) and if you try it at too high a temperature, it also doesn't work (then it's called heat denaturation -- it's what happens to an egg when you boil it, you denature the proteins and they turn white). But this tells you nothing about whether protein folding is exothermic or endothermic -- some are one, some are the other. Most proteins (not all) are exothermic when they fold. And this is true even though you have to incubate them at the right temp to get them to fold, and its also true even though if you cool them down enough they no longer fold (a fact that directly contradicts Elzinga's rationale).

Mike Elzinga · 28 March 2011

Douglas Theobald said:
JimNorth said: the egg farmer in me says that a 60-watt light bulb hanging approximately one foot above a fertilized egg will eventually lead to the development of a hatched chick. That sounds like an input of energy to me.
It of course takes energy to maintain constant temperature in an open system, where heat is always dissipating away. But a calorimeter is very very close, by design, to an isolated system (it is extremely well-insulated and will hold a constant temp for a very long time with no heat input). In fact, it takes no energy input (zero) to maintain constant temp in an isolated system at equilibrium (a perfectly insulated system). If an egg is sitting in our extremely well-insulated calorimeter, you don't need a heater to incubate it, as long as it all starts out at the right temp -- the insulation keeps heat from entering or leaving. Now what happens to the temp in the calorimeter as the egg sits there? If the egg is mostly just oxidizing carbs, then it gives off a little heat itself (carb oxidation is exothermic), and the temp goes up a little bit. On the other hand, many reactions take in a little heat, and those are endothermic. If the development of that egg is endothermic, the temp will drop a bit. But just saying "you gotta maintain temp for the reaction to go" tells you nothing about whether the reaction is exothermic or endothermic.
This is exactly why it is importiant to recognize living systems, as we know them, as existing within an energy window between about 0.1 eV to 0.4 eV. The systems are "soft" and any processes that catalyze chemical reactions from matter taken in or stored are going to be driven by the energy in that heat bath. Life is an extremely hair-triggered set of processes taking place within extremely delicately coupled complex systems. But none of it violates any laws of physics at any time in its history; and that includes the laws of thermodyanamics.

SWT · 28 March 2011

I only have time for a quick hit and run right now, hopefully I'll get back to this later tonight.

Turns out there have been a couple of calorimetric studies of hen egg gestation (see citations in Lamprecht, Thermochimica Acta 405 (2003) 1-13). The gestation process is exothermic; it's driven by the metabolism of the embryo/fetus/whatever you call it for birds.

To get a handle on the entropy production of a subject in a calorimeter (in this case the egg and its inhabitant), you have to work simultaneously with the mass, energy, and entropy balances for the system. (I have a truly marvelous proof of this, which the margin of this comment is too narrow to contain.)

Doc Bill · 28 March 2011

Actually, penguins fly in the water and birds swim in the air. Ironically, birds do the butterfly stroke.

I've never seen a fish do a legal breaststroke, although turtles qualify.

Flies have been reported to do the backstroke, but only in soup.

Douglas Theobald · 28 March 2011

Reed A. Cartwright said: I really bet this has been measured. If not, it should be. I think this paper has some on topic citations: http://jp.physoc.org/content/150/1/239.full.pdf
That paper does report exactly what we're talking about. They observed very clear exothermic development of the egg over 21 days. They also show that their measurements are extremely close to what you would predict simply assuming aerobic metabolism (like carb oxidation that I mentioned, which is of course exothermic). As an aside, Jim, the weight loss of the egg is apparently due to slight evaporation (water loss). So I consider it pretty much settled -- it's demonstrably exothermic. But this still tells us nothing about the entropy change of the egg.

John Vanko · 28 March 2011

Gary Hurd said: "You see the problem?"
Well, I think the ID crowd sees the problem now. Just look at MathGrrl's forum on UD. They can't give her a decent answer. She's an ID skeptic that is asking for a mathematically rigorous description of CSI a la Dembski. One respondent, vjtorley, has given her something, but it's not strictly 'a la Dembski'. It's something else, with impossible-to-compute parameters that must be estimated. http://www.uncommondescent.com/intelligent-design/on-the-calculation-of-csi/#comment-375469 I gotta believe in the ID-ers' heart-of-hearts they keep hoping that some skeptic can formula their theory for them, equations and all. And for ID-ers hope springs eternal - they never give up.

Mike Elzinga · 28 March 2011

It is not particularly enlightening to look at several individual processes. The overall processes within a living organism have to have a net energy cascade from high to low. You can look in on any particular stage of development of an organism and find one or the other of endothermic or exothermic processes.

If you want to point to a particular stage and show that it is exothermic, you have to ask how that energy got stored there.

Whether some individulal processes are endothermic (need an energy kick over a potential barrier) or are exothermic (are releasing energy as they drop into lower potential energy states) is not the issue I understood was being discussed.

Of course there are both kinds of process taking place. But to release stored energy stored in a dimple at the top of a hill, some input of energy is required. Whatever benefit the organism gains from that depends on the ortganism.

But overall, living organisms require an energy cascade and a heat bath that keeps their internal systems within the temperature range at which they can function.

The phenomena of hypothermia and hyperthermia are telling us about the functioning temperature ranges of the crtitical systems that coordinate all this.

TomA · 28 March 2011

Douglas Theobald said: As an aside, Jim, the weight loss of the egg is apparently due to slight evaporation (water loss).
Wouldn't there be loss due to metabolism of carbohydrates for energy production in the form of CO2? This also implies that an egg can't be a closed system because of the gas exchange required for metabolism.

Douglas Theobald · 28 March 2011

Mike Elzinga said: Whether some individulal processes are endothermic (need an energy kick over a potential barrier) or are exothermic (are releasing energy as they drop into lower potential energy states) is not the issue I understood was being discussed.
I did keep asking you about an egg in a calorimeter. But in any case I think you're probably wrong. For most animals, I'm going to bet that overall they are exothermic (put me or a dog or chicken or a snake or a cricket or a ... in a calorimeter, and we'll give off heat due to respiration). Probably the only major endothermic metabolic process on the planet is photosynthesis, and from there on for everybody else it's downhill exothermy.

Douglas Theobald · 28 March 2011

TomA said: Wouldn't there be loss due to metabolism of carbohydrates for energy production in the form of CO2? This also implies that an egg can't be a closed system because of the gas exchange required for metabolism.
Yeah, you'd also lose some weight from the CO2. I bet it's less than the weight lost from water, though. And yes, an egg is surely not a closed system. But an egg in a calorimeter is. (In fact, its an isolated system.)

Paul Burnett · 28 March 2011

JimNorth said: Oh, and the mass of the egg does decrease during incubation by around 2.5% indicating a loss of energy through the eggshell.
Do you mean that 2.5% of the mass of an egg is converted into energy? E=mc2 and all that? Can we harness that effect? Here I always thought that the egg loses a bit of weight because it loses moisture through the eggshell...?

mrg · 28 March 2011

Paul Burnett said: Do you mean that 2.5% of the mass of an egg is converted into energy? E=mc2 and all that? Can we harness that effect?
Somehow I am reminded of Isaac Asimov's PATE DE FOIE GRAS: http://en.wikipedia.org/wiki/P%C3%A2t%C3%A9_de_Foie_Gras_%28short_story%29

mrg · 28 March 2011

John Vanko said: Well, I think the ID crowd sees the problem now. Just look at MathGrrl's forum on UD. They can't give her a decent answer.
That's never been an obstacle before, and I doubt they see the problem now. If they had been inclined to, they would have never gone down this road to begin with.

Stanton · 28 March 2011

mrg said:
John Vanko said: Well, I think the ID crowd sees the problem now. Just look at MathGrrl's forum on UD. They can't give her a decent answer.
That's never been an obstacle before, and I doubt they see the problem now. If they had been inclined to, they would have never gone down this road to begin with.
That is true: Creationists and Intelligent Design proponents have never seen how honesty could ever be profitable.

Mike Elzinga · 28 March 2011

Douglas Theobald said:
Mike Elzinga said: Whether some individulal processes are endothermic (need an energy kick over a potential barrier) or are exothermic (are releasing energy as they drop into lower potential energy states) is not the issue I understood was being discussed.
I did keep asking you about an egg in a calorimeter. But in any case I think you're probably wrong. For most animals, I'm going to bet that overall they are exothermic (put me or a dog or chicken or a snake or a cricket or a ... in a calorimeter, and we'll give off heat due to respiration). Probably the only major endothermic metabolic process on the planet is photosynthesis, and from there on for everybody else it's downhill exothermy.
I thought the egg was a bit off the direct path of discussion. I am currently traveling and not always near a computer. I am not sure why you are saying life is exothermic based on only one of the processes in the cascade. How long can any process remain exothermic? The laws of thermodynamics require energy cascades. Condensing matter starts with matter in higher energy states that then release energy in order to drop into mutual potential energy wells. But it requires energy input to form some stable compounds, some of which are the building blocks of life. That energy comes from cascades. And when we get to the level of complexity of the weakly coupled complex systems in living organisms, a heat bath is needed to drive them and maintain them within a very narrow energy window. Such systems are constructed of building blocks that were built in much more energetic environments, and for which the products got shuttled into less energetic environments. I would not have chosen to single out endothermic and exothermic processes in describing living systems. They are only a very small part of what goes on in such systems; and they don't form a major characterization of such systems. In stepping back and looking at the fact that living systems exist in a very narrow energy window and cease to function outside that window, these systems exhibit all of the same characteristics of any energy-driven complex system maintained within a range where it can "flop around" yet not come all apart.

John Vanko · 28 March 2011

mrg said: "If they had been inclined to, they would have never gone down this road to begin with."
Alas, I have been guilty of acting, and speaking, before thinking more times than I care to remember. A human flaw, or rather trait, without regard for religious or scientific inclinations. But when shown my error, I usually back away or change my mind. Not so ID-ers, or IBIG, or FL.

Douglas Theobald · 28 March 2011

SWT said: To get a handle on the entropy production of a subject in a calorimeter (in this case the egg and its inhabitant), you have to work simultaneously with the mass, energy, and entropy balances for the system.
I'm really skeptical -- but if you are doing something exceptionally clever, I'm quite interested. As I understand it, the entropy change of the system is never directly measured, only inferred from other state function changes that are directly measured (from, for instance, ΔG or Keq and ΔH). If we have a calorimetric enthalpy change, and we can get the equilibrium constant for a reaction (and there's usually some experimentally tractable way to do that), then getting the entropy change of the system is easy. The problem here is getting an equilibrium constant or something analogous -- it's hard to even imagine what it could be in the case of an egg hatching a chick.

Gary Hurd · 28 March 2011

I can only think of (most) fish, and some amphibian eggs that don't require external warming. Not even all amphibians eggs can survive without external heat during incubation, and some fish (esp. Surf Perch: Embiotocidae) internally provide O2 and nutrition to embryos/larva.

Why is this still interesting?

Again today, I suspect you are far smarter than I am, and are enjoying a joke I have failed to understand.

Gary Hurd · 28 March 2011

David Utidjian said: I prefer my eggs poached and runny.
Runny, but not too runny. And, I prefer using a poaching liquid of beef stock and sherry - a Jerez solera of course.

k.e., · 28 March 2011

Tulse replied to comment from Gabriel Hanna | March 28, 2011 3:21 PM | Reply | Edit
Gabriel Hanna said:you can’t just invent conservation laws.

If you invent one conservation law, does another have to go away?

Yup .....The Lemon Test/Law

Joe Felsenstein · 29 March 2011

Thanks, folks, for an interesting thread so far. No trolls disturbed it (even though one commenter used Troll in their name), unless you count the boiled chicken subthread.

A few thoughts:

1. Maybe I should not have tried to formally define CSI. My point was not that there is some way to make a formula, but that Dembski's argument can skip all the information theory and just rely on the self-evident fact that real organisms are far out in the tails of the distribution of fitness (or flying ability, or swimming speed), so far that mutation alone (no natural selection allowed) could not have got them there. I think my definition will be useful in some future model, but it got everybody focussed on whether one could precisely define it. So let's forget it for now. My point was really that one didn't need to. And that the fatal problem of the Design Inference was elsewhere in the Law which was supposedly able to rule out natural selection as the explanation for the high level of adaptation.

2. Dembski's conservation law sort-of-isn't a conservation law., For evolutionary forces that carry out a deterministic one-to-one mapping in genotype sequence space, he sort-of sketched a proof (a flawed one, as it turned out). For evolutionary forces that had randomness involved, such as random mutation or genetic drift, he didn't get as far, but one can see where he was headed. He chose the 500-bit threshold (for calling the result Complex) on that grounds that random trials in this universe, by all particles since it started, could not be expected to get you any farther than that. (He never dealt with many-to-one mappings, as far as I know). Of course the whole thing, even if formalized and corrected somehow, fails on the grounds that it changes the specification in midstream. Anyway, with the threshold of 500 bits, it is not really conserving anything. More like trying to put a bound on it.

I notice that at Uncommon Descent there is a new post by vjtorley arguing that they don't have to precisely define CSI. However they are still avoiding dealing with the criticisms that Shallit, Elsberry, and I made of the Law of Conservation of Complex Specified Information. And those criticisms are lethal. And any attempt to declare that seeing CSI (even if you can formally define it) implies Design (or anyway, not-natural-selection) founders in the absence of a Law like that. Something that rules out natural selection in explaining why fitness is so high, or why birds can fly, or why fish can swim.

Now I will let you go back to Kentucky Fried endothermic chickens and eggs.

Rolf Aalberg · 29 March 2011

Gabriel Hanna said: I'd love to ask Dembski about the Noether theorems. For every quantity that is conserved, there is some associated invariance in physical law. conservation of energy requires that physical laws don't change with time conservation of momentum requires that physical laws don't change with position If "specified information" is conserved, what invariance in physical law corresponds to it?
From my layman's perspective, is that another way of expressing the question I've been asking myself: Since life is a physical thing, what use is (conservation of) CSI if it can't be tied to physics?

Rolf Aalberg · 29 March 2011

Mike Elzinga said:
OgreMkV said: The temperature of the environment only affects the temperture range of the interior of the egg (providing for optimum enzyme reactions). The energy used for the growth and development of the chick is entirely contained within the yolk of the egg. I think the shell is not a gas barrier. If that is correct, then the egg will lose energy (most as heat) as the structure within develops (the chick).
Energy storage may be part of the confusion. I don't know of any example of a living organism that does not exist within an energy cascade and within a narrow energy window within that cascade. Any system that is driven by some form of energy flow is not going to violate any laws of physics. It is not possible to have any kind of system that puts out more energy than it takes in. Whatever stored energy it has eventually gets used up unless replenished. Replenishing that energy takes at least as much energy as what is replenished. I think most living systems are quite inefficient in their use of energy. The net flow though them over their lifetimes may be zero, but in the overall process, more energy is spread around and the overall entropy of the universe increases.
There's got to something to that: I sleep both summer and winter with only a thin sheet as cover but quite often I'll wake up hot as hell during the night and have to let some of that heat out. But there's more to it: My wife complains of freezing even with the $1000 down duvet I bought her. Now that's a thermodynamic riddle far more important than CSI to me. (This thread needs a little diversion, doesn't it?)

Frank J · 29 March 2011

I’d like to give a perspective that may be unpopular here. I don’t think Complex Specified Information is a vacuous concept, though we usually do not have enough information to actually calculate numbers for it.

— Joe Felsenstein
Pardon my late 2c, especially if this was covered in the 126 comments so far: That perspective is not unpopular with me. While I find Dembski's filter worse than vacuous, in that it exploits public misconceptions that chance, regularity and design are mutually exclusive, I still can't rule out that there may someday be some utility for the CSI concept itself. What I have seen constantly over 10+ years, though, are 2 things, one good, and one bad. The good one is that, even if successful, CSI would not provide any alternative to evolution. And no one makes that clearer than the DI gang itself. Either they fully concede ~4 billion years of common descent with modification, or spin vague arguments against it that are unrelated to their "complexity" games, or just play dumb about it. It's like they are collectively shouting "we have nothing, but if we ever do, it will be indistinguishable from evolution." The bad part is that is keeps the discussion on the "complexity" issue, which is usually too technical for most people. That' part of the DI's game, because it keeps the focus away from basic "what happened when questions" that would clearly show that the DI has absolutely nothing that would be any comfort to the average evolution-denier (who's usually some "kind" of Biblical literalist). What those deniers do hear are the occasional sound bite of incredulity that makes them erroneously conclude that evolution is "weak," and that that somehow validates their particular fairy tale - even though the DI knows it doesn't and sometimes even admits it! To be clear, I understand that technical responses to the DI's "complexity" arguments (including the "irreducible") ones are necessary and valuable. But once in a while the public needs to be reminded of how the DI runs from simple "what happened when" questions that they would like to see answered.

Venture Free · 29 March 2011

Wow, check out the massively obfuscatory post made by Mr. Torley: http://www.uncommondescent.com/intelligent-design/why-theres-no-such-thing-as-a-csi-scanner-or-reasonable-and-unreasonable-demands-relating-to-complex-specified-information/ I was going to point out all of the contradictions and undermining of his own thesis, but frankly that requires more time than I'm willing to put in. A few highlights (I'll paraphrase to limit the verbosity a bit)
  • Calculating CSI without any historical knowledge whatsoever can be done with just the barest minimum of historical knowledge.
  • Lack of knowledge can lead to inflated estimates of CSI. But of course there isn't anything we don't know about biology, so we can't make that mistake.
  • Our calculations of the amount of CSI are reliable because even if they're wrong the real value of CSI hasn't changed.
  • Our calculations of the amount of CSI are reliable because even if different people arrive at different values they will eventually just come to agree on which answer is the right one.
  • Quote: ...while the CSI of a complex system is calculable, it is not computable, even given a complete physical knowledge of the system.
I really want to point out the ridiculousness of a lot of his post, but I've been banned for a long time now, and I really don't feel like making a sock puppet.

harold · 29 March 2011

Douglas Theobald -
Harold – in thermodynamics, rates are irrelevant. Enthalpy is a state function. You seem to be confusing enthalpy with free energy (Gibbs free energy is the relevant state function here, being the measure of spontaneity at constant temp and pressure, which is roughly the condition for a chicken, especially an egg in our calorimeter). You can have energy input in endothermic and exothermic reactions. I was gently correcting Mike, as the fact that running a reaction at a lower temperature or a higher temperature does not tell us anything directly about its enthalpy change. And in this case, both reducing and increasing the temperature kill the reaction, so something is wrong with his reasoning.
Thank you for the clarification. It has been a long time since I had to do calculations involving enthalpy and Gibbs free energy, but it is important to recall that biomedical science sits on and is consistent with basic physics and chemistry. I am so used to seeing Mike's basic points about thermodynamics contradicted by ignoramuses that I mistakenly jumped to the conclusion that you were wrong if you were disputing him. I see now that you were providing helpful feedback. I don't think this discussion is entirely off track, because I think the analogy between creationist misuse of the term "information" and creationist misuse of the terms "entropy" and "thermodynamics" is a valid one. In fact, I would add creationist misuse of the term "probability" to this list. In all cases, the creationist is attempting to distract from the evidence for evolution by claiming to "disprove evolution from above", ostensibly with claims from physics or mathematics. Yet cursory examination reveals that the claims about math and physics are false as well.

Douglas Theobald · 29 March 2011

Mike Elzinga said:
Douglas Theobald said: For most animals, I'm going to bet that overall they are exothermic (put me or a dog or chicken or a snake or a cricket or a ... in a calorimeter, and we'll give off heat due to respiration). Probably the only major endothermic metabolic process on the planet is photosynthesis, and from there on for everybody else it's downhill exothermy.
I am not sure why you are saying life is exothermic based on only one of the processes in the cascade. How long can any process remain exothermic?
A long time, given the proper energy source. You give me carbs and protein and fat and I'll remain exothermic until I die (and even after, if you count the bacteria and whatever that makes use of me then). It all goes back to photosynthesis, which is the main endothermic injection in life. The rest of us are riding on the coat-tails.

Douglas Theobald · 29 March 2011

harold said: I don't think this discussion is entirely off track, because I think the analogy between creationist misuse of the term "information" and creationist misuse of the terms "entropy" and "thermodynamics" is a valid one. In fact, I would add creationist misuse of the term "probability" to this list.
I guess I'm Joe's thermo-troll. But I agree, it's not completely off-topic. To retrace the thread here and bring it round -- some people have criticized Joe's CSI concept because it's impossible to measure, in practice, in a bird (you brought up birds first, Joe :). I countered that it is actually measurable in simple systems (and has been measured). This is analogous to the similar creationist argument using the second law of thermo. The entropy change of a system is also measurable in simple systems, but once you get past trivial reactions it's practically impossible to even estimate. Like the entropy change of a bird during its development. Entropy, however, is not a vacuous scientific concept, I think all would agree, and so this particular argument against Joe's CSI doesn't show that it is vacuous either. It's the misuse of these concepts that's the problem. I'll argue that it's all misuse of probability, at core. Information theory is just a sub-discipline of probability, and entropy is also a probability concept (being exactly proportional to the log of the probability that you can predict the exact microstate a system is in at any given moment).

mrg · 29 March 2011

Venture Free said: Wow, check out the massively obfuscatory post made by Mr. Torley: http://www.uncommondescent.com/intelligent-design/why-theres-no-such-thing-as-a-csi-scanner-or-reasonable-and-unreasonable-demands-relating-to-complex-specified-information/
What's really amusing is that silly person Torley honestly thinks he knows what he's talking about. I glanced over it and saw a hodgepodge of gibberish.

Gabriel Hanna · 29 March 2011

Rolf Aalberg said: From my layman's perspective, is that another way of expressing the question I've been asking myself: Since life is a physical thing, what use is (conservation of) CSI if it can't be tied to physics?
I was thinking about it this way: Dembski says something like "natural laws can't produce new complex specified information". A snowflake is produced by natural laws, for example, so whatever information it has, I guess, isn't "complex specified information". Well, natural laws are fundamentally physical laws, so if natural laws conserve CSI, whatever it is, there must be an invariance in physical laws that corresponds. So if it were rigorously defined, we could figure out what the invariance would have to be and see if the laws of physics have that invariance. But I guess that Dembski's "conservation" law isn't one really. Insofar as there is any sense to what he says, I think he's saying that natural laws "can't" produce information in the same sense that you "can't" throw a glass of water into the ocean, wait a few weeks, and scoop up in your glass the very same molecules you put in. It's POSSIBLE, but so unlikley it can't be expected to happen in any amount of time the universe has had available.

OgreMkV · 29 March 2011

Rolf Aalberg said: There's got to something to that: I sleep both summer and winter with only a thin sheet as cover but quite often I'll wake up hot as hell during the night and have to let some of that heat out. But there's more to it: My wife complains of freezing even with the $1000 down duvet I bought her. Now that's a thermodynamic riddle far more important than CSI to me. (This thread needs a little diversion, doesn't it?)
What's even worse is that my wife insists on transferring some of my carefully stored heat to her feet by conduction. Back to the topic at hand: So, since everyone acknowledges that CSI can't be calculated, then should we start demanding proof of the conservation of information, or is it irrelevant now? ID is built on some seemingly simple concepts, all of which are false. Me, I'm just going to keep asking for the same stuff just to show anyone who asks that, no IDers can't do what they claim to do.

mrg · 29 March 2011

Gabriel Hanna said: I was thinking about it this way: Dembski says something like "natural laws can't produce new complex specified information". A snowflake is produced by natural laws, for example, so whatever information it has, I guess, isn't "complex specified information".
If you go along with the argument long enough, you find out that the only supposedly natural structure that contains "information" is the genome. Essentially, even on their own turf a better work would be that the genome contains "instructions", like a blueprint or a recipe or computer program or whatever. Basically, the idea is that somebody had to design the blueprint or recipe or program ... and so, reasoning by analogy, the genome being kinda sorta like those things is necessarily Designed, with the reasoning by analogy concealed by mathematical handwaving and inventive use of terminology. The beauty of this argument from their point of view is that heredity, as embodied in the genome, is a distinct feature of life not found in nonlife, and so they can declare that life is Designed while nonlife is not.

mrg · 29 March 2011

Make that "on their own turf a better word would be".

k.e., · 29 March 2011

mrg

...maybe self designed as per requests from the customer -the enviroment which includes the self designers offspring.

Why invoke actors?

Natural Selection may be an agricultural pre-modern term but at least it doesn't invoke ghosts.

The whole ID arguement is a stage trope invoking Deux Ex Machina to satisfy the stupidest in the audience.

Free entry and "Please buy my book/CD/DVD"

DS · 29 March 2011

mrg said: The beauty of this argument from their point of view is that heredity, as embodied in the genome, is a distinct feature of life not found in nonlife, and so they can declare that life is Designed while nonlife is not.
Gee, now let's think about this. What else can genomes do that nothing else can do? I know, random mutations. What else can life do that nothing else can do? I know, cumulative selection. So, once again, the argument just boils down to refusing to admit that random mutation and natural selection cannot produce adaptations. Once again, unless it can be shown that adaptations do not increase information, then the whole thing falls apart. How could random mutation and natural selection NOT produce adaptation? How could adaptations NOT increase information? How could anyone argue that microevolution is just fine, but adaptations are not? How can anyone take any of this nonsense seriously?

mrg · 29 March 2011

DS said: How can anyone take any of this nonsense seriously?
You don't mean me, do you?

John Kwok · 29 March 2011

DS said:
mrg said: The beauty of this argument from their point of view is that heredity, as embodied in the genome, is a distinct feature of life not found in nonlife, and so they can declare that life is Designed while nonlife is not.
Gee, now let's think about this. What else can genomes do that nothing else can do? I know, random mutations. What else can life do that nothing else can do? I know, cumulative selection. So, once again, the argument just boils down to refusing to admit that random mutation and natural selection cannot produce adaptations. Once again, unless it can be shown that adaptations do not increase information, then the whole thing falls apart. How could random mutation and natural selection NOT produce adaptation? How could adaptations NOT increase information? How could anyone argue that microevolution is just fine, but adaptations are not? How can anyone take any of this nonsense seriously?
The problem with IDiots and other creos is that they assume that mutations are truly "random", without understanding that such mutations are constrained by the prior phylogenetic history of the population in which they occur. That is why no "random" mutation would ever produce a "crocoduck", especially when the lineages leading to crocodiles and birds had diverged from each other over the span of tens of millions of years.

mrg · 29 March 2011

"Random variation, directed selection."

"Directed by whom?"

"The Grim Reaper."

aagcobb · 29 March 2011

What I'm wondering is, whatever has happened at UD that they would allow a thread like mathgrrl's? Any theories?

Stanton · 29 March 2011

aagcobb said: What I'm wondering is, whatever has happened at UD that they would allow a thread like mathgrrl's? Any theories?
Either they've spontaneously developed a lack of quality control, or they're attempting to show the Scientific Community that Intelligent Design proponents can ask questions, too.

OgreMkV · 29 March 2011

aagcobb said: What I'm wondering is, whatever has happened at UD that they would allow a thread like mathgrrl's? Any theories?
My working theory is that Denise O'Leary is the deepest sock puppet EVER!!!!

Joe Felsenstein · 29 March 2011

aagcobb said: What I'm wondering is, whatever has happened at UD that they would allow a thread like mathgrrl's? Any theories?
I think they are embarrassed by the way UD has become an echo chamber -- with voluminous irrelevancies dumped into the comments section by commenters like "bornagain77" and an endless stream of off-the-wall posts by Denyse O'Leary. Not a very good showpiece for the way ID is supposed to lead to advances in research. It makes the failings here at PT -- such as all the troll-chasing -- look good.

Mary H · 29 March 2011

The egg is a self contained unit with sufficient nutrition to develop a chick. The incubation temperature is needed for enzyme function until late in the incubation when the temperature must be dropped a little because the chick begins to make its own heat. The weight loss is primarily evaporation through the shell. Of course the shell is a gas exchange medium how else would the aerobic chick breath?

Henry J · 29 March 2011

I have to wonder, what's the point of trying to figure out the entropy of something as complicated as an egg?

Scott F · 29 March 2011

DS said: How could adaptations NOT increase information?
These are the same people who argue assert that duplicating an entire gene does not increase the "information" in a genome. So to answer your question, adaptation could not increase information by definition. Any "apparent" increase in "information" is only a misunderstanding of the definition of "information".

Henry J · 29 March 2011

Yeah, they like to point out that one step of a multiple-step process does not by itself complete the entire process. To that one might say "so what?".

Duplication of a DNA sequence can increase redundancy, in that subsequent changes to one copy won't break the other one, so that the original function remains intact. (At least that's how I understand it.)

OgreMkV · 29 March 2011

Depends on what you mean by 'information'. It does take slightly more effort to transmit two copies of something, even using compression, than it does to transmit one copy of the same thing (using the same compression).

By that definition, it is an increase in information. What ID always does in conflate 'information' with 'meaning', which is totally incorrect.

Of course, a simple point mutation can then alter the meaning of one sequence which changes the information content as well, resulting in more information (by any definition) and more meaning (using the conflated version of meaning).

eric · 29 March 2011

OgreMkV said: Depends on what you mean by 'information'. It does take slightly more effort to transmit two copies of something, even using compression, than it does to transmit one copy of the same thing (using the same compression).
Its not only that; sequence duplications can change the amount of protein produced, which can dramatically change development. IIRC, there's a couple of birth defects that result from having either too few or too many repeats. In this case the DNA = recipe analogy is useful...if you remember that your cells are very dumb bakers. When they see "add a cup of flour add a cup of flour" they aren't always smart enough to interpret the repeat as a typo that should be ignored. Instead, sometimes they read "add a cup of flour add a cup of flour" and add two cups of flour. Which can have a very dramatic effect on the cake. :)

Henry J · 29 March 2011

Is that what one might call flour power?

mrg · 29 March 2011

Get a haircut, hippie.

harold · 29 March 2011

What I’m wondering is, whatever has happened at UD that they would allow a thread like mathgrrl’s? Any theories?
It was the only way they could get a female (other than Denyse (blanking on last name) from Toronto) to associate with the site. And I'm not 100% sure that Denyse posts there.

Mike Elzinga · 29 March 2011

Mary H said: The egg is a self contained unit with sufficient nutrition to develop a chick. The incubation temperature is needed for enzyme function until late in the incubation when the temperature must be dropped a little because the chick begins to make its own heat. The weight loss is primarily evaporation through the shell. Of course the shell is a gas exchange medium how else would the aerobic chick breath?
I’ve been traveling, so it has been hard for me to keep up with the thread. Now that I am home, I should clarify an apparent disagreement about the egg thing. My thinking about the energy cascades in which life exists and functions led me to say that the egg-to-chick transition was endothermic. I have no argument with Douglas Theobald’s point that there are exothermic reactions going on with stored materials in the egg. But I wonder if a chick would really be produced if the egg were kept in a calorimeter. It seems to me that, while there are certainly exothermic reactions that generate heat, this isn’t sufficient to sustain the process through to the complete development of a chick. That egg really does need to be held within a certain temperature range, and oxygen has to be brought in and any waste gases removed. After that the chick has to grow. Now elementary physics of the first and second laws of thermodynamics applied to any energy-using device or organism requires that the energy be conserved and spread around according to the second law. We can’t get around that. How that energy input gets divided up in triggering exothermic reactions that make the energy available from already stored chemicals, for bringing in matter that will be used in further processes later as well as adding to the growth of the system, for eliminating waste materials, for stimulating the electrical activity that coordinates interactions among subsystems, etc., does of course depend on the particular system. My argument is that all living systems require a net energy input over their lifetimes, thus giving them the net appearance of being endothermic over that period. I recognize that I can be temporally strictly exothermic as I hold my breath and type, for example. But that is not sustainable. The egg-to-chick transition might be one of these, but I’m not up on the details. Any living organism that endures ultimately takes in more energy than it uses in growth, physical activity, shuttling food and waste, and triggering stored energy releases. Any heat that is generated as a result of those energy dumps from stored chemicals can, of course, be used in any such processes as well as contributing to energizing other electrical signals and processes of coordination; and then the rest goes off as heat into the surrounding environment. The details are obviously system dependent; but the first and second laws hold for every individual living system as well as the entire collection of living systems taken as a whole system in itself. As I mentioned on a previous comment, I wouldn’t apply the terms endothermic or exothermic to a living system. Those terms apply to specific subsystems; and they change with time. And, related to the topic of the thread, I agree that the uses of entropy, “information,” and probability by the ID/creationists are some of the most egregious mistakes by anyone pretending to speak as scientists. Entropy, information, order/disorder, probability, and the asserted need for some “higher law” to countermand the laws of chemistry and physics are at the heart of all ID/creationists “scientific” arguments. Their constant word-gaming with medieval literature and authority keeps their thinking medieval. Apparently nothing of the Enlightenment has ever penetrated their thinking.

harold · 30 March 2011

Mike -
My argument is that all living systems require a net energy input over their lifetimes
This is 100% correct. It has been years since I was required to calculate things like Gibbs free energy (I enjoyed that stuff at the time). I do retain a very strong interest in the "economy" of the biosphere. All forms of life, over any reasonable time scale, net consume energy. That is unequivocal. All cellular life net consumes either direct solar energy, or chemical energy, or both. There is a great deal of inefficiency and a great deal of not-perfectly-efficient transformation from one form of energy to another along the way. It is very common for organisms to exude heat, but this should not be mistaken for net energy production. In fact, the better basic physical measurement for understanding the energy economy of the biosphere is power. A very large amount of solar energy hits the earth's surface per unit time. A proportion of that is harvested by photosynthesis, much of which is transformed into chemical energy (with imperfect efficiency). Although only a fracton of solar energy is consumed, photosynthetic organisms must compete with each other for that fraction. The chemical energy is then consumed by what can be very crudely conceived of as a vast number of "pyramids" of organisms, often with a high biomass of relatively small organisms that directly consume photosynthetic organisms on the bottom, and then layers of decreasing biomasses of larger organisms that increasingly consume solar energy less directly. However, large organisms that directly consume photosynthetic material are also present, so the concept of a pyramid is useful but very crude. Proportion of biomass tends to be related to individual size rather than directness of consumption, and even this approximation may only be true of multicellular organisms. At any rate, all life needs a power source for sustainability. Life has other requirements, such as water supply and ambient temperature ranges, which are independent of the need for power supply from photosynthesis or "food". Obviously, life cannot directly "consume" heat energy, but only solar energy and certain types of chemical energy. Ambient temperature can only impact by its effect on chemical reactions which keep organisms alive. It can be "ideal", be "suboptimal" (causing them to suspend metabolism/growth/reproduction and/or to consume MORE energy fuel adaptations to the temperature) or it can be fatal. While it is true that a human with a limited set of clothing and shelter options needs to consume more food per unit time, all else being equal, if working in cold weather versus working at an "ideal temperature", this is because extra energy (actually power) is required to maintain the human body at an internal temperature that is compatible with life. It is not at all because heat energy from the environment can be harvested to drive energy-input-requiring biochemical reactions.

harold · 30 March 2011

Mike -

Also note that Douglas Theobald is, as well, 100% correct.

Exothermic reactions release heat (I'm bothering to make obvious statements because someone other than the regular posters may read my comments). Many, many biochemical processes are unequivocally net exothermic. This is not at all at odds with the fact that the biochemistry of the biosphere ALSO requires a power source, and that all living cells, studied at any reasonable scale of space and time, consume, rather than generate power.

Again, those aliens in the Matrix movies are idiots. Of course you can feed a human being (or other homeotherm) and then use the human as a weak heat source. However, it would be far more efficient just to burn the food directly.

harold · 30 March 2011

That should be "net consume" rather than "net generate", of course.

eric · 30 March 2011

Shorter Theobald: if you put a charged battery in a calorimeter, its can release stored energy into that closed system.

Shorter Elzinga: right, but since no battery is 100% efficient at storing energy, its always going to take more energy to charge the battery than you get out of it.

harold · 30 March 2011

eric -
Shorter Theobald: if you put a charged battery in a calorimeter, its can release stored energy into that closed system.
Your summary is great and insightful, but I would change "stored energy" to the more specific "heat", since Theobald is talking about whether reactions are endothermic or exothermic. But basically, great summary. Now to stick it to the ID/creationists one final time for this thread - 1) We're talking about thermodynamics and energy because we find their false claims about information, the actual topic here, to be highly analogous to their false claims about entropy. 2) There is no reason to think that biological evolution represents a net decrease of the entropy of anything. (It is actually borderline meaningless to make statements about something so impossible to determine.) 3) If it did, it wouldn't matter, because local decreases in entropy are common and do not at all require magic to occur.

Mike Elzinga · 30 March 2011

eric said: Shorter Theobald: if you put a charged battery in a calorimeter, its can release stored energy into that closed system. Shorter Elzinga: right, but since no battery is 100% efficient at storing energy, its always going to take more energy to charge the battery than you get out of it.
Very nice summary! Thank you. :-)

Douglas Theobald · 30 March 2011

Mike Elzinga said:
eric said: Shorter Theobald: if you put a charged battery in a calorimeter, its can release stored energy into that closed system. Shorter Elzinga: right, but since no battery is 100% efficient at storing energy, its always going to take more energy to charge the battery than you get out of it.
Very nice summary! Thank you. :-)
No, that's not a good summary at all. Many people here, including Elzinga, are confusing enthalpy (H) with energy (E) and with Gibbs free energy (G). Contrary to popular belief, spontaneous reactions do not require energy input -- what they require is free energy (and I'm using this in the technical sense). For any and all reactions, including the development of a chick in an egg in a calorimeter, energy is always conserved -- that's the first law of thermo. The total energy is the same at the beginning and at the end of the reaction. Period. On the other hand, a charged battery can release free energy to power some reaction, but that DOES NOT mean that the reaction is releasing heat (i.e., that it is undergoing an exothermic process). Endothermic processes can be spontaneous, and they can release just as much free energy as an exothermic process (or more).

Mike Elzinga · 31 March 2011

Douglas Theobald said:
Mike Elzinga said:
eric said: Shorter Theobald: if you put a charged battery in a calorimeter, its can release stored energy into that closed system. Shorter Elzinga: right, but since no battery is 100% efficient at storing energy, its always going to take more energy to charge the battery than you get out of it.
Very nice summary! Thank you. :-)
No, that's not a good summary at all. Many people here, including Elzinga, are confusing enthalpy (H) with energy (E) and with Gibbs free energy (G). Contrary to popular belief, spontaneous reactions do not require energy input -- what they require is free energy (and I'm using this in the technical sense). For any and all reactions, including the development of a chick in an egg in a calorimeter, energy is always conserved -- that's the first law of thermo. The total energy is the same at the beginning and at the end of the reaction. Period. On the other hand, a charged battery can release free energy to power some reaction, but that DOES NOT mean that the reaction is releasing heat (i.e., that it is undergoing an exothermic process). Endothermic processes can be spontaneous, and they can release just as much free energy as an exothermic process (or more).
First law: Energy is conserved. Second law: Energy gets spread around. Enthalpy and Gibbs free energy are convenient constructions that allow for work that is done against ambient pressure as well as for changes in entropy and other phase changes in which energy is stored or released in the rearrangement and/or binding of atoms. One can also account for particles going in and out of a system. There is an entire set of Maxwell relations one can use depending on which state variables one is dealing with and what proportion of the total energy one is attempting to get a handle on. They allow one to use various system state variables in measurements that one can actually do in the lab. Chemists make good use of these things; and they should. I know the difference; and I meant energy. Physicists tend to think in terms of the energy and are aware of that part of it that goes into molecular bonding and rearrangements, or escapes through radiation, conduction, and convection, or works against the surrounding environment. Those various Maxwell relations simply allow one to get at the proportions of the energy sent into whatever channels are available. But in the end, it’s all about the energy. The energy is contained in the kinetic energies of particle motions and stored in the fields with which these particles interact. Every system is comprised of matter (particles) interacting with fields. And the laws of thermodynamics apply to any such system no matter the level at which one looks.

JimNorth · 31 March 2011

C.P. Snow had something to say about thermodynamics:

Zeroth: "You must play the game."
First: "You can't win."
Second: "You can't break even."
Third: "You can't quit the game."

Douglas Theobald · 31 March 2011

JimNorth said: C.P. Snow had something to say about thermodynamics:
So did Arnold Sommerfeld: "Thermodynamics is a funny subject. The first time you go through it, you don't understand it at all. The second time you go through it, you think you understand it, except for one or two small points. The third time you go through it, you know you don't understand it, but by that time you are so used to it, it doesn't bother you anymore. " But what he forgot to mention is that the process cycles :)

Mike Elzinga · 31 March 2011

Douglas Theobald said:
JimNorth said: C.P. Snow had something to say about thermodynamics:
So did Arnold Sommerfeld: "Thermodynamics is a funny subject. The first time you go through it, you don't understand it at all. The second time you go through it, you think you understand it, except for one or two small points. The third time you go through it, you know you don't understand it, but by that time you are so used to it, it doesn't bother you anymore. " But what he forgot to mention is that the process cycles :)
:-) Many years ago, thermodynamics and statistical mechanics were separate courses in most physics departments. One of the frequent comments by students was, “Thermodynamics made no sense to me until I took statistical mechanics.” But, as it turned out, it often didn’t matter which order one took those two courses; so students who took statistical mechanics first would say, “Statistical mechanics made no sense to me until I took thermo.”

Henry J · 31 March 2011

But, as it turned out, it often didn’t matter which order one took those two courses; so students who took statistical mechanics first would say, “Statistical mechanics made no sense to me until I took thermo.”

Interesting! It sounds like one of those courses focuses on the theory, and the other on the methods used in dealing with it. Henry J

Mike Elzinga · 1 April 2011

Henry J said:

But, as it turned out, it often didn’t matter which order one took those two courses; so students who took statistical mechanics first would say, “Statistical mechanics made no sense to me until I took thermo.”

Interesting! It sounds like one of those courses focuses on the theory, and the other on the methods used in dealing with it. Henry J
That was approximately the case. The courses have since been combined into a single course, (and there is an undergraduate level and a graduate level course) with the result that it generally makes more sense. Thermodynamics courses were often taught axiomatically; and such courses tended to leave out applications. Those thermodynamics courses that did teach applications often left students wondering what all those thermodynamic potentials were all about. It was often the case that students were encountering multivariable functions and partial derivatives for the first time in this course. Without the statistical mechanics insights, many of those state variables and Legendre transformations were very mysterious. On the other hand, without the thermodynamics applications, there was no reference to what was being clarified by the statistical mechanics derivations and concepts. And often the thermodynamics that chemists learned was pretty much restricted to what they would typically use in the lab. So there was a lot of emphasis on things like enthalpy and the Gibbs and Helmholtz free energies, but little insight into how those connected through statistical mechanics to the atomic and molecular level. And as we have noted before, Frank L. Lambert has been active in getting misconceptions about entropy out of the chemistry textbooks. The current major weakness in these courses in physics appears to be relating all this to the elementary concepts of kinetic energy, potential energy, total energy, and matter-matter interactions. It is implicit in the derivations and development of concepts that matter interacts with matter and with fields, but my own opinion is that many textbooks treat this almost as an aside without pointing out the fundamental importance of it. One doesn’t really become thoroughly aware of this fact unless one has to get into the lab and deal with eliminating as many of these interactions as possible while still being able to probe the system to learn what is going on.

SWT · 2 April 2011

Douglas Theobald said:
SWT said: To get a handle on the entropy production of a subject in a calorimeter (in this case the egg and its inhabitant), you have to work simultaneously with the mass, energy, and entropy balances for the system.
I'm really skeptical -- but if you are doing something exceptionally clever, I'm quite interested. As I understand it, the entropy change of the system is never directly measured, only inferred from other state function changes that are directly measured (from, for instance, ΔG or Keq and ΔH). If we have a calorimetric enthalpy change, and we can get the equilibrium constant for a reaction (and there's usually some experimentally tractable way to do that), then getting the entropy change of the system is easy. The problem here is getting an equilibrium constant or something analogous -- it's hard to even imagine what it could be in the case of an egg hatching a chick.
I want to expand a little on my previous comments about this, add a little information, and in the process modify my position a bit. My original comment that one should be able to measure the change in the entropy of an egg during gestation was based on knowing that measuring the entropy production during such a process is well-documented. (It is in fact the entropy production that is calculated by merging material, energy, and entropy balances.) Engineer that I am, my natural response was that if I know the entropy production, I can integrate that over the course of the process to get the entropy change for the system. I spent a some time this week looking in to this more thoroughly. I am reasonably familiar with the basic NEQ development of the dissipation function, but not with the particular application -- I knew that a former colleague did some work with entropy production, calorimetry, and aging but never got in to the details. For processes that are changing slowly during the calorimetric measurement, I think the entropy production measurements are fine for their intended uses. However, there is an implicit pseudo-steady-state assumption for the total rate of entropy change built in to the analysis that makes the methodology inappropriate for measuring the total energy change of the system as I'd envisioned. I'm sort of bummed about this, because I think it would be a cool study to do. I did want to make a comment about the calorimetric measurements in the literature. The few gestation experiments I found were done in isothermal flow calorimeters, which are very well controlled open systems. It would be difficult to do these studies in a closed, adiabatic calorimeter since the gestating chick needs oxygen and will overheat if the excess metabolic heat isn't dumped somehow. I'm not convinced that the experiment can't be done, but I have to admit that I haven't found a way through the analysis to the result of interest. More to come if I need more distraction from things I'm actually supposed to be working on ...

Mike Elzinga · 5 April 2011

SWT said: ... I'm not convinced that the experiment can't be done, but I have to admit that I haven't found a way through the analysis to the result of interest. More to come if I need more distraction from things I'm actually supposed to be working on ...
Between my traveling and getting behind on everything else, I haven’t had much time to comment. Living organisms are difficult to study not only because they are so complicated, but also because they are much more “delicate” (i.e., bound together by much smaller potential energies). One not only has to account for all matter and energy flowing into and out of the system, the system itself has to be kept operational within rather narrow temperature and pressure limits. Those narrow temperature and pressure limits are to keep the internal processes functioning. So this suggests approximately isothermal and isobaric measurements; and these measurements are going to have to measure flow rates of gasses, food and waste (their energy equivalents), as well as heat flow and temperatures on a designated “input” and “output” of the system. The need to provide an ambient temperature and pressure makes those heat flow and temperature difference measurements very difficult. Temperature differences between “input” and “output” are going to be small; and this will require sufficient sampling rates along with proper averaging and standard deviation measurements. The same can be said for flow rates. There are other issues to consider as well. Many processes in living systems are thermally driven. There is nothing strange about thermally driven processes in condensed matter systems; they happen at nearly every level; even in some of the simplest condensed matter systems. Thermocouples and any phonon driven voltage gradients depend on temperature gradients. Within polarized molecular assemblies, electrons and charged or polarized molecules can be made to flow just by providing an ambient temperature and tiny gradient. Any slight differences in mobility from one region to another can result in thermally activated flow of matter. And because the depths of the mutual potential wells in condensed matter are so shallow (on the order of tenths of an eV for solid materials at room temperature down to a few hundredths of an eV for matter near its liquid state at room temperature), there are many ways that processes can be initiated or driven just by maintaining them within a narrow temperature range. We have the additional issue of energy being released from stored chemicals previously shuttled into such a system as well as any activation energies required to release that energy. Those activation energies can come from thermally driven processes if the barrier heights are on the order of binding energies of the constituents of the system. For living systems, those will be on the order of a few hundredths of an eV. The calorimetric measurements done in a chemistry lab typically deal with far higher energies; especially if they are used in measuring chemical reactions which take place in the range of an eV or so. The kind of calorimetry one has to bring to bear on processes taking place in the range of a few hundredths of an eV is going to have to be much more delicate and clever. Some of these kinds of measurements are already done with animals and humans that can cooperate with the experimental arrangements that measure the flow of matter and energy into and out of their systems. But still, these measurements are relatively crude. Alternatively, one can study individual subsystems of living organisms and then reconstruct the energy flows of the assembled total organism.