Since we have seen some poorly argued claims about entropy and its relevance to evolution, I will explore the concepts of entropy as they apply to genome evolution and will show that the evidence shows how simple processes like variation and selection are sufficient to explain the evolution of complexity or information/entropy in the genome.
While various ID authors (here and elsewhere) have argued that such natural processes are unable to explain the evolution of information in the genome, it should be clear that the actual evidence contradicts any such suggestions.
In the past I have argued with various people on the topic of entropy. Jerry Don Bauer a.k.a. Chronos has shown some interesting confusions as to the concept of entropy and believes that he has shown that using the laws of entropy he has shown that macro-evolution could not have happened.
First some background information
Jerry defines entropy and shows that entropy is always positive (no surprise here since entropy is the log of a number larger than or equal to 1. Based on the fact that entropy is positive he concludes that the tendency is positive and thus complex macro evolution has been disproven:
S = log2W, S = log2(100,000,000), S = 26.5754247590989, therefore S is positive showing a tendency of disorder. Complex macroevolution would have violated one of the most basic and well proven laws of science. And since we know that nothing violates a law of science as a tendency, we can most assuredly conclude that complex macroevolution never occurred.
Jerry can be seen backtracking in later responses:
I certainly do not mean to imply that this is my work: “if W, the number of states by some measure, is greater than 1 then S will be positive by your formula. Thus any number of states will be “showing a tendency of disorder.” This is not my work and was done much earlier by such greats as Boltzmann and Feynman et al.
further backing up and further obfuscating
I did state that that if S is positive, entropy is increased. And this is not a tendency in this case. It’s a fact of this specific example. I would ask you to examine your logic. If entropy increases then disorder has occurred. If S is positive then entropy has increased because S’ IS the entropy we are considering. If you are going to continue in this vein of logic, then I will have to ask you to show how that the tenets of thermodynamics is just wrong in that everyone has it backward. Rising entropy denotes order and decreasing entropy denotes disorder.
Another whopper
P1: With every generation in homo sapien, entropy increases in the genome.
P2: Complex macroevolution requires that the genome have a lower entropy over time through the generations.
Therefore, complex macroevolution did not occur
Gedanken quickly exposes the fallacies in Chronos’s argument
By the way, Chronos has not demonstrated either of his premises P1 nor P2.
He has not demonstrated that the entropy must be increasing, simply because his argument confuses the positive value of entropy with a delta or change of entropy in a positive direction. Even if there were an argument that demonstrated this was a positive delta, Chronos has decided not to give such an argument and relies on the value being positive — an irrelevant issue.
Then Chronos has not demonstrated that change over time requires an decrease in entropy. (Or any particular change in entropy — for example changes occur and they are different, but they have the same number of informational or microstates and thus S has not changed.)
Anyone can decypher this one?
Begging your pardon, but it’s not me saying that when entropy is positive it “tends” toward disorder. When entropy is positive there is no longer a tendency involved. It has already happened. The reaction is over and a passed event. Therefore the term tendency no longer applies. And anytime entropy is positive the system has disordered:
Gedanken explains what is wrong with Chronos’s argument
So what is wrong with Jerry’s claims? Other than the confusion of tendency and value that is.
In fact some excellent papers are published by
which show how contrary to Jerry’s claims, entropy in the genome can decrease through the simple processes of variation and selection.
Despite the fact that Jerry seems to be blaming Feynman for his errors, it should be clear or soon become clear that Jerry is wrong.
I encourage the readers to pursue the thread I pointed out in which one can see how several people make significant effort to address the confusions exhibited by Jerry. If anything it shows why the abuse of mathematics appears to be so widespread.
As I have shown in some detail above, a correct application of entropy is not that complicated.
The following is a more indepth introduction to the exciting findings about entropy and information/complexity.
Schneider provides us with some interesting data
Information/entropy increase/decrease
Note how the information increases from zero to about 4 bits
From PNAS we find
Fig. 3. (A) Total entropy per program as a function of evolutionary time. (B) Fitness of the most abundant genotype as a function of time. Evolutionary transitions are identified with short periods in which the entropy drops sharply, and fitness jumps. Vertical dashed lines indicate the moments at which the genomes in Fig. 1 A and B were dominant.
In Evolution of biological complexity Adami et al show
To make a case for or against a trend in the evolution of complexity in biological evolution, complexity needs to be both rigorously defined and measurable. A recent information-theoretic (but intuitively evident) definition identifies genomic complexity with the amount of information a sequence stores about its environment. We investigate the evolution of genomic complexity in populations of digital organisms and monitor in detail the evolutionary transitions that increase complexity. We show that, because natural selection forces genomes to behave as a natural “Maxwell Demon,” within a fixed environment, genomic complexity is forced to increase.
The approach is very simple first assume a genome with site i which has the following probabilities for the four nucleotides involved
One can show that the entropy for this site can be calculated to be
And the entropy tendency or information can be defined as
Now sum over all sites i and you find that
the complexity or information is given by
Figure 3 above shows how entropy after an initial increase decreases at the same time the fitness increases. This information increase/entropy decrease is exactly what happens when selection and variation are combined. Figure 3 shows some beautiful examples of evolutionary transitions.
I am not the only one who has reached this obvious conclusion
Andya Primanda addresses the claim that “Can mutations increase information content? ” from Chapter 3 of The Evolution Deceit by Harun Yahya.
Some excellent websites which expand on the materials presented here can be found
Adami: Evolutionary Biology and Biocomplexity
and
ev: Evolution of Biological Information
A recent paper which identifies some problems with Schneider’s approach can be found here. Despite the problems, the authors recover most of the same conclusions.
Empirically, it has been observed in several cases that the information content of transcription factor binding site sequences (Rsequence) approximately equals the information content of binding site positions (Rfrequency). A general framework for formal models of transcription factors and binding sites is developed to address this issue. Measures for information content in transcription factor binding sites are revisited and theoretic analyses are compared on this basis. These analyses do not lead to consistent results. A comparative review reveals that these inconsistent approaches do not include a transcription factor state space.
Therefore, a state space for mathematically representing transcription factors with respect to their binding site recognition properties is introduced into the modelling framework.
Analysis of the resulting comprehensive model shows that the structure of genome state space favours equality of RSequence and RFrequency indeed, but the relation between the two information quantities also depends on the structure of the transcription factor state space. This might lead to significant deviations between RSequence and RFrequency .
However, further investigation and biological arguments show that the effects of the structure of the transcription factor state space on the relation of RSequence and RFrequency are strongly limited for systems which are autonomous in the sense that all DNA binding proteins operating on the genome are encoded in the genome itself. This provides a theoretical
explanation for the empirically observed equality.
121 Comments
~DS~ · 25 May 2004
TY, that was excellent!
charlie wagner · 25 May 2004
Pim van Meurs · 25 May 2004
Contrary to Charlie's suggestions that the issues are a red herring let me point out that it is the ID movement who is claiming that evolutionary mechanisms cannot explain the origin of information and complexity. Is Charlie suggesting that we blame Dembski for introducing the concept of entropy/information? Or in this case is Charlie arguing that Jerry's comments are irrelevant?
As far as organization is concerned, scale free networks, gene duplication all help understand such issues as modularity, degeneracy, robustness, evolvability.
The real red herring may be the suggestion that the issue is one of 'organization'. But let's focus on the issue at hand in this thread which addresses the arguments by ID proponents about information/entropy.
Jim Anderson · 25 May 2004
Unfortunately, neither entropy, complexity or information has anything at all to do with evolution. So I guess you could call this a "red herring".
Thermodynamics has nothing to do with biology? Call me an uninformed, ignorant layman--and you should, because I am--but that bald assertion strikes me as more than a little, um, radical.
Jim Anderson · 25 May 2004
Oh, and replace "biology" with "evolution." Duh.
charlie wagner · 25 May 2004
Pim van Meurs · 25 May 2004
Pim van Meurs · 25 May 2004
zed · 25 May 2004
I was fairly amazed by Chronos' assertion that "With every generation in homo sapien, entropy increases in the genome."
Now my thermodynamics is a bit rusty, but wouldn't that mean our offspring would progressively degenerate into piles of primordial goo? Isn't the continued presence of life predicated on at the least a zero net change in "entropy"?
And does anyone actually take this seriously?
Panda Bear · 25 May 2004
A good explanation of problems with common creationist arguments based on the SLOT can be found here or here
charlie wagner · 25 May 2004
chris · 25 May 2004
I think I have to agree that the issue of entropy in evolution is a bit of a red herring. First of all (and most importantly), the
order of nucleic acids in the genome is not actually subject to the laws of thermodynamics. It certainly has aspects that bear a resemblance to familiar concepts in thermo but others are unfamiliar. A current major area of research in physics is to develop a framework to describe certain nonequilibrium phenomenon (of which evolution is an example) with familiar concepts of thermodynamics and statistical mechanics. I want to be clear that I'm not saying there isn't some definition of "entropy" that applies to evolution and maybe it always increases or decreases or whatever - I think the hope is that there *is* such a thing, actually - I'm only saying that you can't just lift thermodynamics and apply it to anything you want.
Besides which, increasing entropy is not really associated with increasing disorder at all. That's just an analogy. There are several famous examples of situations where increasing entropy INCREASES ORDER.
Town Crier · 25 May 2004
Pim van Meurs · 25 May 2004
Ben · 25 May 2004
Thank you, Town Crier. I was just about to slash my wrists. Christ I'm sick of this "design" crap.
Jim Harrison · 25 May 2004
Unless the designer of ab organism can suspend the laws of nature, he or she remains subject to the Second Law. If naturally evolved organisms are impossible because of thermodynamic considerations, designed organisms are no less impossible.
Reed A. Cartwright · 25 May 2004
Art · 25 May 2004
About "Nelson's Law":
From the URL (and its "destinations") - "Life can be described as organization and Nelson's Law states that "things do not organize themselves". The evolution of life involves in increase in organization, a decrease in logical entropy. Nelson's law forbids this. Things cannot organize themselves without input from outside. "
I don't know what "Nelson's Law" really is, but Charlie (or anyone else reading this) can refute the characterization seen in this thread in their own kitchen. Pour some salad oil in a bottle, add some water (or vinegar), and mix thoroughly. Then let the mixture (which should be the disorganized state that "Nelson's Law" predicts will be the final state) sit - untouched, completely isolated from all influences. (Heck, we can be really anal and put it in total darkness.)
We all know what will happen - the completely disorganized mixture will spontaneously, completely of its own accord, without any input of energy, information, design, or any other influence, organize into two perfectly-separated phases. The "thing" will most definitely "organize itself". It'll happen each and every time, without fail.
What's really neat is that the same chemical principles underlie the majority (IMO, at least) of organization in biology.
Reed A. Cartwright · 25 May 2004
Art,
Nelson's Law is a tautology that Charlie came up with and named after his middle name.
charlie wagner · 25 May 2004
charlie wagner · 25 May 2004
Town Crier · 25 May 2004
Reed A. Cartwright · 25 May 2004
charlie wagner · 25 May 2004
Town Crier · 25 May 2004
Reed A. Cartwright · 25 May 2004
Nomen Nescio · 25 May 2004
does it seem to anyone else that naming a made-up "law" after one's own self-assigned nickname is an even more pretentious social faux pas than simply naming it after oneself?
not, of course, that that alone is any argument against the "law" in question. nonetheless, bad taste, n'est pas?
...
as for the "law" itself, i cannot say with certainty that it is flawed - frankly, it's too vaguely and verbosely stated for me to make a whole lot of sense of it. however, it does occur to me that - if it is to be of any use - it really ought to be able to determine whether or not the Oklo reactors were intelligently designed could anybody with a better sense of what Charlie's trying to say perhaps take a stab at working out if it does or not?
Art · 25 May 2004
Charlie,
Thanks for the clarification (I think). But I don't buy your distinction between "order" and "organization". At least in its entirety.
But even if I did - I think its easy to see how storms (tornadoes, hurricanes, et.c) are organized according to some of your rules (those that do not reduce your terminology to tautology), and I could probably dissect the oil-water system into an organized, as opposed to ordered, one as well.
So, either way, I don't see how "Nelson's Law" is of any particular use, except as a vehicle to wander into semantic hair-splitting. It's much easier (and more correct) to accept that "SLOT's", regardless of their derivation, simply cannot rule out evolution.
Jerry Don Bauer · 25 May 2004
LOL . . . This has got to be the silliest thing I've ever seen posted on the Net on "thermodynamics"-- To start with it's genetics and has nothing to do with anything I have ever discussed concerning devolution of the genome.
What's even more telling is the rest of the forum is like cool, go man, go . . . ..we are really doing some science here. PvM hasn't done a thing here with this nonsensical 'math' and I've seen him laughed out of other forums with this same stuff. Observe:
*******The approach is very simple first assume a genome with site i which has the following probabilities for the four nucleotides involved******
I did not assume ANY genome. I introduced a study by evolutionary biologists where all deleterious mutations were already identified. There is no probability involved with that. Sheeeze.
******One can show that the entropy for this site can be calculated to be******
LOL . . . OK, what is it, PvM? People he didn't calculate any entropy. He threw out an empty formula with no numbers in it and thinks he has calculated entropy. And everyone else: Hey good job, man. I understand this now.
*****And the entropy tendency or information can be defined as*****
You didn't define anything and again you didn't calculate anything because there are no numbers in your formula.
******Now sum over all sites i and you find that the complexity or information is given by******
You cannot sum over anything! You don't have any figures in this formula to sum over.
Of course, he then graphs all of this stuff he never calculated to start with and he does this before he doesn't calculate anything.
LOL . . . You people don't know he is just cutting and pasting stuff he doesn't even understand?
Jerry Don Bauer · 25 May 2004
******I was fairly amazed by Chronos' assertion that "With every generation in homo sapien, entropy increases in the genome."*******
Don't believe anything Francis (PvM, the very confused YECer) posts. He'll have you so confused you won't know whether you're coming or going because he is totally confused.
This came not from me (Chronos) but from a peer reviewed study submitted in Nature by two well respected evolutionary biologists name Eyre-Walker of Sussex University and Keightley of EdinBurgh.
http://homepages.ed.ac.uk/eang33/
They found in this study that in man's evolutionary walk from Chimp, the genome devolved at the rate of 1.6 deleterious mutations per generation AFTER natural selection had weeded out other of these mutations. This is the number that accumulate in the genome. But normally when we discuss this study, the figures are rounded off to two.
Here is the abstract:
High genomic deleterious mutation rates in hominids.
Eyre-Walker A, Keightley PD.
Centre for the Study of Evolution and School of Biological Sciences, University of Sussex, Brighton, UK. A.C.Eyre-Walker@susx.ac.uk
It has been suggested that humans may suffer a high genomic deleterious mutation rate. Here we test this hypothesis by applying a variant of a molecular approach to estimate the deleterious mutation rate in hominids from the level of selective constraint in DNA sequences. Under conservative assumptions, we estimate that an average of 4.2 amino-acid-altering mutations per diploid per generation have occurred in the human lineage since humans separated from chimpanzees. Of these mutations, we estimate that at least 38% have been eliminated by natural selection, indicating that there have been more than 1.6 new deleterious mutations per diploid genome per generation. Thus, the deleterious mutation rate specific to protein-coding sequences alone is close to the upper limit tolerable by a species such as humans that has a low reproductive rate, indicating that the effects of deleterious mutations may have combined synergistically. Furthermore, the level of selective constraint in hominid protein-coding sequences is atypically low. A large number of slightly deleterious mutations may therefore have become fixed in hominid lineages.
http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&db=pubmed&dopt=Abstract&list_uids=9950425
Those who don't care to read through the science can find a good read here:
http://www.open2.net/truthwillout/evolution/article/evolution_walker.htm
Pim van Meurs · 25 May 2004
Pim van Meurs · 25 May 2004
As far as Charlie is concerned, his appeal to personal incredulity combined with a lack of any supporting argument makes his Nelson's law quite meaningless and forcing him to ignore the empirical data, experimental data and theoretical arguments that disprove his claims. Until Charlie can show that he can correctly represent evolutionary theory, his claims have to be rejected due to their strawman nature and unproven and in fact fallacious assumptions.
Pim van Meurs · 25 May 2004
Pim van Meurs · 25 May 2004
Pim van Meurs · 25 May 2004
Jerry Don Bauer · 25 May 2004
*****With all due respect, this is not true. How do you measure the entropy in the genome and how do you demonstrate that natural selection reduces this entropy?*****
The thread would do well to stay away from vague concepts such as Nelson's law and go with the well proven work of Boltzmann and Feynman. Of course, Boltzmann was the first to term the W in his S = K log W as "the opposite of information" but Feynman refined this into formula without Boltzmann's constant messing with Joules and degrees Kelvin which really just clutters everything.
Feynman honed this down: "So we now have to talk about what we mean by disorder and what we mean by order. ... Suppose we divide the space into little volume elements. If we have black and white molecules, how many ways could we distribute them among the volume elements so that white is on one side and black is on the other? On the other hand, how many ways could we distribute them with no restriction on which goes where? Clearly, there are many more ways to arrange them in the latter case. We measure "disorder" by the number of ways that the insides can be arranged, so that from the outside it looks the same. The logarithm of that number of ways is the entropy. The number of ways in the separated case is less, so the entropy is less, or the "disorder" is less."
http://www.panspermia.org/seconlaw.htm
Feynman also gave us the formula with which we could calculate this: S = log2(W) where S is entropy and W is "those numbers of ways" or the total possible microstates of any given system.
We can find W, because the researchers tell us is there are about 100 million possibilities that could mutate, so four mutations is not that big a number, relatively speaking. (please read the BBC article I previously posted for this figure) But this will be positive entropy, thus we can surmise that entropy has risen in each, or at least most, generations for the last six million years and there is no evidence at all to suggest it hasn't been this way throughout the entire 2 billion year process.
S = log2W, S = log2(100,000,000), S = 26.5754247590989, therefore S is positive showing a positive tendency of disorder as we would expect.
But this is only statistical entropy and if we are to figure reactional entropy, we will have to calculate actual deleterious mutations from generation to generation.
We can view the deleterious mutations as actual entropy because, in this case, this is the actual disorganization. Eyre-Walker tells us that the human genome is estimated to carry 1000 negative mutation, so let's get that entropy S = log2 W = S = log2 1000 S = 9.96578428466209 --
Now let's calculate the entropy after two more genes mutate S = log2 W S = log2 1002 S = 9.96866679319521 --
It is here that we do that subtraction you so badly wanted to do: deltaS, the actual change in entropy is: deltaS = S2 - S1 -- deltaS = 9.96866679319521 (-) 9.96578428466209 = deltaS = 0.00288250853312
Questions, comments, or just ignorant trolls, its up to you guys.
Jerry Don Bauer · 25 May 2004
NOTE: I'm going to answer exactly one more of your ignorant posts, Francis. So after this one, you sum it all together and know that you had your chance to debate this. You somehow seem to have become one of my groupies. Go troll Mike Gene or something.
*****I was responding to your 'calculations' of entropy which you concluded were positive and thus showed a tendency to disorder. This incorrect application of mathematics is what I am addressing here. And contrary to your claims that
P1: With every generation in homo sapien, entropy increases in the genome.
P2: Complex macroevolution requires that the genome have a lower entropy over time through the generations.
Therefore, complex macroevolution did not occur*****
No, you are not answering 'my calculations' or my 'claims.' I'm sending you to papers by reputable scientists. You will not read them even when I post them verbatim for you and you always go back to Jerry claims this and that. Will you pull your head out of your arse and know that its not me saying this stuff but scientific studies? These calculations are just math describing them.
*****I understand that your only response to solid mathematics is ridicule. But we often ridicule that which we do not comprehend. In fact not only is this math not nonsensical but it is also supported by actual examples (did Jerry miss the graphs) which show the evolution of entropy under the influence of mutation and selection.******
LOL . . . Solid mathematics? Is this a joke? You calculated nothing. This is the stupidest forum I've ever been in as no one bothered to muse . . . well gee, what were the conclusions? Um..is there actually any figures in those formulas? What were the entropic conclusions and the sums? Nah . . . .Its just, well gee whiz, this seems anti--Id. Let's run with it. And look at all the PhDs listed as the sponsors. Unbelievable. You people you are just deceiving yourselves. The American public is nowhere near as stupid as you have become.
******It is trivially simple to replace the parameters in these formulas with actual values which is what was done to create the graphs.*****
Oh. This should be good. Let's see you do it. And how did you create the graph without trivially using actual values? <;0)
The rest of your post is typical Francis the YEC/atheist turned I don't know what the crap I am drivel who spouts Bible scripture while espousing atheism in the same posts conundrum.
You get one more post, troll. Make it good, dude. (or dudette, you post under female names as well)
Steve · 25 May 2004
Slowly I'm learning to come here and read the posted articles, but not the comment sections. Reading 7000 words of arguing with IDers who are using information theory and thermodynamics like a monkey would use an oscilloscope is just not valuable for me. I still read them sometimes out of a macabre fascination, though.
The articles are damned good, btw, kudos to the contributors. I hope a cool article in the future will explain to me how IDers fail to understand what it means that evolution-based research results in more papers in PNAS, Science, and Nature alone every week than I can even read the abstracts of, while IDers never get any research published. How can they think so much about the topic, and not realize their 'science' doesn't produce publishible results? That's really more of a psychology topic than a biological one, though.
steve · 25 May 2004
I have a request, too--articles about antibiotic resistence. For some eventual research on the physics of Syntaxin mutants, I've been hip-deep in e. coli DH5a, plasmids, pcr, sequencing, etc., and there's really some very cool stuff involved in growing bacteria which is resistant to a specific antibiotics like kanamycin, in kanamycin, to prevent contamination. It's a subject laypeople might enjoy hearing about. Just a few months ago I had no idea about the subtle evolutionary bits involved.
Jerry Don Bauer · 26 May 2004
****I have a request, too---articles about antibiotic resistence.*****
Here's the only article you will ever need on it. Certain organisms in any species will always be more susceptible than others to an antibiotics and some will be resistant to it. As more antibiotics are used, the ones that are susceptible are killed off and the ones who live will be the ones who have offspring.
These offspring are then also resistant to that antibiotic.
Others might take a book to explain this. But there you go and that's the truth.
Jack Krebs · 26 May 2004
Steve writes, "Slowly I'm learning to come here and read the posted articles, but not the comment sections. Reading 7000 words of arguing with IDers who are using information theory and thermodynamics like a monkey would use an oscilloscope is just not valuable for me. I still read them sometimes out of a macabre fascination, though."
I hope that the quality of the comments section improves, Steve. Don't give up.
Navy Davy · 26 May 2004
The reason the comment section often degenerates is actually a STRUCTURAL failing on the part of Panda's Thumb. Observe the dynamic:
1. Poster tries to makes a point, imbued with scorn and ad hominem, which often specifically NAMES Jerry Don Bauer as the subject;
2. But poster refuses to engage JDB (ask civil questions, respond to civil questions posed);
3. JDB responds, fending off 5-10 persons sniping from sidelines, sometimes with ad homimen in response and sometimes over-enthusiastically;
4. Thread peters out. Few are satisfied.
Of course, I have offered to mediate a civilized, debate, where advocates actually get to TEST their propositions, and we get to learn something. JDB has pledged to participate and abide by standard debating rules. Kudos, Jerry.
But, alas, the "higher-ups" at PT cannot spare a solitary thread for such an orderly debate.
I thought scientists were, you know, into (1) evidence, (2) testing, (3) prediction, (4) logic.
I guess not. Let the brawlin' continue!
Cheers,
Navy Davy
Pim van Meurs · 26 May 2004
Jack Krebs · 26 May 2004
How about taking your idea to ARN, where you or Jerry would be free to start a post and propose details of this debate?
Andrea Bottaro · 26 May 2004
Pim van Meurs · 26 May 2004
Nate Barrister · 26 May 2004
This was good.
Whistle Blower · 26 May 2004
Navy Davy · 26 May 2004
Pim,
Navy, why not take Jack Krebs' idea and implement such a discussion on ARN, which would be an excellent place for Jerry to discuss his ideas.
Not a bad idea, I might do that. But, you POSTED here an attack specifically on Jerry (not simply his ideas on entropy). I quote:
Jerry Don Bauer a.k.a. Chronos has shown some interesting confusions as to the concept of entropy.....
So, for the life of me, I cannot understand why the Evolution proponents would rather snipe from the sidelines at ID folks, rather than directly engage their ideas in a civil manner.
To me, it just seems like a wasted opportunity. Particularly, because I see a whole lotta intellectual acumen here that, if channelled properly, would really be influential and informative.
Cheers, Navy Davy
Whistle Blower · 26 May 2004
Navy Davy · 26 May 2004
Whistle Blower,
I give you a resounding, "Ho-hum."
Jerry Don Bauer · 26 May 2004
******Jerry continues to exemplify his lack of understanding when he states: herefore S is positive showing a positive tendency of disorder as we would expect.
This whopper was in depth debunked on ISCID. That Jerry still repeats this silly notion almost verbatim is quite fascinating.******
I didn't just posit this, unlike you I actually calculated it. And Francis, why don't you quit sending people to other forums hoping they won't read all the posts. If this whopper were debunked by you before, you wouldn't have much trouble doing again here, would you?
*****If Jerry cannot even understand the simple mathematical foundations for Shannon entropy as they apply to the genome, it does not come as a surprise that he fails to comprehend how one can calculate the actual entropy from abstract formulas.****
Oh, I think I understand Shannon/Weaver entropy just fine. I at least understand that you have to put some math into a formula in order to get something out of it.
*******Let me know what part of the derivation of Shannon entropy or the calculations based upon these formulas confuse you Jerry.******
You didn't do any calculations. Not one. You only threw ought empty formulas that you cut and pasted hoping to confuse people. Works, don't it. ;)
******And then there is Jerry who cannot even calculcate the entropy in the genome to show support for P1 (entropy in the human genome always increases).******
But I did calculate it. Did you not read the posts? Now quit beating around the bush hoping this will go away. Refute the math or admit you cannot.
Jerry Don Bauer · 26 May 2004
*****Originally lets assume that the nucleotide at the location of the mutation has an equal probability of being A, C, T or G. This means that this location has maximum entropy.*****
LOL . . . . . . No it doesn't. You think something has maximum entropy just because it exists? All things with microstates such as this have statistical entropy and Feynmann would calculate it as S = log2(4) = S = 2. But this is not maximum entropy because if a deleterious mutation happened here, disorder can go up. Think these things through before you post them, Francis.
******After the mutation if the probability of the nucleotide becomes 1 for one of the 4 bases and zero for the others because the mutation becomes fixed in the genome, the entropy drops to zero.******
Entropy drops to zero? Calculate this for us. And haven't we just left the second law and went to the third?
Third law of thermodynamics: "The third law of thermodynamics states that the entropy of a pure perfect crystal is 0 at 0 K: S(0K) = 0. At 0K the atoms in a pure perfect crystal are aligned perfectly and do not move. Moreover, there is no entropy of mixing since the crystal is pure. For a mixed crystal containing the atomic or molecular species A and B, there are many possible arrangements of A and B and there is therefore entropy associated with the arrangement of the atoms/molecules."
Quick people, back away from the table. Francis' human genome just froze solid!
*****Before the mutation and fixation there was maximum confusion as to which nucleotide would be found at the location or in other words maximum disorder.*****
And how do we calculate this maximum confusion?
*****After the mutation got fixated, there is full predictability of the basepair at this location or maximum order.******
You just flunked the genetics test too. You don't understand the word 'fixated.' This applies only to populations, not individual genomes. Nothing is fixed in an individual because of a mutation. The reason is that this same nucleotide can always mutate again. In fact, it may even be a beneficial mutation.
Pim van Meurs · 26 May 2004
Jerry Don Bauer · 26 May 2004
What is this deal with you writing posts about me rather than to me? <:0)
******Jerry in the same breath claims he understands Shannon entropy and shows confusion of said entropy with thermodynamical entropy when he states
Entropy drops to zero?
I will show in a follow up posting how to calulcate this but it is trivial since p_i log p_i will be zero for this site since either p_i is zero or log p_i is zer (p_i = 1)*******
LOL . . . .I don't want formulas, I want math. Do some with real figures. But I do want to know how you can mathematically show your genome as being frozen solid. And if you show statistical entropy as zero in a system with microstates, you will have just revolutionized, science, my friend. Deleterious mutations become impossible and genetic defects are just a myth of science.
You'll never have to buy a new car again the rest of your life. Just sit your old one out in the sun and according to your math, it will magically regress into a new one. And you will have just refuted the second law of thermodynamics.
****Jerry still seems to be confused about entropy being a positive number (which it always is) and entropy having a tendency to disorder because it is positive.*****
It is not always positive. Sure, statistical entropy in a system with microstates is, and it grows larger when we are considering more microstates. But wouldn't you expect it to be? The second law states that with any spontaneous reaction entropy will tend to increase. I'm showing this mathematically.
But statistical entropy is not reactional entropy. And in chemical reactions entropy can go down and often does.
*****As Gedanken and various others have shown, this is a meaningless and erroneous concept. That Jerry continues his claims in spite of the facts is further evidence that Jerry may have some problems with entropy, and the mathematics surrounding this at the surface simple concept but oh so tricky in the hands of creationists.*****
LOL . . . .What does this mean and aren't you a YEC? Stick to the subject, sheeze.
*****I apologize for presenting a mathematical foundation for my claims and thus confusing you with 'empty formulas' but no worry, not only will I walk the 'confused people' through how to apply Shannon entropy but it will also support what I have already shown using data from Schneider and Adami, namely that entropy decreases under the influence of mutation and selection.*****
Apology accepted. Now quit trying to dodge the subject and go back and tackle the math I gave you. Remember I gave you real figures in formulas. Not just the formulas. And how do you ever hope to defeat my math using something other than my formulas?
****Well, let me try to explain. Maximum confusion means that no nucleotide was prefered or in other words that the probability for the nucleotide being one of (A,C,T,G) is equal to 1/4. Applying the formula for Shannon entropy we find that for uniform distribution of probabilities, entropy is maximum.*****
So you mean by this that before your genome was frozen it pretty much became delirious? And gee . . . I'm rather shocked to learn that the probabilities of one out of four nucleotides mutating is 1:4.
*****Jerry ignores most data that shows fixation of benificial mutations.*****
And this information is found where in that paper? They only considered deleterious mutations. Cut and paste all of this. You're good at that.
****we are here to educate and help out when issues of confusion arise and in a soon to be released posting I will gently walk through these straightforward and, yet to some confusing, calculations to show further support for my thesis.*****
You probably ought to write a thesis that refutes my thesis, don't you think?
****math into a formula? Or does Jerry mean, actual values? If that is the case, could Jerry apply these formulas to his example? As I have shown, if, as Eyre-Walker argues, slightly deleterious mutations become fixated in the human genome then by all measures of entropy, the entropy will decrease.*****
As they say in Arkansas, "Do what?"
*****But we already knew that disorder, entropy and information do not really say anything much about the nature of the fixated mutation. After all this should be obvious to anyone familiar with Shannon entropy.******
You don't even know what Shannon/Weaver entropy is. You are trying to apply entropy that deals with the loss of signal and the addition of noise added into a telephone line to the genome. Oh, you water it down into some genetics you don't understand. And remember that I was personally present when Bill Dembski, a math PhD actually examined this stuff and told you personally that nothing you are cutting and pasting is even germane to the subject. You, my friend, are as full of crap as a Christmas turkey and you spoil the truth of science for both sides.
Well, you trolled me for a debate and since you seem to be the highest intellect I can find on this forum: Your turn.
Frank Schmidt · 26 May 2004
Jerry Don Bauer · 26 May 2004
******All this is from the long-lost recesses of my P. Chem. knowledge, so subject to correction by real physicists and physical chemists, whom I invite to do so.*****
If there's any of them on this forum, I can assure you they won't be addressing this thread.
So what do you think the third law of thermodynamics is? Do you not believe it exists? I mean you didn't try to correct me. You just seemed to deny that there was such a critter.
DS · 27 May 2004
From the Wikipedia:
The 3rd Law-
This states that the entropy of a system at zero absolute temperature is a well-defined constant. This is because a system at zero temperature exists in its ground state, so that its entropy is determined only by the degeneracy of the ground state.
A special case of this is systems with a unique ground state, such as crystal lattices. The entropy of these systems as defined by Nernst's theorem is zero (since ln(1) = 0).
"The entropy of a perfect crystal lattice at absolute 0 is 0"
This is at least one formulation of the 3rd law and is the most common format found in several resources I checked.
Frank Schmidt · 27 May 2004
However, the zero entropy state doesn't imply zero motion, which is what Jerry stated. My old textbook points out that, for example, the entropy of the nucleus is not known when the third law is stated in this formulation. As with all thermodynamic measurements, we can look at changes in the quantities only, and assigning S=0 is a convention that isn't always valid. (Denbigh, The Principles of Chemical Equilibrium, 2nd ed. Cambridge U. Press, 1968) Note that my statement is subject to correction since I don't follow this primary literature at all - hence my request for enlightenment if necessary.
I brought up the point, not to revisit a course that I took long ago but rather to point out that thermodynamics is a specialized subject, involving lots of work and study before one can make the arguments cogently. Pulling quotes from a secondary or tertiary source to make a philosophical point is a dangerous business - whether the subject is thermo, information theory, or dare I say evolutionary biology.
Pim van Meurs · 27 May 2004
Navy Davy · 27 May 2004
Pim,
And remember that I was personally present when Bill Dembski, a math PhD actually examined this stuff and told you personally that nothing you are cutting and pasting is even germane to the subject.
Sorry, the lawyer in me has gotta ask:
1. Did you ever converse with William Dembski?
2. If so, Was Jerry Don Bauer ever present during your conversation with William Dembski?
3. If so, Did you and William Dembski ever discuss entropy?
4. If so, Did William Dembski express any disagreements with your views on entropy?
Cheers, Navy Davy
Pim van Meurs · 27 May 2004
Navy Davy, why don't you ask Jerry. But the answer is that I never met Dembski in person and I must thus conclude that when Jerry stated 'I was personally present' he is refering to an ISCID discussion
Perhaps Jerry can clarify these questions?
Navy Davy · 27 May 2004
Pim,
1. So, you were at some ISCID "discussion" where Dembski was speaking/writing about principles of entropy?
2. And Jerry was there, too?
Not that I really care about these peripheral matters, but if your answers above are "yes" and "yes," it sounds like Jerry was pretty close to the mark. IIRC, you told Jerry to "stop making things" up.
Cheers,
Detective Navy Davy
Pim van Meurs · 27 May 2004
Navy Davy, I do not remember Dembski contributing to these discussions and in my quick search I found nothing that suggests that Dembski contributed. In fact, Dembski's contributions to ISCID are minimal as it comes to actual discussions.
Hence my question to Jerry because to my best recollection this never took place.
Whistle Blower · 27 May 2004
Navy Davy · 27 May 2004
Pim,
Final question:
1. Was this ISCID "discussion" you attended on-line or in person?
I reckon, if you don't remember Dembski discussing anything at an ISCID "discussion," in which you participated, then, Hell, we got ourselves an impasse.
Much obliged,
Navy Davy
Pim van Meurs · 27 May 2004
I suggest we let Jerry support his claim. I am more than willing to admit that I am wrong if he can provide for supporting evidence for his claims.
These ISCID discussions are on-line, Dembski occasionally contributes but seldomly participates. Checkout http://www.iscid.org/boards
Navy Davy · 27 May 2004
Pim,
I just did check out the ISCID discussion boards. There seems to be a lot less vitrol and ad hominem than here. Incidentally, I saw some pretty informative comments by both you and Jerry, over there.
WB,
Perhaps the bar for moral conduct is far lower in your state than it is mine.
Perhaps you are a geek, while I am not:)
Cheers,
Navy Davy
Whistle Blower · 27 May 2004
Jerry Don Bauer · 27 May 2004
******I am still confused*****
Yeah, I can tell. ;)
****on one hand Jerry claims he understands Shannon entropy, on the other hand he confuses Shannon entropy with thermodynamic entropy.*******
That is hilarious. It's you trying to use this irrelevant math to determine this entropy and this is NOT thermodynamic entropy. That kind of entropy deals with heat and energy and is expressed as Joules-Degrees Kelvin. I used Feynman's formula and sent you to the site showing you what that kind of entropy it is. Do you understand the difference between thermodynamic, logical and information entropy? I don't even think you know what you are attempting to calculate.
Also, if you refute my math, you are going to have to use the same math I did and show me where the mistakes in it are. Using your logic, I would try to show that 4 x 4 = 16 to invalidate 2 + 3 = 4. Nat a lot of logic there.
******Not only does Jerry not appreciate the meaning of mathematics (formulas are mathematics Jerry) but also confuses the concepts of Shannon entropy. Even after providing Jerry with the formula H= - sum_{i} p_i log p_I******
Empty formulas show nothing. Do you really think that people on here are so stupid not to know that in order for a formula to show something it must have some figures in it? And what is it you think you are showing if you manage to show that H = 0? Do you know what H is and what it normally represents in math?
******and pointing out that when a mutation becomes fixated the Shannon entropy for that particular nucleotide becomes zero because p_i becomes one for the mutation and zero for the three other nucleotides (I am assuming for convencience a fixated point mutation). Thus the formula with p_i=0 for i=1..3 and p_4=1 shows that H=0******
Mutations cannot become fixated in an individual genome . . . .Sheeze . . . .this is twice now I've corrected this. Can you not see that a nucleotide that mutates can mutate again or change by breeding? Fixations are only related to individuals as those individuals relate to the entire population. We don't have a population, so how can we have any fixation that relates to one?
If you don't believe me, why don't you just look these words up:
Fixation: "Evolutionarily, a state where every single individual within a population is homozygous for a particular allele (and therefore the phenotype that the allele confers). For example, in a population where everyone has blue eyes, the allele for blue eye color is fixed and everyone will continue to have blue eyes in the future, as long as no new individuals come into the population from elsewhere."
So do you understand that if this individual you are fixating ever happens to breed outside its population that you may no longer have any fixation? Or that this particular nucleotide can always mutate again?
*******Stop making up things Jerry. Your 'imagination' is only outperformed by your inability to apply mathematical concepts in a defensible manner and your grandstanding when caught in the act.********
LOL . . . .Did you miss all that math I posted? Do you want to pretend it was never posted? Well here it is again just so you can't state that you overlooked it:
S = log2W, S = log2(100,000,000), S = 26.5754247590989, therefore S is positive showing a positive tendency of disorder as we would expect.
But this is only statistical entropy and if we are to figure reactional entropy, we will have to calculate actual deleterious mutations from generation to generation.
We can view the deleterious mutations as actual entropy because, in this case, this is the actual disorganization. Eyre-Walker tells us that the human genome is estimated to carry 1000 negative mutation, so let's get that entropy S = log2 W = S = log2 1000 S = 9.96578428466209 ---
Now let's calculate the entropy after two more genes mutate S = log2 W S = log2 1002 S = 9.96866679319521 ---
It is here that we do that subtraction you so badly wanted to do: deltaS, the actual change in entropy is: deltaS = S2 - S1 --- deltaS = 9.96866679319521 (-) 9.96578428466209 = deltaS = 0.00288250853312
Now refute this or lose this portion of the argument. Here is the math. You don't need to bring in unrelated and irrelevant math as I've presented this now in three forums the last one in which you simply left the debate.
Of course, you'll just go to the fourth using the same old trite trickery and hope no one will come along and notice.
Pim van Meurs · 27 May 2004
Jerry: I don't even think you know what you are attempting to calculate.
Dear Jerry, I was merely pointing out that you were confusing Information Entropy (Shannon) with thermodynamical entropy when you appealed to the third law of thermodynamics. Nothing wrong with that these are tricky concepts.
Jerry: So, if you refute my math, you are going to have to use the same math I did and show me where the mistakes in it are.
I am showing that your math is irrelevant to entropy calculations for the genome. That's all Jerry.
Jerry: Mutations cannot become fixated in an individual genome . . . .Sheeze . . . .this is twice now I've corrected this. Can you not see that a nucleotide that mutates can mutate again or change by breeding? Fixations are only related to individuals as those individuals relate to the entire population. We don't have a population, so how can we have any fixation that relates to one?
Was it not you Jerry who quoted Eyre Walker's paper to show that slightly detrimental mutations become fixated in the genome? What did you have in mind quoting Eyre Walker if that is not what you were trying to argue? In fact your own statements once again contradict you when you made claim P1 namely that the entropy in the human genome has been increasing and pointed to Eyre Walker's paper.
Jerry, I do not understand why you keep making the same error about positive entropy and tendency. But it was wrong in the past and is wrong now.
Jerry: Now refute this or lose this portion of the argument. Here is the math. You don't need to bring in unrelated and irrelevant math as I've presented this now in three forums the last one in which you simply left the debate.
Why should I refute irrelevant calculations Jerry? Your calculations have no relevance to entropy in the genome, and when correctly applying such calculations they show that you are wrong. What should I do beyond this Jerry?
Jerry: Of course, you'll just go to the fourth using the same old trite trickery and hope no one will come along and notice.
Who is using nonsensical math and arguments to support his opinions here Jerry. It seems obvious to all that it is you.
In fact in case of detrimental mutations, entropy decreased for the simple reason that the number of possible states has decreased. While before these 1000 (assume for the moment) point mutations were free to take on various base pairs (A,C,T,G), now they have become fixed as a detrimental mutation. Thus, applying correctly, the concepts of entropy to the genome, the entropy has decreased significantly.
Not surprisingly Jerry seems to still believe that positive entropy is an indicator of tendency and that it is enough to take the log of the number of mutations to estimate entropy. Using Jerry's logic and apply it to beneficial mutations he would argue that entropy also increased. No matter what mutates entropy always increases, neutral, benificial or detrimental, does not matter.
A real calculation would have considered the before state and after state for all nucleotides.
So let's assume that we have 1000 base pairs genome and the nucleotides in the genome are initially uniformly distributed, that is fully random, not surprisingly we find that the entropy is maximal for this situation. But now let's assume that one base pair becomes fixated in the human genome. It does not matter whether the mutation is slightly detrimental or benificial, let's assume that for whatever reason the mutations spreads through the human genome as Eyre-Walker has argued. Thus the entropy has to drop since it was at its maximum value, in fact the entropy drops by 2 bits.
I know, these are complicated issues, but it is you who has unnecessarily complicated matters when confusing entropy and tendency or claiming that you understand Shannon entropy and then confuse it with thermodynamical entropy.
GOod thing that there are friendly people here who are willing to spend the time correcting your mistakes :-)
Joe P Guy · 27 May 2004
Frank Schmidt · 27 May 2004
As I recall, Shannon used the term "entropy" in his discussion about information because his version (potential information loss - remember he worked for the phone company) had the same mathematical formulation as did the statistical definition of entropy. But it was understood as an analogy only. Hence the confusion - the two terms are not exactly the same. I'm sure Shannon never got it confused, but it sure can be a problem for us non-geniuses.
Pim van Meurs · 27 May 2004
Pim van Meurs · 27 May 2004
A good similarity between Boltzman entropy and Shannon entropy can be seen here
On Slide 10 the author points out that when the distribution of the gas particles is uniformly distributed over the phase space bins, the entropy is maximal, when only 1 bin has all the molecules and entropy is minimal.
Not surprisingly Boltzman himself derived that
H_b = Sum_k p_k log p_k (slide 11)
And defined S = -k_b H_b (slide 13)
Looks familiar anyone?
Well things get better. How do we go from number of allowed states to Shannon's form?
Slide 8
S=k_B log W
now apply to number of states n_a and n_b and the entropy can be expressed as
S = -N k_b (p_a log p_a+ p_b log p_b)
Another example of how correct application of entropy results in the format I presented
Pim van Meurs · 27 May 2004
Tim Downs · 27 May 2004
My only comment would be that Jerry is very confused on this issue. I'm not sure it is worth continuing the discussion with him at this point.
Jerry Don Bauer · 28 May 2004
Is there any honest scientist on this forum willing to admit I am debating simpletons that don't even understand the subject? This is a sad day for science, my friends, when ALL of the scientists listed as the sponsors on this forum will just remain silent in the face of this gross rape of Intelligence. This should tell all reading this what Panda's Thumb is all about.
Panda's Thumb wants to evangelize you to its religion. If you think any differently, get real.
Truth be known, they don't just hate IDists. My experience is that they usually hate blacks and Jews as well.
Bob Maurus · 28 May 2004
You're really coming unhinged, Jerry. That last insult was truly offensive. Crawl back under your rock.
Bob Maurus · 28 May 2004
You're really coming unhinged, Jerry. That last insult was truly offensive. Crawl back under your rock.
Ed Brayton · 28 May 2004
Truth be known, they don't just hate IDists. My experience is that they usually hate blacks and Jews as well.
Mr. Bauer, the mere fact that we have tolerated you this long is a miracle (yes, that's called "irony"). But you have now crossed over the line. One more comment like this and you will be permanently banned from leaving comments on this page. This is not negotiable, nor do I care in the slightest that it will feed your martyr complex and you'll scream persecution from every mountaintop you can find. You're behaving like very obnoxious ass who ever got thrown out of a chatroom, screaming that you're being censored when the truth is you're just being an ass and that's why you got banned. This is your first, last, and only warning.
Matt Young · 28 May 2004
I'm not black, but I am Jewish and an "honest scientist on this forum." I do not hate IDists, blacks, or Jews, and I resent the suggestion that I do. I'm not a psychologist either, but I recognize projection when I see it, and I'd suggest that Mr. Bauer is projecting his own hates and prejudices onto his opponents. Maybe he should read my column, "I Am Firm, Thou Art Stubborn, He Is Pigheaded," at this URL: http://www.pandasthumb.org/pt-archives/000082.html.
Pim van Meurs · 28 May 2004
Now that the flaws in Jerry's arguments have been fully exposed, what is his response? Calling the contributors to these boards racists? Lovely ad hominem and as poorly supported as Jerry's usual 'claims'.
From the bright side however several good things have come from Jerry's presence on this board
1. An in-depth analysis of entropy as it applies to the genome
2. An in-depth expose of Jerry's deepest 'thoughts'
Thanks to Jerry, I have been encouraged to write up the details of entropy and how the concept applies to genome (including common pitfalls).
Mark Perakh · 28 May 2004
Mr. Bauer's last comment, however disgusting, is not really surprising. He has displayed all the features of an incurable self-admiring crank, imperviously confident in his infallibility, so when he reveals that he also is a spiteful slanderer, this could be expected. He wanted scientists to address his lengthy comments on thermodynamics. Perhaps I can qualify as I have taught, among other things, thermodynamics and statistical physics for more than half a century both to undergraduate and graduate students and have published nearly 300 papers in peer-reviewed media. I did not see a need to address Bauer's diarrhea of pseudo-scientific arguments for the same reason I did not argue against a Siberian peasant who was confident that he succeeded to muzzle me as he asserted that an electric bulb lights up when a plus and a minus meet in it. Now, when Bauer has shown his real character explicitly, I think not only should he be banned from further posting his crock on PT, but also all his preceding posts should be kicked into a garbage can where they belong, except for his last pearl about hating blacks and Jews which would serve to show what kinds of adversaries PT has to deal with.
Navy Davy · 28 May 2004
Oh, the outrage! Oh, the bleakness! Oh, the humanity!
I actually think Jerry should simply apologize for his one remark and continue posting.
But, you all, should also apologize for ganging up on him, calling him a liar, and generally not bein' good sports.
Best, Navy Davy
Bob Maurus · 28 May 2004
Navy Davy,
What's to be gained by letting a liar get by with his lies? What useful function does that serve? And why should we apologize for calling him a liar? He's a liar.
Sounds like you've not encountered Mr. Bauer before. He's an infamous haunter of evolution/creationism/IDC boards, under a number of different aliases, and seems to be an object of scorn and ridicule wherever he tries to recycle his ignorance and misinformation.
You had a pretty good introduction to his routine and his dishonesty in several threads here. His last post, while unexpectedly vile, is not, in the end, surprising.
Bob
Jerry Don Bauer · 28 May 2004
*******A good similarity between Boltzman entropy and Shannon entropy can be seen here*******
Who cares, Pim? What does any of this have to do with a genome in devolution and heading toward mutational melt down? What does this have to do with fixation or anything else we have discussed?
*****On Slide 10 the author points out that when the distribution of the gas particles is uniformly distributed over the phase space bins, the entropy is maximal, when only 1 bin has all the molecules and entropy is minimal.******
<:0)
******Not surprisingly Boltzman himself derived that
H_b = Sum_k p_k log p_k (slide 11)
And defined S = -k_b H_b (slide 13)
Looks familiar anyone?******
Why yes, it does because its more cutting and pasting and serves as another example that you don't have any idea what your discussing. BTW, the math you're using is NOT Shannon/Weaver entropy, but just similar. Here is Shannon/Weaver entropy:
******S=k_B log W
now apply to number of states n_a and n_b and the entropy can be expressed as
S = -N k_b (p_a log p_a+ p_b log p_b)
Another example of how correct application of entropy results in the format I presented******
And yet another example of empty formulas with no figures in them. What are the values of S, k_B, W or any of the rest of this stuff? Let me guess, you have no idea?
Whistle Flower · 28 May 2004
Mark
As was posted elsewhere, I believe that Jerry Don's garbage non-science is as useful as his bizarre comment regarding blacks and Jews for illustrating the vastness of the empty space between Jerry Don's headphones (and why did he leave out homosexuals? Jerry Don is showing his age). But certainly we don't need any more proof at this late date.
One question for the peanut gallery: do any ID critics here believe Navy Davy when he claims to have not yet made up his mind about the relative merits of ID ? Do any ID critics here believe that Navy Davy can possibly be convinced that ID is a bogus political scheme for teaching creationism in schools?
Personally I think Navy Davy is completely full of baby diaper garbage, just like Jerry Don Bauer. But perhaps I'm too quick to make such judgments.
Jerry Don Bauer · 28 May 2004
*******Sounds like you've not encountered Mr. Bauer before. He's an infamous haunter of evolution/creationism/IDC boards, under a number of different aliases, and seems to be an object of scorn and ridicule wherever he tries to recycle his ignorance and misinformation.******
Nah,,,,just on the radical naturalism boards that can't handle the debate. ;)
Navy Davy · 28 May 2004
Whistle Flower,
Are you by any chance related to Whistle Blower ? Perhaps, a genetic mutant:)
I'm reading a great book, entitled, "The Pleasure of Finding Things Out" by Richard Feynman. I guess that sums up my philosophy on many issues, including this one.
I note that many on PT would retitle the book,
"The pleasure of discussing among ourselves what we believe to be facts, and ignoring and/or stifling those who disagree with us, who must be religious wackos to even think that way"
Cheers, Boys!
Navy Davy
p.s. Have a Good Memorial Day weekend too. Don't be shy about hoisting up 'ole Glory, neither. Lotta brave young folks died, so we could be free to play on the internet:)
Jerry Don Bauer · 28 May 2004
********One question for the peanut gallery: do any ID critics here believe Navy Davy when he claims to have not yet made up his mind about the relative merits of ID ? Do any ID critics here believe that Navy Davy can possibly be convinced that ID is a bogus political scheme for teaching creationism in schools?*******
He was very honest with me in stating that he did not buy into my ideas, but just wanted to see a debate on the issues. I also informed him that he probably would not see a debate on the issues as there is no one on here capable of handling the debate from the side of naturalism.
Of course, there has been little real debate. I can do a mathematical calculation for this forum and when I do, I'm a liar. No attempt to address the math at all, no refutations, nothing. Idists are just liars.
I would be most surprised if those who actually do have an open mind and can learn will not do some further research on this issue.
You people badly damage your agenda by even having these forums. As when anyone really decides to disagree with you, you lose it entirely. This shows radical Darwinism as really what it is: religion, with almost no tenet based in science or math.
Whistle Flower · 28 May 2004
Navy Davy · 28 May 2004
Whistle Flower is the new kinder, gentler version of Whistle Blower. Didn't you notice?
Why would I notice a cyber-dweeb, who can't even correctly spell his own nom de guerre ?
Or, in your case, nom de wuss :)
Cheers, Navy Davy
p.s. I really have to go now, sorry. But, I will be back: in the name of truth, science and finding things out!
"Another of the qualities of science is that it teaches the value of rational thought, as well as the importance of freedom of thought; the positive doubting that the lessons are all true. " (Feynman, Pleasures.... pg. 186.)
Ed Brayton · 28 May 2004
You people badly damage your agenda by even having these forums. As when anyone really decides to disagree with you, you lose it entirely. This shows radical Darwinism as really what it is: religion, with almost no tenet based in science or math.
Jerry Don, that is - to put it kindly - complete and utter horseshit. The only one who has "lost it entirely" here is you, with your obnoxious nonsense about your opponents allegedly hating blacks and jews. Your first mistake is in thinking this is a "forum"; it's not. It's a blog, not a message board and not a forum. We are not obligated to give you a forum to launch ad hominem attacks on us. You were treated just fine until you started with your smug "no one here is smart enough to debate" crap. We have had respectful exchanges on here with Paul Nelson and Frank Beckwith.
Perhaps it's time for you to consider the possibility that the reaction of other people to you is not due to what you perceive to be the overwhelming strength of your position, but is due to the fact that you're a raging asshole. Now go troll somewhere else. You've worn out your welcome here.
Jerry Don Bauer · 28 May 2004
Mine started as respectful as well. One can only take being called names so long before they begin to slap back. And I cannot imagine that Paul would be treated with respect in here.
Bob Maurus · 28 May 2004
Why don't you ask Paul about that, Jerry?
Jerry Don Bauer · 28 May 2004
******Why don't you ask Paul about that, Jerry?******
I probably will.
Pim van Meurs · 28 May 2004
Jerry Don Bauer · 28 May 2004
******Any chance you can provide us with support for you statement about Dembski*******
No, I could not find the post. However, if you were honest you'd just admit it. But since we can't expect this I'll simply retract that statement.
I don't think it takes Dembski to see the silliness in those formulas.
Pim van Meurs · 28 May 2004
Jerry Don Bauer · 28 May 2004
******S = k log N
Where N******
Oh sheeze. I meant W and you know it. But if I'm using N the same way, what the hecks the difference. Are you now just running away from the little debate we have going hoping to bury it in smoke and mirrors?
Pim van Meurs · 28 May 2004
Pim van Meurs · 28 May 2004
Jerry: No, I could not find the post. However, if you were honest you'd just admit it.
ROTFL. You're a gem. Another one for the "bathroom wall". One more and I get the much coveted but rarely awared "Bronze PandaThumb Beer Mug"
Btw the N in S = k log N is just a mathematical symbol which in fact represents what you called W.
Even Feynman seems to disagree with your interpretation of his statement. Sigh..
Pim van Meurs · 28 May 2004
And despite claims that 'I am running away from a debate' it should be clear that I am fully engaged in a debate and that Jerry is left to utter ad hominems in apparant frustration.
Fine with me.
Jerry Don Bauer · 28 May 2004
*****Btw the N in S = k log N is just a mathematical symbol which in fact represents what you called W.
Even Feynman seems to disagree with your interpretation of his statement.*****
Good Lord. This is not Feymann this is Boltzmann. Sometimes I wonder if you've actually ever had a freshman chemistry class.
Now back to the debate. You owe me a post, so why are you posting to me without addressing what we were talking about?
Are you conceeding that argument?
Pim van Meurs · 28 May 2004
Oh boy, the constant is just that, a constant and does not make a difference whether you use Feynman or Boltzman or Shannon.
And my Feynman reference was to the previous posting in which Feynman correctly represents the facts but when 'calculated' by Jerrry he forgets the relevant (bolded parts).
I have no idea what argument you want me to concede?
1. That you confused the positive nature of entropy with tendency?
2. That you called us racists?
3. That you claimed "And remember that I was personally present when Bill Dembski, a math PhD actually examined this stuff and told you personally that nothing you are cutting and pasting is even germane to the subject."?
4. That you appear confused about application of entropy concepts?
1 is self evident, 2 seems undisputed, 3 is unsupported other than by ad hominem thus weakening your position and 4 is well documented on the various threads here.
Pim van Meurs · 28 May 2004
Bob Maurus · 28 May 2004
I love it, Jerry,
You posted
(PVM)******Any chance you can provide us with support for you statement about Dembski*******
(Jerry) No, I could not find the post. However, if you were honest you'd just admit it. But since we can't expect this I'll simply retract that statement.
How about this Jerry: if YOU were honest you'd just admit you're full of shit. You made a flat claim that you couldn't validate when called. Get your documentation in order first - kinda like, "First you pillage, then you burn." I do realize though that that might be beyond your comprehension. My offer to pray for you still stands. The goddess might be willing to take pity on you.
Peace,
Bob
Jerry Don Bauer · 28 May 2004
People, take a look at Feynmann's famous formula that somehow was mistakenly engraved on Boltzmann's tombstone. :0)
http://www.wellesley.edu/Chemistry/chem120/thermo1.html#boltz
Jerry Don Bauer · 28 May 2004
*****Well, you should get the point by now.******
Was that post to me? Will you please quit sending me to irrelevant web sites hoping they will make your argument for you?
Place your argument in your own words. The problem is that you are out of argument and you know it. This in fact is the third place you have been embarrassed discussing design. I'm curious, where did you get those degrees in physics?
Jerry Don Bauer · 28 May 2004
*****He wanted scientists to address his lengthy comments on thermodynamics. Perhaps I can qualify as I have taught, among other things, thermodynamics and statistical physics for more than half a century both to undergraduate and graduate students and have published nearly 300 papers in peer-reviewed media.******
And you are not up to a friendly debate in your field? I can assure you I'm only rude to those who seek to demean me rather than professionally debate.
Tell me. If you are a thermodynamicist, how do feel that this law would have allowed macroevolution which postulates the antithesis of SLOT?
Pim van Meurs · 28 May 2004
Jerry: Place your argument in your own words. The problem is that you are out of argument and you know it. This in fact is the third place you have been embarrassed discussing design. I'm curious, where did you get those degrees in physics?
Very simple your application of the formula for entropy is incorrect. I have tried to show you in my own words (see Shannon entropy applied) and I have quoted extensively from the literature.
To no avail. And you ad hominems only serve to show that you have not won any argument other than by running away from confronting it.
I have no problem with that Jerry.
The problem with Jerry's argument is that the logarithm has to be taken from the number of possible states, not the number of mutations.
So let's assume we have a population of 10000 humans and look at a particular location in the genome. Let's assume for the moment that at the distribution of the nucleotides at this location is uniformly distributed. In other words we have maximum disorder. The number of accessible states can be calculated to be
W = 10000!/2500!*2500!*2500!*2500!
S= log W = 6014.6
using Sshannon= - Sum p_i log p_i with p_i = 0.25 we calculate Sshannon=6020.6
Remember that the two measures are equal only for large numbers.
Now two mutations arise which are slightly detrimental or slight neutral, it does not matter and they get fixated in the population as argued by Eyre-Walker. Now we find a different situation since p_i is 1 for these two locations
so now we have calculate the entropy to be 9998*log.25 or 6019.4 or a drop of 1.2 in entropy.
Similarly we calculate
W = 9998!/2500!2500!2500!2498!=6013.4 or 1.2 less than before
Both formulas agree
QED
Jerry Don Bauer · 28 May 2004
*****Very simple your application of the formula for entropy is incorrect. I have tried to show you in my own words (see Shannon entropy applied) and I have quoted extensively from the literature.*****
LOL . . . .Right. Please read up on Shannon entropy so that you will at least know what it is. Shannon deals with data and information. Shannon worked for the telephone company, remember? It cannot possibly be used to say anything about nucleotide mutation.
Shannon Entropy: "This theorem is the foundation of the modern field of information theory
Information theory is a branch of the mathematical theory of probability and mathematical statistics, that quantifies the concept of information. It is concerned with information entropy, communication systems, data transmission and rate distortion theory, cryptography, data compression, error correction, and related topics . . . . . . .."
http://encyclopedia.thefreedictionary.com/Shannon's%20theorem
Also, please understand that mutations are random and thus no probabilities can be calculated on exactly when they will mutate, if they will mutate or what they mutate into to. If we could, think about it; we could mathematically calculate what everything is devolving into, if anything at all. We can calculate statistical entropy, but that's it unless we know what actually happened.
********To no avail. And you ad hominems only serve to show that you have not won any argument other than by running away from confronting it.*******
Yep, you get 'em right back from me when you throw them out, don't you.
********I have no problem with that Jerry.*********
You have no problem with what?
******The problem with Jerry's argument is that the logarithm has to be taken from the number of possible states, not the number of mutations.*******
I'm just calculating easy to understand bits which can also be viewed as statistical entropy. But what you fail to understand is that whatever math we use the entropy will still be positive because 1002 mutations are more than 1000 mutations. How would you calculate bits?
"bits are entropy"
http://64.233.161.104/search?q=cache:uScMj4eddMgJ:www.cise.ufl.edu/help/software/doc/mpeg_encode/doc.ps+%22Bits+are+entropy%22&hl=en
******So let's assume we have a population of 10000 humans and look at a particular location in the genome. Let's assume for the moment that at the distribution of the nucleotides at this location is uniformly distributed. In other words we have maximum disorder. The number of accessible states can be calculated to be
W = 10000!/2500!*2500!*2500!*2500!
S= log W = 6014.6*******
Oh man, this is a classic. And almost as bad as the last time you argued this with me stating that when one adds positive integers, the total must always be negative. Remember that little jewel? You don't want me to post a link to that tidbit, do you?
First, if anything is uniformly distributed how is this maximum disorder?
What are you talking about by "the distribution of the nucleotides at this location is uniformly distributed" how would they not be regularly distributed. Please give an example of an irregularly distributed nucleotide.
I take it you are looking at four nucleotides in 10,000 genomes? Well then that's 40,000 sites; you don't have to do any math to determine this.
Then you take 10000 people and divide them by 2500!. 2500! what? Why? Then you multiply them by 2500! three times. Why? This is simply nonsensical. And if you divide 10000 by 2500! And then multiply it by 2500! Aren't you right back at 10000? So why would you include this at all?
Now, slow down. Do each step one at a time, explain why you are doing it, and every step of the mathematics.
using Sshannon= - Sum p_i log p_i with p_i = 0.25 we calculate Sshannon=6020.6******
LOL . . . What is this garbage, Francis? You're just making this up out of thin air hoping someone will believe you've got the math down.
Also, you claimed to have just calculated the Shannon entropy and came up with zero, remember this: Francis: "H= - sum_{i} p_i log p_i
and pointing out that when a mutation becomes fixated the Shannon entropy for that particular nucleotide becomes zero because p_i becomes one for the mutation and zero for the three other nucleotides (I am assuming for convencience a fixated point mutation). Thus the formula with p_i=0 for i=1..3 and p_4=1 shows that H=0"
How come you get two totally distinct formulas when you calculate the same thing twice. Ahhh . . . you forgot you already did this, I see.
How come when I punch in "-Sum p_i log p_i with p_i " into Google not one page comes up?
What are the values of p_I, how did you get these values? Show your work
And now you trying now to show disorder! You are arguing for order, remember?? Now you're coming up with positive entropy just as I was showing disorder, except, for some strange reason, you are wanting to show a HUGE disorder. Sheeze.
*****Remember that the two measures are equal only for large numbers.******
Prove these with references from web sites. I don't believe it.
*******Now two mutations arise which are slightly detrimental or slight neutral, it does not matter and they get fixated in the population as argued by Eyre-Walker. Now we find a different situation since p_i is 1 for these two locations******
Why is p_I 1 and what was it your former 'calculation?'
******so now we have calculate the entropy to be 9998*log.25 or 6019.4 or a drop of 1.2 in entropy.
Similarly we calculate
W = 9998!/2500!2500!2500!2498!=6013.4 or 1.2 less than before
Both formulas agree*******
Well what happened to the other two people? Did they fall over dead with a heart attack or something? :0)
Show this calculation step by step. Explain why you are doing it, and every step of the math. Francis, you are the most intellectually dishonest person I have ever met on the Net and that's saying something.
Pim van Meurs · 28 May 2004
Well things need some cleaning up here
First of all all entropies need to be divided by log 2 (I took log10 not log2 although this is a trivial change). Secondly we need to divide by 10000 to get the average entropy for this location in the genome.
Not surprisingly the entropy before is 2.06 and after 1.99 after the mutation.
If needed I can provide additional examples but the results will be the same. If the mutation at the location before was not fixated and afterwards it is then a decrease in entropy is inevitable since the number of accessible locations has dropped by 2.
I apologize for the sloppiness in my calculations but they are merely meant to show that entropy decreases, the actual value is less important as it is scaled by a constant.
Another thought experiment is to calculate the entropy in 1000 rolls of a single die and then 1000 rolls a single die but two of the outcomes are guaranteed to be sixes. Again it should be obvious that order has increased and entropy decreased. If this is not clear now assume the roll of 1000 die that are loaded to always throw a six. Perfect order and thus minimal entropy. Any time a particular location becomes fixated, entropy has to drop.
Jerry Don Bauer · 28 May 2004
*****Not surprisingly the entropy before is 2.06 and after 1.99 after the mutation.*****
LOL . . . You didn't divide anything by any log based component. Now go back and address my last post sentence by sentence as I did yours. You're just inventing this stuff by making up a bunch of numbers and formulas that are nonsensical. If you can't dazzle them with brilliance, then baffle them with bull****. If you fail to support your stuff with references, then you have lost this debate for the third time.
*****I apologize for the sloppiness in my calculations but they are merely meant to show that entropy decreases, the actual value is less important as it is scaled by a constant.*****
LOL . . . Do you now.
*****Another thought experiment is to calculate the entropy in 1000 rolls of a single die and then 1000 rolls a single die but two of the outcomes are guaranteed to be sixes. Again it should be obvious that order has increased and entropy decreased.******
Thermodynamic entropy cannot increase or decrease with rolls of dice. It takes the exact same energy for my arm to throw any pair of dice as it does another. This is just more Francis garbage that he picked up from some web site, if he didn't just pull it out of thin air.
*******Perfect order and thus minimal entropy.******
ROFL . . . ..You have this exactly backward. Perfect order is maximum entropy. Take a physics course. If you've ever had one in your life, I'll eat my hat.
Jerry Don Bauer · 29 May 2004
Ok <:0) I found one like it and this is used to calculate the size of a channel in order to determine how many binary digits can flow through it:
"Shannon defined a measure of entropy:
that, when applied to an information source, could determine the capacity of the channel required to transmit the source as encoded binary digits."
http://www.fastload.org/in/Information_theory.html
Boy, this has a lot to say about mutating nucleotides. LOL, you, my man, are a trip.
Erik 12345 · 29 May 2004
There's lots of things here that could use clarifications and corrections, but I'll just make a few comments about high abstraction-level things (leaving aside question about particular calculations):
Point I: Information theory (by which I here mean communication theory, rather than, say, Maximum Entropy Inference) can be applied to DNA sequences. For example, if you are going to store large genomes in a database, you may want to compress your data to save storage space. Information theory can tell you how much you at most can compress your DNA sequences by encoding subsequences with bit strings.
A completely different application can be to pick a biological process (e.g. asexual reproduction), and try to imagine this as a communication process by identifying parts that are in some sense analogous to a sender, receiver, communication channel, and messages sent over the communication channel. In the case of asexual reproduction, one could regard the parent as the sender, the children as receivers, the parent genome as the message that is sent, the children genomes as the noise-modified messages that are received, etc. Such an analysis is not an end in itself, but sometimes it may let us infer things about quantities of direct interest to biologists (e.g. the average number of sites at which an asexually produced offspring genome differs from the parent genome).
Point II: Boltzmann entropy, Gibbs entropy, etc. are application-specific concepts. They are restricted to statistical mechanics, which is concerned with the relation between Boltzmann entropy (or Gibbs entropy) and macroscopic physical quantities like energy, pressure, volume, temperature, etc.
Shannon entropy is (by now) a generic concept that can be applied to any quantity to which we have associated a probability distribution. How sensible the application is depends entirely on what you are trying to accomplish and on how sensible your choice of quantity and probability distribution is.
Some readers may be mathematicians who (a) study entropy because they are intrigued by certain formal similarities between statistical mechanics, communication theory, statistical inference, etc., and (b) can say "I like Radon-Nikodym derivates" and mean it. Readers who are not, however, should be careful to not confuse the generic concept of Shannon entropy with the application-specific concept of Boltzmann entropy (or Gibbs entropy).
Point III: Jerry Don Bauer writes: "ROFL . . . ..You have this exactly backward. Perfect order is maximum entropy. Take a physics course. If you've ever had one in your life, I'll eat my hat."
I have taken several physics courses and would generally discourage talk about entropy in terms of metaphors like "disorder". Unfortunately, many physicists are fond of the "disorder" metaphor. Those physicists would say, contra Jerry Don Bauer, that perfect order is indeed minimum entropy and that maximum disorder is the same as maximum entropy.
A good article that deals with Point II & III is "Shuffled Cards, Messy Desks, and Disorderly Dorm Rooms --- Examples of Entropy Increase? Nonsense!". It briefly warns of the dangers of not understanding the difference between Shannon entropy and the entropy of statistical mechanics/thermodynamics as well as the misleading features of the "disorder" metaphor.
Ed Brayton · 29 May 2004
Jerry Don Bauer has been involuntarily removed from this site. He is free to submit all of his whining, martyr poses and cries of persecution to http://www.wedontcare.com.
Pim van Meurs · 29 May 2004
Donald A. Syvanen · 10 November 2004
From an chemical engineering undergraduate student: you folks are making a big mess out of this. I flunked a few test questions because I did not do all the calculations to be SURE the entropy is overall negative, INCREASED disorganization. The power of atheism is too great for most of you to handle. God does not need to keep all the laws of thermodynamics like we do. Macro-evolution did not happen.
Neil Johnson · 11 November 2004