PvM posted Entry 228 on May 25, 2004 09:22 AM.
Trackback URL: http://www.pandasthumb.org/cgi-bin/mt/mt-tb.fcgi/227

Since we have seen some poorly argued claims about entropy and its relevance to evolution, I will explore the concepts of entropy as they apply to genome evolution and will show that the evidence shows how simple processes like variation and selection are sufficient to explain the evolution of complexity or information/entropy in the genome.

While various ID authors (here and elsewhere) have argued that such natural processes are unable to explain the evolution of information in the genome, it should be clear that the actual evidence contradicts any such suggestions.

In the past I have argued with various people on the topic of entropy. Jerry Don Bauer a.k.a. Chronos has shown some interesting confusions as to the concept of entropy and believes that he has shown that using the laws of entropy he has shown that macro-evolution could not have happened.

First some background information

Jerry defines entropy and shows that entropy is always positive (no surprise here since entropy is the log of a number larger than or equal to 1. Based on the fact that entropy is positive he concludes that the tendency is positive and thus complex macro evolution has been disproven:

S = log2W, S = log2(100,000,000), S = 26.5754247590989, therefore S is positive showing a tendency of disorder. Complex macroevolution would have violated one of the most basic and well proven laws of science. And since we know that nothing violates a law of science as a tendency, we can most assuredly conclude that complex macroevolution never occurred.

Link

Jerry can be seen backtracking in later responses:

I certainly do not mean to imply that this is my work: “if W, the number of states by some measure, is greater than 1 then S will be positive by your formula. Thus any number of states will be “showing a tendency of disorder.” This is not my work and was done much earlier by such greats as Boltzmann and Feynman et al.

further backing up and further obfuscating

I did state that that if S is positive, entropy is increased. And this is not a tendency in this case. It’s a fact of this specific example. I would ask you to examine your logic. If entropy increases then disorder has occurred. If S is positive then entropy has increased because S’ IS the entropy we are considering. If you are going to continue in this vein of logic, then I will have to ask you to show how that the tenets of thermodynamics is just wrong in that everyone has it backward. Rising entropy denotes order and decreasing entropy denotes disorder.

link

Another whopper

P1: With every generation in homo sapien, entropy increases in the genome.

P2: Complex macroevolution requires that the genome have a lower entropy over time through the generations.

Therefore, complex macroevolution did not occur

Link

Gedanken quickly exposes the fallacies in Chronos’s argument

By the way, Chronos has not demonstrated either of his premises P1 nor P2.

He has not demonstrated that the entropy must be increasing, simply because his argument confuses the positive value of entropy with a delta or change of entropy in a positive direction. Even if there were an argument that demonstrated this was a positive delta, Chronos has decided not to give such an argument and relies on the value being positive — an irrelevant issue.

Then Chronos has not demonstrated that change over time requires an decrease in entropy. (Or any particular change in entropy — for example changes occur and they are different, but they have the same number of informational or microstates and thus S has not changed.)

Link

Anyone can decypher this one?

Begging your pardon, but it’s not me saying that when entropy is positive it “tends” toward disorder. When entropy is positive there is no longer a tendency involved. It has already happened. The reaction is over and a passed event. Therefore the term tendency no longer applies. And anytime entropy is positive the system has disordered:

Link

Gedanken explains what is wrong with Chronos’s argument

So what is wrong with Jerry’s claims? Other than the confusion of tendency and value that is.

In fact some excellent papers are published by

Adami
and
Schneider

which show how contrary to Jerry’s claims, entropy in the genome can decrease through the simple processes of variation and selection.

Despite the fact that Jerry seems to be blaming Feynman for his errors, it should be clear or soon become clear that Jerry is wrong.

I encourage the readers to pursue the thread I pointed out in which one can see how several people make significant effort to address the confusions exhibited by Jerry. If anything it shows why the abuse of mathematics appears to be so widespread.

As I have shown in some detail above, a correct application of entropy is not that complicated.

The following is a more indepth introduction to the exciting findings about entropy and information/complexity.

Schneider provides us with some interesting data

Information/entropy increase/decrease

  http://www.lecb.ncifcrf.gov/~toms/paper/ev/dembski/b.xyout.gif

Note how the information increases from zero to about 4 bits

From PNAS we find
  http://www.pnas.org/content/vol97/issue9/images/medium/pq0805620003.gif

Fig. 3.  (A) Total entropy per program as a function of evolutionary time. (B) Fitness of the most abundant genotype as a function of time. Evolutionary transitions are identified with short periods in which the entropy drops sharply, and fitness jumps. Vertical dashed lines indicate the moments at which the genomes in Fig. 1 A and B were dominant.

In Evolution of biological complexity Adami et al show

 

To make a case for or against a trend in the evolution of complexity in biological evolution, complexity needs to be both rigorously defined and measurable. A recent information-theoretic (but intuitively evident) definition identifies genomic complexity with the amount of information a sequence stores about its environment. We investigate the evolution of genomic complexity in populations of digital organisms and monitor in detail the evolutionary transitions that increase complexity. We show that, because natural selection forces genomes to behave as a natural “Maxwell Demon,” within a fixed environment, genomic complexity is forced to increase.

The approach is very simple first assume a genome with site i which has the following probabilities for the four nucleotides involved  http://www.pnas.org/content/vol97/issue9/fulltext/4463/img001.gif

One can show that the entropy for this site can be calculated to be

  http://www.pnas.org/content/vol97/issue9/fulltext/4463/img002.gif

And the entropy tendency or information can be defined as

  http://www.pnas.org/content/vol97/issue9/fulltext/4463/img003.gif

Now sum over all sites i and you find that
the complexity or information is given by

  http://www.pnas.org/content/vol97/issue9/fulltext/4463/img004.gif

Figure 3 above shows how entropy after an initial increase decreases at the same time the fitness increases. This information increase/entropy decrease is exactly what happens when selection and variation are combined. Figure 3 shows some beautiful examples of evolutionary transitions.

I am not the only one who has reached this obvious conclusion

Andya Primanda addresses the claim that “Can mutations increase information content? ” from Chapter 3 of The Evolution Deceit by Harun Yahya.

Some excellent websites which expand on the materials presented here can be found

Adami: Evolutionary Biology and Biocomplexity

and

ev: Evolution of Biological Information

A recent paper which identifies some problems with Schneider’s approach can be found here. Despite the problems, the authors recover most of the same conclusions.

Empirically, it has been observed in several cases that the information content of transcription factor binding site sequences (Rsequence) approximately equals the information content of binding site positions (Rfrequency). A general framework for formal models of transcription factors and binding sites is developed to address this issue. Measures for information content in transcription factor binding sites are revisited and theoretic analyses are compared on this basis. These analyses do not lead to consistent results. A comparative review reveals that these inconsistent approaches do not include a transcription factor state space.
Therefore, a state space for mathematically representing transcription factors with respect to their binding site recognition properties is introduced into the modelling framework.
Analysis of the resulting comprehensive model shows that the structure of genome state space favours equality of RSequence and RFrequency indeed, but the relation between the two information quantities also depends on the structure of the transcription factor state space. This might lead to significant deviations between RSequence and RFrequency .
However, further investigation and biological arguments show that the effects of the structure of the transcription factor state space on the relation of RSequence and RFrequency are strongly limited for systems which are autonomous in the sense that all DNA binding proteins operating on the genome are encoded in the genome itself. This provides a theoretical
explanation for the empirically observed equality.

Commenters are responsible for the content of comments. The opinions expressed in articles, linked materials, and comments are not necessarily those of PandasThumb.org. See our full disclaimer.

Comment #2797

Posted by ~DS~ on May 25, 2004 9:23 AM (e)

TY, that was excellent!

Comment #2799

Posted by charlie wagner on May 25, 2004 10:18 AM (e)

Pim wrote:

Since we have seen some poorly argued claims about entropy and its relevance to evolution, I will explore the concepts of entropy as they apply to genome evolution and will show that the evidence shows how simple processes like variation and selection are sufficient to explain the evolution of complexity or information/entropy in the genome.

Unfortunately, neither entropy, complexity or information has anything at all to do with evolution. So I guess you could call this a “red herring”.
If you explained the evolution of *organization* (not order), now that would really be something!

As for the rest of your post, very interesting. Instead of answering each point, allow me to refer you to my prior responses to this issue:

http://tinyurl.com/2sk5c

The search term is “Nelson’s Law”

Comment #2802

Posted by Pim van Meurs on May 25, 2004 10:31 AM (e)

Contrary to Charlie’s suggestions that the issues are a red herring let me point out that it is the ID movement who is claiming that evolutionary mechanisms cannot explain the origin of information and complexity. Is Charlie suggesting that we blame Dembski for introducing the concept of entropy/information? Or in this case is Charlie arguing that Jerry’s comments are irrelevant?

As far as organization is concerned, scale free networks, gene duplication all help understand such issues as modularity, degeneracy, robustness, evolvability.

The real red herring may be the suggestion that the issue is one of ‘organization’. But let’s focus on the issue at hand in this thread which addresses the arguments by ID proponents about information/entropy.

Comment #2804

Posted by Jim Anderson on May 25, 2004 10:40 AM (e)

Unfortunately, neither entropy, complexity or information has anything at all to do with evolution. So I guess you could call this a “red herring”.

Thermodynamics has nothing to do with biology? Call me an uninformed, ignorant layman–and you should, because I am–but that bald assertion strikes me as more than a little, um, radical.

Comment #2805

Posted by Jim Anderson on May 25, 2004 10:41 AM (e)

Oh, and replace “biology” with “evolution.” Duh.

Comment #2807

Posted by charlie wagner on May 25, 2004 11:01 AM (e)

Pim wrote:

let me point out that it is the ID movement who is claiming that evolutionary mechanisms cannot explain the origin of information and complexity.

I don’t represent the “ID movement”, I don’t speak for the “ID movement”, I don’t defend anyone’s views but my own. I don’t know Bill Dembski, I’ve never read his books and I could care less what he thinks. I speak on my own behalf, and I defend what *I* say.

Comment #2808

Posted by Pim van Meurs on May 25, 2004 11:07 AM (e)

charlie and Nelson's law wrote:

Which is exactly why I have put forth Nelson’s Law. It separates out this problem and allows it to stand alone on it’s own merits. Nelson’s law involves logical entropy and separates it from thermodynamic entropy. It measures the disorder of a system and is a pure number, with no units. Life can be described as organization and Nelson’s Law states that “things do not organize themselves”. The evolution of life involves in increase in organization, a decrease in logical entropy. Nelson’s law
forbids this. Things cannot organize themselves without input from outside. You cannot seek refuge from this dilemma by taking advantage of
the confusion between the two forms of entropy.

Nor can one take refuge from the logical and practical answer to Nelson’s law namely that the input from outside is exactly what natural selection is all about. In fact the entropy in the genome can be shown to be linked to the correlation between the genome and the environment and natural selection tends to increase this correlation, thus reducing the entropy.

Simply arguing that it looks like a machine and that it requires input from the outside does not help eliminate evolutionary processes as ‘designers’. In fact it strengthens the proposed evolutionary mechanisms.

Nelson’s law is nothing much different from appeal to entropy with the same fallacies. Its ‘prediction’ that complex machines require intelligent design are meaningless when intelligent design cannot exclude natural processes as its designer.

Comment #2809

Posted by Pim van Meurs on May 25, 2004 11:12 AM (e)

charlie wrote:

I don’t represent the “ID movement”, I don’t speak for the “ID movement”, I don’t defend anyone’s views but my own. I don’t know Bill Dembski, I’ve never read his books and I could care less what he thinks. I speak on my own behalf, and I defend what *I* say.

That is all nice but this thread is *not* about Charlie and all about the confusions exhibited by ID proponents when it comes to entropy and evolution. If you want to be included, fine and I appreciate that your comments help make my point that ID proponents such as Dembski and Jerry are ‘misguided’?

Comment #2820

Posted by zed on May 25, 2004 2:13 PM (e)

I was fairly amazed by Chronos’ assertion that “With every generation in homo sapien, entropy increases in the genome.”

Now my thermodynamics is a bit rusty, but wouldn’t that mean our offspring would progressively degenerate into piles of primordial goo? Isn’t the continued presence of life predicated on at the least a zero net change in “entropy”?

And does anyone actually take this seriously?

Comment #2825

Posted by Panda Bear on May 25, 2004 2:47 PM (e)

A good explanation of problems with common creationist arguments based on the SLOT can be found here or here

Comment #2826

Posted by charlie wagner on May 25, 2004 3:21 PM (e)

Pim wrote:

Nor can one take refuge from the logical and practical answer to Nelson’s law namely that the input from outside is exactly what natural selection is all about. In fact the entropy in the genome can be shown to be linked to the correlation between the genome and the environment and natural selection tends to increase this correlation, thus reducing the entropy.

With all due respect, this is not true. How do you measure the entropy in the genome and how do you demonstrate that natural selection reduces this entropy?
There is no evidence that natural selection can “design” anything. All it can do is change the frequency of already existing variation.

Its ‘prediction’ that complex machines require intelligent design are meaningless when intelligent design cannot exclude natural processes as its designer.

There is no empirical evidence, either observational or experimental that supports the idea that natural processes are capable of design.

Comment #2827

Posted by chris on May 25, 2004 3:32 PM (e)

I think I have to agree that the issue of entropy in evolution is a bit of a red herring. First of all (and most importantly), the
order of nucleic acids in the genome is not actually subject to the laws of thermodynamics. It certainly has aspects that bear a resemblance to familiar concepts in thermo but others are unfamiliar. A current major area of research in physics is to develop a framework to describe certain nonequilibrium phenomenon (of which evolution is an example) with familiar concepts of thermodynamics and statistical mechanics. I want to be clear that I’m not saying there isn’t some definition of “entropy” that applies to evolution and maybe it always increases or decreases or whatever - I think the hope is that there *is* such a thing, actually - I’m only saying that you can’t just lift thermodynamics and apply it to anything you want.

Besides which, increasing entropy is not really associated with increasing disorder at all. That’s just an analogy. There are several famous examples of situations where increasing entropy INCREASES ORDER.

Comment #2828

Posted by Town Crier on May 25, 2004 3:40 PM (e)

There is no empirical evidence, either observational or experimental that supports the idea that natural processes are capable of design.

For the love of aliens, Charlie, could you at least make the slightest effort to express your own beliefs accurately?

Numerous people have gone around the bend with you numerous times regarding the same issue.

What Charlie really means to say is “No evidence shown to me, alone or in combination with any or all other evidence, will convince me that natural selection is responsible for the diversity of organisms and structures, including all known extinct organisms and all known living organisms.”

This statement, Charlie, unlike yours, is honest and forthright.

The statement you made, highlighted above, merely begs numerous questions, such as what you mean when you use terms such as “empirical,” “evidence,” “observational,” “experimental,” “natural processes” and “design.”

Woe be unto those who would ask you to define these terms in an effort to understand how anyone could be so deluded to assert that **no evidence exists** to support evolution by natural selection.

Woe be unto those!!!

Comment #2829

Posted by Pim van Meurs on May 25, 2004 3:49 PM (e)

Charlie wrote:

There is no empirical evidence, either observational or experimental that supports the idea that natural processes are capable of design.

Of course this is an argument based on personal incredulity contradicted by the actual empirical evidence that shows how natural selection and variation (in other words natural processes) are capable of ‘design’. Charlie may surely argue that he is unfamiliar with the scientific evidence but that is hardly of any interest to me.

Charlie continues his strawman wrote:

There is no evidence that natural selection can “design” anything. All it can do is change the frequency of already existing variation.

Charlie, until you accurately represent evolutionary processes that is variation AND selection, your comments have to be rejected. In fact your argument supports my claim namely that by changing the frequency of already existing variation natural selection does increase the correlation of the genome with the environment.

So yes there is theoretical, empirical and observation and experimental evidence to support these claims.

Chris wrote:

I think I have to agree that the issue of entropy in evolution is a bit of a red herring. First of all (and most importantly), the order of nucleic acids in the genome is not actually subject to the laws of thermodynamics.

Which is why I use the Shannon entropy.

chris wrote:

There are several famous examples of situations where increasing entropy INCREASES ORDER.

I would be interested in some references to these famous examples. But he discussion is not about order but rather about information/complexity and the claims made by ID proponents as to the limitations of natural processes to increase information/complexity.

Comment #2831

Posted by Ben on May 25, 2004 4:21 PM (e)

Thank you, Town Crier. I was just about to slash my wrists. Christ I’m sick of this “design” crap.

Comment #2833

Posted by Jim Harrison on May 25, 2004 4:55 PM (e)

Unless the designer of ab organism can suspend the laws of nature, he or she remains subject to the Second Law. If naturally evolved organisms are impossible because of thermodynamic considerations, designed organisms are no less impossible.

Comment #2834

Posted by Reed A. Cartwright on May 25, 2004 5:06 PM (e)

There is much confusion that in Shannon information theory entropy/uncertainity is a measure of information. However, in reality “information” is defined as “a measure of the decrease of uncertainty at a receiver.” (See here and here.)

Imagine that we are in communication and that we have agreed on an alphabet. Before I send you a bunch of characters, you are uncertain (Hbefore) as to what I’m about to send. After you receive a character, your uncertainty goes down (to Hafter). Hafter is never zero because of noise in the communication system. Your decrease in uncertainty is the information ® that you gain.

Since Hbefore and Hafter are state functions, this makes R a function of state. It allows you to lose information (it’s called forgetting). You can put information into a computer and then remove it in a cycle.

Many of the statements in the early literature assumed a noiseless channel, so the uncertainty after receipt is zero (Hafter=0). This leads to the SPECIAL CASE where R = Hbefore. But Hbefore is NOT “the uncertainty”, it is the uncertainty of the receiver BEFORE RECEIVING THE MESSAGE.

Comment #2835

Posted by Art on May 25, 2004 5:08 PM (e)

About “Nelson’s Law”:

From the URL (and its “destinations”) - “Life can be described as organization and Nelson’s Law states that “things do not organize themselves”. The evolution of life involves in increase in organization, a decrease in logical entropy. Nelson’s law forbids this. Things cannot organize themselves without input from outside. “

I don’t know what “Nelson’s Law” really is, but Charlie (or anyone else reading this) can refute the characterization seen in this thread in their own kitchen. Pour some salad oil in a bottle, add some water (or vinegar), and mix thoroughly. Then let the mixture (which should be the disorganized state that “Nelson’s Law” predicts will be the final state) sit - untouched, completely isolated from all influences. (Heck, we can be really anal and put it in total darkness.)

We all know what will happen - the completely disorganized mixture will spontaneously, completely of its own accord, without any input of energy, information, design, or any other influence, organize into two perfectly-separated phases. The “thing” will most definitely “organize itself”. It’ll happen each and every time, without fail.

What’s really neat is that the same chemical principles underlie the majority (IMO, at least) of organization in biology.

Comment #2839

Posted by Reed A. Cartwright on May 25, 2004 5:25 PM (e)

Art,

Nelson’s Law is a tautology that Charlie came up with and named after his middle name.

Comment #2840

Posted by charlie wagner on May 25, 2004 5:34 PM (e)

Art wrote:

We all know what will happen - the completely disorganized mixture will spontaneously, completely of its own accord, without any input of energy, information, design, or any other influence, organize into two perfectly-separated phases. The “thing” will most definitely “organize itself”

I don’t expect that you’ve read *all* of the messages I wrote in talk.origins, but if you had, you would have seen that I carefully explained the difference between “order” and “organization”. These terms are sometimes interchanged and some confusion occurs, which is why I carefully defined what I meant. Your example with the oil and vinegar is an example of an increse in order, which can and does occur naturally, as you point out, without intelligent intervention. In this case, order is defined as “a condition of logical or comprehensible arrangement among the separate elements of a group”. Ther are many such examples, from ice crystals forming to the separation of immiscible liquids by density.
But organization is another matter. I define it as “a system made up of elements with varied functions that contribute to the whole and to collective functions of the system”.

Here are two examples of my views:

“The only difference I see is between living systems and non-living
systems. Non-living systems do not adapt means to ends, they do not
adapt structure and process to function and they do not self-organize.
And one must be careful not to confuse organization with order.
There’s a lot of talk about ordered systems in the non-living world,
snowflakes, tornadoes, etc. but this is not the issue. Living systems
are beyond order, which is simply a condition of logical or
comprehensible arrangement among the separate elements of a group.
Like putting files in alphabetical order or using a seive to separate
items by size. Organization is a much different structure in which
something is made up of elements with varied functions that contribute
to the whole and to collective functions, such as exist in living
organisms. Ordered systems can result from non-intelligent processes,
as has been seen many times and cited by the examples given. But
organized systems require intelligent guidance. They need to be put
together with intent and their assembly requires insight. They need to
be the product of intelligence because it is necessary to determine if
they are functioning properly and that can only be acheived by
insight. Since living systems display organization, they display means
adapted to ends and structures and processes assembled to perform
specific functions, it becomes self-evident that they are the product
of a higher intelligence.”

And,

“a mousetrap has a quality
called organization, which is much different from complexity or order.
Each part of the mousetrap, the platform, the holding bar, the spring,
the hammer and the catch each have specific functions. And each of
these functions are organized in such a way that they support the other functions and the
overall function of the mousetrap, which is to catch mice. The
function of the platform is to hold the parts, but it’s there
ultimately to facilitate the process of mouse catching. The function
of the spring is to exert a force on the hammer, but it’s ultimate
goal is to enable the process of mouse catching. All of the parts have
functions that not only support the other functions, but ultimately
support the overall function of the device. This type of organization
is not obtainable without insight, and insight always requires
intelligence. There is no way that these parts could be assembled in
such a manner without insight.
A mousetrap is a simple machine, made up of several structures and
processes and exists for a purpose. The construction of the mousetrap
was initiated with intent, and fashioned for a purpose. Living
organisms are similarly machines, with structures and processes that
work together to create a function. In fact, all complex, highly
organized machines in which means are adapted to ends are the product
of intelligent design. The important point is that the adaption of
means to ends, the adaption of structure and process to function
requires insight.
A moustrap is unevolvable without intelligent input, not because you can’t take it
apart without it losing it’s function, it’s unevolvable because you
can’t put it together in the first place using only random,
non-directed, accidental occurrences. The selection of the parts, the
configuration in which they’re aligned, the assembly into one unit all
require intelligent decisions at every step of the way. Similarly,
living organisms show the same characteristics. It’s not that you
can’t remove parts and lose total function, it’s that you can’t
explain why these particular parts were selected, why they’re
integrated together in just such a way and how they were assembled
from raw materials without invoking an intelligent agent.”

Comment #2842

Posted by charlie wagner on May 25, 2004 5:41 PM (e)

Reed wrote:

Nelson’s Law is a tautology that Charlie came up with and named after his middle name.

Go to my website and scroll down to the very bottom, right corner where you will find my middle name. It is not Nelson. Marshall Nelson is my pen name. I did, in fact, name the Law after myself. And Nelson’s Law is NOT a tautology, it is a falsifiable scientific law.

Comment #2843

Posted by Town Crier on May 25, 2004 5:57 PM (e)

Charlie said:

Organization is a much different structure in which
something is made up of elements with varied functions that contribute
to the whole and to collective functions

Ah, such a clear and useful definition of “organization”! Come to think of it, Charlie, you never were able to explain why Great White Wonder’s swimming hole didn’t satisfy your definition of “organization.” Would you like to take a stab at that now?

Or you could simply endorse what I stated in my earlier post and spare us the backpedaling.

Comment #2846

Posted by Reed A. Cartwright on May 25, 2004 6:12 PM (e)

Charlie wrote:

Go to my website and scroll down to the very bottom, right corner where you will find my middle name. It is not Nelson. Marshall Nelson is my pen name. I did, in fact, name the Law after myself.

My bad, when looking on T.O. I thought I saw you say that it was after your middle name.

And Nelson’s Law is NOT a tautology, it is a falsifiable scientific law.

It is so a tautology: (to paraphrase your logic) “Everything that we’ve observed made by mankind did not arise without intelligent input.”

Your use of “falsifiability” is also flawed. “Falsifiability” does not make something true, only testable. Only after repeatedly failing to falsify a “falsifiable” claim is it considered valuable. I have yet to see “Nelson’s Law” tested on biological entities. Therefore, your attempts to say “Nelson’s Law prevents X” are invalid.

Comment #2847

Posted by charlie wagner on May 25, 2004 6:23 PM (e)

Reed wrote:

It is so a tautology: (to paraphrase your logic) “Everything that we’ve observed made by mankind did not arise without intelligent input.”

Reed, that’s an incorrect characterization of my “logic”. Perhaps it would be better to just quote me, rather than trying to paraphrase my logic.
I said:

Nelson’s law is not an abstract conceptualization, it has clear and easily
observable qualities, chief among which is the adaptation of means to
ends. It also includes the relatedness of structure and function and
the adaptation of structures and processes for the purpose of
accomplishing the function. All of the aspects of a refrigerator meet
these criteria, The physical construction of the cabinet, compressor,
heat exchanger, etc. exist for the purpose of carrying out the cooling
function. The process of compression and expansion of gases in the
refrigerator are adapted to the function of cooling as well. All of
these aspects are assembled in such a way as to work together, each
part and process performing a specific function that contributes to
the overall goal of cooling. All known entities that we have observed,
that have the above mention qualities, are the products of intelligent
design. Living organisms also have these qualities and possesss the
same inherent properties as a refrigerator, or any other functional
machine. My conclusion, therefore, is that they also were designed for
a purpose. Nelson’s Law clearly states that entities of this nature,
fitting this description, and having these properties and qualities,
do not create themselves without the benefit of intelligent
intervention.

Comment #2851

Posted by Town Crier on May 25, 2004 6:40 PM (e)

Nelson’s law is not an abstract conceptualization, it has clear and easily
observable qualities, chief among which is the adaptation of means to
ends. It also includes the relatedness of structure and function and
the adaptation of structures and processes for the purpose of
accomplishing the function.

Charlie, you are clearly refusing to address the fact that your definition of organization means that swimming holes and beaches are intelligently designed (even those without tire swings or showers).

Your refusal to answer can only be taken as an admission that your theory is, at best, half-baked.

Comment #2857

Posted by Reed A. Cartwright on May 25, 2004 7:29 PM (e)

Charlie wrote:

Reed, that’s an incorrect characterization of my “logic”. Perhaps it would be better to just quote me, rather than trying to paraphrase my logic.

Here is what you state in your quote:

All known entities that we have observed,
that have the above mention qualities, are the products of intelligent
design.

Your “all know entities” that you cite are human creations. Thus “Everything that we’ve observed made by mankind did not arise without intelligent input.” is an accurate paraphrase of your logic.

Comment #2859

Posted by Nomen Nescio on May 25, 2004 8:11 PM (e)

does it seem to anyone else that naming a made-up “law” after one’s own self-assigned nickname is an even more pretentious social faux pas than simply naming it after oneself?

not, of course, that that alone is any argument against the “law” in question. nonetheless, bad taste, n’est pas?

as for the “law” itself, i cannot say with certainty that it is flawed - frankly, it’s too vaguely and verbosely stated for me to make a whole lot of sense of it. however, it does occur to me that - if it is to be of any use - it really ought to be able to determine whether or not the Oklo reactors were intelligently designed could anybody with a better sense of what Charlie’s trying to say perhaps take a stab at working out if it does or not?

Comment #2860

Posted by Art on May 25, 2004 8:22 PM (e)

Charlie,

Thanks for the clarification (I think). But I don’t buy your distinction between “order” and “organization”. At least in its entirety.

But even if I did - I think its easy to see how storms (tornadoes, hurricanes, et.c) are organized according to some of your rules (those that do not reduce your terminology to tautology), and I could probably dissect the oil-water system into an organized, as opposed to ordered, one as well.

So, either way, I don’t see how “Nelson’s Law” is of any particular use, except as a vehicle to wander into semantic hair-splitting. It’s much easier (and more correct) to accept that “SLOT’s”, regardless of their derivation, simply cannot rule out evolution.

Comment #2862

Posted by Jerry Don Bauer on May 25, 2004 9:09 PM (e)

LOL … This has got to be the silliest thing I’ve ever seen posted on the Net on “thermodynamics”– To start with it’s genetics and has nothing to do with anything I have ever discussed concerning devolution of the genome.

What’s even more telling is the rest of the forum is like cool, go man, go …..we are really doing some science here. PvM hasn’t done a thing here with this nonsensical ‘math’ and I’ve seen him laughed out of other forums with this same stuff. Observe:

*******The approach is very simple first assume a genome with site i which has the following probabilities for the four nucleotides involved******

I did not assume ANY genome. I introduced a study by evolutionary biologists where all deleterious mutations were already identified. There is no probability involved with that. Sheeeze.

******One can show that the entropy for this site can be calculated to be******

LOL … OK, what is it, PvM? People he didn’t calculate any entropy. He threw out an empty formula with no numbers in it and thinks he has calculated entropy. And everyone else: Hey good job, man. I understand this now.

*****And the entropy tendency or information can be defined as*****

You didn’t define anything and again you didn’t calculate anything because there are no numbers in your formula.

******Now sum over all sites i and you find that the complexity or information is given by******

You cannot sum over anything! You don’t have any figures in this formula to sum over.

Of course, he then graphs all of this stuff he never calculated to start with and he does this before he doesn’t calculate anything.

LOL … You people don’t know he is just cutting and pasting stuff he doesn’t even understand?

Comment #2864

Posted by Jerry Don Bauer on May 25, 2004 9:37 PM (e)

******I was fairly amazed by Chronos’ assertion that “With every generation in homo sapien, entropy increases in the genome.”*******

Don’t believe anything Francis (PvM, the very confused YECer) posts. He’ll have you so confused you won’t know whether you’re coming or going because he is totally confused.

This came not from me (Chronos) but from a peer reviewed study submitted in Nature by two well respected evolutionary biologists name Eyre-Walker of Sussex University and Keightley of EdinBurgh.

http://homepages.ed.ac.uk/eang33/

They found in this study that in man’s evolutionary walk from Chimp, the genome devolved at the rate of 1.6 deleterious mutations per generation AFTER natural selection had weeded out other of these mutations. This is the number that accumulate in the genome. But normally when we discuss this study, the figures are rounded off to two.

Here is the abstract:

High genomic deleterious mutation rates in hominids.

Eyre-Walker A, Keightley PD.

Centre for the Study of Evolution and School of Biological Sciences, University of Sussex, Brighton, UK. A.C.Eyre-Walker@susx.ac.uk

It has been suggested that humans may suffer a high genomic deleterious mutation rate. Here we test this hypothesis by applying a variant of a molecular approach to estimate the deleterious mutation rate in hominids from the level of selective constraint in DNA sequences. Under conservative assumptions, we estimate that an average of 4.2 amino-acid-altering mutations per diploid per generation have occurred in the human lineage since humans separated from chimpanzees. Of these mutations, we estimate that at least 38% have been eliminated by natural selection, indicating that there have been more than 1.6 new deleterious mutations per diploid genome per generation. Thus, the deleterious mutation rate specific to protein-coding sequences alone is close to the upper limit tolerable by a species such as humans that has a low reproductive rate, indicating that the effects of deleterious mutations may have combined synergistically. Furthermore, the level of selective constraint in hominid protein-coding sequences is atypically low. A large number of slightly deleterious mutations may therefore have become fixed in hominid lineages.

http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&db=pubmed&dopt=Abstract&list_uids=9950425

Those who don’t care to read through the science can find a good read here:

http://www.open2.net/truthwillout/evolution/article/evolution_walker.htm

Comment #2865

Posted by Pim van Meurs on May 25, 2004 10:07 PM (e)

It should be obvious by now that Jerry, is in ‘over his head’.

jerry wrote:

I did not assume ANY genome. I introduced a study by evolutionary biologists where all deleterious mutations were already identified. There is no probability involved with that. Sheeeze.

I was responding to your ‘calculations’ of entropy which you concluded were positive and thus showed a tendency to disorder. This incorrect application of mathematics is what I am addressing here. And contrary to your claims that

P1: With every generation in homo sapien, entropy increases in the genome.

P2: Complex macroevolution requires that the genome have a lower entropy over time through the generations.

Therefore, complex macroevolution did not occur

I argued that not only have you failed to provide for support for P1 and P2 but I also showed that mutation and variation actually can be shown to decrease the entropy in the genome.

Not only can it thus be shown that no support for Jerry’s P1 and P2 claims exist but also that actual evidence contradicts Jerry’s conclusions.

All because Jerry seems to be unable to apply the correct concepts.

Jerry wrote:

What’s even more telling is the rest of the forum is like cool, go man, go …..we are really doing some science here. PvM hasn’t done a thing here with this nonsensical ‘math’ and I’ve seen him laughed out of other forums with this same stuff. Observe:

I understand that your only response to solid mathematics is ridicule. But we often ridicule that which we do not comprehend. In fact not only is this math not nonsensical but it is also supported by actual examples (did Jerry miss the graphs) which show the evolution of entropy under the influence of mutation and selection.

Of course Jerry seems easily confused by abstract mathematics

You didn’t define anything and again you didn’t calculate anything because there are no numbers in your formula.

It is trivially simple to replace the parameters in these formulas with actual values which is what was done to create the graphs.

Jerry ironically comments: LOL … You people don’t know he is just cutting and pasting stuff he doesn’t even understand?

So far the only person who does not seem to understand is you Jerry. And that by your own comments. Remember you are the person who confused the positive value of entropy with its tendency. People are ISCID are still rolling their eyes on that one.

Jerry then quotes an article by Walker without doing ANY calculations to show that entropy has in fact increased. If Jerry’s argument is not about entropy but about the rate of deleterious mutations then why all his nonsensical postings to introduce these concepts?
Are you saying that when you spammed this board with your nonsensical articles about entropy you were not really arguing entropy at all?

Can we say back pedaling Jerry?

SO Jerry, what part of my introduction to Shannon entropy confused you?

Or shall we consult a scientist on these matters rather than rely on your ‘claims’? Let’s see

How do genetic systems gain information by evolutionary processes? Answering this question precisely requires a robust, quantitative measure of information. Fortunately, fifty years ago Claude Shannon defined information as a decrease in the uncertainty of a receiver. For molecular systems, uncertainty is closely related to entropy and hence has clear connections to the Second Law of Thermodynamics. These aspects of information theory have allowed the development of a straightforward and practical method of measuring information in genetic control systems. Here this method is used to observe information gain in the binding sites for an artificial ‘protein’ in a computer simulation of evolution. The simulation begins with zero information and, as in naturally occurring genetic systems, the information measured in the fully evolved binding sites is close to that needed to locate the sites in the genome. The transition is rapid, demonstrating that information gain can occur by punctuated equilibrium.

Tom Schneider “evolution of biological information”

To explain for Jerry’s sake the implications: Schneider has shown how under the simple processes of mutation and selection the information in the genome can increase. Since information increase means entropy decrease, it should be obvious that evolutionary processes can and do in fact decrease entropy and increase information.

But let’s continue with how real scientists address these issues of entropy and the genome

To make a case for or against a trend in the evolution of complexity in biological evolution, complexity needs to be both rigorously defined and measurable. A recent information-theoretic (but intuitively evident) definition identifies genomic complexity with the amount of information a sequence stores about its environment. We investigate the evolution of genomic complexity in populations of digital organisms and monitor in detail the evolutionary transitions that increase complexity. We show that, because natural selection forces genomes to behave as a natural “Maxwell Demon,” within a fixed environment, genomic complexity is forced to increase.

Adami, Ofria and Collier evolution of biological complexity

Jerry’s postings and arguments are no match to the reality.

Comment #2866

Posted by Pim van Meurs on May 25, 2004 10:12 PM (e)

As far as Charlie is concerned, his appeal to personal incredulity combined with a lack of any supporting argument makes his Nelson’s law quite meaningless and forcing him to ignore the empirical data, experimental data and theoretical arguments that disprove his claims. Until Charlie can show that he can correctly represent evolutionary theory, his claims have to be rejected due to their strawman nature and unproven and in fact fallacious assumptions.

Comment #2867

Posted by Pim van Meurs on May 25, 2004 10:19 PM (e)

Syntax Error: mismatched tag 'li'

Comment #2868

Posted by Pim van Meurs on May 25, 2004 10:37 PM (e)

Me: Note that Jerry has yet to present any entropy calculations that show that entropy of the human genome increase (P1) not to mention any evidence to support P2.

And this is not a non-trivial calculation since Eyre-Walker looked only at slightly deleterious mutations, ignoring thus beneficial mutations.

And as far as P2 is concerned, it seems that these findings fall within the realm of evolutionary explanations. Perhaps P2 was just a wishful thinking argument?

A bit of quote mining, a lot of nonsensical and irrelevant claims about entropy all to show that there is a significant slightly deleterious mutational load in humans.

But what about the entropy Jerry? And what about the fact that the study only considered slightly deleterious mutations?

Shoddy Jerry, shoddy.

Not surprisingly Eyre-Walker presents the answer to this problem

Work has also started to elucidate the role of adaptive evolution at the DNA sequence level. Several studies have recently estimated that a substantial fraction of the amino acid substitutions in higher primates (30) and Drosophila (24, 31, 32) are a consequence of adaptive evolution rather than random genetic drift. However, inferring the number of advantageous mutations is difficult because the number of substitutions is a function of both the mutation rate to advantageous mutations and the strength of selection favoring them. We do not currently have independent estimates of either of these quantities.

Estimating the distribution of fitness effects from DNA sequence data: Implications for the molecular clock, Gwenaël Piganeau and Adam Eyre-Walker PNAS 2003 vol. 100 no. 18 10335-10340

Comment #2869

Posted by Pim van Meurs on May 25, 2004 10:49 PM (e)

Let’s add to this the following data

Fitness effects of advantageous mutations in evolving Escherichia coli populations
Marianne Imhof and Christian Schlötterer

The central role of beneficial mutations for adaptive processes in natural populations is well established. Thus, there has been a long-standing interest to study the nature of beneficial mutations. Their low frequency, however, has made this class of mutations almost inaccessible for systematic studies. In the absence of experimental data, the distribution of the fitness effects of beneficial mutations was assumed to resemble that of deleterious mutations. For an experimental proof of this assumption, we used a novel marker system to trace adaptive events in an evolving Escherichia coli culture and to determine the selective advantage of those beneficial mutations. Ten parallel cultures were propagated for about 1,000 generations by serial transfer, and 66 adaptive events were identified. From this data set, we estimate the rate of beneficial mutations to be 4 × 109 per cell and generation. Consistent with an exponential distribution of the fitness effects, we observed a large fraction of advantageous mutations with a small effect and only few with large effect. The mean selection coefficient of advantageous mutations in our experiment was 0.02.

and

The Distribution of Fitness Effects Among Beneficial Mutations H. Allen Orra

We know little about the distribution of fitness effects among new beneficial mutations, a problem that partly reflects the rarity of these changes. Surprisingly, though, population genetic theory allows us to predict what this distribution should look like under fairly general assumptions. Using extreme value theory, I derive this distribution and show that it has two unexpected properties. First, the distribution of beneficial fitness effects at a gene is exponential. Second, the distribution of beneficial effects at a gene has the same mean regardless of the fitness of the present wild-type allele. Adaptation from new mutations is thus characterized by a kind of invariance: natural selection chooses from the same spectrum of beneficial effects at a locus independent of the fitness rank of the present wild type. I show that these findings are reasonably robust to deviations from several assumptions. I further show that one can back calculate the mean size of new beneficial mutations from the observed mean size of fixed beneficial mutations.

Comment #2870

Posted by Jerry Don Bauer on May 25, 2004 10:51 PM (e)

*****With all due respect, this is not true. How do you measure the entropy in the genome and how do you demonstrate that natural selection reduces this entropy?*****

The thread would do well to stay away from vague concepts such as Nelson’s law and go with the well proven work of Boltzmann and Feynman. Of course, Boltzmann was the first to term the W in his S = K log W as “the opposite of information” but Feynman refined this into formula without Boltzmann’s constant messing with Joules and degrees Kelvin which really just clutters everything.

Feynman honed this down: “So we now have to talk about what we mean by disorder and what we mean by order…. Suppose we divide the space into little volume elements. If we have black and white molecules, how many ways could we distribute them among the volume elements so that white is on one side and black is on the other? On the other hand, how many ways could we distribute them with no restriction on which goes where? Clearly, there are many more ways to arrange them in the latter case. We measure “disorder” by the number of ways that the insides can be arranged, so that from the outside it looks the same. The logarithm of that number of ways is the entropy. The number of ways in the separated case is less, so the entropy is less, or the “disorder” is less.”

http://www.panspermia.org/seconlaw.htm

Feynman also gave us the formula with which we could calculate this: S = log2(W) where S is entropy and W is “those numbers of ways” or the total possible microstates of any given system.

We can find W, because the researchers tell us is there are about 100 million possibilities that could mutate, so four mutations is not that big a number, relatively speaking. (please read the BBC article I previously posted for this figure) But this will be positive entropy, thus we can surmise that entropy has risen in each, or at least most, generations for the last six million years and there is no evidence at all to suggest it hasn’t been this way throughout the entire 2 billion year process.

S = log2W, S = log2(100,000,000), S = 26.5754247590989, therefore S is positive showing a positive tendency of disorder as we would expect.

But this is only statistical entropy and if we are to figure reactional entropy, we will have to calculate actual deleterious mutations from generation to generation.

We can view the deleterious mutations as actual entropy because, in this case, this is the actual disorganization. Eyre-Walker tells us that the human genome is estimated to carry 1000 negative mutation, so let’s get that entropy S = log2 W = S = log2 1000 S = 9.96578428466209 –

Now let’s calculate the entropy after two more genes mutate S = log2 W S = log2 1002 S = 9.96866679319521 –

It is here that we do that subtraction you so badly wanted to do: deltaS, the actual change in entropy is: deltaS = S2 - S1 – deltaS = 9.96866679319521 (-) 9.96578428466209 = deltaS = 0.00288250853312

Questions, comments, or just ignorant trolls, its up to you guys.

Comment #2871

Posted by Jerry Don Bauer on May 25, 2004 11:32 PM (e)

NOTE: I’m going to answer exactly one more of your ignorant posts, Francis. So after this one, you sum it all together and know that you had your chance to debate this. You somehow seem to have become one of my groupies. Go troll Mike Gene or something.

*****I was responding to your ‘calculations’ of entropy which you concluded were positive and thus showed a tendency to disorder. This incorrect application of mathematics is what I am addressing here. And contrary to your claims that
P1: With every generation in homo sapien, entropy increases in the genome.
P2: Complex macroevolution requires that the genome have a lower entropy over time through the generations.
Therefore, complex macroevolution did not occur*****

No, you are not answering ‘my calculations’ or my ‘claims.’ I’m sending you to papers by reputable scientists. You will not read them even when I post them verbatim for you and you always go back to Jerry claims this and that. Will you pull your head out of your arse and know that its not me saying this stuff but scientific studies? These calculations are just math describing them.

*****I understand that your only response to solid mathematics is ridicule. But we often ridicule that which we do not comprehend. In fact not only is this math not nonsensical but it is also supported by actual examples (did Jerry miss the graphs) which show the evolution of entropy under the influence of mutation and selection.******

LOL … Solid mathematics? Is this a joke? You calculated nothing. This is the stupidest forum I’ve ever been in as no one bothered to muse … well gee, what were the conclusions? Um..is there actually any figures in those formulas? What were the entropic conclusions and the sums? Nah ….Its just, well gee whiz, this seems anti–Id. Let’s run with it. And look at all the PhDs listed as the sponsors. Unbelievable. You people you are just deceiving yourselves. The American public is nowhere near as stupid as you have become.

******It is trivially simple to replace the parameters in these formulas with actual values which is what was done to create the graphs.*****

Oh. This should be good. Let’s see you do it. And how did you create the graph without trivially using actual values? ;0)

The rest of your post is typical Francis the YEC/atheist turned I don’t know what the crap I am drivel who spouts Bible scripture while espousing atheism in the same posts conundrum.

You get one more post, troll. Make it good, dude. (or dudette, you post under female names as well)

Comment #2872

Posted by Steve on May 25, 2004 11:45 PM (e)

Slowly I’m learning to come here and read the posted articles, but not the comment sections. Reading 7000 words of arguing with IDers who are using information theory and thermodynamics like a monkey would use an oscilloscope is just not valuable for me. I still read them sometimes out of a macabre fascination, though.

The articles are damned good, btw, kudos to the contributors. I hope a cool article in the future will explain to me how IDers fail to understand what it means that evolution-based research results in more papers in PNAS, Science, and Nature alone every week than I can even read the abstracts of, while IDers never get any research published. How can they think so much about the topic, and not realize their ‘science’ doesn’t produce publishible results? That’s really more of a psychology topic than a biological one, though.

Comment #2874

Posted by steve on May 25, 2004 11:58 PM (e)

I have a request, too–articles about antibiotic resistence. For some eventual research on the physics of Syntaxin mutants, I’ve been hip-deep in e. coli DH5a, plasmids, pcr, sequencing, etc., and there’s really some very cool stuff involved in growing bacteria which is resistant to a specific antibiotics like kanamycin, in kanamycin, to prevent contamination. It’s a subject laypeople might enjoy hearing about. Just a few months ago I had no idea about the subtle evolutionary bits involved.

Comment #2876

Posted by Jerry Don Bauer on May 26, 2004 1:28 AM (e)

****I have a request, too—articles about antibiotic resistence.*****

Here’s the only article you will ever need on it. Certain organisms in any species will always be more susceptible than others to an antibiotics and some will be resistant to it. As more antibiotics are used, the ones that are susceptible are killed off and the ones who live will be the ones who have offspring.

These offspring are then also resistant to that antibiotic.

Others might take a book to explain this. But there you go and that’s the truth.

Comment #2879

Posted by Jack Krebs on May 26, 2004 3:47 AM (e)

Steve writes, “Slowly I’m learning to come here and read the posted articles, but not the comment sections. Reading 7000 words of arguing with IDers who are using information theory and thermodynamics like a monkey would use an oscilloscope is just not valuable for me. I still read them sometimes out of a macabre fascination, though.”

I hope that the quality of the comments section improves, Steve. Don’t give up.

Comment #2893

Posted by Navy Davy on May 26, 2004 9:25 AM (e)

The reason the comment section often degenerates is actually a STRUCTURAL failing on the part of Panda’s Thumb. Observe the dynamic:

1. Poster tries to makes a point, imbued with scorn and ad hominem, which often specifically NAMES Jerry Don Bauer as the subject;

2. But poster refuses to engage JDB (ask civil questions, respond to civil questions posed);

3. JDB responds, fending off 5-10 persons sniping from sidelines, sometimes with ad homimen in response and sometimes over-enthusiastically;

4. Thread peters out. Few are satisfied.

Of course, I have offered to mediate a civilized, debate, where advocates actually get to TEST their propositions, and we get to learn something. JDB has pledged to participate and abide by standard debating rules. Kudos, Jerry.

But, alas, the “higher-ups” at PT cannot spare a solitary thread for such an orderly debate.

I thought scientists were, you know, into (1) evidence, (2) testing, (3) prediction, (4) logic.

I guess not. Let the brawlin’ continue!

Cheers,

Navy Davy

Comment #2894

Posted by Pim van Meurs on May 26, 2004 9:34 AM (e)

Jerry continues to exemplify his lack of understanding when he states: herefore S is positive showing a positive tendency of disorder as we would expect.

This whopper was in depth debunked on ISCID. That Jerry still repeats this silly notion almost verbatim is quite fascinating.

Jerry wrote:

The rest of your post is typical Francis the YEC/atheist turned I don’t know what the crap I am drivel who spouts Bible scripture while espousing atheism in the same posts conundrum.

Yes Jerry, ad hominem is all that you have left. Thank you for showing your ‘true colors’. And while we all hope that the quality of Jerry’s postings would improve, the opposite seems to have happened.

If Jerry cannot even understand the simple mathematical foundations for Shannon entropy as they apply to the genome, it does not come as a surprise that he fails to comprehend how one can calculate the actual entropy from abstract formulas.

Let me know what part of the derivation of Shannon entropy or the calculations based upon these formulas confuse you Jerry.

As I have shown, real scientists, have applied these concepts and shown how entropy in the genome could decrease under the workings of variation and selection.

And then there is Jerry who cannot even calculcate the entropy in the genome to show support for P1 (entropy in the human genome always increases).

Comment #2895

Posted by Jack Krebs on May 26, 2004 9:34 AM (e)

How about taking your idea to ARN, where you or Jerry would be free to start a post and propose details of this debate?

Comment #2896

Posted by Andrea Bottaro on May 26, 2004 9:44 AM (e)

Here’s the only article you will ever need on it. Certain organisms in any species will always be more susceptible than others to an antibiotics and some will be resistant to it. As more antibiotics are used, the ones that are susceptible are killed off and the ones who live will be the ones who have offspring.

These offspring are then also resistant to that antibiotic.

Others might take a book to explain this. But there you go and that’s the truth.

Ah, but that’s just a statement of fact, and explains little or nothing. The real questions are, why will certain individuals always be more susceptible, or resistant, to antibiotics than others? And why is it so hard to come up with an antibiotic that kills all individuals, preventing the onset of resistance? And why does taking suboptimal doses of antibiotics increase the chances of inducing resistance?

Comment #2897

Posted by Pim van Meurs on May 26, 2004 9:47 AM (e)

Navy, why not take Jack Krebs’ idea and implement such a discussion on ARN, which would be an excellent place for Jerry to discuss his ideas.

To suggest that I refuse to engage Jerry or that Jerry is asking questions can be easily shown to be wrong. Jerry started spamming other threads with his nonsensical ideas about entropy. And now we find out that it was all about detrimental mutations and not about entropy at all.

I guess we will never see Jerry support his claims. His application of mathematics does not seem to extend beyond

therefore S is positive showing a positive tendency of disorder as we would expect.

Of course S is positive, it always is positive. That Jerry confuses positive with a tendency of disorder is just totally nonsensical.

Jerry still fails to apply the correct formulas in a useful manner. Subtracting nonsense from nonsense will lead to a nonsensical conclusion. The dangers of using mathematics without understanding how to correctly apply is quite real in discussions of the second law of thermodynamics. Mostly because people confuse a mutation with an increase in entropy. In fact when a mutation happens and gets fixed in a population, entropy will decrease. Even if the mutation is a slightly detrimental mutation.

Surprised? I bet Jerry is. But as I said these are nontrivial concepts and often somewhat counterintuitive. So let me try to explain.

Originally lets assume that the nucleotide at the location of the mutation has an equal probability of being A, C, T or G. This means that this location has maximum entropy. After the mutation if the probability of the nucleotide becomes 1 for one of the 4 bases and zero for the others because the mutation becomes fixed in the genome, the entropy drops to zero. Not surprisingly when one understand the concept of Shannon entropy. Before the mutation and fixation there was maximum confusion as to which nucleotide would be found at the location or in other words maximum disorder. After the mutation got fixated, there is full predictability of the basepair at this location or maximum order.

Don’t believe me ? Do the calculations using the correct formulas
I provide the outline for how to calculate Shannon entropy.

Comment #2902

Posted by Nate Barrister on May 26, 2004 11:36 AM (e)

This was good.

Comment #2908

Posted by Whistle Blower on May 26, 2004 1:30 PM (e)

Navy Davy, an alleged lawyer, casts his judgment as follows:

I thought scientists were, you know, into (1) evidence, (2) testing, (3) prediction, (4) logic.

I guess not.

For the record, Navy Davy, where do you find an indication in Pim’s posts that he is uninterested in any of the four categories you listed? I’m particularly interested in where you find problems with the evidence or logic Pim used.

Your honest response will be greatly appreciated and recorded for future use, as appropriate.

Finally, the record will show that your statement re:

“poster refuses to engage JDB (ask civil questions, respond to civil questions posed”

is demonstrably false as pointed out by Pim.

In the future, please try to avoid distorting the record by making false, facially inaccurate, or dishonest statements in your posts.

Cheers,
WB

Comment #2915

Posted by Navy Davy on May 26, 2004 2:42 PM (e)

Pim,

Navy, why not take Jack Krebs’ idea and implement such a discussion on ARN, which would be an excellent place for Jerry to discuss his ideas.

Not a bad idea, I might do that. But, you POSTED here an attack specifically on Jerry (not simply his ideas on entropy). I quote:

Jerry Don Bauer a.k.a. Chronos has shown some interesting confusions as to the concept of entropy…..

So, for the life of me, I cannot understand why the Evolution proponents would rather snipe from the sidelines at ID folks, rather than directly engage their ideas in a civil manner.

To me, it just seems like a wasted opportunity. Particularly, because I see a whole lotta intellectual acumen here that, if channelled properly, would really be influential and informative.

Cheers, Navy Davy

Comment #2917

Posted by Whistle Blower on May 26, 2004 3:02 PM (e)

Navy Davy attempts to defend himself, telling Pim –

you POSTED here an attack specifically on Jerry (not simply his ideas on entropy). I quote:

Jerry Don Bauer a.k.a. Chronos has shown some interesting confusions as to the concept of entropy …..

Navy Davy, please refrain from distorting and mischaracterizing the record. Let me know if you need definitions for the terms “distorting” or “mischaracterizing” and I’ll be happy to provide them to you.

In your “summary” of Pim’s post, you failed to mention that Pim has provided substantial (indeed, conclusive) evidentiary support for his conclusion that Chronos had “shown some interesting confusions.” Therefore, obviously, Pim’s statement cannot reasonably be construed as an “attack” on Chronos (aka JDBauer), at least not by a reader without substantial biases.

The record shows, Navy Davy, that you have alleged that you are free of such biases. Please let us know if, in fact, you are retracting your earlier statements (e.g., “I don’t have a dog in this fight,” etc., etc.). Some of have expressed doubts as to the honesty of those statements so we not be surprised or particularly disappointed if you retracted them.

Alternatively, Navy Davy, you are welcome to state where you believe that Pim has unfairly taken Jerry Don Bauer’s words and distorted them, such as would render Pim’s conclusion about Jerry Don’s “interesting conclusions” unsubstantiated.

As always, your honest and courteous responses are greatly appreciated. For the record, we note that such responses from Navy Davy are few and far between.

Cheers,
WB

Comment #2918

Posted by Navy Davy on May 26, 2004 3:24 PM (e)

Whistle Blower,

I give you a resounding, “Ho-hum.”

Comment #2921

Posted by Jerry Don Bauer on May 26, 2004 3:44 PM (e)

******Jerry continues to exemplify his lack of understanding when he states: herefore S is positive showing a positive tendency of disorder as we would expect.
This whopper was in depth debunked on ISCID. That Jerry still repeats this silly notion almost verbatim is quite fascinating.******

I didn’t just posit this, unlike you I actually calculated it. And Francis, why don’t you quit sending people to other forums hoping they won’t read all the posts. If this whopper were debunked by you before, you wouldn’t have much trouble doing again here, would you?

*****If Jerry cannot even understand the simple mathematical foundations for Shannon entropy as they apply to the genome, it does not come as a surprise that he fails to comprehend how one can calculate the actual entropy from abstract formulas.****

Oh, I think I understand Shannon/Weaver entropy just fine. I at least understand that you have to put some math into a formula in order to get something out of it.

*******Let me know what part of the derivation of Shannon entropy or the calculations based upon these formulas confuse you Jerry.******

You didn’t do any calculations. Not one. You only threw ought empty formulas that you cut and pasted hoping to confuse people. Works, don’t it. ;)

******And then there is Jerry who cannot even calculcate the entropy in the genome to show support for P1 (entropy in the human genome always increases).******

But I did calculate it. Did you not read the posts? Now quit beating around the bush hoping this will go away. Refute the math or admit you cannot.

Comment #2922

Posted by Jerry Don Bauer on May 26, 2004 4:11 PM (e)

*****Originally lets assume that the nucleotide at the location of the mutation has an equal probability of being A, C, T or G. This means that this location has maximum entropy.*****

LOL …… No it doesn’t. You think something has maximum entropy just because it exists? All things with microstates such as this have statistical entropy and Feynmann would calculate it as S = log2(4) = S = 2. But this is not maximum entropy because if a deleterious mutation happened here, disorder can go up. Think these things through before you post them, Francis.

******After the mutation if the probability of the nucleotide becomes 1 for one of the 4 bases and zero for the others because the mutation becomes fixed in the genome, the entropy drops to zero.******

Entropy drops to zero? Calculate this for us. And haven’t we just left the second law and went to the third?

Third law of thermodynamics: “The third law of thermodynamics states that the entropy of a pure perfect crystal is 0 at 0 K:  S(0K) = 0. At 0K the atoms in a pure perfect crystal are aligned perfectly and do not move. Moreover, there is no entropy of mixing since the crystal is pure. For a mixed crystal containing the atomic or molecular species A and B, there are many possible arrangements of A and B and there is therefore entropy associated with the arrangement of the atoms/molecules.”

Quick people, back away from the table. Francis’ human genome just froze solid!

*****Before the mutation and fixation there was maximum confusion as to which nucleotide would be found at the location or in other words maximum disorder.*****

And how do we calculate this maximum confusion?

*****After the mutation got fixated, there is full predictability of the basepair at this location or maximum order.******

You just flunked the genetics test too. You don’t understand the word ‘fixated.’ This applies only to populations, not individual genomes. Nothing is fixed in an individual because of a mutation. The reason is that this same nucleotide can always mutate again. In fact, it may even be a beneficial mutation.

Comment #2933

Posted by Pim van Meurs on May 26, 2004 5:31 PM (e)

Syntax Error: mismatched tag 'url'

Comment #2940

Posted by Jerry Don Bauer on May 26, 2004 8:18 PM (e)

What is this deal with you writing posts about me rather than to me? :0)

******Jerry in the same breath claims he understands Shannon entropy and shows confusion of said entropy with thermodynamical entropy when he states
Entropy drops to zero?

I will show in a follow up posting how to calulcate this but it is trivial since p_i log p_i will be zero for this site since either p_i is zero or log p_i is zer (p_i = 1)*******

LOL ….I don’t want formulas, I want math. Do some with real figures. But I do want to know how you can mathematically show your genome as being frozen solid. And if you show statistical entropy as zero in a system with microstates, you will have just revolutionized, science, my friend. Deleterious mutations become impossible and genetic defects are just a myth of science.

You’ll never have to buy a new car again the rest of your life. Just sit your old one out in the sun and according to your math, it will magically regress into a new one. And you will have just refuted the second law of thermodynamics.

****Jerry still seems to be confused about entropy being a positive number (which it always is) and entropy having a tendency to disorder because it is positive.*****

It is not always positive. Sure, statistical entropy in a system with microstates is, and it grows larger when we are considering more microstates. But wouldn’t you expect it to be? The second law states that with any spontaneous reaction entropy will tend to increase. I’m showing this mathematically.

But statistical entropy is not reactional entropy. And in chemical reactions entropy can go down and often does.

*****As Gedanken and various others have shown, this is a meaningless and erroneous concept. That Jerry continues his claims in spite of the facts is further evidence that Jerry may have some problems with entropy, and the mathematics surrounding this at the surface simple concept but oh so tricky in the hands of creationists.*****

LOL ….What does this mean and aren’t you a YEC? Stick to the subject, sheeze.

*****I apologize for presenting a mathematical foundation for my claims and thus confusing you with ‘empty formulas’ but no worry, not only will I walk the ‘confused people’ through how to apply Shannon entropy but it will also support what I have already shown using data from Schneider and Adami, namely that entropy decreases under the influence of mutation and selection.*****

Apology accepted. Now quit trying to dodge the subject and go back and tackle the math I gave you. Remember I gave you real figures in formulas. Not just the formulas. And how do you ever hope to defeat my math using something other than my formulas?

****Well, let me try to explain. Maximum confusion means that no nucleotide was prefered or in other words that the probability for the nucleotide being one of (A,C,T,G) is equal to 1/4. Applying the formula for Shannon entropy we find that for uniform distribution of probabilities, entropy is maximum.*****

So you mean by this that before your genome was frozen it pretty much became delirious? And gee … I’m rather shocked to learn that the probabilities of one out of four nucleotides mutating is 1:4.

*****Jerry ignores most data that shows fixation of benificial mutations.*****

And this information is found where in that paper? They only considered deleterious mutations. Cut and paste all of this. You’re good at that.

****we are here to educate and help out when issues of confusion arise and in a soon to be released posting I will gently walk through these straightforward and, yet to some confusing, calculations to show further support for my thesis.*****

You probably ought to write a thesis that refutes my thesis, don’t you think?

****math into a formula? Or does Jerry mean, actual values? If that is the case, could Jerry apply these formulas to his example? As I have shown, if, as Eyre-Walker argues, slightly deleterious mutations become fixated in the human genome then by all measures of entropy, the entropy will decrease.*****

As they say in Arkansas, “Do what?”

*****But we already knew that disorder, entropy and information do not really say anything much about the nature of the fixated mutation. After all this should be obvious to anyone familiar with Shannon entropy.******

You don’t even know what Shannon/Weaver entropy is. You are trying to apply entropy that deals with the loss of signal and the addition of noise added into a telephone line to the genome. Oh, you water it down into some genetics you don’t understand. And remember that I was personally present when Bill Dembski, a math PhD actually examined this stuff and told you personally that nothing you are cutting and pasting is even germane to the subject. You, my friend, are as full of crap as a Christmas turkey and you spoil the truth of science for both sides.

Well, you trolled me for a debate and since you seem to be the highest intellect I can find on this forum: Your turn.

Comment #2941

Posted by Frank Schmidt on May 26, 2004 8:19 PM (e)

Jerry wrote

At 0K the atoms in a pure perfect crystal are aligned perfectly and do not move.

Um, no. At 0 deg K, atoms still have zero-point energy which translates into motion. If they were perfectly still, it would be possible for an observer to simultaneously know the position (given by the crystal lattice) and momentum (none, according to Jerry) of an atom. But that would be in violation of the uncertainty principle, so the atoms don’t have zero motion.

All this is from the long-lost recesses of my P. Chem. knowledge, so subject to correction by real physicists and physical chemists, whom I invite to do so.

Comment #2946

Posted by Jerry Don Bauer on May 26, 2004 8:48 PM (e)

******All this is from the long-lost recesses of my P. Chem. knowledge, so subject to correction by real physicists and physical chemists, whom I invite to do so.*****

If there’s any of them on this forum, I can assure you they won’t be addressing this thread.

So what do you think the third law of thermodynamics is? Do you not believe it exists? I mean you didn’t try to correct me. You just seemed to deny that there was such a critter.

Comment #2954

Posted by DS on May 27, 2004 3:53 AM (e)

From the Wikipedia:

The 3rd Law-

This states that the entropy of a system at zero absolute temperature is a well-defined constant. This is because a system at zero temperature exists in its ground state, so that its entropy is determined only by the degeneracy of the ground state.

A special case of this is systems with a unique ground state, such as crystal lattices. The entropy of these systems as defined by Nernst’s theorem is zero (since ln(1) = 0).

“The entropy of a perfect crystal lattice at absolute 0 is 0”

This is at least one formulation of the 3rd law and is the most common format found in several resources I checked.

Comment #2964

Posted by Frank Schmidt on May 27, 2004 7:08 AM (e)

However, the zero entropy state doesn’t imply zero motion, which is what Jerry stated. My old textbook points out that, for example, the entropy of the nucleus is not known when the third law is stated in this formulation. As with all thermodynamic measurements, we can look at changes in the quantities only, and assigning S=0 is a convention that isn’t always valid. (Denbigh, The Principles of Chemical Equilibrium, 2nd ed. Cambridge U. Press, 1968) Note that my statement is subject to correction since I don’t follow this primary literature at all - hence my request for enlightenment if necessary.

I brought up the point, not to revisit a course that I took long ago but rather to point out that thermodynamics is a specialized subject, involving lots of work and study before one can make the arguments cogently. Pulling quotes from a secondary or tertiary source to make a philosophical point is a dangerous business - whether the subject is thermo, information theory, or dare I say evolutionary biology.

Comment #2992

Posted by Pim van Meurs on May 27, 2004 11:40 AM (e)

I am still confused, on one hand Jerry claims he understands Shannon entropy, on the other hand he confuses Shannon entropy with thermodynamic entropy.

I will show in a follow up posting how to calulcate this but it is trivial since p_i log p_i will be zero for this site since either p_i is zero or log p_i is zer (p_i = 1)*******

LOL ….I don’t want formulas, I want math. Do some with real figures. But I do want to know how you can mathematically show your genome as being frozen solid. And if you show statistical entropy as zero in a system with microstates, you will have just revolutionized, science, my friend. Deleterious mutations become impossible and genetic defects are just a myth of science.

Not only does Jerry not appreciate the meaning of mathematics (formulas are mathematics Jerry) but also confuses the concepts of Shannon entropy. Even after providing Jerry with the formula

H= - sum_{i} p_i log p_i

and pointing out that when a mutation becomes fixated the Shannon entropy for that particular nucleotide becomes zero because p_i becomes one for the mutation and zero for the three other nucleotides (I am assuming for convencience a fixated point mutation). Thus the formula with p_i=0 for i=1..3 and p_4=1 shows that H=0

It’s that simple and I will explore this in a soon to be posted update to Shannon Entropy Applied.

Jerry ironically states wrote:

You don’t even know what Shannon/Weaver entropy is. You are trying to apply entropy that deals with the loss of signal and the addition of noise added into a telephone line to the genome. Oh, you water it down into some genetics you don’t understand. And remember that I was personally present when Bill Dembski, a math PhD actually examined this stuff and told you personally that nothing you are cutting and pasting is even germane to the subject. You, my friend, are as full of crap as a Christmas turkey and you spoil the truth of science for both sides.

Stop making up things Jerry. Your ‘imagination’ is only outperformed by your inability to apply mathematical concepts in a defensible manner and your grandstanding when caught in the act.

Keep up the good work though since it helps exemplify how creationism/ID abuses concepts of entropy imho of course.

In the mean time check out how real scientists apply the concept of Shannon entropy in a meaningful manner, despite objections to the contrary by Jerry.

Scheider: Evolution of biological information

Adami et al Evolution of biological complexity

Learn and enjoy.

Comment #2999

Posted by Navy Davy on May 27, 2004 2:01 PM (e)

Pim,

And remember that I was personally present when Bill Dembski, a math PhD actually examined this stuff and told you personally that nothing you are cutting and pasting is even germane to the subject.

Sorry, the lawyer in me has gotta ask:

1. Did you ever converse with William Dembski?

2. If so, Was Jerry Don Bauer ever present during your conversation with William Dembski?

3. If so, Did you and William Dembski ever discuss entropy?

4. If so, Did William Dembski express any disagreements with your views on entropy?

Cheers, Navy Davy

Comment #3001

Posted by Pim van Meurs on May 27, 2004 2:11 PM (e)

Navy Davy, why don’t you ask Jerry. But the answer is that I never met Dembski in person and I must thus conclude that when Jerry stated ‘I was personally present’ he is refering to an ISCID discussion

Perhaps Jerry can clarify these questions?

Comment #3004

Posted by Navy Davy on May 27, 2004 2:35 PM (e)

Pim,

1. So, you were at some ISCID “discussion” where Dembski was speaking/writing about principles of entropy?

2. And Jerry was there, too?

Not that I really care about these peripheral matters, but if your answers above are “yes” and “yes,” it sounds like Jerry was pretty close to the mark. IIRC, you told Jerry to “stop making things” up.

Cheers,

Detective Navy Davy

Comment #3006

Posted by Pim van Meurs on May 27, 2004 2:45 PM (e)

Navy Davy, I do not remember Dembski contributing to these discussions and in my quick search I found nothing that suggests that Dembski contributed. In fact, Dembski’s contributions to ISCID are minimal as it comes to actual discussions.

Hence my question to Jerry because to my best recollection this never took place.

Comment #3008

Posted by Whistle Blower on May 27, 2004 2:57 PM (e)

Navy Davy–

Not that I really care about these peripheral matters, but if your answers above are “yes” and “yes,” it sounds like Jerry was pretty close to the mark. IIRC, you told Jerry to “stop making things” up.

Once again, we find Navy “unbiased” Davy taking a position that reveals his biases. Specifically, we find Navy Davy taking the position that “I was personally present” can be reasonably construed to mean “posted something to an Internet bulletin board I was reading.”

Navy Davy, the evidence that you are neither unbiased nor particularly intelligent continues to grow. Of course, your claims that your posts were thoughtful or humorous were ignored from the start.

The record shows that there are several question directed to you on the Panda’s Thumb that you have chosen to ignore, most likely because answering them directly would either prove that you are a liar or cause you to admit your biases.

As always, a modest showing of integrity on your part is greatly appreciated (although our expectations are diminished with every post of yours). I realize that certain lawyers are more inclined to play childish word games than others. Perhaps the bar for moral conduct is far lower in your state than it is mine.

Best,
WB

Comment #3009

Posted by Navy Davy on May 27, 2004 3:00 PM (e)

Pim,

Final question:

1. Was this ISCID “discussion” you attended on-line or in person?

I reckon, if you don’t remember Dembski discussing anything at an ISCID “discussion,” in which you participated, then, Hell, we got ourselves an impasse.

Much obliged,

Navy Davy

Comment #3011

Posted by Pim van Meurs on May 27, 2004 3:06 PM (e)

I suggest we let Jerry support his claim. I am more than willing to admit that I am wrong if he can provide for supporting evidence for his claims.

These ISCID discussions are on-line, Dembski occasionally contributes but seldomly participates. Checkout http://www.iscid.org/boards

Comment #3013

Posted by Navy Davy on May 27, 2004 3:39 PM (e)

Pim,

I just did check out the ISCID discussion boards. There seems to be a lot less vitrol and ad hominem than here. Incidentally, I saw some pretty informative comments by both you and Jerry, over there.

WB,

Perhaps the bar for moral conduct is far lower in your state than it is mine.

Perhaps you are a geek, while I am not:)

Cheers,

Navy Davy

Comment #3018

Posted by Whistle Blower on May 27, 2004 6:00 PM (e)

Navy Davy –

I saw some pretty informative comments by both you and Jerry

Which comments by Jerry did you find informative, Navy Davy? Also, based on what you’ve read here and on ISCID, why do you choose to believe that Jerry Don Bauer is a reliable source of information about thermodynamics or evolutionary biology?

Comment #3022

Posted by Jerry Don Bauer on May 27, 2004 7:27 PM (e)

******I am still confused*****

Yeah, I can tell. ;)

****on one hand Jerry claims he understands Shannon entropy, on the other hand he confuses Shannon entropy with thermodynamic entropy.*******

That is hilarious. It’s you trying to use this irrelevant math to determine this entropy and this is NOT thermodynamic entropy. That kind of entropy deals with heat and energy and is expressed as Joules-Degrees Kelvin. I used Feynman’s formula and sent you to the site showing you what that kind of entropy it is. Do you understand the difference between thermodynamic, logical and information entropy? I don’t even think you know what you are attempting to calculate.

Also, if you refute my math, you are going to have to use the same math I did and show me where the mistakes in it are. Using your logic, I would try to show that 4 x 4 = 16 to invalidate 2 + 3 = 4. Nat a lot of logic there.

******Not only does Jerry not appreciate the meaning of mathematics (formulas are mathematics Jerry) but also confuses the concepts of Shannon entropy. Even after providing Jerry with the formula H= - sum_{i} p_i log p_I******

Empty formulas show nothing. Do you really think that people on here are so stupid not to know that in order for a formula to show something it must have some figures in it? And what is it you think you are showing if you manage to show that H = 0? Do you know what H is and what it normally represents in math?

******and pointing out that when a mutation becomes fixated the Shannon entropy for that particular nucleotide becomes zero because p_i becomes one for the mutation and zero for the three other nucleotides (I am assuming for convencience a fixated point mutation). Thus the formula with p_i=0 for i=1..3 and p_4=1 shows that H=0******

Mutations cannot become fixated in an individual genome ….Sheeze ….this is twice now I’ve corrected this. Can you not see that a nucleotide that mutates can mutate again or change by breeding? Fixations are only related to individuals as those individuals relate to the entire population. We don’t have a population, so how can we have any fixation that relates to one?

If you don’t believe me, why don’t you just look these words up:

Fixation: “Evolutionarily, a state where every single individual within a population is homozygous for a particular allele (and therefore the phenotype that the allele confers). For example, in a population where everyone has blue eyes, the allele for blue eye color is fixed and everyone will continue to have blue eyes in the future, as long as no new individuals come into the population from elsewhere.”

So do you understand that if this individual you are fixating ever happens to breed outside its population that you may no longer have any fixation? Or that this particular nucleotide can always mutate again?

*******Stop making up things Jerry. Your ‘imagination’ is only outperformed by your inability to apply mathematical concepts in a defensible manner and your grandstanding when caught in the act.********

LOL ….Did you miss all that math I posted? Do you want to pretend it was never posted? Well here it is again just so you can’t state that you overlooked it:

S = log2W, S = log2(100,000,000), S = 26.5754247590989, therefore S is positive showing a positive tendency of disorder as we would expect.

But this is only statistical entropy and if we are to figure reactional entropy, we will have to calculate actual deleterious mutations from generation to generation.

We can view the deleterious mutations as actual entropy because, in this case, this is the actual disorganization. Eyre-Walker tells us that the human genome is estimated to carry 1000 negative mutation, so let’s get that entropy S = log2 W = S = log2 1000 S = 9.96578428466209 —

Now let’s calculate the entropy after two more genes mutate S = log2 W S = log2 1002 S = 9.96866679319521 —
It is here that we do that subtraction you so badly wanted to do: deltaS, the actual change in entropy is: deltaS = S2 - S1 — deltaS = 9.96866679319521 (-) 9.96578428466209 = deltaS = 0.00288250853312

Now refute this or lose this portion of the argument. Here is the math. You don’t need to bring in unrelated and irrelevant math as I’ve presented this now in three forums the last one in which you simply left the debate.

Of course, you’ll just go to the fourth using the same old trite trickery and hope no one will come along and notice.

Comment #3028

Posted by Pim van Meurs on May 27, 2004 7:51 PM (e)

Jerry: I don’t even think you know what you are attempting to calculate.

Dear Jerry, I was merely pointing out that you were confusing Information Entropy (Shannon) with thermodynamical entropy when you appealed to the third law of thermodynamics. Nothing wrong with that these are tricky concepts.

Jerry: So, if you refute my math, you are going to have to use the same math I did and show me where the mistakes in it are.

I am showing that your math is irrelevant to entropy calculations for the genome. That’s all Jerry.

Jerry: Mutations cannot become fixated in an individual genome ….Sheeze ….this is twice now I’ve corrected this. Can you not see that a nucleotide that mutates can mutate again or change by breeding? Fixations are only related to individuals as those individuals relate to the entire population. We don’t have a population, so how can we have any fixation that relates to one?

Was it not you Jerry who quoted Eyre Walker’s paper to show that slightly detrimental mutations become fixated in the genome? What did you have in mind quoting Eyre Walker if that is not what you were trying to argue? In fact your own statements once again contradict you when you made claim P1 namely that the entropy in the human genome has been increasing and pointed to Eyre Walker’s paper.

Jerry, I do not understand why you keep making the same error about positive entropy and tendency. But it was wrong in the past and is wrong now.

Jerry: Now refute this or lose this portion of the argument. Here is the math. You don’t need to bring in unrelated and irrelevant math as I’ve presented this now in three forums the last one in which you simply left the debate.

Why should I refute irrelevant calculations Jerry? Your calculations have no relevance to entropy in the genome, and when correctly applying such calculations they show that you are wrong. What should I do beyond this Jerry?

Jerry: Of course, you’ll just go to the fourth using the same old trite trickery and hope no one will come along and notice.

Who is using nonsensical math and arguments to support his opinions here Jerry. It seems obvious to all that it is you.

In fact in case of detrimental mutations, entropy decreased for the simple reason that the number of possible states has decreased. While before these 1000 (assume for the moment) point mutations were free to take on various base pairs (A,C,T,G), now they have become fixed as a detrimental mutation. Thus, applying correctly, the concepts of entropy to the genome, the entropy has decreased significantly.

Not surprisingly Jerry seems to still believe that positive entropy is an indicator of tendency and that it is enough to take the log of the number of mutations to estimate entropy. Using Jerry’s logic and apply it to beneficial mutations he would argue that entropy also increased. No matter what mutates entropy always increases, neutral, benificial or detrimental, does not matter.

A real calculation would have considered the before state and after state for all nucleotides.

So let’s assume that we have 1000 base pairs genome and the nucleotides in the genome are initially uniformly distributed, that is fully random, not surprisingly we find that the entropy is maximal for this situation. But now let’s assume that one base pair becomes fixated in the human genome. It does not matter whether the mutation is slightly detrimental or benificial, let’s assume that for whatever reason the mutations spreads through the human genome as Eyre-Walker has argued. Thus the entropy has to drop since it was at its maximum value, in fact the entropy drops by 2 bits.

I know, these are complicated issues, but it is you who has unnecessarily complicated matters when confusing entropy and tendency or claiming that you understand Shannon entropy and then confuse it with thermodynamical entropy.

GOod thing that there are friendly people here who are willing to spend the time correcting your mistakes :-)

Comment #3029

Posted by Joe P Guy on May 27, 2004 7:52 PM (e)

Jerry Don:
Empty formulas show nothing. Do you really think that people on here are so stupid not to know that in order for a formula to show something it must have some figures in it?

E = mc2

Comment #3032

Posted by Frank Schmidt on May 27, 2004 8:13 PM (e)

As I recall, Shannon used the term “entropy” in his discussion about information because his version (potential information loss - remember he worked for the phone company) had the same mathematical formulation as did the statistical definition of entropy. But it was understood as an analogy only. Hence the confusion - the two terms are not exactly the same. I’m sure Shannon never got it confused, but it sure can be a problem for us non-geniuses.

Comment #3033

Posted by Pim van Meurs on May 27, 2004 8:23 PM (e)

Indeed information is NOT entropy, it’s the change in entropy which describes information.

Jerry seems to like to quote from Feynman so let’s see what he had to say about entropy

“So we now have to talk about what we mean by disorder and what we mean by order…. Suppose we divide the space into little volume elements. If we have black and white molecules, how many ways could we distribute them among the volume elements so that white is on one side and black is on the other? On the other hand, how many ways could we distribute them with no restriction on which goes where? Clearly, there are many more ways to arrange them in the latter case. We measure “disorder” by the number of ways that the insides can be arranged, so that from the outside it looks the same. The logarithm of that number of ways is the entropy. The number of ways in the separated case is less, so the entropy is less, or the “disorder” is less.”

Since the number of ways the nucleotides can be arranged has not changed, according to Feynman’s concept of entropy, entropy would not have changed. Jerry’s application of these concepts is just frustratingly wrong.

Comment #3034

Posted by Pim van Meurs on May 27, 2004 8:35 PM (e)

A good similarity between Boltzman entropy and Shannon entropy can be seen here

On Slide 10 the author points out that when the distribution of the gas particles is uniformly distributed over the phase space bins, the entropy is maximal, when only 1 bin has all the molecules and entropy is minimal.

Not surprisingly Boltzman himself derived that

H_b = Sum_k p_k log p_k (slide 11)

And defined S = -k_b H_b (slide 13)

Looks familiar anyone?

Well things get better. How do we go from number of allowed states to Shannon’s form?

Slide 8

S=k_B log W
now apply to number of states n_a and n_b and the entropy can be expressed as

S = -N k_b (p_a log p_a+ p_b log p_b)

Another example of how correct application of entropy results in the format I presented

Comment #3038

Posted by Pim van Meurs on May 27, 2004 10:00 PM (e)

Oh Yes, Jerry from Eyre-Walker’s paper it is clear that they are discussing population statistics and fixation

A large number of slightly deleterious mutations may therefore have become fixed in hominid lineages.

Thus my comments not only stand but are further supported by my mathematical arguments presented in “Shannon entropy applied” thread.

Any comments?

Comment #3039

Posted by Tim Downs on May 27, 2004 10:14 PM (e)

My only comment would be that Jerry is very confused on this issue. I’m not sure it is worth continuing the discussion with him at this point.

Comment #3040

Posted by Jerry Don Bauer on May 28, 2004 1:31 AM (e)

Is there any honest scientist on this forum willing to admit I am debating simpletons that don’t even understand the subject? This is a sad day for science, my friends, when ALL of the scientists listed as the sponsors on this forum will just remain silent in the face of this gross rape of Intelligence. This should tell all reading this what Panda’s Thumb is all about.

Panda’s Thumb wants to evangelize you to its religion. If you think any differently, get real.

Truth be known, they don’t just hate IDists. My experience is that they usually hate blacks and Jews as well.

Comment #3041

Posted by Bob Maurus on May 28, 2004 4:39 AM (e)

You’re really coming unhinged, Jerry. That last insult was truly offensive. Crawl back under your rock.

Comment #3042

Posted by Bob Maurus on May 28, 2004 4:42 AM (e)

You’re really coming unhinged, Jerry. That last insult was truly offensive. Crawl back under your rock.

Comment #3043

Posted by Ed Brayton on May 28, 2004 6:25 AM (e)

Truth be known, they don’t just hate IDists. My experience is that they usually hate blacks and Jews as well.

Mr. Bauer, the mere fact that we have tolerated you this long is a miracle (yes, that’s called “irony”). But you have now crossed over the line. One more comment like this and you will be permanently banned from leaving comments on this page. This is not negotiable, nor do I care in the slightest that it will feed your martyr complex and you’ll scream persecution from every mountaintop you can find. You’re behaving like very obnoxious ass who ever got thrown out of a chatroom, screaming that you’re being censored when the truth is you’re just being an ass and that’s why you got banned. This is your first, last, and only warning.

Comment #3048

Posted by Matt Young on May 28, 2004 9:04 AM (e)

I’m not black, but I am Jewish and an “honest scientist on this forum.” I do not hate IDists, blacks, or Jews, and I resent the suggestion that I do. I’m not a psychologist either, but I recognize projection when I see it, and I’d suggest that Mr. Bauer is projecting his own hates and prejudices onto his opponents. Maybe he should read my column, “I Am Firm, Thou Art Stubborn, He Is Pigheaded,” at this URL: http://www.pandasthumb.org/pt-archives/000082.html.

Comment #3054

Posted by Pim van Meurs on May 28, 2004 9:52 AM (e)

Now that the flaws in Jerry’s arguments have been fully exposed, what is his response? Calling the contributors to these boards racists? Lovely ad hominem and as poorly supported as Jerry’s usual ‘claims’.

From the bright side however several good things have come from Jerry’s presence on this board

1. An in-depth analysis of entropy as it applies to the genome

2. An in-depth expose of Jerry’s deepest ‘thoughts’

Thanks to Jerry, I have been encouraged to write up the details of entropy and how the concept applies to genome (including common pitfalls).

Comment #3058

Posted by Mark Perakh on May 28, 2004 10:54 AM (e)

Mr. Bauer’s last comment, however disgusting, is not really surprising. He has displayed all the features of an incurable self-admiring crank, imperviously confident in his infallibility, so when he reveals that he also is a spiteful slanderer, this could be expected. He wanted scientists to address his lengthy comments on thermodynamics. Perhaps I can qualify as I have taught, among other things, thermodynamics and statistical physics for more than half a century both to undergraduate and graduate students and have published nearly 300 papers in peer-reviewed media. I did not see a need to address Bauer’s diarrhea of pseudo-scientific arguments for the same reason I did not argue against a Siberian peasant who was confident that he succeeded to muzzle me as he asserted that an electric bulb lights up when a plus and a minus meet in it. Now, when Bauer has shown his real character explicitly, I think not only should he be banned from further posting his crock on PT, but also all his preceding posts should be kicked into a garbage can where they belong, except for his last pearl about hating blacks and Jews which would serve to show what kinds of adversaries PT has to deal with.

Comment #3062

Posted by Navy Davy on May 28, 2004 11:13 AM (e)

Oh, the outrage! Oh, the bleakness! Oh, the humanity!

I actually think Jerry should simply apologize for his one remark and continue posting.

But, you all, should also apologize for ganging up on him, calling him a liar, and generally not bein’ good sports.

Best, Navy Davy

Comment #3067

Posted by Bob Maurus on May 28, 2004 12:03 PM (e)

Navy Davy,

What’s to be gained by letting a liar get by with his lies? What useful function does that serve? And why should we apologize for calling him a liar? He’s a liar.

Sounds like you’ve not encountered Mr. Bauer before. He’s an infamous haunter of evolution/creationism/IDC boards, under a number of different aliases, and seems to be an object of scorn and ridicule wherever he tries to recycle his ignorance and misinformation.

You had a pretty good introduction to his routine and his dishonesty in several threads here. His last post, while unexpectedly vile, is not, in the end, surprising.

Bob

Comment #3080

Posted by Jerry Don Bauer on May 28, 2004 1:46 PM (e)

*******A good similarity between Boltzman entropy and Shannon entropy can be seen here*******

Who cares, Pim? What does any of this have to do with a genome in devolution and heading toward mutational melt down? What does this have to do with fixation or anything else we have discussed?

*****On Slide 10 the author points out that when the distribution of the gas particles is uniformly distributed over the phase space bins, the entropy is maximal, when only 1 bin has all the molecules and entropy is minimal.******

:0)

******Not surprisingly Boltzman himself derived that

H_b = Sum_k p_k log p_k (slide 11)

And defined S = -k_b H_b (slide 13)

Looks familiar anyone?******

Why yes, it does because its more cutting and pasting and serves as another example that you don’t have any idea what your discussing. BTW, the math you’re using is NOT Shannon/Weaver entropy, but just similar. Here is Shannon/Weaver entropy:

******S=k_B log W
now apply to number of states n_a and n_b and the entropy can be expressed as
S = -N k_b (p_a log p_a+ p_b log p_b)
Another example of how correct application of entropy results in the format I presented******

And yet another example of empty formulas with no figures in them. What are the values of S, k_B, W or any of the rest of this stuff? Let me guess, you have no idea?

Comment #3083

Posted by Whistle Flower on May 28, 2004 2:04 PM (e)

Mark

As was posted elsewhere, I believe that Jerry Don’s garbage non-science is as useful as his bizarre comment regarding blacks and Jews for illustrating the vastness of the empty space between Jerry Don’s headphones (and why did he leave out homosexuals? Jerry Don is showing his age). But certainly we don’t need any more proof at this late date.

One question for the peanut gallery: do any ID critics here believe Navy Davy when he claims to have not yet made up his mind about the relative merits of ID ? Do any ID critics here believe that Navy Davy can possibly be convinced that ID is a bogus political scheme for teaching creationism in schools?

Personally I think Navy Davy is completely full of baby diaper garbage, just like Jerry Don Bauer. But perhaps I’m too quick to make such judgments.

Comment #3087

Posted by Jerry Don Bauer on May 28, 2004 2:43 PM (e)

*******Sounds like you’ve not encountered Mr. Bauer before. He’s an infamous haunter of evolution/creationism/IDC boards, under a number of different aliases, and seems to be an object of scorn and ridicule wherever he tries to recycle his ignorance and misinformation.******

Nah,,,,just on the radical naturalism boards that can’t handle the debate. ;)

Comment #3088

Posted by Navy Davy on May 28, 2004 2:47 PM (e)

Whistle Flower,

Are you by any chance related to Whistle Blower ? Perhaps, a genetic mutant:)

I’m reading a great book, entitled, “The Pleasure of Finding Things Out” by Richard Feynman. I guess that sums up my philosophy on many issues, including this one.

I note that many on PT would retitle the book,
“The pleasure of discussing among ourselves what we believe to be facts, and ignoring and/or stifling those who disagree with us, who must be religious wackos to even think that way”

Cheers, Boys!

Navy Davy

p.s. Have a Good Memorial Day weekend too. Don’t be shy about hoisting up ‘ole Glory, neither. Lotta brave young folks died, so we could be free to play on the internet:)

Comment #3089

Posted by Jerry Don Bauer on May 28, 2004 2:53 PM (e)

********One question for the peanut gallery: do any ID critics here believe Navy Davy when he claims to have not yet made up his mind about the relative merits of ID ? Do any ID critics here believe that Navy Davy can possibly be convinced that ID is a bogus political scheme for teaching creationism in schools?*******

He was very honest with me in stating that he did not buy into my ideas, but just wanted to see a debate on the issues. I also informed him that he probably would not see a debate on the issues as there is no one on here capable of handling the debate from the side of naturalism.

Of course, there has been little real debate. I can do a mathematical calculation for this forum and when I do, I’m a liar. No attempt to address the math at all, no refutations, nothing. Idists are just liars.

I would be most surprised if those who actually do have an open mind and can learn will not do some further research on this issue.

You people badly damage your agenda by even having these forums. As when anyone really decides to disagree with you, you lose it entirely. This shows radical Darwinism as really what it is: religion, with almost no tenet based in science or math.

Comment #3091

Posted by Whistle Flower on May 28, 2004 2:58 PM (e)

I’m reading a great book, entitled, “The Pleasure of Finding Things Out” by Richard Feynman. I guess that sums up my philosophy on many issues, including this one.

Any of the contributors to this blog believe that Navy Davy just can’t make up his mind because the evidence from both sides is so compelling? Just curious. Don’t be afraid to speak up!

Navy Davy, I would be curious to find support for your view that everyone who disagrees with evolutionary theory must be a religious wacko. Rather, the view expressed most often here is that ID apologists are either religious wackos, liars, or unable to understand elementary scientific arguments (or a combination thereof).

Once again, Navy Davy, you have distorted the record and shown us your true colors (a liar who is unable to understand elementary scientific arguments).

P.S. Whistle Flower is the new kinder, gentler version of Whistle Blower. Didn’t you notice?

Comment #3093

Posted by Navy Davy on May 28, 2004 3:19 PM (e)

Whistle Flower is the new kinder, gentler version of Whistle Blower. Didn’t you notice?

Why would I notice a cyber-dweeb, who can’t even correctly spell his own nom de guerre ?

Or, in your case, nom de wuss :)

Cheers, Navy Davy

p.s. I really have to go now, sorry. But, I will be back: in the name of truth, science and finding things out!

“Another of the qualities of science is that it teaches the value of rational thought, as well as the importance of freedom of thought; the positive doubting that the lessons are all true. (Feynman, Pleasures…. pg. 186.)

Comment #3099

Posted by Ed Brayton on May 28, 2004 3:48 PM (e)

You people badly damage your agenda by even having these forums. As when anyone really decides to disagree with you, you lose it entirely. This shows radical Darwinism as really what it is: religion, with almost no tenet based in science or math.

Jerry Don, that is - to put it kindly - complete and utter horseshit. The only one who has “lost it entirely” here is you, with your obnoxious nonsense about your opponents allegedly hating blacks and jews. Your first mistake is in thinking this is a “forum”; it’s not. It’s a blog, not a message board and not a forum. We are not obligated to give you a forum to launch ad hominem attacks on us. You were treated just fine until you started with your smug “no one here is smart enough to debate” crap. We have had respectful exchanges on here with Paul Nelson and Frank Beckwith.

Perhaps it’s time for you to consider the possibility that the reaction of other people to you is not due to what you perceive to be the overwhelming strength of your position, but is due to the fact that you’re a raging asshole. Now go troll somewhere else. You’ve worn out your welcome here.

Comment #3101

Posted by Jerry Don Bauer on May 28, 2004 3:54 PM (e)

Mine started as respectful as well. One can only take being called names so long before they begin to slap back. And I cannot imagine that Paul would be treated with respect in here.

Comment #3114

Posted by Bob Maurus on May 28, 2004 5:34 PM (e)

Why don’t you ask Paul about that, Jerry?

Comment #3120

Posted by Jerry Don Bauer on May 28, 2004 6:03 PM (e)

******Why don’t you ask Paul about that, Jerry?******

I probably will.

Comment #3122

Posted by Pim van Meurs on May 28, 2004 6:37 PM (e)

Any chance you can provide us with support for you statement about Dembski

Remember your claim

And remember that I was personally present when Bill Dembski, a math PhD actually examined this stuff and told you personally that nothing you are cutting and pasting is even germane to the subject. You, my friend, are as full of crap as a Christmas turkey and you spoil the truth of science for both sides.

Or were you ‘confused’?

Comment #3124

Posted by Jerry Don Bauer on May 28, 2004 6:52 PM (e)

******Any chance you can provide us with support for you statement about Dembski*******

No, I could not find the post. However, if you were honest you’d just admit it. But since we can’t expect this I’ll simply retract that statement.

I don’t think it takes Dembski to see the silliness in those formulas.

Comment #3125

Posted by Pim van Meurs on May 28, 2004 6:52 PM (e)

More examples that Jerry misinterpreted entropy calculations

Boltzman argued:

S = k log N

Where N is number of ways system can be arranged in indistinguishable ways as far as outside observer is concerned (Feynman). More recently information theory has been applied to entropy.

Link

Or

Comment #3126

Posted by Jerry Don Bauer on May 28, 2004 6:56 PM (e)

******S = k log N

Where N******

Oh sheeze. I meant W and you know it. But if I’m using N the same way, what the hecks the difference. Are you now just running away from the little debate we have going hoping to bury it in smoke and mirrors?

Comment #3127

Posted by Pim van Meurs on May 28, 2004 6:59 PM (e)

Even Jerry’s own quote from Feynman misinterpretes what he said

Feynman honed this down: “So we now have to talk about what we mean by disorder and what we mean by order…. Suppose we divide the space into little volume elements. If we have black and white molecules, how many ways could we distribute them among the volume elements so that white is on one side and black is on the other? On the other hand, how many ways could we distribute them with no restriction on which goes where? Clearly, there are many more ways to arrange them in the latter case. We measure “disorder” by the number of ways that the insides can be arranged, so that from the outside it looks the same. The logarithm of that number of ways is the entropy. The number of ways in the separated case is less, so the entropy is less, or the “disorder” is less.”

Comment #3128

Posted by Pim van Meurs on May 28, 2004 7:03 PM (e)

Jerry: No, I could not find the post. However, if you were honest you’d just admit it.

ROTFL. You’re a gem. Another one for the “bathroom wall”. One more and I get the much coveted but rarely awared “Bronze PandaThumb Beer Mug”

Btw the N in S = k log N is just a mathematical symbol which in fact represents what you called W.

Even Feynman seems to disagree with your interpretation of his statement. Sigh..

Comment #3129

Posted by Pim van Meurs on May 28, 2004 7:05 PM (e)

And despite claims that ‘I am running away from a debate’ it should be clear that I am fully engaged in a debate and that Jerry is left to utter ad hominems in apparant frustration.

Fine with me.

Comment #3132

Posted by Jerry Don Bauer on May 28, 2004 7:15 PM (e)

*****Btw the N in S = k log N is just a mathematical symbol which in fact represents what you called W.

Even Feynman seems to disagree with your interpretation of his statement.*****

Good Lord. This is not Feymann this is Boltzmann. Sometimes I wonder if you’ve actually ever had a freshman chemistry class.

Now back to the debate. You owe me a post, so why are you posting to me without addressing what we were talking about?

Are you conceeding that argument?

Comment #3134

Posted by Pim van Meurs on May 28, 2004 7:24 PM (e)

Oh boy, the constant is just that, a constant and does not make a difference whether you use Feynman or Boltzman or Shannon.

And my Feynman reference was to the previous posting in which Feynman correctly represents the facts but when ‘calculated’ by Jerrry he forgets the relevant (bolded parts).

I have no idea what argument you want me to concede?

1. That you confused the positive nature of entropy with tendency?

2. That you called us racists?

3. That you claimed “And remember that I was personally present when Bill Dembski, a math PhD actually examined this stuff and told you personally that nothing you are cutting and pasting is even germane to the subject.”?

4. That you appear confused about application of entropy concepts?

1 is self evident, 2 seems undisputed, 3 is unsupported other than by ad hominem thus weakening your position and 4 is well documented on the various threads here.

Comment #3138

Posted by Pim van Meurs on May 28, 2004 7:38 PM (e)

Configurational states

Page 7:

The configurational entropy of a crystal refers to the distinguishable ways the atoms can be arranged on the lattice sites. For monoatomic crystals the only origin of configurational
entropy is the presence of crystal defects such as vacancies lattice sites without atoms

It’s so straightforward, but you can lead a horse to water but it’s harder to make him drink :-)

Probabilities, Microstates and Entropy

Microstates

Entropy and the Second Law

Simple things can pose subtle questions.

Well, you should get the point by now.

Comment #3141

Posted by Bob Maurus on May 28, 2004 7:49 PM (e)

I love it, Jerry,

You posted

(PVM)******Any chance you can provide us with support for you statement about Dembski*******

(Jerry) No, I could not find the post. However, if you were honest you’d just admit it. But since we can’t expect this I’ll simply retract that statement.

How about this Jerry: if YOU were honest you’d just admit you’re full of shit. You made a flat claim that you couldn’t validate when called. Get your documentation in order first - kinda like, “First you pillage, then you burn.” I do realize though that that might be beyond your comprehension. My offer to pray for you still stands. The goddess might be willing to take pity on you.

Peace,
Bob

Comment #3142

Posted by Jerry Don Bauer on May 28, 2004 7:50 PM (e)

People, take a look at Feynmann’s famous formula that somehow was mistakenly engraved on Boltzmann’s tombstone. :0)

http://www.wellesley.edu/Chemistry/chem120/thermo1.html#boltz

Comment #3144

Posted by Jerry Don Bauer on May 28, 2004 7:56 PM (e)

*****Well, you should get the point by now.******

Was that post to me? Will you please quit sending me to irrelevant web sites hoping they will make your argument for you?

Place your argument in your own words. The problem is that you are out of argument and you know it. This in fact is the third place you have been embarrassed discussing design. I’m curious, where did you get those degrees in physics?

Comment #3158

Posted by Jerry Don Bauer on May 28, 2004 8:59 PM (e)

*****He wanted scientists to address his lengthy comments on thermodynamics. Perhaps I can qualify as I have taught, among other things, thermodynamics and statistical physics for more than half a century both to undergraduate and graduate students and have published nearly 300 papers in peer-reviewed media.******

And you are not up to a friendly debate in your field? I can assure you I’m only rude to those who seek to demean me rather than professionally debate.

Tell me. If you are a thermodynamicist, how do feel that this law would have allowed macroevolution which postulates the antithesis of SLOT?

Comment #3160

Posted by Pim van Meurs on May 28, 2004 9:06 PM (e)

Jerry: Place your argument in your own words. The problem is that you are out of argument and you know it. This in fact is the third place you have been embarrassed discussing design. I’m curious, where did you get those degrees in physics?

Very simple your application of the formula for entropy is incorrect. I have tried to show you in my own words (see Shannon entropy applied) and I have quoted extensively from the literature.

To no avail. And you ad hominems only serve to show that you have not won any argument other than by running away from confronting it.

I have no problem with that Jerry.

The problem with Jerry’s argument is that the logarithm has to be taken from the number of possible states, not the number of mutations.

So let’s assume we have a population of 10000 humans and look at a particular location in the genome. Let’s assume for the moment that at the distribution of the nucleotides at this location is uniformly distributed. In other words we have maximum disorder. The number of accessible states can be calculated to be

W = 10000!/2500!*2500!*2500!*2500!

S= log W = 6014.6
using Sshannon= - Sum p_i log p_i with p_i = 0.25 we calculate Sshannon=6020.6

Remember that the two measures are equal only for large numbers.

Now two mutations arise which are slightly detrimental or slight neutral, it does not matter and they get fixated in the population as argued by Eyre-Walker. Now we find a different situation since p_i is 1 for these two locations

so now we have calculate the entropy to be 9998*log.25 or 6019.4 or a drop of 1.2 in entropy.

Similarly we calculate

W = 9998!/2500!2500!2500!2498!=6013.4 or 1.2 less than before

Both formulas agree

QED

Comment #3169

Posted by Jerry Don Bauer on May 28, 2004 10:53 PM (e)

*****Very simple your application of the formula for entropy is incorrect. I have tried to show you in my own words (see Shannon entropy applied) and I have quoted extensively from the literature.*****

LOL ….Right. Please read up on Shannon entropy so that you will at least know what it is. Shannon deals with data and information. Shannon worked for the telephone company, remember? It cannot possibly be used to say anything about nucleotide mutation.

Shannon Entropy: “This theorem is the foundation of the modern field of information theory
Information theory is a branch of the mathematical theory of probability and mathematical statistics, that quantifies the concept of information. It is concerned with information entropy, communication systems, data transmission and rate distortion theory, cryptography, data compression, error correction, and related topics …. ….”

http://encyclopedia.thefreedictionary.com/Shannon’s%20theorem

Also, please understand that mutations are random and thus no probabilities can be calculated on exactly when they will mutate, if they will mutate or what they mutate into to. If we could, think about it; we could mathematically calculate what everything is devolving into, if anything at all. We can calculate statistical entropy, but that’s it unless we know what actually happened.

********To no avail. And you ad hominems only serve to show that you have not won any argument other than by running away from confronting it.*******

Yep, you get ‘em right back from me when you throw them out, don’t you.

********I have no problem with that Jerry.*********

You have no problem with what?

******The problem with Jerry’s argument is that the logarithm has to be taken from the number of possible states, not the number of mutations.*******

I’m just calculating easy to understand bits which can also be viewed as statistical entropy. But what you fail to understand is that whatever math we use the entropy will still be positive because 1002 mutations are more than 1000 mutations. How would you calculate bits?

“bits are entropy”

http://64.233.161.104/search?q=cache:uScMj4eddMgJ:www.cise.ufl.edu/help/software/doc/mpeg_encode/doc.ps+%22Bits+are+entropy%22&hl=en

******So let’s assume we have a population of 10000 humans and look at a particular location in the genome. Let’s assume for the moment that at the distribution of the nucleotides at this location is uniformly distributed. In other words we have maximum disorder. The number of accessible states can be calculated to be

W = 10000!/2500!*2500!*2500!*2500!

S= log W = 6014.6*******

Oh man, this is a classic. And almost as bad as the last time you argued this with me stating that when one adds positive integers, the total must always be negative. Remember that little jewel? You don’t want me to post a link to that tidbit, do you?

First, if anything is uniformly distributed how is this maximum disorder?

What are you talking about by “the distribution of the nucleotides at this location is uniformly distributed” how would they not be regularly distributed. Please give an example of an irregularly distributed nucleotide.

I take it you are looking at four nucleotides in 10,000 genomes? Well then that’s 40,000 sites; you don’t have to do any math to determine this.

Then you take 10000 people and divide them by 2500!. 2500! what? Why? Then you multiply them by 2500! three times. Why? This is simply nonsensical. And if you divide 10000 by 2500! And then multiply it by 2500! Aren’t you right back at 10000? So why would you include this at all?

Now, slow down. Do each step one at a time, explain why you are doing it, and every step of the mathematics.

using Sshannon= - Sum p_i log p_i with p_i = 0.25 we calculate Sshannon=6020.6******

LOL … What is this garbage, Francis? You’re just making this up out of thin air hoping someone will believe you’ve got the math down.

Also, you claimed to have just calculated the Shannon entropy and came up with zero, remember this: Francis: “H= - sum_{i} p_i log p_i
and pointing out that when a mutation becomes fixated the Shannon entropy for that particular nucleotide becomes zero because p_i becomes one for the mutation and zero for the three other nucleotides (I am assuming for convencience a fixated point mutation). Thus the formula with p_i=0 for i=1..3 and p_4=1 shows that H=0”

How come you get two totally distinct formulas when you calculate the same thing twice. Ahhh … you forgot you already did this, I see.

How come when I punch in “-Sum p_i log p_i with p_i “ into Google not one page comes up?

What are the values of p_I, how did you get these values? Show your work

And now you trying now to show disorder! You are arguing for order, remember?? Now you’re coming up with positive entropy just as I was showing disorder, except, for some strange reason, you are wanting to show a HUGE disorder. Sheeze.

*****Remember that the two measures are equal only for large numbers.******

Prove these with references from web sites. I don’t believe it.

*******Now two mutations arise which are slightly detrimental or slight neutral, it does not matter and they get fixated in the population as argued by Eyre-Walker. Now we find a different situation since p_i is 1 for these two locations******

Why is p_I 1 and what was it your former ‘calculation?’

******so now we have calculate the entropy to be 9998*log.25 or 6019.4 or a drop of 1.2 in entropy.
Similarly we calculate
W = 9998!/2500!2500!2500!2498!=6013.4 or 1.2 less than before
Both formulas agree*******

Well what happened to the other two people? Did they fall over dead with a heart attack or something? :0)

Show this calculation step by step. Explain why you are doing it, and every step of the math. Francis, you are the most intellectually dishonest person I have ever met on the Net and that’s saying something.

Comment #3170

Posted by Pim van Meurs on May 28, 2004 10:59 PM (e)

Well things need some cleaning up here

First of all all entropies need to be divided by log 2 (I took log10 not log2 although this is a trivial change). Secondly we need to divide by 10000 to get the average entropy for this location in the genome.

Not surprisingly the entropy before is 2.06 and after 1.99 after the mutation.

If needed I can provide additional examples but the results will be the same. If the mutation at the location before was not fixated and afterwards it is then a decrease in entropy is inevitable since the number of accessible locations has dropped by 2.

I apologize for the sloppiness in my calculations but they are merely meant to show that entropy decreases, the actual value is less important as it is scaled by a constant.

Another thought experiment is to calculate the entropy in 1000 rolls of a single die and then 1000 rolls a single die but two of the outcomes are guaranteed to be sixes. Again it should be obvious that order has increased and entropy decreased. If this is not clear now assume the roll of 1000 die that are loaded to always throw a six. Perfect order and thus minimal entropy. Any time a particular location becomes fixated, entropy has to drop.

Comment #3171

Posted by Jerry Don Bauer on May 28, 2004 11:37 PM (e)

*****Not surprisingly the entropy before is 2.06 and after 1.99 after the mutation.*****

LOL … You didn’t divide anything by any log based component. Now go back and address my last post sentence by sentence as I did yours. You’re just inventing this stuff by making up a bunch of numbers and formulas that are nonsensical. If you can’t dazzle them with brilliance, then baffle them with bull****. If you fail to support your stuff with references, then you have lost this debate for the third time.

*****I apologize for the sloppiness in my calculations but they are merely meant to show that entropy decreases, the actual value is less important as it is scaled by a constant.*****

LOL … Do you now.

*****Another thought experiment is to calculate the entropy in 1000 rolls of a single die and then 1000 rolls a single die but two of the outcomes are guaranteed to be sixes. Again it should be obvious that order has increased and entropy decreased.******

Thermodynamic entropy cannot increase or decrease with rolls of dice. It takes the exact same energy for my arm to throw any pair of dice as it does another. This is just more Francis garbage that he picked up from some web site, if he didn’t just pull it out of thin air.

*******Perfect order and thus minimal entropy.******

ROFL …..You have this exactly backward. Perfect order is maximum entropy. Take a physics course. If you’ve ever had one in your life, I’ll eat my hat.

Comment #3172

Posted by Jerry Don Bauer on May 29, 2004 12:09 AM (e)

Ok :0) I found one like it and this is used to calculate the size of a channel in order to determine how many binary digits can flow through it:

“Shannon defined a measure of entropy:

H = - \sum_i p_i \log p_i

that, when applied to an information source, could determine the capacity of the channel required to transmit the source as encoded binary digits.”

http://www.fastload.org/in/Information_theory.html

Boy, this has a lot to say about mutating nucleotides. LOL, you, my man, are a trip.

Comment #3173

Posted by Erik 12345 on May 29, 2004 3:59 AM (e)

There’s lots of things here that could use clarifications and corrections, but I’ll just make a few comments about high abstraction-level things (leaving aside question about particular calculations):

Point I: Information theory (by which I here mean communication theory, rather than, say, Maximum Entropy Inference) can be applied to DNA sequences. For example, if you are going to store large genomes in a database, you may want to compress your data to save storage space. Information theory can tell you how much you at most can compress your DNA sequences by encoding subsequences with bit strings.

A completely different application can be to pick a biological process (e.g. asexual reproduction), and try to imagine this as a communication process by identifying parts that are in some sense analogous to a sender, receiver, communication channel, and messages sent over the communication channel. In the case of asexual reproduction, one could regard the parent as the sender, the children as receivers, the parent genome as the message that is sent, the children genomes as the noise-modified messages that are received, etc. Such an analysis is not an end in itself, but sometimes it may let us infer things about quantities of direct interest to biologists (e.g. the average number of sites at which an asexually produced offspring genome differs from the parent genome).

Point II: Boltzmann entropy, Gibbs entropy, etc. are application-specific concepts. They are restricted to statistical mechanics, which is concerned with the relation between Boltzmann entropy (or Gibbs entropy) and macroscopic physical quantities like energy, pressure, volume, temperature, etc.

Shannon entropy is (by now) a generic concept that can be applied to any quantity to which we have associated a probability distribution. How sensible the application is depends entirely on what you are trying to accomplish and on how sensible your choice of quantity and probability distribution is.

Some readers may be mathematicians who (a) study entropy because they are intrigued by certain formal similarities between statistical mechanics, communication theory, statistical inference, etc., and (b) can say “I like Radon-Nikodym derivates” and mean it. Readers who are not, however, should be careful to not confuse the generic concept of Shannon entropy with the application-specific concept of Boltzmann entropy (or Gibbs entropy).

Point III: Jerry Don Bauer writes: “ROFL …..You have this exactly backward. Perfect order is maximum entropy. Take a physics course. If you’ve ever had one in your life, I’ll eat my hat.”

I have taken several physics courses and would generally discourage talk about entropy in terms of metaphors like “disorder”. Unfortunately, many physicists are fond of the “disorder” metaphor. Those physicists would say, contra Jerry Don Bauer, that perfect order is indeed minimum entropy and that maximum disorder is the same as maximum entropy.

A good article that deals with Point II & III is “Shuffled Cards, Messy Desks, and Disorderly Dorm Rooms — Examples of Entropy Increase? Nonsense!“. It briefly warns of the dangers of not understanding the difference between Shannon entropy and the entropy of statistical mechanics/thermodynamics as well as the misleading features of the “disorder” metaphor.

Comment #3178

Posted by Ed Brayton on May 29, 2004 8:42 AM (e)

Jerry Don Bauer has been involuntarily removed from this site. He is free to submit all of his whining, martyr poses and cries of persecution to http://www.wedontcare.com.

Comment #3185

Posted by Pim van Meurs on May 29, 2004 11:01 AM (e)

Jerry: Show this calculation step by step. Explain why you are doing it, and every step of the math. Francis, you are the most intellectually dishonest person I have ever met on the Net and that’s saying something.

Translatio of Jerry Speak wrote:

I am unable to reject or refute anything you have to say. In fact I have come to realize that my many poorly informed postings on entropy have been a waste of time. Most importantly though, I have come to the conclusion that my arguments against evolution based on entropy were poorly informed and have been refuted.

Some good whoppers

Jerry Whopper wrote:

LOL ….Right. Please read up on Shannon entropy so that you will at least know what it is. Shannon deals with data and information. Shannon worked for the telephone company, remember? It cannot possibly be used to say anything about nucleotide mutation.

Again real scientists seem to disagree with Jerry. See the many applications of Shannon entropy:

ev: Evolution of Biological Information

Evolution of biological complexity
Christoph Adami, Charles Ofria, and Travis C. Collier, PNAS, Vol. 97, Issue 9, 4463-4468, April 25, 2000

Jerry Whopper wrote:

ROFL …..You have this exactly backward. Perfect order is maximum entropy. Take a physics course. If you’ve ever had one in your life, I’ll eat my hat.

Bon appetite, I’d say. If maximum entropy were perfect order, how come that we are moving towards disorder AND maximum entropy. Good one Jerry.

Jerry Whopper wrote:

How come you get two totally distinct formulas when you calculate the same thing twice. Ahhh … you forgot you already did this, I see.

As I showed in the thread “Shannon entropy applied” the two formulas are NOT totally distinct. In fact I show how the two formalas are related. Sorry for confusing you with math Jerry.

Jerry whopper wrote:

Then you take 10000 people and divide them by 2500!. 2500! what? Why? Then you multiply them by 2500! three times. Why? This is simply nonsensical. And if you divide 10000 by 2500! And then multiply it by 2500! Aren’t you right back at 10000? So why would you include this at all?

Seems Jerry is unfamiliar with the usage of the “!” which means to mathematicians factorial.

definition wrote:

n! = n*(n-1)*n-2)*…*2*1

I hope that by now Jerry has come to the realization that he was ‘in over his head’ when he made these poorly informed assertions. In his rebuttal he shows clearly how unfamiliar he is with basic mathematical concepts relevant to entropy calculations.

I think these issues have been resolved quite succesfully. But I can understand that the calculations I provided may seem somewhat complicated to those who believe that it is sufficient to take the log of the mutation to calculate entropy and in a subsequent posting I intend to walk through the calculations in a little bit more detail, providing supporting evidence from the many websites I have already quoted,

Comment #10157

Posted by Donald A. Syvanen on November 10, 2004 9:32 PM (e)

From an chemical engineering undergraduate student: you folks are making a big mess out of this. I flunked a few test questions because I did not do all the calculations to be SURE the entropy is overall negative, INCREASED disorganization. The power of atheism is too great for most of you to handle. God does not need to keep all the laws of thermodynamics like we do. Macro-evolution did not happen.

Comment #10168

Posted by Neil Johnson on November 11, 2004 8:31 AM (e)

Donald A. Syvanen wrote:

From an chemical engineering undergraduate student…

Now that an undergrad has deigned to tell the authors of his textbooks that they don’t understand thermodynamics, I’m sure we shall all sleep better at night.

Donald A. Syvanen wrote:

God does not need to keep all the laws of thermodynamics like we do.

So, you admit that creationism is a priori untestable and therefore not a topic of scientific investigation. Glad to see that you agree with us.

Welcome to the bigs, Donald.

Neil