Intelligent Reasoning

Promoting, advancing and defending Intelligent Design via data, logic and Intelligent Reasoning and exposing the alleged theory of evolution as the nonsense it is. I also educate evotards about ID and the alleged theory of evolution one tard at a time and sometimes in groups

Tuesday, January 31, 2012

Sporting Events

With the Super Bowl approaching I was wondering if the people whose favorite team is not in the SB do they watch it?

If I don't care about the teams I don't care about the game.

I will not watch a world series if the team I like (Red Sox) is not in it.

I will not watch a NBA final if the Celtics are not in it.

I will not watch the stanley cup if the Bruins are not in it.

That said, surprisingly I am not a big Pats fan. I used to be until I moved back to the NE only to find out that Pats' fans are an uneducated lot- "squish the fish"? Please Dolphins are not fish and I know you think you are being coy but in reality you are just demonstrating your belligerent ignorance. The pats are the greatest team ever? Please, while it is true that you have won 3 super bowls it was only by a total of 9 points.

When Dallas won their SBs they dominated their opponents. Yes they have 3 SB loses but by a measly total of 11 points. In two of the Pats SB loses they were embarrassed, proving they did not belong in the game.

I am of the feeling that you have to dominate SBs in order to declare yourselves the best team.

So will I watch the game this Sunday? Not all of it, that's for sure...

Wednesday, January 25, 2012

The Design Criteria

One prediction of all design-centric venues is that when agencies interact with nature they tend to leave traces of their involvement behind. Most often those traces are in the form of signs of work. IDists have defined what such traces would look like wrt biology:

The criteria for inferring design in biology is, as Michael J. Behe, Professor of Biochemistry at Leheigh University, puts it in his book Darwin ‘ s Black Box: “Our ability to be confident of the design of the cilium or intracellular transport rests on the same principles to be confident of the design of anything: the ordering of separate components to achieve an identifiable function that depends sharply on the components.”

He goes on to say: ” Might there be some as-yet-undiscovered natural process that would explain biochemical complexity? No one would be foolish enough to categorically deny the possibility. Nonetheless, we can say that if there is such a process, no one has a clue how it would work. Further, it would go against all human experience, like postulating that a natural process might explain computers.”

Then we have-

Irreducible Complexity:
IC- A system performing a given basic function is irreducibly complex if it includes a set of well-matched, mutually interacting, non-arbitrarily individuated parts such that each part in the set is indispensable to maintaining the system’s basic, and therefore original, function. The set of these indispensable parts is known as the irreducible core of the system. Page 285 NFL

Numerous and Diverse Parts If the irreducible core of an IC system consists of one or only a few parts, there may be no insuperable obstacle to the Darwinian mechanism explaining how that system arose in one fell swoop. But as the number of indispensable well-fitted, mutually interacting,, non-arbitrarily individuated parts increases in number & diversity, there is no possibility of the Darwinian mechanism achieving that system in one fell swoop. Page 287

Minimal Complexity and Function Given an IC system with numerous & diverse parts in its core, the Darwinian mechanism must produce it gradually. But if the system needs to operate at a certain minimal level of function before it can be of any use to the organism & if to achieve that level of function it requires a certain minimal level of complexity already possessed by the irreducible core, the Darwinian mechanism has no functional intermediates to exploit. Page 287

So if what we are investigating fits any of the descriptions above then we have more than enough to check into the possibility that is was designed- meaning we would run it through the explanatory filter. If it gets to the last node and matches any of the above, then we infer it was intentionally designed.

Dr Behe responds to IC criticisms:
One last charge must be met: Orr maintains that the theory of intelligent design is not falsifiable. He’s wrong. To falsify design theory a scientist need only experimentally demonstrate that a bacterial flagellum, or any other comparably complex system, could arise by natural selection. If that happened I would conclude that neither flagella nor any system of similar or lesser complexity had to have been designed. In short, biochemical design would be neatly disproved.- Dr Behe in 1997

Tuesday, January 24, 2012

Nested Hierarchies and Cladistics- a Primer

Revisiting Nested Hierarchies:
From A Summary of the Principles of Hierarchy Theory:
Nested and non-nested hierarchies: nested hierarchies involve levels which consist of, and contain, lower levels. Non-nested hierarchies are more general in that the requirement of containment of lower levels is relaxed. For example, an army consists of a collection of soldiers and is made up of them. Thus an army is a nested hierarchy. On the other hand, the general at the top of a military command does not consist of his soldiers and so the military command is a non-nested hierarchy with regard to the soldiers in the army. Pecking orders and a food chains are also non-nested hierarchies.

For example in the nested hierarchy of living organisms we have the animal kingdom.

To be placed in the animal kingdom an organism must have all of the criteria of an animal.

For example:

All members of the Animalia are multicellular (eukaryotes), and all are heterotrophs (that is, they rely directly or indirectly on other organisms for their nourishment). Most ingest food and digest it in an internal cavity.

Animal cells lack the rigid cell walls that characterize plant cells. The bodies of most animals (all except sponges) are made up of cells organized into tissues, each tissue specialized to some degree to perform specific functions.

The next level (after kingdom) contain the phyla. Phyla have all the characteristics of the kingdom PLUS other criteria.

For example one phylum under the Kingdom Animalia, is Chordata.

Chordates have all the characteristics of the Kingdom PLUS the following:

Chordates are defined as organisms that possess a structure called a notochord, at least during some part of their development. The notochord is a rod that extends most of the length of the body when it is fully developed. Lying dorsal to the gut but ventral to the central nervous system, it stiffens the body and acts as support during locomotion. Other characteristics shared by chordates include the following (from Hickman and Roberts, 1994):

bilateral symmetry
segmented body, including segmented muscles
three germ layers and a well-developed coelom.
single, dorsal, hollow nerve cord, usually with an enlarged anterior end (brain)
tail projecting beyond (posterior to) the anus at some stage of development
pharyngeal pouches present at some stage of development
ventral heart, with dorsal and ventral blood vessels and a closed blood system
complete digestive system
bony or cartilaginous endoskeleton usually present.

The next level is the class. All classes have the criteria of the kingdom, plus all the criteria of its phylum PLUS the criteria of its class.

This is important because it shows there is a direction- one of additive characteristics.

Yet evolution does NOT have a direction. Characteristics can be lost as well as gained. And characteristics can remain stable.

Cladistics is a method of categorizing organisms based on shared characteristics. Each clade (allegedly) consists of a common ancestor and all of its (alleged) descendents:
intro to cladistics
The basic idea behind cladistics is that members of a group share a common evolutionary history, and are "closely related," more so to members of the same group than to other organisms. These groups are recognized by sharing unique features which were not present in distant ancestors. These shared derived characteristics are called synapomorphies.

Cladistics can be distinguished from other taxonomic systems, such as phenetics, by its focus on shared derived characters (synapomorphies).

And also what is cladistics?

The clade is not constructed based on ancestor-descendent relationships, those are assumed. And ancestor-descendent relationships form a non-nested hierarchy- see Eric B Knox, "The use of hierarchies as organizational models in systematics", Biological Journal of the Linnean Society (1998), 63: 1–49

Each clade, note- not the entire cladogram, can be a nested hierarchy based on shared characteristics in that each descendent node will consist of and contain, ie share, a set of defined characteristics present in the alleged common ancestor. However each clade is also a non-nested hierarchy in that the alleged common ancestor does not consist of nor contain all descendents.

The point being is that if your basis for clade-construction is to make it conform to a nested hierarchy based on shared characteristics, then yes, you should see that a clade is a nested hierarchy based on shared characteristics, duh.

Thursday, January 19, 2012

Measuring Complex Specified Information with Respect to Biology

Once again, I don't know why this is so difficult, but here it is:

Complex specified information is a specified subset of Shannon information. That means that complex specified information is Shannon information of a specified nature, ie with meaning and/ or function, and with a specified complexity.

Shannon's tells us that since there are 4 possible nucleotides, 4 = 2^2 = 2 bits of information per nucleotide. Also there are 64 different coding codons, 64 = 2^6 = 6 bits of information per amino acid, which, is the same as the three nucleotides it was translated from.

Take that and for example a 100 amino acid long functioning protein- a protein that cannot tolerate any variation, which means it is tightly specified and just do the math 100 x 6 + 6 (stop) = 606 bits of specified information- minimum, to get that protein. That means CSI is present and design is strongly supported.

Now if any sequence of those 100 amino acids can produce that protein then it isn't specified. IOW if every possible combo produced the same resulting protein, I would say that would put a hurt on the design inference.

The variational tolerance has to be figured in with the number of bits.

from Kirk K. Durston, David K. Y. Chiu, David L. Abel, Jack T. Trevors, “Measuring the functional sequence complexity of proteins,” Theoretical Biology and Medical Modelling, Vol. 4:47 (2007):
[N]either RSC [Random Sequence Complexity] nor OSC [Ordered Sequence Complexity], or any combination of the two, is sufficient to describe the functional complexity observed in living organisms, for neither includes the additional dimension of functionality, which is essential for life. FSC [Functional Sequence Complexity] includes the dimension of functionality. Szostak argued that neither Shannon’s original measure of uncertainty nor the measure of algorithmic complexity are sufficient. Shannon's classical information theory does not consider the meaning, or function, of a message. Algorithmic complexity fails to account for the observation that “different molecular structures may be functionally equivalent.” For this reason, Szostak suggested that a new measure of information—functional information—is required.

With text we use 5 bits per character which gives us the 26 letters of the alphabet and 6 other characters. The paper below puts it all together- peer-review. It tells you exactly how to measure the functional information, which is exactly what Dembski and Meyer are talking about wrt CSI. So read the paper it tells how to do exactly what you have been saying no one knows how to do- it isn't pro-ID and the use of AVIDA as evidence of "emergence" is dubious*, but the math is there for you to misunderstand or not comprehend.

Here is a formal way of measuring functional information:

Robert M. Hazen, Patrick L. Griffin, James M. Carothers, and Jack W. Szostak, "Functional information and the emergence of biocomplexity," Proceedings of the National Academy of Sciences, USA, Vol. 104:8574–8581 (May 15, 2007).

See also:

Jack W. Szostak, “Molecular messages,” Nature, Vol. 423:689 (June 12, 2003).

*1- Avida "organisms" are far too simple to be considered anything like a biological organism

2- Avida organisms "evolve" via unreasonable parameters:

The effects of low-impact mutations in digital organisms

Chase W. Nelson and John C. Sanford

Theoretical Biology and Medical Modelling, 2011, 8:9 | doi:10.1186/1742-4682-8-9


Background: Avida is a computer program that performs evolution experiments with digital organisms. Previous work has used the program to study the evolutionary origin of complex features, namely logic operations, but has consistently used extremely large mutational fitness effects. The present study uses Avida to better understand the role of low-impact mutations in evolution.


When mutational fitness effects were approximately 0.075 or less, no new logic operations evolved, and those that had previously evolved were lost. When fitness effects were approximately 0.2, only half of the operations evolved, reflecting a threshold for selection breakdown. In contrast, when Avida's default fitness effects were used, all operations routinely evolved to high frequencies and fitness increased by an average of 20 million in only 10,000 generations.


Avidian organisms evolve new logic operations only when mutations producing them are assigned high-impact fitness effects. Furthermore, purifying selection cannot protect operations with low-impact benefits from mutational deterioration. These results suggest that selection breaks down for low-impact mutations below a certain fitness effect, the selection threshold. Experiments using biologically relevant parameter settings show the tendency for increasing genetic load to lead to loss of biological functionality. An understanding of such genetic deterioration is relevant to human disease, and may be applicable to the control of pathogens by use of lethal mutagenesis.

Kevin R. McCarthy, aka OgreMk5, Thinks He has Refuted Archaeology, Forensics, SETI and Intelligent Design

Yup Kevin is at it again, this time he thinks he has devised a test that will show that archaeology, forensic science, SETI, Intelligent Design, insurance fraud and more are all baseless and without merit.

Of course all he does is erect a strawman because he is an ignorant asshole and that is all he is good for.

I will say it AGAIN Kevin- Intelligent Design, archaeology, forensics, insurance fraud all say they can determine what happens by chance and/ or necessity with what requires agency involvement. And that means, as I have told you but you ignored, context is everything.

The funniest part about Kevin's test- it also renders his position moot as his position also requires the ability to determin between design and not.

Way to go dumbass...

Tuesday, January 10, 2012

ATP Synthase- All Experiments Point to Design

As I posted earlier:

The ATP Synthase is a system that consists of two subsystems-> one for the flow of protons down an electrochemical gradient from the exterior to the interior and the other (a rotary engine) that generates ATP from ADP using the energy liberated by proton flow. These two processes are totally unrelated from a purely physiochemical perspective*- meaning there isn't any general principle of physics nor chemistry by which these two processes have anything to do with each other. Yet here they are.

How is this evidence for Intelligent Design? Cause and effect relationships as in designers often take two totally unrelated systems and intergrate them into one. The ordering of separate subsystems to produce a specific effect that neither can do alone. And those subsystems are composed of the ordering of separate components to achieve a specified function.

ATP synthase is not reducible to chance and necessity and also meets the criteria of design.

* Emergent collective properties, networks, and information in biology, page 23:
In the same vein, ATP synthesis in mitochondria can be conceived of and explained only because there is a coupling between ATP-synthase, the enzyme responsible for ATP synthesis, and the electrochemical potential. Hence ATP synthesis emerges out of this coupling. The activity of ATP-synthase alone could have in no way explained ATP synthesis. It is the merit of Mitchell, to have shown that it is precisely the interaction between two different physico-chemical events that generates this novel remarkable property. (italics in original)

Next we take a look inside ATP synthase-

“Thermodynamic efficiency and mechanochemical coupling of F1-ATPase”:

F1-ATPase is a nanosized biological energy transducer working as part of FoF1-ATP synthase. Its rotary machinery transduces energy between chemical free energy and mechanical work and plays a central role in the cellular energy transduction by synthesizing most ATP in virtually all organisms. However, information about its energetics is limited compared to that of the reaction scheme. Actually, fundamental questions such as how efficiently F1-ATPase transduces free energy remain unanswered. Here, we demonstrated reversible rotations of isolated F1-ATPase in discrete 120° steps by precisely controlling both the external torque and the chemical potential of ATP hydrolysis as a model system of FoF1-ATP synthase. We found that the maximum work performed by F1-ATPase per 120° step is nearly equal to the thermodynamical maximum work that can be extracted from a single ATP hydrolysis under a broad range of conditions. Our results suggested a 100% free-energy transduction efficiency and a tight mechanochemical coupling of F1-ATPase.

Highly effiecient, irreducibly complex, and no way- physiochemcially to get the two subunits to come together-> there's no attraction and no coupling.

See also:

Davies et al., “Macromolecular organization of ATP synthase and complex I in whole mitochondria,” Proceedings of the National Academy of Sciences

Tamás Beke-Somfai, Per Lincoln, and Bengt Nordén, “Double-lock ratchet mechanism revealing the role of [alpha]SER-344 in F0F1 ATP synthase,” Proceedings of the National Academy of Sciences

Monday, January 09, 2012

NickM and Zachriel, Unable to Support Their Claim, Abuse Kimura

Yup life is good and evotards are a bunch of dishonest losers. The latest gaff is by NickM and Zachriel who think that 40 previously occurring neutral mutations will become fixed in a population each generation.

To supprt their claim they cite Kimura and his neutral theory. Unfortunately for them Kimura never says what they do. However it is in the literature that it takes 4N generations, where N = population size, for a neutral (that's ONE) mutation to become fixed. And seeing NickM suggested a population of 10,000 that would mean 40,000 generations.

So seeing that it is obvious that they are abusing Kimura I asked for experimental evidence that would support their claim.

Did they provide any? Nope. In typical cowardly evotard fashion NickM had a hissy-fit, falsely accused me of misunderstanding population genetics and ran away. OTOH Zachriel kept on bloviating as if his unsupported diatribe would be supported by his dishonesty.

And all that because a 10% difference between chimps and humans is way too much for their position to account for in the number of generations alotted.

(NickM doesn't care because he sez they were fixed- question-begging 101)

As I already posted:

Look, all I am asking for is evidence to support your claim:

"That means if the rate is 40 neutral mutations per birth per generation, then the expect value is 40 (previously occurring) mutations becoming fixed across the population in each generation."

I have provided references that say 4N generations for 1.

NickM started out with a population of 10,000. Do you understand what that means?

Sunday, January 08, 2012

NickM: Strawman Humper Exraordinaire

Recently a new commenter has graced my bloggings- NickM. Nick has proven to be an extraordinary strawman humper. The evidence for my claim is:

1- Nick's first offering:
To take the rate of point mutation (the type of mutation from A to T or C to A or whatever) and then make conclusions using the difference statistics which include indels is wildly, hopeless illegitimate.

Unfortunately for Nick, I wasn't taking the rate of point mutation. That was never part of my post. Strawman.

After telling him that Nick started humping that strawman until he got tired of it.

In the same thread Nick erects another:
You seem to think that you start out with a population with no mutations.

Again I never said that but there is the term "common ancestor" that one would think means very close genetically, anyway.

Nick accepts that but then has to erect yet another strawman:
And yet you think 1% divergence over ~6 million years of divergence is impossible.

That is strange seeing the the OP and topic refers to a 10%+ difference/ divergence.

What the fuck Nick? Is your PhD thesis in strawman evolution?

Friday, January 06, 2012

Neutral Theory and Substitution rates

There seems to be some confusion about what the neutral theory says about substitution rates. NickM seems to think that if the mutation rate is 1.1x10^-8 then 40 (neutral) mutations will occur with each birth and 40 will become fixed in each generation because rate of substitution = mutation rate.

NickM gives a wikipedia reference for a mutation rate of 1.1 x 10^-8.
Anyhow -- another thing you obviously don't understand at all is that even under completely neutral conditions, with no natural selection acting at all, and with nothing but genetic drift going on, *the substitution rate equals the mutation rate*.

If the genome size is 3.2 billion bases, if the human-chimp divergence time is 6 my ago, and if the generation size is 15 years, 1% divergence in point mutations takes 32 million mutations. That's 40 mutations/generation.

What NickM doesn't seem to realize is the rate of substitution = 1.1 x 10^-8, not 40 mutations/ generation. The 40 is the number of mutations we can expect in each birth given a genome of 3.2 bp.

Neutral theory pertains to the mutation rate only, not the number of mutations. What is your degree in?

Ya see Nick in order to become fixed every member in the population has to haz it. Every member, biology 101. Well I guess if the population size is one.

EvoTard's finest...

Thursday, January 05, 2012

Chimp/ Human DNA Comparison- Bad News for EvoTards

The evotard bullshit lie is that chimp and human DNA is 98.X% similar. What the evotards do not tell you is that is based on a small sample of similar DNA. And it doesn't take into account that there should be similarities based on the fact that the majority of the expressed genes are used for every day maintenace and sustaining of the cell/ organism. Meaning there should be similarities because those daily chores are also very similar.

But anyway now that both the chimp and human genomes have been sequenced scientists are now seeing a much greater genetic difference between chimps and humans. What's the point? Well at 1% difference that would be about 32,000,000 base differences and for 10% it would be about 320,000,000 base differences.

What does that mean? Well for 10% if the split occurred 7.5 million years ago, that would mean that a great number of mutations would have to become fixed each year. Even at 1% there will still be a need to fix mutations on a regular basis.

7.5 million years since divergence* = 15 million years tip-common ancestor-tip

10 year generation = 1.5 million generations in that 15 million years

80,000 indels / 1.5 million generations = 18.75 generations per fixed indel

Nothing in any peer-reviewed literature supports that. And that is only for indels.

10% is the theory of evolution killer though, as that is just too much diversity for accumulations of random mutations to overcome. And yes there are studies that demonstrate the human/ chimp genomes are over 10% different.

Only imagination can get evotards out of that mess. Can't wait to see what their high priests have to say about it-> should be interesting.

*generous number as the literature has it anywhere from 4 million-7 million years ago

So at 4 million years ago = 8 million @ 10 years/ generation = 800,000 generations and with 80,000 indels that would be 1 fixed every 10 generations.

Tuesday, January 03, 2012

Jason Rosenhouse, Just Another EvoTard Liar

Yup Jason Rosenhouse is just anothet typical evotard liar.

In his fact-free blog, Twenty Years after Darwin on Trial, ID is dead, Jason spews the lie:
There have been numerous books and countless magazine and internet postings addressing and refuting all of the major arguments ID has to offer.

Fuck you you lying piece-of-shit. You can't even produce a testable hypothesis for the premise that the bacterial flagellum evolved via accumulations of random mutations. And if you can't do that then you don't have anything that refutes ID.

Then Jason sez that ID doesn't have anything new to offer. Well Jason your position doesn't have anything to offer. Nothing at all.

But anyway what is this alleged evidence that refutes ID? Until you can muster a testable hypothesis that evidence doesn't exist.

Sunday, January 01, 2012

Retard's "Evidence" for Common Ancestry

EvoTards are such clueless pricks. It is entertaining watching them present "evidence" for their claims.

With universal common ancestry there isn't any way to test the claim objectively as there is way, way too much time involved. This allows evotards to weasel out of doing any science to support their position. However they will have us believe there is circumstantial evidence. Evidence, the say, that can only be explained via common ancestry-> which is just a question-begging fallacy.

For example I was just visited by an evotard that goes by "human ape". It wants to be an ape so badly it calls itself one. But that isn't the entertainemnt. On its blog it sez:

Human ape fetuses and whale fetuses are covered with hair which drops off before birth. The hair is a remnant of our ancient hairy ape ancestors and it's a remnant of whale ancestors who lived on land. This is the only possible scientific explanation.

Unfortunately the dumbass has absolutely no way of supporting its question-begging claim. Not only that the freaking ignorant punk doesn't understand that evidence for common ancestry is not evidence for any mechanism.

But anyway-

Why can't it be that the hair/ fur is just part of the developmental program for mammals? One basic/ common developmental program for mammals, with each getting its own required "personal" code that determines the final archetype.

So untestable question-begging plus the inability to think = evidence for a human ape.

On another note-

2012 will go down as the year in which the myth of 98.x% genetic similarity between chimp and humans is shattered, putting the actual % below 90%. And that may be too large of a difference to be accounted for via stochastic processes.