Measuring Information- Revisited
The causal tie between an artifact and its intended character — or, strictly speaking, between an artifact and its author’s productive intention — is constituted by an author’s actions, that is, by his work on the object.- ArtifactWhen discussing information some people want to know how much information does something contain?
If it is something straight-forward such as a definition, we can count the number of bits in that definition to find out how much information it contains.
aardvark: a large burrowing nocturnal mammal (Orycteropus afer) of sub-Saharan Africa that has a long snout, extensible tongue, powerful claws, large ears, and heavy tail and feeds especially on termites and antsA simple character count reveals 202 characters which translates into 1010 bits of information/ specified complexity.
Now what do we do when all we have is an object?
One way of figuring out how much information it contains is to figure out how (the simplest way) to make it.
Then you write down the procedure without wasting words/ characters and count those bits. The point is that you have to capture the actions required and translate that into bits. That is if you want to use CSI. However by doing all of that you have already determined the thing was designed Now you are just trying to determine how much work was involved.
But anyway, that will give you an idea of the minimal information it contains- Data collection and compression (six sigma DMAIC- define, measure, analyze, improve, control).
CSI is a threshold, meaning you don't need an exact number. And it is a threshold that nature, operating freely has never been observed to come close to. Once CSI = yes you know it was designed.
On Shannon Information and measuring biological information:
The word information in this theory is used in a special mathematical sense that must not be confused with its ordinary usage. In particular, information must not be confused with meaning.- Warren Weaver, one of Shannon's collaborators
Is what Weaver said so difficult to understand?
Kolmogorov complexity deals with, well, complexity. From wikipedia:
Algorithmic information theory principally studies complexity measures on strings (or other data structures).
Nothing about meaning, content, functionality, prescription. IOW nothing that Information Technology cares deeply about, namely functional, meaningful, and useful information. Not only Information Technology but the whole world depends on Information Technology type of information, ie the type of information Intelligent Design is concerned with.
And both Creationists and IDists make it clear, painfully clear, that when we are discussing "information" we are discussing that type of information.
And without even blinking an eye, the anti-IDists always, and without fail, bring up the meaningless when trying to refute the meaningful. “Look there is nature producing Shannon Information, you lose!”- ho-hum.
Biological specification always refers to function. An organism is a functional system comprising many functional subsystems. In virtue of their function, these systems embody patterns that are objectively given and can be identified independently of the systems that embody them. Hence these systems are specified in the same sense required by the complexity-specification criterion (see sections 1.3 and 2.5). The specification of organisms can be crashed out in any number of ways. Arno Wouters cashes it out globally in terms of the viability of whole organisms. Michael Behe cashes it out in terms of minimal function of biochemical systems.- Wm. Dembski page 148 of NFL
In the preceding and proceeding paragraphs William Dembski makes it clear that biological specification is CSI- complex specified information.
In the paper "The origin of biological information and the higher taxonomic categories", Stephen C. Meyer wrote:
Dembski (2002) has used the term “complex specified information” (CSI) as a synonym for “specified complexity” to help distinguish functional biological information from mere Shannon information--that is, specified complexity from mere complexity. This review will use this term as well.
In order to be a candidate for natural selection a system must have minimal function: the ability to accomplish a task in physically realistic circumstances.- M. Behe page 45 of “Darwin’s Black Box”
With that said, to measure biological information, ie biological specification, all you have to do is count the coding nucleotides of the genes involved for that functioning system, then multiply by 2 (four possible nucleotides = 2^2) and then factor in the variation tolerance:
from Kirk K. Durston, David K. Y. Chiu, David L. Abel, Jack T. Trevors, “Measuring the functional sequence complexity of proteins,” Theoretical Biology and Medical Modelling, Vol. 4:47 (2007):
[N]either RSC [Random Sequence Complexity] nor OSC [Ordered Sequence Complexity], or any combination of the two, is sufficient to describe the functional complexity observed in living organisms, for neither includes the additional dimension of functionality, which is essential for life. FSC [Functional Sequence Complexity] includes the dimension of functionality. Szostak argued that neither Shannon’s original measure of uncertainty nor the measure of algorithmic complexity are sufficient. Shannon's classical information theory does not consider the meaning, or function, of a message. Algorithmic complexity fails to account for the observation that “different molecular structures may be functionally equivalent.” For this reason, Szostak suggested that a new measure of information—functional information—is required.
Here is a formal way of measuring functional information:
Robert M. Hazen, Patrick L. Griffin, James M. Carothers, and Jack W. Szostak, "Functional information and the emergence of biocomplexity," Proceedings of the National Academy of Sciences, USA, Vol. 104:8574–8581 (May 15, 2007).
Jack W. Szostak, “Molecular messages,” Nature, Vol. 423:689 (June 12, 2003).
original posts can be found here, here and here