"Specification- The Pattern that Signifies Intelligence"*
ABSTRACT: Specification denotes the type of pattern that highly improbable events must exhibit before one is entitled to attribute them to intelligence. This paper analyzes the concept of specification and shows how it applies to design detection (i.e., the detection of intelligence on the basis of circumstantial evidence). Always in the background throughout this discussion is the
fundamental question of Intelligent Design (ID): Can objects, even if nothing is
known about how they arose, exhibit features that reliably signal the action of an
intelligent cause? This paper reviews, clarifies, and extends previous work on
specification in my books The Design Inference and No Free Lunch.--Wm Dembski
*Wm. Dembski on Specification
CJYman has also weighed in on specification:
CJYman on Specification
Here is a simple example of measuring for a specification:
since a specification includes, but is not limited to function, I will use an example of specification based on compressibility, since compressibility is a way of independently formulating a certain pattern.
can be independently formulated as:
"print '1' 30x", so let's attempt to find if this specified pattern is also a specifiation and here's the equation to use:
? = -log2[number of bit operations * number of specified patterns * probability of pattern]
Let's first calculate the number of specified patterns that have the same compressibility (specificity in this case) as the string of 30 1s. If the above string = 30 bits, then there is only one other pattern with the same compressibility -- a string of 30 0s.
So, we multiply 2 by the probability of the pattern in question:
2 * 1/1,073,741,824
Now, let's calculate how many bit operations it took to arrive at the pattern in question:
Let's say we started at a random 30 bit string such as "100011101111100010111010000010"
and arrived at the pattern in question (30 1s) in only 30 random bit flips/operations then:
? = -log2[ 30 * 2 * 1/1,073,741,824]
? = (approx) 24
24 is greater than 1 thus we have a specification and it is beyond random chance processes to generate the pattern of 30 1s from a random 30 bit string within 30 random bit operations. Thus, we must begin to look at causal options other than chance to arrive at the pattern in question in the above scenario.
Now, when measuring for a functional specification (within a set of functional "islands"), you apply the same equation, however, when measuring the specificity you take into account all other FUNCTIONAL patterns (able to be processed into function by the system in question)that have the same probability of appearance as the pattern in question -- instead of taking into account all equally probable and compressible patterns.
Furthermore, according to the NFL Theorem, an evolutionary algorithm based on problem specific information is necessary in order to arrive at better than chance performance, which is exactly what a specification is calculating.
The next question: will a random set of laws cause an information processing system and evolutionary algorithm to randomly materialize?
According to recent work on Conservation of Information Theorems (which I won't get into at the moment since I'm already taking over joe's blog post -- sorry joe) ID theorists state that the answer is "NO!" In fact, getting consistently better than chance results without previous problem specific information is to information theory what perpetual motion free energy machines are to physics.
Merely produce an information processing system and evolutionary algorithm from a truly random (high thermodynamic entropy/low information) set of laws and ID Theory is falsified.