Posted by Richard B. Hoppe on May 25, 2004 04:10 AM

Since I seem to be on a roll, I thought I’d take a look at a claim that’s made frequently by ID proponents, namely that archaeology uses a design detection procedure akin to that allegedly formalized by Dembski’s Explanatory Filter.  (Gary Hurd: Feel free to interject/comment/correct at will.)

Review of the Explanatory Filter

Recall that the Explanatory Filter classifies explanations into just three mutually exclusive kinds: regularity (or law), chance, and design.  It employs a decision tree to determine which explanation is appropriate for a given phenomenon, with the decisions determined at each node in the tree by a probability estimate.  The first decision to be made is whether a given phenomenon is best explained as a “regularity” — an inevitable consequence of natural law(s).  Such are high-probability events, and if the probability of the phenomenon is high, the decision is “regularity” and the decision process terminates.

The second decision is whether the phenomenon is due to chance.  One estimates the probability of occurrence of the phenomenon, where the events or parts comprising the phenomenon are assumed to be mutually independent, the phenomenon is assumed to be a discrete combinatorial object (in space for structures or sequence for processes), and the PDF is assumed to be uniform.  If the probability of spontaneous aggregation of the phenomenon is “intermediate,” which apparently means somewhere between 1 in 10^150 and the near vicinity of 1.0 (1.0 being the best-of-all-possible-worlds “regularity” probability and 1 in 10^150 being Dembski’s “Universal Probability Bound”), one attributes the phenomenon to “chance.”

If the phenomenon is “specified” and if its probability of occurrence (estimated as above) is less than 1 in 10^150, we are to attribute it to design.  “Specified” means (roughly) that the phenomenon displays a pattern that we recognize.  For example, the specification of the eubacterial flagellum is an “outboard rotary motor”  (No Free Lunch, p 289).  I will leave (nearly) unremarked Dembski’s assertion that

Specification is never a problem.  The irreducibly complex systems we consider, particularly those in biology, always satisfy independently given functional requirements (see section 3.7). … This is not to say that for the biological function of a system to constitute a specification humans must have independently invented a system that performs the same function. … At any rate, no biologist I know questions whether the functional systems that arise in biology are specified. (p. 289)

I will here note only that in section 3.7, which is slightly less than 2 pages long, Dembski manages to use “specified” (or “specification”) in at least 7 different senses and settles on none of them.  No wonder no biologist would question it — one doesn’t know what “it” is!  Since for Dembski “specification” reduces to ‘this system has a known function,’ then “specified complexity” reduces to ‘this system is very very very VERY unlikely to have spontaneously aggregated.’

The IDist claim about archaeology

Dembski and other IDists claim that a number of “special” sciences like SETI, forensics, and archaeology employ an informal “pretheoretic” form of the Explanatory Filter to ascertain whether a given phenomenon is due to natural causes (regularity), chance, or (human) design.  For example, in Reason & Revelation we read

Intuitively, we know that design cannot be a concept that is foreign to science because there are disciplines of a scientific nature that seek to tease apart natural causes from intelligent agency. Dembski is not proposing a change in the way that these scientists work. All he has done is to formalize the process that they (and many of us) already use, and to show that the process can detect design in a reliable fashion

One such discipline is forensic science. A typical task of forensic investigators is to determine the cause of death: Was it natural or suspicious? Should they be looking for a person—an agent—who caused this death? Another science is archaeology, which on a regular basis must distinguish genuine artifacts from stones, sticks, and other items that so often clutter excavation sites. One of the best-known examples of design detection is SETI (the Search for Extraterrestrial Intelligence). (Bolding added)

(I won’t remark on the groundlessness of the “reliability claim here.)  From the Institute for Christian Teaching:

This [Dembski’s The Design Inference] is all very exciting to the “design crowd” because complexity and specification are “testable” and “quantifiable” characteristics, and Dembski has established an objective method for testing structures, processes, etc. to determine if they are the result of law, chance, or design.  No longer are the defenses, “You just need more imagination” or “You believe in the ‘God of the Gaps’”, tenable, because the invocation of design is no longer a response to our ignorance, or our aesthetic judgment, or our biases.  What we have now is a scientifically defensible basis for saying “With current knowledge, we have shown that unintelligent causes are unable to produce this artifact.”  This isn’t really “new” science, either.  It borrows from the same kind of work that goes on constantly in archaeology, forensic science, etc.  (Bolding added)

Finally, from Dembski himself:

There does in fact exist a rigorous criterion for discriminating intelligently caused from unintelligently caused objects.  Many special sciences (e.g., forensic science, artificial intelligence, cryptography, archaeology, and the Search for Extraterrestrial Intelligence) already use this criterion, though in pretheoretic form.  I call it the complexity-specification criterion. (No Free Lunch, p. 6; Bolding added)

and later in the same book

Intelligent design is a scientific research program that examines the role of specified complexity in nature.  Since many special sciences already employ specified complexity as a criterion for detecting design (e.g., SETI and archaeology) there can be no principled objection to teaching intelligent design within a science curriculum, and particularly whenever the origin and history of life comes up in grades K-12.  (NFL, p. 314; Bolding added)

What do archaeologists actually do?

I took an undergraduate degree in anthropology back in the dark ages and was required to take some ‘bones and stones’ in the course of doing that degree.  I deem myself to be an informed layman about such matters:  I can read the literature and interpret it with some small effort.  I don’t recall from those courses or my subsequent reading that a “pretheoretic” version of Dembski’s Explanatory Filter was used then or now.  My most vivid memory, in fact, is of a junior faculty member sitting on a table in front of the lecture hall knapping stone tools while lecturing about them.  That is, he demonstrated how the tools were manufactured; he didn’t estimate their probablity of occurrence, considering them as spontaneously aggregated discrete combinatorial objects, against a uniform probability distribution.

Most interesting about his demonstrations were the several products.  Since he started from scratch (a nodule of flint), when he finished there was a nice projectile point, but there was also characteristic debris — the flint core off which he had hammered several large flakes that later became the points he knapped, some unsuitable flakes, and a slew of chips and pieces of varying sizes, ranging from largish chips produced by percussive blows to get the initial flake into some kind of rough shape, to the tiny chips produced by the final pressure flaking (using a deer antler) to finish the sharp edges of the point.  (Incidentally, I’m told that chips show a characteristic microscopic pattern on the struck edge created by a percussive blow that’s different from that caused by pressure flaking.)

Now, one can obviously identify the final product, a projectile point, as “designed” by its shape, referring it to any number of instances of known human manufacture.  But since the manufacturing process left characteristic debris, one could also say something about how the point was produced — one is not restricted to the object itself.  A year ago I found a projectile point manufacturing site on a hillside at the edge of a newly plowed field.  I found no points or cores, but I found a concentrated area thick with the kinds of chips produced by the manufacture of points from flakes.  Did I do a probability analysis to infer that they were produced by a human?  Nope.  I compared my observation of the distribution of chips (yes, I picked up as many as I could find to sort and count them!) to the known distribution of debris produced by manufacturing projectile points from good Mercer flint, a straightforward distributional comparison process. 

Consider the most primitive known tools manufactured by hominids, the Oldowan (sometimes Olduwan) pebble tool assemblages from the Olduvai Gorge explored for decades by the Leakeys.  There are hammerstones, cores, and flakes associated with the manufacturing of Oldowan assemblages, as well as choppers and scrapers.  That is, one finds the tools used to manufacture flake tools (hammerstones) and the two main components left by the manufacturing process, the flakes and the cores from which the flakes were struck.  There’s no probability calculation involved; human design is inferred from evidence left by the manufacturing process.

Experimental archaeologists have done a variety of studies that provide information on characteristic distributions of debris types, characteristic usage wear patterns, and the relative utilities of tools for various functions, as for example testing unretouched flakes against bifacial hand axes as large-animal butchering tools.  Again, there’s no probability calculation to eliminate regularity and chance in order to infer design, there is a systematic study of the artefacts and their properties and context in order to make inferences about early hominid tool manufacture.  Microscopic examinations of tool edges yield information about use, as well.  There is even a mathematical model that explains the consistency of the Pleistocene Levallois stone tool assemblage across a considerable geographic area in Eurasia and Africa in terms of the efficiency of a manufacturing process utilizing stone raw materials.

So rather than looking at an artefact in isolation and calculating some cockamamie probability, archaeologists look at objects in context, and make strong efforts to ascertain the method of manufacture, seek evidence left by the manufacturing process (both on the object itself and on scrap material and tools in the vicinity), and analyze the marks left by functional use.  And not least, they look for direct evidence — fossils — or other indications, independent of the artefacts themselves, of the presence of the humans who made the tools, for example, fire pits and other traces of dwellings or camp sites.

Nothing I see in that research methodology would be even mildly helped by Dembski’s Explanatory Filter, say nothing of embodying a primitive version of its alleged formalism.  It might help what Dembski imagines that archaeologists do, but his familiarity with archeological research methodologies seems even more tenuous than his familiarity with evolutionary biology.  I therefore conclude that claims that archaeology uses a “pretheoretic” version of Dembski’s Explanatory Filter or his “complexity-specification criterion” are false.

RBH