Saturday, July 24, 2010

David Warren is an Ignoramus

What happens when a newspaper hires an ignoramus as a columnist?

You get this kind of drivel, which has been deftly taken apart by media culpa.

David Warren seems to have attended the Denyse O'Leary school of ignorance.

Sunday, July 11, 2010

My Annie Hall Moment

One of the best moments in Annie Hall occurs when Alvy Singer (Woody Allen) is standing in line at the theater with Annie Hall (Diane Keaton), listening to a guy pontificate behind him. When the guy mentions Marshall McLuhan, Singer pulls out McLuhan from behind a poster, who then proceeds to say "You know nothing of my work!"

Singer then says, "Boy, if life were only like this!"

Well, perhaps what just happened to me is not up to that standard, but here goes anyway:

Over at Uncommon Descent, writer "DonaldM" uses the satiric experiment of physicist Alan Sokal to argue against "dogmatic Darwinism" and for "the sunshine of Truth".

I wrote to Alan Sokal and asked him what he thought of DonaldM's ramblings. Here are excerpts from his e-mail to me (ellipses, as usual, denote omissions):

Many thanks for drawing my attention to that strange blog item... I don't really understand the logic of how that ID guy is purporting to use me!

I mean, I looked at Paul Greenberg's article in the Jewish World Review that he cites http://www.jewishworldreview.com/cols/greenberg070610.php3?printer_friendly and it seems to be a straightforward piece supporting my contention that there is such a thing as objective reality (though he didn't get quite correct his purported quote from me). But then the ID guy seems to overlook the obvious irony in the paragraph from Greenberg that he quotes, and takes it literally -- or else he just drops the subject there, says that "All this reminds me" of something else that is vaguely related, and goes on with his own pet story.

Now, that story makes a valid point, namely that how one interprets evidence is affected (though not determined) by the preconceptions one comes with. But if we are having a contest about who is more open to having his or her preconceptions be refuted by inconvenient evidence, then I would have to say that -- though no one is perfect -- scientists win hands down over the devotees of sacred texts. (I know, I know, they will respond in a chorus: ID is not religion, and our support for ID does not arise from any religious commitment but simply from our dispassionate analysis of the scientific evidence. Yeah, right.)

...


Isn't it great that life is like this!

Saturday, July 03, 2010

The Day My Father Got Arrested



Sixty-eight years ago today, my father was arrested in Philadelphia.

He didn't rob a bank or run over a priest. No, his only crime was to take a photograph of one of the US's most enduring symbols of freedom: the Liberty Bell.

Back in 1942, the country was at war. My father hadn't yet enlisted in the Army; he was still a reporter for the Philadelphia Record. He was living only three blocks away from the Liberty Bell, which at the time was in Independence Hall. (Now it's in its own special building across the street.) My father often met tourists who wanted to take a picture of the Liberty Bell, but were prevented from doing so by an arbitrary rule imposed by the Bureau of City Property. My father got indignant when he learned that commercial photographers were able to take pictures of the Liberty Bell, but not the average citizen. That's the way my father was -- he liked to stick up for the little guy.

So he took a photograph -- and promptly got arrested. Maybe it was partly a publicity stunt for the newspaper, but I think he was trying to make a serious point, too. Officials asked if he was a communist, and called him "vindictive". He spent the night in jail. But after the article he wrote about his experience appeared in the Record, he was acquitted of the charge of "breach of the peace" by Magistrate Nathan A. Belfel. Maybe that's because my father was clever enough to bring along some important people, like the president of the Philadelphia Stock Exchange, to witness his arrest and speak on his behalf. Today, I'm happy to say, that old rule about the Liberty Bell is no longer in place.

But some things never change. We're at war again. And ordinary citizens are still being harassed for taking perfectly legal photographs of public buildings.

My father died in 1995. I like to think, however, that if he were still alive, he'd still be sticking up for the little guy -- and for the right to photograph without being arrested by overzealous officials.

Wednesday, June 30, 2010

The Worst Science Books

Over at Uncommon Descent, Denyse O'Leary, the world's worst journalist™, gives us a list of her favorite science books --- in her usual barely literate style. (Note to Denyse: the plural of "coo" is not "coo's".)

No surprise, three of them aren't written by scientists: Darwin on Trial, Signature in the Cell, and Alfred Russel Wallace's Theory of Intelligent Evolution. Of the other two, one was written by a very mediocre scientist who made basic mistakes in previous books, and the other by a man whose bogus claims were repudiated by his own department. In Denyse's topsy-turvy world, actual scientists can be dismissed as "mooches and tax burdens", or "British aristocrats".

The late Martin Gardner studied this kind of crankery and knew how to recognize it. A scientific crank, Gardner said, "has strong compulsions to focus his attacks on the greatest scientists and the best-established theories." It is not possible to reason with this kind of idiocy -- ridicule is the best response.

Actually, Denyse's list would be a good start on a list of the Worst Science Books. Do you have any more nominations? I'll start with Judith Hooper's Of Moths and Men, Arthur Koestler's The Case of the Midwife Toad, and anything by Jeremy Rifkin.

Thursday, June 24, 2010

Mr. Jefferson and the Giant Moose


I recently read a fun little book, Mr. Jefferson and the Giant Moose, by Lee Alan Dugatkin. It's the story of a little-known episode in American history -- how Jefferson tried to combat the bogus claims of French naturalist Georges-Louis Leclerc, Comte de Buffon, that North American biota was "degenerate" compared to European biota.

Jefferson was worried that if such claims became accepted knowledge, then the US's reputation would suffer. Who would want to conduct trade with "degenerate" humans, or buy "degenerate" agricultural products?

In part, Buffon's claims were responsible for Jefferson's magnum opus, Notes on the State of Virginia, the most important American book published before 1800. Writing Notes involved correspondence with luminaries such as James Madison. Dugatkin quotes from a 1786 letter from Madison to Jefferson, where they discuss the fine points of weasel biology, including measuring the "width of the ears horizontally" and the "distance between the anus and the vulva".

But Jefferson had other ideas for convincing Buffon. As the book's title suggests, Jefferson's most concerted effort in terms of hands-on evidence was to procure a very large, dead, stuffed American moose - antlers and all - to hand Buffon personally in Paris, in effect saying "see".

You can't help but love a book that has sentences like "The first pre-moose incident occurred just before Jefferson was to sail off to his ministerial post in France" and "The second pre-moose instance -- wherein Jefferson encountered in Buffon a man who seemed to refuse to budge, even in the face of physical evidence -- revolved around the "mammoth" discussed in Notes on the State of Virginia."

I recommend it to anyone interested in the crucial role of ungulates in American history.

Wednesday, June 23, 2010

More Lousy Reporting from Mirko Petricevic

Mirko Petricevic, the religion reporter for the Kitchener-Waterloo Record, is at it again.

I previously criticized his coverage of a local creationist group. I pointed out that Petricevic -- unlike a good reporter -- never asks any hard questions of believers. Instead, his "reporting" is mostly just taking dictation.

Now he's got an article about the local Christian Science church, and he's employing exactly the same modus operandi: local believers are allowed to prattle on, and not a single skeptical word in the entire article.

Reading it, you would never know that there is no good evidence that prayer works to heal people of diseases. Nor would you know that Christian Science practitioners have been implicated in dozens of cases of medical neglect, where simple and safe treatment could have saved lives.

This is not just shoddy journalism, it's morally culpable.

I'm Sorry to Have Missed This

Reader Paul C. A. points out that I was too late to attend the International Remote Viewing Association's 2010 conference. Too bad, I would have liked to see so many woomeisters in one room.

Just think, for only $436 I could have heard

* Robert Jahn, formerly of PEAR, a parapsychology lab associated with Princeton that embarrassed them for years until it was finally disbanded in 2006;

* Noreen Renier, a self-proclaimed psychic whose attempts at solving crimes have been extensively debunked;

* Alexis Champion, who advocates "psychic archaeology";

* Paul Smith, who taught dowsing to participants;

* Courtney Brown, a political scientist at Emory University who, according to Michael Shermer, is not allowed to mention his affiliation with Emory when discussing remote viewing. Did I mention that he was a "yogic flyer"?

Oh, the fun I could have had! For example, in the description of Jahn's talk, he says, "repeated applications proved to diminish the yield, suggesting that disproportionate focus on the analytical components of the perception, rather than on the phenomenal gestalt, can result in obscuring the essence of the phenomenon and that the subjective quality of these experiences is more effectively enhanced when their inherent uncertainties are both acknowledged and emphasized."

Translation: we can't figure out why our discovery of tiny psi effects disappear when we do more trials.

Sunday, June 20, 2010

Yet Another Literary Quiz


What American polymath, professor, science and fiction writer lived in this house in Newton, Massachusetts from 1956 to 1970? Hint: he has an asteroid and a crater on Mars named after him.

Sorry about the photo quality - it was pouring at the time.

Saturday, June 19, 2010

Literary Quiz



This house in New England was owned by one of America's most celebrated writers. One of his lesser-known achievements was a long attack on a home-grown American religion. This writer wrote most of his celebrated works in this house.

Who is it?

Wednesday, June 16, 2010

Fermat's Last Theorem Silliness

I am fascinated by cranks and crank mathematics, and there's a lot of them/it out there. Here's a new "proof" of Fermat's last theorem I was sent yesterday. There is only a small amount of entertainment value in this one, with phrases such as "only one of those cofactors is organically entered into the structure of the pair of conjugate variables". Much better is the book The Life-Romance of an Algebraist by George Winslow Pierce.

Some Unimpressive Numerology

The fine-structure constant α is a fundamental constant in physics, and is currently estimated to be approximately .0072973525376.

The physicist Arthur Eddington, who became rather eccentric and believed he could compute the number of protons in the universe accurately, thought it was equal to exactly 1/137, but our current estimate gives something closer to 1/137.03599967899.

The mathematician James Gilson seems to think that α is given by the rather complicated formula (29/π)*cos(π/137)*tan(π/(137*29)). But this is just numerology, and not even particularly impressive numerology. The trick is that tan(x) is very close to x when x is small, and cos(x) is very close to 1 when x is small. So Gilson's formula is just (29/π) times something that is very close to π/(137*29), with an additional fudge factor of something that's very close to 1 thrown in. There is no real surprise, then, that one can find small integers to make this close to α.

Heck, it's obvious that the real value of the fine structure constant is actually 250/34259. Or maybe (cos(2 π/57) - sin(4 π/47))/100? I can't decide which.

Monday, June 14, 2010

Great Moments in Reprint Requests

I just received the following letter:

Dear Professor Shallit,

I am a graduate student in XXX University majoring in YYY. I want to cite one of your papers that should be of great use to my current research. The title of the paper is "Randomized Algorithms in Number Theory", published on Communications on Pure and Applied Mathematics 39 (1986), S1.

Because our library does not have access to the article, it should be best that you send me a copy via email.

I really appreciate your worthless help!


Now that's the way to ask for a reprint!

Friday, June 11, 2010

John Alexander

My great-great-great-great-grandfather was John Alexander (1738-1799), a minister in the Church of England and a Loyalist during the Revolutionary War. (Although I'd have preferred a freethinker and a revolutionary, we don't get to choose our ancestors.) Here is a copy of his will, as reproduced in Volume 2, No. 4 (October 1901) of J. R. B. Hathaway's North Carolina Historical and Genealogical Register. The will is dated April 4 1795 and was probated in the August 1799 term of the court of Bertie County, North Carolina.


"Da Praecepta, Familiae, Tuae nam Tu Crive Morituruses."

"For as much as the last scene of life seems hastening on, and the curtain ready to fall," I think it prudent, before I make my final exit off the stage, whereon I have some time acted, to dispose of the few trifles fortune has bestowed me, in manner following to-wit:

Imprimis. I Give and bequeath to my two Daughters, Martha and Rachel, all and every part of my property whatever, to be equally divided between them, and to their lawful heirs forever. On the demise of either, before impowered to make a will, the surviving sister inherits the whole and should both decease, before the laws capacitate to will, then, my remaining property to be wholly converted to Educating the poor children within the counties of Hertford and Bertie; under such regulations as my Executors shall think fit. My body I bequeath to the earth, whence it originated, My Soul Immortal and unalloyed to dust, I commend to the Father of Mercies.

The manly, masculine Voice of Orthodoxy, is no longer heard in our land. Far, therefore, from my Grave be the senseless Rant of Whining Fanaticism; her hated and successful rival --- Cant and Grimace dishonour the dead, as well as Disgrace the living. Let the monitor within, who never Deceives, alone pronounce my Funeral Oration; while some Friendly hand Deposits my poor remains Close by the ashes of my beloved Daughter Elizabeth, with whom I trust to share a happy Eternity.

And of this my last Will and Testament, I constitute and appoint Capt'n George West, George Outlaw, Esq., and Mr. Edward Outlaw, my Executors, On whose Probity, Honor, and Disinterested Friendship, I entirely rely for the faithful Discharge of the trust I repose in them. Beseeching them, as they would approve themselves to him who is the Father of the Fatherless, to use all possible means of Inspiring my children with a love of Virtue and an abhorrence of Vice, Restraining them from all plans and persons Dangerous to their Virtue or Innocency --- Giving them an Education to their rank in life suitable and becoming. Let their books and their needles be their principal companions and Employ. I could wish the laws enable me to do more for my wretched and unfortunate slaves than that of recommending them to lenity and mild treatment.

Be to their faults a little blind;
Be to their virtues ever kind.

JOHN ALEXANDER.


The "manly, masculine Voice of Orthodoxy" is the Church of England; the "senseless Rant of Whining Fanaticism" is either the Episcopalian Church or the Baptist Church.

I don't know why, if he found his slaves "wretched and unfortunate", he didn't just free them. But perhaps it was just so far beyond the social norm that he didn't feel it possible to do so.

Friday, June 04, 2010

A Famous Wall


I would guess this is the most famous wall in professional sports. What do you think?

Wednesday, June 02, 2010

Great Moments in Lousy Writing

Harlan Coben is a pretty good mystery writer. I don't like his Myron Bolitar novels, but that's because I don't like the main character, a sports agent, at all. But some of his other books are top-notch: Tell No One, which was made into a movie by the French director Guillaume Canet (Ne le dis à personne) is excellent, as are many of his other stand-alone novels. Each stand-alone has a similar theme: something in the distant past of a character's life is eventually revealed, with strong repercussions in the present day, changing what many of the characters thought they knew.

His latest book, Caught, is about the disappearance of a high-school lacrosse player, and, while it starts slowly, you get the trademark Coben reversals in the last 50 pages. It's a good summer read.

However, there was one passage that stood out (p. 209):

Something was niggling at the back of Wendy's brain. It was there, just out of sight, but she couldn't quite get to it.

Is there anything more infuriating in mystery writing than this cliché? The reader learns that something is triggered at the back of the detective's mind, and it's like a great big sign reading, "IF YOU WANT TO FIGURE IT OUT, THIS IS AN IMPORTANT CLUE."

What are some other mystery novel clichés that turn you off?

Saturday, May 29, 2010

No Ghost in the Machine

Back when I was a graduate student at Berkeley, I worked as a computer consultant for UC Berkeley's Computing Services department. One day a woman came in and wanted a tour of our APL graphics lab. So I showed her the machines we had, which included Tektronix 4013 and 4015 terminals, and one 4027, and drew a few things for her. But then the incomprehension set in:

"Who's doing the drawing on the screen?" she asked.

I explained that the program was doing the drawing.

"No, I mean what person is doing the drawing that we see?" she clarified.

I explained that the program was written by me and other people.

"No, I don't mean the program. I mean, who is doing the actual drawing, right now?

I explained that an electron gun inside the machine activated a zinc sulfide phosphor, and that it was directed by the program. I then showed her what a program looked like.

All to no avail. She could not comprehend that all this was taking place with no direct human control. Of course, humans wrote the program and built the machines, but that didn't console her. She was simply unable to wrap her mind around the fact that a machine could draw pictures. For her, pictures were the province of humans, and it was impossible that this province could ever be invaded by machines. I soon realized that nothing I could say could rescue this poor woman from the prison of her preconceptions. Finally, after suggesting some books about computers and science she should read, I told her I could not devote any more time to our discussion, and I sadly went back to my office. It was one of the first experiences I ever had of being unable to explain something so simple to someone.

That's the same kind of feeling I have when I read something like this post over at Telic Thoughts. Bradford, one of the more dense commentators there, quotes a famous passage of Leibniz

Suppose that there be a machine, the structure of which produces thinking, feeling, and perceiving; imagine this machine enlarged but preserving the same proportions, so that you could enter it as if it were a mill. This being supposed you might visit its inside; but what would you observe there? Nothing but parts which push and move each other, and never anything which could explain perception.

But Leibniz's argument is not much of an argument. He seems to take it for granted that understanding how the parts of a machine work can't give us understanding of how the machine functions as a whole. Even in Leibniz's day this must have seemed silly.

Bradford follows it up with the following from someone named RLC:

The machine, of course, is analogous to the brain. If we were able to walk into the brain as if it were a factory, what would we find there other than electrochemical reactions taking place along the neurons? How do these chemical and electrical phenomena map, or translate, to sensations like red or sweet? Where, exactly, are these sensations? How do chemical reactions generate things like beliefs, doubts, regrets, certainty, or purposes? How do they create understanding of a problem or appreciation of something like beauty? How does a flow of ions or the coupling of molecules impose a meaning on a page of text? How can a chemical process or an electrical potential have content or be about something?

Like my acquaintance in the graphics lab 30 years ago, poor RLC is trapped by his/her own preconceptions, I don't know what to say. How can anyone, writing a post on a blog which is entirely mediated by things like electrons in wires or magnetic disk storage, nevertheless ask "How can a chemical process or an electrical potential have content or be about something?" The irony is really mind-boggling. Does RLC ever use a phone or watch TV? For that matter, if he/she has trouble with the idea of "electrical potential" being "about something", how come he/she has no trouble with the idea of carbon atoms on a page being "about something"?

We are already beginning to understand how the brain works. We know, for example, how the eye focuses light on the retina, how the retina contains photoreceptors, how these photoreceptors react to different wavelengths of light, and how signals are sent through the optic nerve to the brain. We know that red light is handled differently from green light because different opsins absorb different wavelengths. And the more we understand, the more the brain looks like Leibniz's analogy. There is no ghost in the machine, there are simply systems relying on chemistry and physics. That's it.

To be confused like RLC means that one has to believe that all the chemical and physical apparatus of the brain, which is clearly collects data from the outside world and processes it, is just a coincidence. Sure, the apparatus is there, but somehow it's not really necessary, because there is some "mind" or "spirit" not ultimately reducible to the apparatus.

Here's an analogy. Suppose someone gives us a sophisticated robot that can navigate terrain, avoid obstacles, and report information about what it has seen. We can then take this robot apart, piece by piece. We see and study the CCD camera, the chips that process the information, and the LCD screens. Eventually we have a complete picture of how the robot works. What did we fail to understand by our reductionism?

Our understanding of how the brain works, when it is completed, will come from a complete picture of how all its systems function and interact. There's no magic to it - our sensations, feelings, understanding, appreciation of beauty - they are all outcomes of these systems. And there will still be people like RLC who will sit there, uncomprehending, and complain that we haven't explained anything, saying,

"But how can chemistry and physics be about something?"

Friday, May 28, 2010

Casey Luskin: Information Theory Expert

Well, it looks like the Discovery Institute was so unnerved by my pointing out the misunderstandings and misrepresentations in Stephen Meyer's book, Signature in the Cell, that they devoted two whole chapters to attacking me in their new book. The always-repulsive David Klinghoffer called me a "pygmy" and made fun of the name of my university (page 6). Paul Nelson called my critique a "fluffy confection" and alleged I was guilty of "sophistry". Casey Luskin said I indulged in "gratuitous invective".

The DI's responses to my arguments about Signature are about at the level of what you'd expect from them. I already replied to Paul Nelson months ago here, but of course they didn't see fit to reference that.

In their new book, they trot out lawyer Casey Luskin as their new expert on information theory. Luskin's main points are

(1) Shannon and Kolmogorov complexity are not "useful metrics of functional biological information" and
(2) eminent scientists have adopted Dembski and Meyer's notion of "functional information".

Here's my response:

(1) No measure of information is perfect. Both Shannon and Kolmogorov have proved useful in biological contexts; to claim, as Luskin does that they are "outmoded tools" is ridiculous. An exercise for Luskin, or anyone else: do a search of the scientific literature for "Shannon information" in biology, and count how many hits you get. Now do the same thing for "functional information". See the difference?

Indeed, it is the apparent incompressibility of the genome that suggests, through Kolmogorov complexity, that random mutations played a very significant role in its creation.

(2) Luskin cites a 1973 book by Orgel, where Orgel used the term "specified complexity", as evidence that creationist information is used by real scientists. However, Orgel did not give a rigorous definition of the term, and no one has since then. The term was only used in a popular book, and Orgel never published a definition in the peer-reviewed scientific literature. Dembski later claimed that Orgel's term was the same as his, and Luskin now repeats this falsehood. A lie can travel around the world, while the truth is just lacing up its sneakers.

Luskin points out that very recently, Szostak has introduced a notion of "functional information". However, Szostak's "functional information" is not a general-purpose measure of information. It certainly does not obey the axioms of information as studied by information theorists, and it does not obey Dembski's "law of conservation of information". Furthermore, it is only defined relative to a set of functions that one specifies. Change the functions, and you might get a completely different measure. So it is clear that Szostak's measure is not the same as Dembski's.

Might Szostak's idea prove useful one day? Perhaps, although the jury is still out. It has yet to receive many citations in the scientific literature; one of the papers cited by Luskin is by creationist Kirk Durston. The last time I looked, Durston's paper had essentially no impact at all, to judge by citation counts.

In any event, my claim was "Information scientists do not speak about ‘specified information’ or ‘functional information.’” Luskin offers Szostak as a counterexample. But Szostak is not an information scientist; he's a biologist. No discussion of "functional information" has yet appeared in the peer-reviewed information theory literature, which was my point. Luskin's trotting out of Szostak's paper does not refute that.

A Much-Too-Credulous Review of Signature in the Cell

John Walker is a pretty bright guy who's done some interesting work, but in this review of Stephen Meyer's Signature in the Cell, he demonstrates insufficient skepticism about Meyer's claims.

He asks, where did the information to specify the first replicator come from?, and then follows with this non-sequitur: The simplest known free living organism (although you may quibble about this, given that it's a parasite) has a genome of 582,970 base pairs, or about one megabit (assuming two bits of information for each nucleotide, of which there are four possibilities).

Of course, this is silly. Nobody thinks the first replicator was anywhere near this complicated, or even that it necessarily had a "genome" based on DNA. Even the genetic code itself may have evolved. Hypotheses like the RNA World suggest that the first replicator might have consisted of only a few hundred base-pairs.

Oddly enough for someone who has worked in artificial life, Walker shows no sign of having read Koza's 1994 paper, which shows how self-replicators can emerge spontaneously and with high probability in computer simulations.

He then goes on to claim you find that in the finite time our universe has existed, you could have produced about 500 bits of structured, functional information by random search. The only problem? The term "structured, functional information" has no definition in the scientific literature - it's just babble invented by creationists like Dembski and Meyer. There's no sign that Walker has read any of the criticism of Dembski's work.

Walker goes on to give a definition of "structured, functional information" as "information which has a meaning expressed in a separate ___domain than its raw components". But then there are lots of examples of such information occurring in nature, such as varves. Varves are layers of sediment which encode yearly information about the environment in which they formed. Another example is Arctic ice cores, which encode essential information about climate that is being mined by climatologists today.

Finally, the notion of "meaning" is incoherent. Disagree? Then tell me which of the following strings have "meaning" and which do not:

#1:
001001001100011011111010010111010010111000100000100000100111

#2:
010100111011001100001111101011100101110011110110010000001101

#3:
101010101010101010101010101010101010101010101010101010101010

#4:
101111101111101110101110111110101111101110101110101110101001

If that's too easy for you, let's try another. List all the binary strings of length 10 that have "meaning", and explain, for each one, what the meaning is.

Bottom line: insufficient skepticism leads to credulous acceptance of bad ideas.

Wednesday, May 26, 2010

Stephen Meyer - More Honesty Problems?

At Christianity Today, Stephen Meyer repeats the falsehood that "We know that information—whether inscribed in hieroglyphics, written in a book, or encoded in a radio signal—always comes from an intelligent source." It's simply not so - for example, in the Kolmogorov theory, any random source produces information. Even in Meyer's own idiosyncratic definition of information, natural systems produce information - such as when you stick your head out the window to see if it will rain that day. Where did you get that information? Not from any intelligent source.

And he adds some new falsehoods: "My recent book on the subject received enthusiastic endorsements from many scientists not previously known as advocates of ID, such as chemist Philip Skell, a National Academy of Sciences member..."

As is well-known to anyone who follows the creation-evolution debate, Philip Skell is a longtime evolution opponent. His anti-evolution activity dates from at least 2000, and he has been quite active since then.

Meyer also claims, "those who reject ID within the scientific community do so not because they have a better explanation of the relevant evidence, but because they affirm a definition of science that requires them to reject explanations involving intelligence—whatever the evidence shows". Scientists don't reject explanations involving "intelligence"; they simply don't find "intelligence" alone to be a useful explanation for most phenomena. No archaeologist finds a potsherd and exclaims, "Intelligence must have been involved in the creation of this pot!" To do so would be regarded as moronic. Rather, archaeologists spend their time figuring out who made an artifact, what they used it for, and how it fits into a larger understanding of the human culture it was a part of. Contrary to Meyer's bogus claim, fields like archaeology have no problem incorporating human agency into their studies. But no scientific field incorporates agency without some evidence of the agent actually existing - something Meyer has yet to provide.

If ID wants to be taken seriously, ID advocates have to distance themselves from spokesmen who are more interested in public relations than scientific truth.

Tuesday, May 25, 2010

How to Test for Syphilis

Yesterday on NPR's "All Things Considered" I heard this segment about "sparsity", which tried to link several different issues about data compression in one story. I don't think it was very successful, and the chosen term "sparsity" wasn't really representative of the content.

Nevertheless, the piece opened up with an interesting puzzle. The Army wants to do comprehensive blood tests for syphilis, a relatively rare disease, but each individual test is expensive. How can they test everyone more cheaply?

The idea is simple: mix the blood of g individuals together, and test that. A positive outcome indicates that at least one person in the group has syphilis, and a negative outcome indicates that no person in the group has it. In the event of a positive outcome, test all the members of the group.

Now let p be the probability that a randomly-chosen person has syphilis, and let N be the population size. What is the optimal size for the group? Choose g too small, and you end up doing lots of tests, because N/g (the number of groups) is large. Choose g too large, and it becomes very likely that testing the group will be positive, so you end up doing lots of tests again. Somewhere in between is the optimal choice of g.

How do we find it? There are N/g groups, and we have to test each of them. A randomly-chosen person fails to have syphilis with probability 1-p, so the probability that everyone in the group fails to have it is (1-p)g. Hence the probability that a group tests positive is 1-(1-p)g, and the expected number of positive-testing groups is (N/g)(1-(1-p)g). We have to test each person in a positive group, so this means g(N/g)(1-(1-p)g) = N(1-(1-p)g) additional tests. If the cost of a test is C, then the total cost is CN/g (the cost to test each group), plus CN(1-(1-p)g) (the cost to test everyone in the positive-testing groups). Factoring out CN, we need to minimize

1/g + 1 - (1-p)g.       (1)

Notice that this expression only depends on p, the probability.

We can, in fact, find a closed form for this minimum (or at least Maple can), but it is somewhat complicated, and depends on the little-known Lambert W-function. It's easier to compute the minimum through bisection or some other method for any particular p. A recent estimate is that syphilis occurs in about 15.4 per 100,000 people, which corresponds to p = .000154. For this value of p, the optimal g is g = 81, which gives (1) the value of .0247. Thus, for this p, we are able to test everyone by combining tests -- for less than 3% of the cost of testing everyone individually.

By the way, a simple heuristic argument suggests that the g that minimizes (1) will be about p: to minimize (1), choose g so that the two terms in the sum (1), 1/g and 1-(1-p)g, are equal. Set g = p; then 1/g = p½. On the other hand, 1-(1-p)g = 1 - (1-g-2)g. But (1-1/g)g is about 1/e for large g, so using the Taylor series for exp(x), we see that 1 - (1-g-2)g is about 1/g, too. So this choice of g will be very close to the one minimizing (1), and the value of (1) is therefore about 2/g = 2 p½.

Added: with more complicated tests (see the comments), one can do even better. But some of the proposed methods are so complicated that it seems to me the possibility of human error in carrying them out would rule them out.