Buy 'Godless in America'
Buy 'mere atheism'
My Blog/What's New
To the religious
Irreconcilable Differences
A violation by prayer
Answering atheism's critics
Arguing the inarguable
Baloney detection
Sagan remembered
Capitol gods, historical fictions
Chance, Karma, etc
Columbine: 5 years after
Thoughts on abortion
Divine fictions
Dueling billboards
Economic cold turkey
Election reforms
Excommunicating reason
Finding the line
The First and the 10
Freedom of the press
Friday night at Curry's
From the shores of Tripoli
Garbled 'God'
Genes don't care
Fooled again?
The Gifts We Give
'God' and the pro athlete
God losing its religions
Spotting monkey traps
Inaugurating change
ID facts and fictions
A last rite
Let us think
Lethal bliss
Memo concerning a wall
mere atheism
My left lung
Mythic Lies
The numbers game
On 'atheist' atrocities
Only words
Out of the mainstream
Q&A Dialogue with a Christian
The real war
Rebutting Rabbi Gellman
Storm story
Rosa's 'No'
Rethinking the 'A' word
Same-sex marriage
See no evil
Signs signal changing times
Sitting still?
Tom Paine
Trouble with miracles
The trouble with NOMA
Under sail
Voting for bigotry
What the thunder says
When atheists attack
When faith trumps reason
Why Darwin was right
GIA errata
e-mail me

The art and science of baloney detection

By George A. Ricker ©2006

• A talk before the Space Coast Freethought Association on Oct. 1, 2006

Note to the web edition: A large part of the material reproduced in this talk is from other sources, specifically the late Dr. Carl Sagan’s The Demon-Haunted Word: Science As A Candle In The Dark and Michael Shermer’s two columns written for Scientific American in November and December of 2001. The idea of a baloney detection kit appears to have originated with Dr. Sagan. Others have taken the notion and run with it, and I suspect he would have approved of that. We will observe the 10th anniversary of Dr. Sagan’s death in December of this year. Though I never had the pleasure of meeting him except by reading his books and watching his television appearances, he is a man I hold in high esteem. It was Carl Sagan who rekindled my interest in science, an interest that had been dormant for many years. I am profoundly grateful to him for that and for being a voice who spoke always for sanity and reason in a world that seems desperately in need of both. So my use of Dr. Sagan’s words, and Dr. Shermer’s as well, in this speech is not in any way intended to trade on either man’s work or stature. If anything, it is in homage. I think Sagan’s prescription for clearer thinking is one that can help to cure much of what is wrong in American society today. And if I can help distribute that prescription, then I am bound to try.


The wise know that, even if they have been most fortunate, they have glimpsed but a small fragment of the truth. Fools are not bound by such limitations. —GR

We live in an age called “the age of information.”

At times it seems there is too much information, so much that it threatens to overwhelm us. Experts in some fields have difficulty keeping up with all the advances in their own areas of specialization.

Yet, understanding the information is critical to us. It’s critical to our own well-being, critical to the survival of our families, our societies, possibly even our planet.

There may never have been a time in the history of our species when it was more important for us to develop and exercise the critical thinking skills that allow us to recognize truth, to distinguish between fact and fiction, to separate the wheat from the chaff.

In short, how to spot the baloney.

A brief digression:

But before we get too far along, let’s talk about the word “baloney” itself.

According to Partridge’s Concise Dictionary of Slang and Unconventional English (Paul Beale, ed.; Macmillan Publishing Company, NY, 1989), the word actually should be spelled “boloney” instead of the more common “baloney.” It means “nonsense” or “eyewash” and has been in use in the United States since, at least, 1900. The word is generally thought to be a corruption of the word “bologna,” a reference to bologna sausage.

But whatever its origin and however you spell it, the word “baloney,” when applied to ideas, refers to “nonsense,” to false or phony ideas. When I was a youngster we sometimes referred to things, and people, that didn’t ring true as being “phony baloney.”

End digression

So what I want to do in these brief remarks today is to talk about some ways to detect baloney in the claims and ideas with which we are presented.

I’m going to begin with some general comments. Then talk specifically about the “Baloney Detection Kit” of Dr. Carl Sagan, and some additions to it suggested by Michael Shermer of the Skeptics Society.

Part One: Setting the stage

The title of my talk—The art and science of baloney detection—is derivative and was inspired by a book I highly recommend, The Demon-Haunted World: Science as a Candle in the Dark by Dr. Carl Sagan.

Those of you who have read Sagan’s book may recall a chapter in it called “The Fine Art of Baloney Detection.” It was in that chapter that I first encountered his idea of a baloney detection kit, the tools for skeptical thinking.

But before we get to the specific tools in the kit, it’s important to understand why all of this matters, or should matter.

For most of his adult life, Sagan was engaged in promoting scientific literacy. He thought it critical that all human beings have an understanding of, at least, the basics of science and scientific inquiry. He thought it was especially critical for the citizens of a highly sophisticated, technologically advanced society to have that understanding. And he thought it extremely dangerous if they did not.

A couple of years ago, David Baltimore, Nobel laureate and president of the California Institute of Technology, wondered, in an op-ed piece in the Los Angeles Times, whether a nation in which more people believe in the devil than accept evolution can maintain its leadership in the sciences.

But Sagan, who died in December of 1996 (the same year The Demon-Haunted World was published), was concerned about more than whether we would maintain our leadership in the sciences.

Here’s a statement from a speech he gave at the Seattle convention of CSICOP (Committee for the Scientific Investigation of the Claims of the Paranormal) in 1994. He was talking about reasons for popularizing science.

“On a personal note, another reason that popularizing science and its methods is important is a foreboding I have, maybe ill-placed, of an America in my children’s generation or my grandchildren's generation when all the manufacturing industries have slipped away to other countries, when we are a services and information processing economy, when awesome technological powers are in the hands of a very few, and no one representing the public interest can even grasp the issues.

“When the people in a democracy have lost the ability to set their own agendas, or even to knowledgeably question those who do set the agendas. When there is no practice in questioning those in authority. When clutching our crystals and religiously consulting our horoscopes, our critical faculties in steep decline, unable to distinguish between what’s true and what feels good, we slide, almost without noticing, into superstition and darkness. That Worries Me. And I don’t think we have adequate protections against that and this is just a kind of fantasy. There are reasons to worry.” …

“We have a civilization based on science and technology, and we have cleverly arranged things so that almost nobody understands science and technology. That is as clear a prescription for disaster as you can imagine. It’s a combustible mixture of ignorance and power. And while we might get away with it for a while, that mixture, sooner or later, is going to blow up. The powers of modern technology are so enormous that it is insufficient to say, well those in charge of those powers, I am sure, are doing a good job. This is a democracy, and for us to make sure that the powers of science and technology are used properly and prudently, we ourselves must understand science and technology.”

Twelve years ago, Sagan’s comments may have seemed unduly pessimistic to some. Today, they seem almost prescient.

Today, more than ever, we need to equip ourselves to spot the “baloney.”

We begin with a few basic ideas, a few ground rules, if you will.

First, we must recognize that words are symbols. They are representations. They are not phenomena. They are abstractions from phenomena. Therefore, in order for any discussion to go forward, it is absolutely critical that the participants have a common understanding of the words being used in that discussion. When there is dispute about the terms being used, the first order of business must be for the participants in a discussion to clarify and agree on the meaning of the terms in use. When such clarification does not take place, then effective communication cannot occur and the discussion often degenerates into a shouting match.

Second, it is not necessary to prove ideas false. What is required is to demonstrate that they are true. Whoever presents an idea for our consideration, regardless of what the idea is about, must be prepared to support it. No idea becomes valid just because no one can disprove it. By the same token, no idea becomes invalid just because no one can prove it. (We’ll revisit this, but it’s important to emphasize it. An argument that amounts to little more than the statement “prove me wrong” is inherently bogus.)

Third, everyone is entitled to his or her opinion, but all opinions are not created equal. Saying you have a right to your opinion is just another way of saying you have a right to be wrong.

Fourth, sincerity is not an argument. Things do not become true just because someone really, really, really wants them to be true. (By the same token, volume is not an argument either. Your ideas don’t gain extra validity because you are the biggest loudmouth in the room).

Finally, our perception of reality is always a work in progress. It is never complete, and no one ever gets the last word. All of our understanding is provisional and contingent upon the knowledge we have at any given time. As the knowledge changes, so will our understanding.

What we are really talking about when we talk about the art and science of baloney detection is learning how to think better.

Part Two: Sagan’s “baloney detection kit”

Carl Sagan presented his baloney detection kit as a way to evaluate new ideas.

He introduced it this way:

“If you’re so inclined, if you don’t want to buy baloney even when it’s reassuring to do so, there are precautions that can be taken; there’s a tried-and-true, consumer-tested method.

“What’s in the kit? Tools for skeptical thinking.

“What skeptical thinking boils down to is the means to construct and to understand, a reasoned argument and—especially important—to recognize a fallacious or fraudulent argument. The question is not whether we like the conclusion that emerges out of a train of reasoning, but whether the conclusion follows from the premise or starting point and whether the premise is true.” (The Demon-Haunted World, p. 210)

Here are some of the tools Sagan suggested.

• Wherever possible there must be independent confirmation of the facts.

• Encourage substantive debate on the evidence by knowledgeable proponents of all points of view.

• Arguments from authority carry little weight (in science there are no "authorities,” at best there may be experts).

• Spin more than one hypothesis—don't simply run with the first idea that caught your fancy. If there’s something to explain, try to think of all the different ways it could be explained, then think of the tests whereby you might disprove each of the alternatives.

• Try not to get overly attached to a hypothesis just because it's yours. Try to think of ways to prove it false. What are the best arguments against it?

• Quantify, wherever possible. Being able to assign numerical values to whatever you are attempting to explain makes it easier to evaluate and to choose among competing hypotheses. In the absence of the ability to make quantifiable measurements, the task becomes much more difficult.

• If there is a chain of argument every link in the chain must work, including the premise.

• Occam's razor - if there are two hypothesis that explain the data equally well choose the simpler.

• Ask whether the hypothesis can, at least in principle, be falsified (shown to be false by some unambiguous test). In other words, it is testable? Can others duplicate the experiment and get the same result?

Additional issues are:

Conduct control experiments - especially "double blind" experiments where the person taking measurements is not aware of the test and control subjects.

Check for confounding factors—separate the variables.

Common fallacies of logic and rhetoric

Ad hominem - attacking the arguer and not the argument.

Argument from "authority." Authorities have been wrong in the past and will be wrong in the future.

Argument from adverse consequences (putting pressure on the decision maker by pointing out dire consequences of an "unfavorable" decision).

Appeal to ignorance (absence of evidence is not evidence of absence). This is the claim that whatever has not been proven false must be true, and vice versa.

Special pleading (typically referring to god's will). This is done to rescue a proposition that is in trouble. One of the classic examples is the appeal to divine mysteries to explain how a perfect deity who is good could allow evil to exist.

Begging the question (assuming an answer in the way a statement or a question is phrased). For example: “How did ‘God’ create the universe?”

Observational selection (counting the hits and forgetting the misses). This is a favorite trick of psychics and others who claim paranormal powers. They always remind us of any prediction that is even close to the mark. They never mention those that miss wildly.

Statistics of small numbers (such as drawing conclusions from inadequate sample sizes).

Misunderstanding the nature of statistics (President Eisenhower expressing astonishment and alarm on discovering that fully half of all Americans have below average intelligence!)

Inconsistency (e.g. military expenditures based on worst case scenarios but scientific projections on environmental dangers thriftily ignored because they are not "proved").

Non sequitur—"it does not follow"—the logic falls down. “America has prospered because we are a ‘Christian’ nation.”

Post hoc, ergo propter hoc - "it happened after so it was caused by" - confusion of cause and effect.

Meaningless question (“What happens when an irresistible force meets an immovable object?”). If there is such a thing as an irresistible force, there cannot be such a thing as an immovable object. The opposite is also true.

Excluded middle—considering only the two extremes in a range of possibilities (making the "other side" look worse than it really is). Example: Either morality comes from ‘God’ or it’s based on individual whims and wishes.

Short-term v. long-term—a subset of excluded middle ("why pursue fundamental science when we have so huge a budget deficit?").

Slippery slope - a subset of excluded middle - unwarranted extrapolation of the effects (give an inch and they will take a mile).

Confusion of correlation and causation.

Straw man—caricaturing (or stereotyping) a position to make it easier to attack. Darwin’s theory of evolution by natural selection must be false because it fails to explain the origins of life. In fact, Darwin never claimed to explain the origins of life and the subject is not part of the theory of evolution at all. Theories about the origin of life are classified as theories of abiogenesis.

Suppressed evidence or half-truths.

Weasel words—for example, use of euphemisms for war such as "police action" to get around limitations on Presidential powers. Talleyrand said, “An important art of politicians is to find new names for institutions which under old names have become odious to the public.”
(From The Demon-Haunted World, p210-216)

There’s another fallacy I’m going to add to the list and that’s the false analogy. The idea of a false analogy is expressed when we say someone is comparing apples to oranges. Analogies or comparisons can be useful rhetorical devices and can help to clarify arguments by providing concrete examples. But when the comparisons are questionable, the argument becomes suspect.

A classic false analogy is the “watchmaker universe” of William Paley. Paley opined that if you were walking on a beach and found a watch, you would instantly conclude it must have been made by a watchmaker. By the same token, he stated, when we observe the design of the universe, we must conclude it had a “maker” as well.

The analogy is false because Paley is comparing a manufactured object to a universe that is a natural phenomenon and may not have been created at all. Indeed, considering the state of things in our universe, Richard Dawkins probably was much closer to the truth when he played off Paley’s analogy and titled one of his books The Blind Watchmaker.

Part Three: Shermer’s “baloney detection” questions

In the November and December, 2001, issues of Scientific American, Michael Shermer—founder of the Skeptics Society and founding publisher of Skeptic magazine—proposed a series of questions that could be used to detect baloney and especially to draw the boundaries between science and pseudoscience.

Shermer wrote:

“When lecturing on science and pseudoscience at colleges and universities, I am inevitably asked, after challenging common beliefs held by many students, ‘Why should we believe you’ My answer: ‘You shouldn't.’

“I then explain that we need to check things out for ourselves and, short of that, at least to ask basic questions that get to the heart of the validity of any claim. This is what I call baloney detection, in deference to Carl Sagan, who coined the phrase Baloney Detection Kit. To detect baloney--that is, to help discriminate between science and pseudoscience--I suggest 10 questions to ask when encountering any claim.”

These were the questions Shermer proposed, along with his explanations of them:

1. How reliable is the source of the claim?

Pseudoscientists often appear quite reliable, but when examined closely, the facts and figures they cite are distorted, taken out of context or occasionally even fabricated. Of course, everyone makes some mistakes. And as historian of science Daniel Kevles showed so effectively in his book The Baltimore Affair, it can be hard to detect a fraudulent signal within the background noise of sloppiness that is a normal part of the scientific process. The question is, Do the data and interpretations show signs of intentional distortion? When an independent committee established to investigate potential fraud scrutinized a set of research notes in Nobel laureate David Baltimore's laboratory, it revealed a surprising number of mistakes. Baltimore was exonerated because his lab's mistakes were random and nondirectional.

2. Does this source often make similar claims?

Pseudoscientists have a habit of going well beyond the facts. Flood geologists (creationists who believe that Noah's flood can account for many of the earth's geologic formations) consistently make outrageous claims that bear no relation to geological science. Of course, some great thinkers do frequently go beyond the data in their creative speculations. Thomas Gold of Cornell University is notorious for his radical ideas, but he has been right often enough that other scientists listen to what he has to say. Gold proposes, for example, that oil is not a fossil fuel at all but the by-product of a deep, hot biosphere (microorganisms living at unexpected depths within the crust). Hardly any earth scientists with whom I have spoken think Gold is right, yet they do not consider him a crank. Watch out for a pattern of fringe thinking that consistently ignores or distorts data.

3. Have the claims been verified by another source?

Typically pseudoscientists make statements that are unverified or verified only by a source within their own belief circle. We must ask, Who is checking the claims, and even who is checking the checkers? The biggest problem with the cold fusion debacle, for instance, was not that Stanley Pons and Martin Fleischman were wrong. It was that they announced their spectacular discovery at a press conference before other laboratories verified it. Worse, when cold fusion was not replicated, they continued to cling to their claim. Outside verification is crucial to good science.

4. How does the claim fit with what we know about how the world works?

An extraordinary claim must be placed into a larger context to see how it fits. When people claim that the Egyptian pyramids and the Sphinx were built more than 10,000 years ago by an unknown, advanced race, they are not presenting any context for that earlier civilization. Where are the rest of the artifacts of those people? Where are their works of art, their weapons, their clothing, their tools, their trash? Archaeology simply does not operate this way.

5. Has anyone gone out of the way to disprove the claim, or has only supportive evidence been sought?

This is the confirmation bias, or the tendency to seek confirmatory evidence and to reject or ignore disconfirmatory evidence. The confirmation bias is powerful, pervasive and almost impossible for any of us to avoid. It is why the methods of science that emphasize checking and rechecking, verification and replication, and especially attempts to falsify a claim, are so critical.

6. Does the preponderance of evidence point to the claimant's conclusion or to a different one?

The theory of evolution, for example, is "proved'' through a convergence of evidence from a number of independent lines of inquiry. No one fossil, no one piece of biological or paleontological evidence has "evolution'' written on it; instead tens of thousands of evidentiary bits add up to a story of the evolution of life. Creationists conveniently ignore this confluence, focusing instead on trivial anomalies or currently unexplained phenomena in the history of life.

7. Is the claimant employing the accepted rules of reason and tools of research, or have these been abandoned in favor of others that lead to the desired conclusion?

A clear distinction can be made between SETI (Search for Extraterrestrial Intelligence) scientists and UFOlogists. SETI scientists begin with the null hypothesis that ETIs do not exist and that they must provide concrete evidence before making the extraordinary claim that we are not alone in the universe. UFOlogists begin with the positive hypothesis that ETIs exist and have visited us, then employ questionable research techniques to support that belief, such as hypnotic regression (revelations of abduction experiences), anecdotal reasoning (countless stories of UFO sightings), conspiratorial thinking (governmental cover-ups of alien encounters), low-quality visual evidence (blurry photographs and grainy videos), and anomalistic thinking (atmospheric anomalies and visual misperceptions by eyewitnesses).

8. Is the claimant providing an explanation for the observed phenomena or merely denying the existing explanation?

This is a classic debate strategy--criticize your opponent and never affirm what you believe to avoid criticism. It is next to impossible to get creationists to offer an explanation for life (other than "God did it''). Intelligent Design (ID) creationists have done no better, picking away at weaknesses in scientific explanations for difficult problems and offering in their stead. "ID did it.'' This stratagem is unacceptable in science.

9. If the claimant proffers a new explanation, does it account for as many phenomena as the old explanation did?

Many HIV/AIDS skeptics argue that lifestyle causes AIDS. Yet their alternative theory does not explain nearly as much of the data as the HIV theory does. To make their argument, they must ignore the diverse evidence in support of HIV as the causal vector in AIDS while ignoring the significant correlation between the rise in AIDS among hemophiliacs shortly after HIV was inadvertently introduced into the blood supply.

10. Do the claimant's personal beliefs and biases drive the conclusions, or vice versa?

All scientists hold social, political and ideological beliefs that could potentially slant their interpretations of the data, but how do those biases and beliefs affect their research in practice? Usually during the peer-review system, such biases and beliefs are rooted out, or the paper or book is rejected.

Shermer concludes with this:

“Clearly, there are no foolproof methods of detecting baloney or drawing the boundary between science and pseudoscience. Yet there is a solution: science deals in fuzzy fractions of certainties and uncertainties, where evolution and big bang cosmology may be assigned a 0.9 probability of being true, and creationism and UFOs a 0.1 probability of being true. In between are borderland claims: we might assign superstring theory a 0.7 and cryonics a 0.2. In all cases, we remain open-minded and flexible, willing to reconsider our assessments as new evidence arises. This is, undeniably, what makes science so fleeting and frustrating to many people; it is, at the same time, what makes science the most glorious product of the human mind.”

Part four: some examples of baloney

When I was matriculating at the University of Miami, more years ago than I care to remember, I took a course called “Introduction to Reading Instruction” or something like that (I really can’t remember the exact name of the course). I was in my senior year and the course was required for all of us who were pursuing degrees in secondary education with a teaching field of English.

My instructor was the head of the department and had written the text book for the course. Early in the semester, he made the following statement:

“Reading is thinking. There is no limit on how fast one can think, therefore there is no limit on how fast one can read.”

After I had climbed down from the ceiling, I challenged the professor.

I pointed out that reading and thinking were very different activities—the first involved the visual intake of data; the second could happen entirely within the human brain. I noted there were definite physiological speed limits on how fast the eyes could focus and how fast the brain could process the information obtained. I also noted that even thinking did not operate without speed limits imposed by the rate at which the neural synapses could occur.

How did the professor respond to my challenge?

He didn’t. He simply stared at me and then repeated, a bit more forcefully, “Reading is thinking. There is no limit on how fast one can think, therefore there is no limit on how fast one can read.”

What are some of the kinds of baloney exhibited in the example?

First is a kind of argument from authority. In this case, the authority was the professor and the assumption was that since his authority was unassailable, his argument also was unassailable.

Second, the statement about reading being thinking and so on is a classic non sequiter. It simply doesn’t hold up at all under examination. The logic breaks down.

Here’s another example.

During much of the last century, it was the accepted wisdom that human beings, on average, only used about 10 percent of their brains.

Now, this was a great comfort to many, and it got mentioned in all sorts of places by all sorts of people.

“You know,” someone would say knowingly, “people only use about 10 percent of their brain power.”

So we all ran around thinking each of us had this vast, untapped pool of potential brain power lying there waiting to be used, if we chose to use it. And if we didn’t choose to use it, well that was our privilege, wasn’t it. Besides, it was always there to call on when needed.

This idea was especially useful to all the psychic hucksters who could almost always be depended upon to remind us that (a) most of us only used 10 percent of our brain power and (b) their extra-sensory abilities might be made possible by the ability to tap into all that unused potential.

Alas, the conventional wisdom—as is often the case—turned out to be wrong.

After decades of neuroscience mapping the brain, recording electrical impulses within the brain, locating the various parts of the brain that control bodily and mental functions, scientists have found no blank areas, no unused portions, no intellectual amp system that could be pumped up to improve one’s grades or job performance or to pursue any other goal.

So where did the 10-percent myth come from?

According to Barry L. Beyerstein of the Brain Behavior Laboratory at Simon Fraser University in Vancouver, there is no clear source of the idea. However, he thinks it may have originated in the works of William James, the pioneering American psychologist. James wrote many articles for popular magazines and often said that most people only realized a small part of their potential. Beyerstein notes he can find no reference to a specific percentage, and James was not referring to gray matter alone but to total human potential. Lots of positive-thinking gurus latched onto the statements, nonetheless, and the 10-percent-of-the-brain idea was born. Apparently the biggest boost for the idea came, Beyerstein writes, when journalist and adventurer Lowell Thomas attributed the idea to James in the preface he wrote, in 1936, to Dale Carnegie’s How to Win Friends and Influence People. (Beyerstein, Barry L. Scientific American.com)

Is this another example of baloney that has made its way into the modern psyche?

There doesn’t appear to be any evidentiary support for the idea. In fact, the evidence of countless brain scans, lab experiments and so on appear to point in the opposite direction. What seems to be the case is that we use every bit of our brains in some fashion or another.

Beyerstein notes, “Why would a neuroscientist immediately doubt that 90 percent of the average brain lies perpetually fallow? First of all, it is obvious that the brain, like all our other organs, has been shaped by natural selection. Brain tissue is metabolically expensive both to grow and to run, and it strains credulity to think evolution would have permitted squandering of resources on a scale necessary to build and maintain such a massively underutilized organ.” (Scientific American.com)

We can provisionally state that the 10-percent-of-the-brain myth is not supported by the evidence. Beyond that, there is no hypothesis that would explain how such a phenomenon would occur.

We may not be able to prove the idea false, but we can state with some conviction that there does not appear to be any good reason to think it is true.

Conclusion: the nature of science

One of the criticisms that gets leveled at people like Sagan and Shermer…and Richard Dawkins…and others is that they “worship” science, that science has become their new religion, and so on.

I put it to you that this is simply an attempt to defend the irrational by trashing the rational.

Sagan, for example, did not worship science. He understood its nature and appreciated its value, but he did not worship it.

He did think—and it’s a view I share—that despite any shortcomings, science was humankind’s supreme accomplishment to date. Science and the emergent technologies arising from it have led to unimaginable improvements in living standards, in health, in communication, in our understanding of some of the most fundamental questions asked by human beings.

But Sagan also understood the dangers that can come when people, including some scientists, misuse the products made possible by scientific research. The same science that gives us the advances mentioned previously also has led to the production of thermonuclear bombs, weaponized chemical and biological agents, various forms of pollution and the potential to destroy life as well as enhance it.

Indeed, that was one of the reasons he was so committed to educating people about science.

In my book, Godless in America, I make the following observation:

“One of the oddest indictments I hear uttered against science is in the statement, ‘Well, you know, science doesn’t have all the answers.’

“Of course it doesn’t. When has modern science ever claimed otherwise?

“Not only does science not have all the answers, I imagine most scientists probably would agree that science has yet to glimpse all the questions. Those who think science is about absolute statements and unchanging information miss the point. Science is a process, a way of looking at things. The information, the theories, the speculations about the nature of reality that make up the body of individual scientific disciplines are the products of that process.

“Science is not just a dry set of statistics or a collection of observations. It is our passport to understanding all of existence. It is a hymn to the capacity of human intelligence. It is a message in time's bottle, revealing our past, informing our present and enhancing our future.

“Stating that science does not have all the answers is no indictment. It is a recognition of the real state of human knowledge. The more we learn, the more we recognize we have more to learn. However, we cannot let our continued quest for understanding obscure the knowledge we have attained. Saying we have much to learn is not the same as saying we have learned nothing.” (Godless in America, p. 36)

The old National Inquirer used to promote the slogan “Inquiring minds want to know.”

I didn’t think much of the publication, but I always liked the slogan. Human beings are naturally inquisitive. Our curiosity is one of our strengths as a species. It is one of the attributes that has led us to scientific pursuits. We want to know. We want to find out. Unfortunately, we are also an impatient species. And in the absence of a well-thought-out answer, we’re apt to latch on to whatever comes along. Because of our impatience, we sometimes buy into bamboozles and intellectual cons that tell us what we want to hear.

Unfortunately, it also appears to be characteristic of the human species that once we have bought into a bamboozle we have a hard time letting it go.

Science does not speak the language of absolute certainty. If you read the works of scientists, you will find that, for the most part, they qualify their claims with all sorts of caveats. And that is as it should be.

After all, the most any scientist can say honestly at any point in time is “This is what I think based on the available evidence.”

And the unspoken assumption implicit in those words is the understanding that “I reserve the right to change my mind based upon what I learn tomorrow.”

Our culture is awash with ideas and information bidding for our attention, some bidding for our allegiance. Having a working baloney detection kit can serve one well in trying to evaluate all those ideas and all that infomation.

Sagan concludes The Demon-Haunted World with a chapter called “Real Patriots Ask Questions.” He ends with these words:

“Education on the value of free speech and the other freedoms reserved by the Bill of Rights, about what happens when you don’t have them, and about how to exercise and protect them, should be an essential prerequisite for being an American citizen—or indeed a citizen of any nation, the more so to the degree that such rights remain unprotected. If we can't think for ourselves, if we're unwilling to question authority, then we're just putty in the hands of those in power. But if the citizens are educated and form their own opinions, then those in power work for us. In every country, we should be teaching our children the scientific method and the reasons for a Bill of Rights. With it comes a certain decency, humility and community spirit. In the demon-haunted world that we inhabit by virtue of being human, this may be all that stands between us and the enveloping darkness.” —The Demon-Haunted World p. 434

Thank You.


• Beale, Paul, ed. Partridge’s Concise Dictionary of Slang and Unconventional English. Macmillan Publishing Company, NY, 1989

• Beyerstein, Barry L. Scientific American.com. “Do we really use only 10 percent of our brains?”

• Ricker, George A. Godless in America. iUniverse; Lincoln, NE, 2006

• Sagan, Carl. The Demon-Haunted World. New York: Ballantine Books, 1997

• Shermer, Michael. Scientific American: November and December, 2001
November, 2001
December, 2001
http://www.sciam.com/article.cfm?articleID=000ADC77-B274-1C6E- 84A9809EC588EF21&pageNumber=1&catID=2

|Home| |About| |Buy 'Godless in America'| |Buy 'mere atheism'| |My Blog/What's New| |To the religious| |Irreconcilable Differences| |A violation by prayer| |Answering atheism's critics| |Arguing the inarguable| |Baloney detection| |Sagan remembered| |Capitol gods, historical fictions| |Chance, Karma, etc| |Columbine: 5 years after| |On "The New Atheism"| |Thoughts on abortion| |Disagreeable| |Divine fictions| |Dueling billboards| |Economic cold turkey| |Election reforms| |Excommunicating reason| |Finding the line| |The First and the 10| |Freedom of the press| |Friday night at Curry's| |From the shores of Tripoli| |Garbled 'God'| |Genes don't care| |Fooled again?| |The Gifts We Give| |'God' and the pro athlete| |God losing its religions| |Spotting monkey traps| |Inaugurating change| |ID facts and fictions| |A last rite| |Let us think| |Lethal bliss| |Memo concerning a wall| |mere atheism| |Mockingbird| |My left lung| |Mythic Lies| |The numbers game| |Obama| |On 'atheist' atrocities| |Only words| |Out of the mainstream| |Poems| |Q&A Dialogue with a Christian| |The real war| |Rebutting Rabbi Gellman| |Storm story| |Rosa's 'No'| |R*E*S*P*E*C*T| |Rethinking the 'A' word| |Same-sex marriage| |See no evil| |Signs signal changing times| |Sitting still?| |Tom Paine| |Trouble with miracles| |The trouble with NOMA| |Under sail| |Voting for bigotry| |What the thunder says| |When atheists attack| |When faith trumps reason| |WhiningforJesus| |Wholeness| |Why Darwin was right| |GIA errata| |Links|