So who are these ‘‘scientists anonymous’’?

Alison Campbell finds the creationists are still trying to get into our schools.

A friend of mine, who happens to be a biology teacher, recently forwarded me an email. Quite apart from the fact that the sender had sent it to what looks like every secondary school in the country and didn’t have the courtesy to bcc the mailing list, there are a number of issues around it that give me some concern.

But first, the email:

TO: Faculty Head of Science / Head of Biology Department

Please find attached a new resource (pp. 12-14) by Dr Jerry Bergman on the left recurrent laryngeal nerve (RLN) for the teaching and learning of Senior Science/Biology (human evolution). [Edit: The original email had a link to the article on RLN, which was on the Institute for Creation Research website.]

• Much evidence exists that the present design results from developmental constraints.

• There are indications that this design serves to fine-tune laryngeal functions.

• The nerve serves to innervate other organs after it branches from the vagus on its way to the larynx.

• The design provides backup innervation to the larynx in case another nerve is damaged.

• No evidence exists that the design causes any disadvantage.

Freely share this resource with the teaching staff in your faculty/department.

Yours sincerely,

Scientists Anonymous (NZ)

PRIVACY ACT/DISCLAIMER Dissemination of extraordinary science resources will be made once or twice a year at the most (opt out).

All replies will be read but not necessarily acknowledged (no-reply policy applies).

The distribution of resources through this mailing system is not by the Publishers.

It’s immediately obvious that this is a thinly disguised attempt by cdesign proponentsists to get ‘intelligent design’ materials into the classroom [Those unfamiliar with the term ‘cdesign proponentsists’ please use Google – ed.]. The use of the word ‘design’ is a dead giveaway there. The arrangement of the laryngeal nerves has been noted by biologists as an example of poor ‘design’ as it doesn’t follow a straightforward path to the organs it innervates (and in fact follows an extremely lengthy detour in giraffes!), leading to the question, why would a ‘designer’ use such poor planning? (There’s a good YouTube clip on the subject.) That the ID proponents now seem to be arguing that poor design is actually purposeful and thus still evidence of a designer smacks of grasping at straws. Furthermore, the article that the email originally linked to is mounted on the Institute for Creation Research website – it’s not published in a peer-reviewed journal. So there’s nothing “extraordinary” about this particular “resource”.

Of more significance, I think, is the identity of the originators of this message (and I note they promise others in future; at least one can opt out!). “Scientists Anonymous”. This is an attempt at an appeal to authority – a bunch of scientists say so, so we should give it some weight.

But we shouldn’t – because we don’t know who they are. No-one’s publicly signed their name to this stuff, so why should we accept their authority in this matter? Are there really any practising scientists there? Are any of them biologists? Who knows… but it adds no weight to their proclaimed position on this issue. The only person mentioned by name, Jerry Bergman, is indeed a biologist by training, for whom the first Google entries are citations by Answer in Genesis and CreationWiki. Google Scholar indicates that his recent publications are not in the area of biological sciences but promote anti-evolution ideas including the one that Darwin’s writings influenced Hitler’s attitudes to various racial groups (an idea that’s been throroughly debunked elsewhere).

A search for ‘scientists anonymous’ brings up a students’ Facebook site and a book of the same name about women scientists. So who, exactly, are these ‘Scientists Anonymous’ who are behind the email to schools, and why aren’t they prepared to put their names to the document?

Yet more reasons why people believe weird things

Research at Victoria University of Wellington is shedding light on the often irrational processes by which people assess new information. This article is based on presentations to the 2010 NZ Skeptics conference.

Jacqui Dean was alarmed. The Otago MP had received an email reporting the deaths of thousands of people – deaths caused by the compound dihydrogen monoxide. Dihydrogen monoxide is commonly used as an industrial solvent and coolant, it is fatal if inhaled, and is a major component of acid rain (see dhmo.org for more facts about dihydrogen monoxide). Only after she declared her plans to ban dihydrogen monoxide did she learn of its more common name: water (NZ Herald, 2007).

Ms Dean’s honest mistake may be amusing, but when large groups of people fail to correctly assess the veracity of information that failure can have tragic consequences. For example, a recent US survey found 25 percent of parents believe that vaccines can cause autism, a belief that may have contributed to the 11.5 percent of parents refusing at least one recommended vaccine for their child (Freed et al, 2010).

Evidence from experimental research also demonstrates the mistakes people can make when evaluating information. Over a number of studies researchers have found that people believe:

  • that brand name medication is more effective than generic medication;
  • that products that cost more are of higher quality;
  • and that currency in a familiar form – eg, the US dollar bill, is more valuable than currency in a less familiar form – eg, a dollar coin (Alter & Oppenheimer, 2008; for a review, see Rao & Monroe, 1989).

Why is it that people believe these weird things and make mistakes evaluating information?

Usually people can evaluate the veracity of information by relying on general knowledge. But when people have little relevant knowledge they often turn to feelings to inform their decisions (eg Unkelbach, 2007). Consider the following question: Are there more words in the English language that start with the letter K or have K in the third position? When Nobel prize winner Daniel Kahneman and his colleague Amos Tversky (1973) asked this question most people said there are more words that start with the letter K. And they were wrong. People make this error because words that start with the letter K, like kite, come to mind more easily than words that have a K in the third position, like acknowledge, so they judge which case is true based on a feeling – the experience of ease when generating K examples.

Generally speaking, information that is easy to recall, comprehend, visualise, and perceive brings about a feeling of fluent processing – the information feels easy on the mind, just like remembering words such as kite (Alter & Oppenheimer, 2009). We are sensitive to feelings of fluent processing (fluency), and we use it as a cue to evaluate information. For example, repeated information feels easy to bring to mind, and tends to be judged as more true than unrepeated information; trivia statements written in high colour contrast (Osorno is the capital of Chile) are easier to perceive and are judged as more true than statements written in low colour contrast (Osorno is the capital of Chile); and financial stocks with easy to pronounce ticker symbols (eg KAR) outperform those with difficult to pronounce ticker symbols such as RDO (Alter & Oppenheimer, 2006; Hasher et al, 1977; Reber & Schwarz, 1999).

Most of the time, fluently processed information is evaluated more positively – we say it is true, we think it is more valuable. And on the face of it, fluency can be a great mental shortcut: decisions based on fluency are quick and require little cognitive effort. But feelings of fluency can also lead people to make systematic errors. In our research, we examine how feelings of fluency affect beliefs, confidence, and evaluations of others. More specifically, we examine how photos affect people’s judgements about facts; how repeated statements affect mock- jurors’ confidence; and how the complexity of a name affects people’s evaluations of that person.

Can decorative photos influence your beliefs about information?

If we told you that the Barringer Crater is on the northern hemisphere of the moon, would that statement be more believable if we showed you a photo of the Barringer Crater? Because the photo is purely decorative – that is, it doesn’t actually tell you anything about the location of the Barringer Crater (which is in fact in Arizona) – you probably wouldn’t expect it to influence your beliefs about the statement.

Yet, evidence from fluency research suggests that in the absence of relevant knowledge, people rely on feelings to make decisions (eg Unkelbach, 2007). Thus, not knowing what the Barringer Crater is or what it looks like, you might turn to the photo when considering whether the statement is true. The photo might bring about feelings of fluency, and make the statement seem more credible by helping you easily picture the crater and bring to mind related information about craters – even though this would still give you no objective information about where the crater is located. In our research, we ask whether decorative photos can lead people to be more willing to believe information.

How did we answer our research question?

In one experiment, people responded true or false to trivia statements that varied in difficulty; some were easy to answer (eg, Neil Armstrong was the first person to walk on the moon), some were more difficult (eg, Turtles are deaf). Half of the time, statements were paired with a related photo (eg, a turtle). In a second study, people evaluated wine labels and guessed whether each of the wine labels had won a medal. We told people that the wine companies were all based in California. In fact, we created all of the wine names by pairing an adjective (eg, Clever) with a noun (eg, Clever Geese). Some of the wine labels contained familiar nouns (eg, Flower) and some contained unfamiliar nouns (eg, Quills). Half of the wine labels appeared with a photo of the noun.

What did we find?

Overall, when people saw trivia statements or wine names paired with photos, they were more likely to think that statements were true or that the wines had won a medal. However, photos only exerted these effects when information was difficult – that is, for those trivia statements that were difficult to answer and wine names that were relatively unfamiliar. Put more simply, decorative photos can lead you to believe claims about unfamiliar information.

Is one eyewitness repeating themselves as believable as three?

If you were a juror in a criminal case, you would probably be more willing to convict a man based on the testimony of multiple eyewitnesses, rather than the testimony of a single eyewitness. But why would you be more likely to believe multiple eyewitnesses? On the one hand, you might think that the converging evidence of multiple eyewitnesses is more accurate and more convincing than evidence from a single eyewitness, and indeed, multiple eyewitnesses are generally more accurate than a single eyewitness (Clark & Wells, 2008).

On the other hand, as some of the fluency research discussed earlier suggests, you may be more likely to believe multiple eyewitnesses simply because hearing from multiple eyewitnesses means hearing the testimony multiple times (Hasher et al, 1977). Put another way, it may be the repetition of the testimony, rather than the number of independent eyewitnesses, that makes you more likely to believe the testimony. In our research, we wanted to know whether it is the overlap of statements made by multiple eyewitnesses or the repetition of those statements that makes information more believable.

How did we answer our research question?

We asked subjects to read three eyewitness reports about a fictitious crime. We told half of the subjects that each report was written by a different eyewitness, and we told the other half that all three reports were written by the same eyewitness. In addition, half of these subjects read some specific claims about the crime (eg, The thief read a Newsweek magazine) in one of the eyewitness reports, while the other half read those same specific claims in all three reports. Later, we asked subjects to tell us how confident they were that certain claims made in the eyewitness reports really happened during the crime (eg, How confident are you that the thief read a Newsweek magazine?).

What did we find?

This study had two important findings. First, regardless of whether one or three different eyewitnesses ostensibly wrote the reports, subjects who read claims repeated across all three reports were more confident about the accuracy of the claims than subjects who read those claims in only one report. Second, when the claims were repeated, subjects were just as confident about the accuracy of a single eyewitness as the accuracy of multiple eyewitnesses. These findings tell us that repeated claims were relatively more fluent than unrepeated claims – making people more confident simply because the claims were repeated, not because multiple eyewitnesses made them.

Would a name influence your evaluations of a person?

Your immediate response might be that it shouldn’t – people’s names provide no objective information about their character. We hope that we make decisions about others by recalling information from memory and gathering evidence about a person’s attributes. Indeed, research shows that when we have knowledge about a topic, a person or a place, we do just that – use our knowledge to make a judgement- and we can be reasonably accurate in doing so (eg, Unkelbach, 2007).

But when we don’t know a person and we can’t draw on our knowledge, we might be influenced by their name. As we have described, when people cannot draw on memory to make a judgement, they unwittingly turn to tangential information to make their decisions. Therefore, when people evaluate an unfamiliar name, tangential information, like the complexity of that name, might influence their judgements. More specifically, we thought that unknown names that were phonologically simple – easier to pronounce – would be judged more positively on a variety of attributes than names that were difficult to pronounce.

How did we answer our research question?

We showed people 16 names gathered from international newspapers. Half of the names were easy to pronounce (eg, Lubov Ershova), and half were difficult to pronounce (eg, Czeslaw Ratynska). We matched the names on a number of factors to make sure any differences we found were not due to effects of culture or name length. So for example, people saw an easy and difficult name from each region of the world and names were matched on length. Across three experiments, we asked subjects to judge whether each name was familiar (Experiment 1), trustworthy (Experiment 2), or dangerous (Experiment 3).

What did we find?

Although the names were not objectively different from each other on levels of familiarity, trustworthiness, or danger, people systematically judged easy names more positively than difficult names. Put another way, people thought that easy-to-pronounce names were more familiar, more trustworthy, and less dangerous than difficult-to-pronounce names. So although we would like to think we would not evaluate a person based on their name, we may unwittingly use trivial information like the phonological complexity of a name in our judgements.

Conclusions

Why is it that people believe these weird things and make mistakes when evaluating information? Our research suggests that decorative photos, repetition of information, and a person’s name all influence the way people interpret information. More specifically, decorative photos lead people to think information is more credible; repetition leads mock-jurors to be more confident in eyewitness statements – regardless of how many eyewitnesses provided the statements; and an easy-to-pronounce name can lead people to evaluate a person more positively.

Relying on feelings of fluency can result in sensible, accurate decisions when we are evaluating credible facts, accurate eyewitness reports, and trustworthy people. But the same feelings can lead people into error when we are evaluating inaccurate facts, mistaken eyewitnesses, and unreliable people. More specifically, feelings of fluency might lead us to think false facts are true, be more confident in inaccurate eyewitness reports, and more positively evaluate an unreliable person.

A common finding across our studies is that the effect of fluency was specific to situations where people had limited general knowledge to draw on. In the real world, we might see these effects even when people have sufficient knowledge to draw on. That is because we juggle a lot of information at any one time and we do not have the cognitive resources to carefully evaluate every piece of information that reaches us – as a result we may turn to feelings to make some decisions. Therefore it is inevitable that we will make at least some mistakes. We can only hope that our mistakes are comical rather than tragic.

The authors thank Professor Maryanne Garry for her invaluable guidance and her inspiring mentorship on these and other projects.

References

Alter, A, Oppenheimer, D 2006: Proc. Nat. Acad. Sci. 103, 9369-9372.
Alter, A, Oppenheimer, D 2008: Psychonomic Bull. & Rev. 15, 985-990.
Alter, A; Oppenheimer, D 2009: Personality and Soc. Psych. Rev. 13, 219-236.
Clark, SE; Wells, GL 2008: Law & Human Behavior 32, 406-422.
Dihydrogen Monoxide – DHMO Homepage. (2010).dhmo.org
Freed, G; Clark, S; Butchart, A; Singer, D; Davis, M 2010: Pediatrics, 125, 653-659.
Hasher, L; Goldstein, D; Toppino, T 1977: J. Verbal Learning & Verbal Behavior 16, 107-112.
NZ Herald 2007:www.nzherald.co.nz/nz/news/article.cfm?c_id=1&objectid=10463579
Rao, A; Monroe, K 1989: J. Marketing Research, 26, 351-357.
Reber, R; Schwarz, N 1999: Consciousness & Cognition 8, 338-342.
Tversky, A; Kahneman, D 1973: Cognitive Psych. 5, 207-232.
Unkelbach, C 2007: J. Exp. Psych.: Learning, Memory, & Cognition 33, 219-230.

Belief and knowledge: a plea about language

Alison Campbell looks at some words that cause scientific misunderstandings.

I suspect that for many of my first-year Biology students, the sheer weight of new terms they come across is perhaps the most daunting thing about the course. In some ways learning biology is rather like learning a new language, with several thousand new words swamping the page (and the brain).

But there’s more than just the new words – there’s the meaning of the words to come to terms with. This is the focus of Helen Quinn’s paper in Physics Today (2007): Belief and knowledge – a plea about language. There are many words whose meaning to a scientist may be quite different from what they mean to a layperson. Quinn feels, and I agree, that some words “are the root of considerable public misunderstanding about science: belief, hypothesis, theory and knowledge.”

‘Belief’ isn’t really a word that sits well with science. As Quinn says, it can be “an article of faith” ie religious belief. Or – conversely – in the phrase “I believe he is coming at 5pm”, you get the meaning “but I’m not really sure.” So how are we to take those news stories that begin “Scientists believe”? A statement like “most biologists believe in evolution” could be used to claim that evolution is as much faith-based as organised religion. (I tell my students that I don’t ‘believe’ in evolution, but accept it as the best available current explanation for life’s diversity. This can engender some interesting discussions…)

But what the statement “most scientists believe” means – to scientists – is that most scientists agree that the weight of evidence favours a particular interpretation. Quinn suggests we should say “scientific evidence supports the conclusion that…” I like this – it leaves open the possibility that this conclusion could change, if sufficient evidence to the contrary comes to light. Which is a much better reflection of the nature of science. Unfortunately there tends to be a perception that scientific ‘facts’ don’t change. (Also unfortunate is the fact that if scientists do change their interpretation of the data, they’re accused of not really knowing what they’re talking about. Sometimes I think we just can’t win!) Like Quinn, I feel that as scientists we shouldn’t be using the ‘b’ word – it gives the appearance that science is “just another belief system.”

‘Theory’ is another word that means different things to different people. “I’ve got a theory about that” really means, ‘I’ve got a hunch or an idea, a guess.’ But to scientists ‘theory’ means a well-established explanation for a large body of data: the theories of relativity, plate tectonics, evolution… These are definitely not guesses (nor are they belief systems!), but comprehensive explanations that have strong predictive power and have been tested time and time again. They are also incomplete, but that again is the nature of science. Scientific theories may well be modified if new evidence comes to hand: Newton’s laws are an example. (Quinn notes that Newton’s laws still hold, under certain well-defined conditions.)

It’s worth repeating Quinn’s description of how scientific theories are developed, because this is a valuable description of how science operates and what sets it apart from ‘other ways of knowing’:

When we seek to extend and revise our hypothetical frameworks, we make hypotheses, build models, and construct untested, alternate, extended theories. These last must incorporate all the well-established elements of prior theories. Experiment not only tests the new hypotheses; any unexplained result both requires and constrains new speculative theory building – new hypotheses. Models … play an important role here. They allow us to investigate and formulate the predictions and tests of our theory in complex situations. Our theories are informed guesses, incorporating much that we know. They may or may not pan out, but they are motivated by some aspects or puzzles in the existing data and theory. We actively look for contradictions.

An alien star-child?

Waikato University biological sciences lecturer Alison Campbell posts a regular blog on matters biological (sci.waikato.ac.nz/bioblog/). Her aim is to encourage critical thinking among secondary students. We think these need sharing.

Last week one of my students wrote to me about something they’d seen on TV:

My friend and I saw this on Breakfast this morning. Although we don’t think it is all true, we are still interested because they talked a lot about the skull’s morphology and how they believe it is the offspring from a female human and an alien. Here’s the website on it: www.starchildproject.com

It would be great to hear your thoughts.”

So I went off and had a look at the website, and wrote back. My first thought is that (following what’s called ‘Occam’s razor’) the simplest possible explanation is likely to be correct, ie that this is simply a ‘pathological’ human skull, rather than a mysterious alien-human hybrid. (Read Armand LeRoy’s book Mutants to get a feel for just how wide the range of potential variation is in humans.)

Happily there are ways of testing this – the skull is reportedly only 900 years old so it should be possible to look at its DNA.

And indeed this has been done – and the data are presented on the Starchild project’s website. Which surprised me more than a little, given that they don’t support the hybrid idea! The skull in question – which certainly has an interesting shape – was found along with the remains of an adult female. The DNA results show that both woman and child were native Americans, not related to each other, and also that the child was male. There is absolutely no indication there of any ‘alien’ DNA. Which is what I would have predicted – if we were to be visited by extraterrestrial individuals, why would we expect them to be a) humanoid and b) genetically compatible with us? ie the likelihood of successful interbreeding is vanishingly small. And that’s a big ‘if’ in any case … Carl Sagan had some sensible things to say on that issue in The Demon-haunted World.

My personal view is that the whole thing should have been examined rather more critically by the programmers before it made it to air. But then, I have ceased to be surprised at the uncritical nature of much that’s presented by our broadcast media (with the honourable exception of the National Programme!).

Ominous trends in the schoolroom

Another annual conference has come and gone, with the usual collection of thought-provoking presentations. This issue we present two highlights, from Waikato University biology lecturer and science communicator Alison Campbell, and Greek Honorary Consul Nikos Petousis.

Continue reading

Science as a human endeavour

If students are to pursue careers in science, they need to be able to see themselves in that role. One way to encourage this may be through the telling of stories. This article is based on a presentation to the 2008 NZ Skeptics Conference in Hamilton.

New Zealand’s new science curriculum asks us to develop students’ ability to think critically. As a science educator I think that’s about the most important skill we can give them: the ability to assess the huge amount of information that’s put in front of them from all sorts of sources. We also need to recognise that the ideas and processes students are hearing about have come to us through the activities of people – it’s people who develop science understanding. Science changes over time, as people’s ideas change. It’s fluid, it’s done by people, and it’s a human endeavour.

This puts science in an interesting position. It has its own norms, and its own culture, but it’s embedded in the wider culture as well. Those norms of science include its history. I find it sad that many of my students have no idea of where the big ideas in science came from. They don’t know what the people who were developing those ideas were like.

The new curriculum document recognises that the nature of science is an important strand in the curriculum, because it is what gives science its context, and lets students see science as a human endeavour. They’re going to learn what science is, and how scientists do science. They will become acquainted with the idea that scientists’ ideas change as they’re given new information; that science is valuable for society. And students are going to learn how it’s communicated.

Our future prosperity depends on students continuing to enter careers in the sciences. Richard Meylan, a senior adviser at the Ministry of Research, Science and Technology, said to me recently that somewhere between the end of year 13 and that two-month break before they go to university, we seem to be losing them. The universities are tending to see a drop in the number of students who have picked science as something that they want to continue in. Students don’t seem to see it as a viable career option, and there are many reasons for that.

We need more scientists, we need scientifically-literate politicians, and we need a community that understands science: how science is done, how science is relevant; one that sees science and scientists as being an integral part of the community. But how are we going to get there? What sorts of things can we do that are going to make young people want to carry on in science? Students often don’t choose science – how are we going to change that?

One of the reasons, perhaps, is that they often don’t see themselves as scientists. We did a bit of research on this at Waikato University last year, asking what would encourage our first-year students to continue as scientists. And what they were saying was, “Well, a lot of the time I don’t see myself as a scientist.” We asked, what would make a difference? The response: “Seeing that my lecturers are people.” People first, scientists second.

When I googled ‘scientist’ I had to go through eight or nine pages of results before finding something that looks like my own idea of a scientist. (‘Woman scientist’ is a bit better!) Almost all the guys have moustaches, they’ve all got glasses, all the women are square-shaped. Students don’t see themselves in this. We need them (and the rest of the community!) to see science as something that ordinary people do.

Now, what sorts of things are those ordinary people doing? They’re thinking; they’re speculating, they’re saying ‘what if?’ They’re thinking creatively: science is a creative process and at its best involves imagination and creativity. Scientists make mistakes! Most of the time we’re wrong but that doesn’t make good journal articles; usually no-one publishes negative results. So you just hear about the ‘correct’ stuff. Scientists persist when challenged, when things aren’t always working well.

Science stories

One way of fostering students’ engagement with science, and seeing themselves in it, is to tell them stories, to give them a feeling of how science operates. Brian Greene, a science communicator and physicist in the US, says:

I view science as one of the most dramatic narratives our species can tell. The story of our search to understand the Universe and ourselves. When that search is conveyed using the power of story – the story of discovery – we can all feel part of the journey.

So I’m going to tell you stories. And I’m going to tell stories about old, largely dead, people because one of my passions at the moment is the history of science. A lot of science’s big ideas have a history that stretches back 3-400 years. But they’re just as important today, and I think that an understanding of the scientists who came up with those ideas is also important today.

I think it’s important that kids recognise that a lot of scientists are a bit quirky. But then, everyone’s a bit quirky – we’re all different. One example of someone ‘a bit different’ is Richard Feynman. Famous for his discoveries in the nanotech field, he was a polymath: a brilliant scientist with interests in a whole range of areas – biology, art, anthropology, lock-picking, bongo-drumming. He was into everything. He also had a very quirky sense of humour. He was a brilliant scientist and a gifted teacher, and he showed that from an early age. His sister Joan has a story about when she was three, and Feynman was nine or so. He’d been reading a bit of psychology and knew about conditioning, so he’d say to Joan: “Here’s a sum: 2 plus 1 more makes what?” And she’s bouncing up and down with excitement. If she got the answer right, he’d give her a treat. The Feynman children weren’t allowed lollies for treats, so he let her pull his hair till it hurt (or, at least, he behaved as if it did!), and that was her reward for getting her sums right.

Making mistakes

We get it wrong a lot of the time. Even the people we hold up as these amazing icons – they get it wrong. Galileo thought the tides were caused by the Earth’s movement. At the time, no-one had developed the concept of gravity. How could something as far away as the Moon possibly affect the Earth? We look back at people in the past and we think, how could they be so thick? But,in the context of their time, what they were doing was perfectly reasonable.

Louis Pasteur, the ‘father of microbiology’, held things up for years by insisting that fermentation was due to some ‘vital process’ it wasn’t chemical. He got it wrong.

And one of my personal heroes, Charles Darwin, got it completely wrong about how inheritance worked. He was convinced that inheritance worked by blending. When Darwin published The Origin of Species, in 1859, Mendel’ s work on inheritance hadn’ t been published. It was published in Darwin’s lifetime – Mendel’s ideas would have made a huge difference to Darwin’s understanding of how inheritance worked – part of the mechanism for evolution that he didn’t have. But he never read Mendel’s paper.

Scientists do come into conflict with various aspects of society. Galileo had huge issues with the Church. He laid out his understanding of what Copernicus had already said: the Universe was not geocentric, it didn’t go round the Earth. The Church model was that the Universe was very strongly geocentric: everything went round us. Galileo was accused of heresy, and shown the various instruments of torture; for pulling out his thumbnails and squashing his feet. He did recant, and he was kept under house arrest until his death. And the Church officially apologised to him in 1992. A long-running conflict indeed.

And there’s conflict with prevailing cultural expectations. Beatrice Tinsley was an absolutely amazing woman; a New Zealander who has been called a world leader in modern cosmology, and one of the most creative and significant theoreticians in modern astronomy. She went to the US to do her PhD in 1964, and finished it in 1966. Beatrice published extensively, and received international awards, but she found the deck stacked against her at the University of Texas, where she worked. She was asked if she’d design and set up a new astronomy department, which she did. The university duly opened applications for the new Head of Department. Beatrice applied. They didn’t even respond to her letter. So she left Texas. (Yale did appreciate her, and appointed her Professor of Astronomy.) A couple of years later she found she had a malignant melanoma, and was dead by the age of 42. The issue for Beatrice was a conflict between societal expectations and the area where she was working: women didn’t do physics.

Science versus societal ‘knowledge’

Raymond Dart was an English zoologist who worked at the University of Witwatersrand in South Africa. He was widely known among the locals for his fondness for fossils; you could trundle down to Prof Dart’s house, bring him a lovely bit of bone, and he’d pay you quite well. One day in 1924 the workers at Taung quarry found a beautiful little skull – a face, a lower jaw, and a cast of the brain – in real life it would sit in the palm of your hand. Dart was getting ready for a wedding when the quarry workers arrived, and he was so excited by this find that when his wife came in to drag him off to be best man, he still didn’t have his cuffs and his collar on and there was dust all over his good black clothes. He was absolutely rapt.

Dart looked at this fossil and saw in it something of ourselves. He saw it as an early human ancestor. The jaw is like ours, it has a parabolic shape, and the face is more vertical -relatively speaking – than in an ape. He described it, under the name Australopithecus africanus, as being in our own lineage and went off to a major scientific meeting, expecting a certain amount of interest in what he’d discovered. What he got was a fair bit of doubt, and some ridicule. How could he be so foolish? It was surely an ape.

By 1924 evolution was pretty much an accepted fact in the scientific community. But there was a particular model of what that meant. In some ways this built on the earlier, non-evolutionary concept of the Great Chain of Being. They also had a model that tended to view the epitome of evolutionary progress as white European males. It followed from this that humans had evolved in Europe, because that’s where all the ‘best’ people came from. Black Africans were sometimes placed as a separate species, and were regarded as being lower down the chain.

Yet here was Dart saying he’d found a human ancestor in Africa. This would mean the ancestor must have been black – which didn’t fit that world-view. It’s a racist view, but that reflected the general attitudes of society at the time, and the scientists proposing that view were embedded in that society just as much as we are embedded in ours today.

Another difficulty for Dart had to do with prevailing ideas about how humans had evolved. By the 1920s Neanderthal man was quite well known. Neanderthals have the biggest brains of all the human lineage – a much bigger brain than we have. And the perception was that one of the features that defined humans, apart from tool use, was a big brain. It followed from this that the big brain had evolved quite early. Dart was saying that Australopithecus was a hominin, but Australopithecus as an adult would have had a brain size of around 400cc. We have a brain size of around 1400cc. Australopithecus didn’t fit the prevailing paradigm. The big brain had to come first; everybody knew that.

And belief in that particular paradigm – accepted by scientists and non-scientists alike – helps to explain why something like Piltdown man lasted so long. Over the period 1911-1915 an English solicitor, Charles Dawson, ‘discovered’ the remains of what appeared to be a very early human indeed in a quarry at Piltdown. There were tools (including a bone ‘cricket bat’), a skull cap, and a lower jaw, which looked very old. The bones were quite thick, and heavily stained. This was seized upon with joy by at least some anatomists because the remains fitted in with that prevailing model: old bones of a big-brained human ancestor.

People began to express doubts about this fossil quite early on, and these doubts grew as more hominin remains were confirmed in Africa and Asia. But it wasn’t completely unmasked as a fake until the early 1950s. The skull looked modern because it was a modern (well, mediaeval) skull that had been stained to make it look really old. The jaw was that of an orangutan, with the teeth filed so that they looked more human and the jaw articulation and symphysis (the join between right and left halves) missing. When people saw these remains in the light of new knowledge, they probably thought, how could I have been so thick? But in 1914 Piltdown fitted with the prevailing model; no-one expected it to look otherwise. And I would point out that it was scientists who ultimately exposed the fraud. And scientists who re-wrote the books accordingly.

Thinking creatively

The next story is about Barry Marshall, Robin Warren, and the Nobel Prize they received in 2005. (These guys aren’t dead yet!) Here’s the citation:

[The 2005] Nobel Prize in Physiology or Medicine goes to Barry Marshall and Robin Warren, who with tenacity and a prepared mind challenged prevailing dogmas. By using technologies generally available… they made an irrefutable case that the bacterium Helicobacter pylori is causing disease.

The prevailing dogma had been that if you had a gastric or duodenal ulcer, you were a type A stress-ridden personality. The high degree of stress in your life was linked to the generation of excess gastric juices and these ate a hole in your gut. Marshall and Warren noticed that this bacterium was present in every preparation from patients’ guts that they looked at. They collected more data, and found that in every patient they looked at, H. pylori was present in the diseased tissue. One of them got a test-tube full of H. pylori broth and drank it. He got gastritis: inflammation of the stomach lining and a precursor to a gastric ulcer. He took antibiotics, and was cured. The pair treated their patients with antibiotics and their ulcers cleared up.

Because they were creative, and courageous, they changed the existing paradigm. And this is important – you can overturn prevailing paradigms, you can change things. But in order to do that you have to have evidence, and a mechanism. Enough evidence, a solid explanatory mechanism, and people will accept what you say.

Which was a problem for Ignaz Semmelweiss. He had evidence, alright, but he lacked a mechanism. Semmelweiss worked in the Vienna General Hospital, where he was in charge of two maternity wards. Women would reputedly beg on their knees not to be admitted to Ward 1, where the mortality rate from puerperal fever was about 20 percent. In Ward 2, mortality was three or four percent. What caused the difference? In Ward 2 the women were looked after exclusively by midwives. In Ward 1, it was the doctors. What else were they doctors doing? They were doing autopsies in the morgue. And they would come from the morgue to the maternity ward, with their blood-spattered ties, and I hate to think what they had on their hands. Then they would do internal examinations on the women. Small wonder so many women died. Semmelweiss felt that the doctors’ actions were causing this spread of disease and said he wanted them to wash their hands before touching any of the women on his ward. Despite their affronted reactions he persisted, and he kept data. When those doctors washed their hands before doing their examinations, mortality rates dropped to around three percent.

The trouble was that no-one knew how puerperal fever was being transmitted. They had this idea that disease was spread by miasmas – ‘bad airs’ – and although the germ theory of disease was gaining a bit of traction the idea that disease could be spread by the doctors’ clothes or on their hands still didn’t fit the prevailing dogma. Semmelweiss wasn’t particularly popular – he’d gone against the hospital hierarchy, and he’d done it in quite an abrasive way, so when he applied for a more senior position, he didn’t get it, and left the hospital soon after. He was in the unfortunate position of having data, but no mechanism, and the change in the prevailing mindset had to wait for the conclusive demonstration by Koch and Pasteur that it was single-celled organisms that actually caused disease.

Collaboration and connectedness

Scientists are part of society. They collaborate with each other, are connected to each other, and are connected to the wider world. Although there have been some really weird people that weren’t. Take Henry Cavendish – the Cavendish laboratory in Cambridge is named after him. He was a true eccentric. He did an enormous amount of science but published very little, and was quite reclusive – Cavendish just didn’t like talking with people. If you wanted to find out what he thought, you’d sidle up next to him at a meeting and ask the air, I wonder what Cavendish would think about so-and-so. If you were lucky, a disembodied voice over your shoulder would tell you what Cavendish thought. If you were unlucky, he’d flee the room.

But most scientists collaborate with each other. Even Newton, who was notoriously bad-tempered and unpleasant to people whom he regarded as less than his equal, recognised the importance of that collaboration. He wrote: “If I have seen further than others, it is because I have stood on the shoulders of giants.” Mind you, he may well have been making a veiled insult to Robert Hooke, to whom he was writing: Hooke was rather short.

What about Darwin? Was he an isolated person, or a connected genius? We know that Darwin spent much of the later years of his life in his study at Downe. He had that amazing trip round the world on the Beagle, then after a couple of years in London he retreated to Downe with his wife and growing family, and spent hours in his study every day. He’d go out and pace the ‘sandwalk’ – a path out in the back garden – come back, and write a bit more. Darwin spent eight years of that time producing a definitive work on barnacles, and he didn’t do it alone. He wrote an enormous number of letters to barnacle specialists, and to other scientists asking to use work that they’d done, or to use their specimens to further the work he was doing.

He was also connected to a less high-flying world: he was into pigeons. This grew from his interest in artificial selection and its power to change, over a short period of time, various features in a species. So he wrote to pigeon fanciers. And the pigeon fanciers would write back. These were often in a lower social class and various family and friends may well have been a bit concerned that he spent so much time speaking to ‘those people’ about pigeons. And Darwin had a deep concern for society as well. He was strongly anti-slavery, and he put a lot of time (and money) into supporting the local working-class people in Downe. He was still going in to London to meet with his colleagues, men like Lyell and Hooker, who advised him when Alfred Wallace wrote to him concerning a new theory of natural selection. Now there’s an example of connectedness for you, and the impact of other people’s thought on your own! It was Wallace who kicked Darwin into action, and led to him publishing the Origin of Species.

That’s enough stories. I’m going to finish with another quote from Brian Greene:

Science is the greatest of all adventure stories, one that’s been unfolding for thousands of years as we have sought to understand ourselves and our surroundings. Science needs to be taught to the young and communicated to the mature in a manner that captures this drama. We must embark on a cultural shift that places science in its rightful place alongside music, art and literature as an indispensable part of what makes life worth living.
Science lets us see the wonder and the beauty of the stars, and inspires us to reach them.

Evolution in the NZ school curriculum

The teaching of evolution in New Zealand schools may seem secure, but it has faced many challenges, and these appear to be on the increase. This article is based on a presentation at the Evolution 2007 Conference, Christchurch.

Many people feel the antagonism between evolution and creationism is an issue only in the United States. However, creationism is becoming more visible around the world. Even in New Zealand, creationism, and its opposition to evolution, has a relatively long history and-as in many other countries-is currently increasing in prominence.

Evolution was first discussed in a New Zealand educational institution in 1871, when Otago University professor Duncan MacGregor pushed for the teaching of evolutionary biology. This led to moves to have him removed from his Chair, though these were ultimately unsuccessful.

New Zealand’s free, secular public education system was born in 1877. By 1881 there was some concern among the Protestant and Catholic churches that schoolteachers were being taught about evolution, thus supposedly losing the religious neutrality required of a secular system. However, school curricula contained no explicit mention of this worrying subject. New Zealanders appear to have viewed themselves as fairly open-minded in this area: Numbers & Stenhouse in 2000 noted that the NZ Herald, reporting on the 1925 Scopes trial in the US, “found it ‘hard to take the anti-evolution movement seriously'”.

However, in 1928 the Education Department published an addition to the national science syllabus that said “in the higher classes the pupils should gain some definite idea of the principle of evolution”. Though fundamentalist Christians were few in number they were extremely vocal: their immediate and heated response to the amended syllabus was so strong that the department backed down: students should not have to learn about human origins, but to “discover some part… of the great plan of nature”. This could be regarded as a win for the creationist camp, and was followed by the establishment of anti-evolution societies such as the Evolution Protest Movement.

In 1947 the Department of Education broadcasts to schools included a series of BBC programmes on evolution, How Things Began. Protest was swift and vociferous and included Labour party supporters worried about losing the wavering voter, as well as conservative Christian groups. The Minister of Education first suspended and then cancelled the broadcasts, despite strong opposition to this from teacher unions and other educationalists. Flushed with success, the creationist lobby attempted to get the Ministry to publish creationist articles in the School Journal, but the Minister declined. As public interest waned so too did the creationist movement, so that by the 1970s it seemed to have disappeared completely.

But at the same time, creationism in the US was experiencing resurgence, with the popular writings and presentations of Youth Earth Creationists such as Henry Morris (The Genesis Flood). In 1972 New Zealander Tony Hanne read Morris’ book and invited him on a tour of New Zealand. Visits by other US creationists followed, each generating considerable public interest in this country even though scientists in general rejected their claims. However, Numbers & Stenhouse (2000) also give the example of one university geologist who was so swayed by creationist rhetoric that he included works by Morris & Duane Gish in his own courses!

In 1982 the then Auckland Department of Education issued a creationist textbook for use in senior biology classes, a book which was widely distributed by the then Auckland College of Education’s Science Resource Centre. When questioned about the propriety of science teachers including creationism in their classes, a spokesman for the New Zealand Education Department responded that he found nothing wrong with science teachers including ‘scientific creationism’ in their classes, “as long as they’re presenting it as one possible explanation and not the only explanation”.

Scientists tended to feel that science, and evolution, had little to fear from creationism; it was viewed as a peculiarly American foible. Yet at the same time, the Creation Science Foundation (CSF) in Australia was expanding to become what was, by the 1990s, the world’s second-largest creation science organisation. This found fertile ground among religious conservatives in New Zealand, and also among our Maori and Pasifika communities (eg Peddie, 1995), and in 1994 the CSF opened a New Zealand branch, Creation Science (NZ).

1993 saw the introduction of a new Science curriculum, and the associated ‘specialist’ science curricula, for New Zealand schools. Evolution is mentioned explicitly only at Level 8 (Living World) of this document, which gives as a learning objective “students can investigate and describe the diversity of scientific thought on the origins of humans”. It goes on to say that students could be learning through “holding a debate about evolution and critically evaluating the theories relating to this biological issue” (my italics). This suggestion that there is more than one possible theory explaining evolution has left the door open for teachers and institutions who wish to bring creationism into the science classroom. Thus, in 1995 Peddie could comment, “… in this country some private schools, and some teachers within the state school system and home schooling systems, continue to teach creationism and debunk evolution.”

For example, in 2003 the Masters Institute, together with the organisation Focus on the Family, offered a workshop on intelligent design for teachers and parents, featuring speakers such as the Discovery Institute’s William Dembski. The session was billed as “an excellent learning opportunity that offers both a professional development opportunity and a fresh look at some knotty problems in science and biology” (Education Gazette, 22 August 2003). Focus on the Family has also distributed CD-ROMs based on the creationist tract Icons of Evolution to every secondary school in the country.

Concern from universities and the Royal Society was met by a response from the Ministry of Education stating that “it is not the intention of the science curriculum that the theory of evolution should be taught as the only way of explaining the complexity and diversity of life on Earth”-and that schools are free to decide their own approach to theories of the origins of life, within existing curriculum guidelines. Showing a lack of knowledge of evolution, the Ministry’s representative continued: “The science curriculum does not require evolution to be taught as an uncontested fact at any level. The theory of evolution cannot be replicated in a laboratory and there are some phenomena that aren’t well explained by it.”

We are now developing a new draft Science curriculum. This document, as well as emphasising the importance of students developing an understanding of the nature of science, recognises evolution as one of the organising themes of modern biology following Dobzhansky’s 1973 dictum, “Nothing in biology makes sense except in the light of evolution.” The curriculum document reads: “Students develop an understanding of the diversity of life and life processes. They learn about where and how life has evolved, about evolution as the link between life processes and ecology, and about the impact of humans on all forms of life”. One significant difference from the existing curriculum is that the term evolution is introduced in primary school: students in years 1 and 2 will “recognise that there are lots of different living things in the world and that they can be grouped in different ways,” and “explain how we know that some living things from the past are now extinct.” By year 13 they will be exploring “the evolutionary processes that have resulted in the diversity of life on Earth.”

The document was sent out for public consultation and the Biology component immediately drew the ire of conservative religious groups. Creation Ministries International (formerly the CSF) contacted members and supporters, asking them to lobby strongly for a reversion to the current status quo: “CMI does not suggest evolutionists be forced to teach about creation. What we do suggest is that freedom be retained for the presenting of both evolution-based and Creation-based frameworks of science. We support the teaching of evolution provided it is done accurately, ‘warts and all’, ie with open discussion of its many scientific problems included.”

And a submission for a private school stated that “… there is still no evidence to support the theory, [and]… to base [curriculum content] on an unproven theory is bizarre” (www.tki.org.nz/r/nzcurriculum/long_submissions_e.php). The writers went on to suggest that the curriculum would be better to speak of ‘diversity’, which they viewed as a much more suitable term.

There is also anecdotal evidence that many teachers also oppose the new curriculum in its present form-either because they feel uncomfortable or under pressure about it in the face of potential student, parent, and community opposition, or because they themselves have a creationist worldview. At a time when biology in its various forms is set to play an important role in New Zealand’s scientific and economic development, this is something that should concern us all.

Selected references (full references available from editor)

Numbers, R.L. & J. Stenhouse (2000) Antievolutionism in the Antipodes: from protesting evolution to promoting creationism in New Zealand. British Journal for the History of Science 33: 335-350. Peddie, W.S. (1995) Alienated by Evolution: the educational implications of creationist and social Darwinist reactions in New Zealand to the Darwinian theory of evolution. Unpublished PhD thesis, University of Auckland.

Steiner Preschools: More taxpayer-funded loopiness

Rudolf Steiner kindergartens look set to cash in on free early childhood education initiatives.

The plan of Education Minister Steve Maharey to provide 20 hours of free early childhood education reminds us that New Zealand has a wide variety of preschools, based on diverse philosophies. Perhaps the weirdest is Waldorf-Steiner schooling, which was founded by the loopy Austrian, Rudolf Steiner (1861-1925).

Hundreds of schools and even more kindergartens in scores of countries follow Steiner’s system. Established in 1950, the Hastings Rudolf Steiner School and Kindergarten was the first Waldorf centre of education in New Zealand. The Federation of Rudolf Steiner Waldorf Schools was formed as an incorporated society in 1988 and lobbies the government. Today, the country has 10 Steiner-Waldorf schools or school initiatives and almost 40 Waldorf kindergarten groups. The Government gave most of them money from you and me even before Maharey’s scheme started.

The education they offer is based on the notions of Rudolf Steiner, the founder of anthroposophy. He believed humanity is living in the post-Atlantis period, which started with the sinking of Atlantis in 7227 BC. After the current European-American epoch ends in the year 3573, humans will regain the psychic powers they had before the time of the ancient Greeks. Steiner claimed to be the earthly ambassador of the world-encompassing spirit of our time, St Michael. According to Steiner, a hierarchy of angels and archangels influence earthly developments. Seven leading archangels take turns to guide the evolution of humanity for 354 years at a time. St Michael’s stint as “time spirit” started in late November 1879, and Steiner declared that he himself had accepted the mission of being the Michaelic Initiate, to help guide the spiritual life of the Western World. This was an event of world historic importance that took place unnoticed. Anthroposophists regard Steiner with awe and reverence. They are as gullible as Mormons.

Anthroposophy can involve bizarre behaviour. For example, some anthroposophists sit alone or in groups to read to departed souls in order to form links from our sense-perceptible world to the “so-called dead”. They claim to receive messages of “Thank you”. Some anthroposophists ask questions of Steiner himself. Occasionally statements are circulated that allegedly came from him. The feeling that such a message evokes of loosening one’s being from the physical body is a sign that the communication is genuine.

Steiner devoted time to many interests, including education, poetry, architecture, jewellery design, astrology, biodynamic agriculture, reincarnation, karma, medicine and the creation of what he called a new art, eurythmy (mime and movement). All these topics he treated in spiritual terms. Eurythmy, for example, is supposed to manifest spiritual states of being, calling upon influences from past lives and preparing for future lives.

The benefits of anthroposophical medicine are wildly exaggerated. The pricey Helios Therapeutic Retreat in Hawkes Bay sells eurythmy, massage, music and art therapy. Although perhaps nice, these pursuits will not cure any diseases. Patients who need a loan to meet the thousands of dollars in fees are referred to a finance company in Napier. In 1921 Steiner himself started a business called Weleda that has spread internationally, selling useless ‘natural’ medicines with a spiritual approach. Waldorf schools have a reputation for opposing childhood vaccinations.

Waldorf kindergartens are based on the belief that there is a spiritual side to all of life. They focus on free play, art and craft, fairy stories, myths, eurythmy, and circle time for festivals such as Michaelmas. Waldorf teachers use the ancient idea of four temperaments (choleric, phlegmatic, melancholic and sanguine) to categorise children. They might seat pupils in the classroom according to their supposed type. The Steiner approach is sometimes called racist. He believed that souls pass through stages, including racial stages, with African races being lower than Asian races and European races being the highest form. Steiner education stresses fantasy and dreaminess, which anthroposophists associate with spontaneous clairvoyance. Other quirks of the system include teaching reading late and the banning of computers until high school. Television, radio and recorded music are excluded. While this approach can stimulate imaginations, it also is based on false and nutty ideas. I wonder how many New Age people and followers of alternative healers were handicapped in grasping reality because they went to a Steiner school.