The 10 Myths of 1080

Sodium monofluoroacetate (1080) is a proven tool in the New Zealand pest control arsenal, but significant opposition to its use continues, much of it irrational. This article is based on a presentation to the 2011 NZ Skeptics Conference.

There is a brutal battle being waged every night in our forests. It’s our own little horror movie. NZ’s ‘mammal mafia’ of possums, stoats and rats has been accused of devouring more than 26.5 million birds in native forest annually. Landcare Research scientist Dr John Innes, quoted in the Waikato Times, said it was time opponents of 1080 “… got real about the facts. Most endemic forestbirds are disappearing because of predators – millions of forest birds are being killed by mammals every year.”

Myth One: Its all about 1080

No, its not. The real issues are around protection of our natural heritage, and while we do win many battles, it’s the war that still needs to be won. We know that where we do intervene we make a positive difference and we support a wide range of private initiatives, recognising we need all the help we can get.

Quite simply we have a toolkit approach to pest control and we used the best tool to fit the type of country and the type of pests we are trying to manage. 1080 is a crucial part of this toolkit. It is the only toxin registered for aerial control on the mainland, and it complements a range of other toxins and the widespread use of trapping.

Its main use is on difficult, challenging country where the costs of ground control, whether by toxins or trapping, are double or treble the cost of aerial 1080 use. For example in the case of the Cascade Valley in South Westland, trying to do pest control by ground methods would have cost an extra $1 million and we have the quotes to prove it. It can be applied over 25,000 ha in a single day and is highly effective, often achieving 99 percent kills.

But DOC is not addicted to 1080. DOC does around 550,000 ha annually of mammalian pest control and less than 30 percent is delivered by aerial 1080. In terms of stoat control, over 250,000 ha are controlled by ground trapping.

Myth Two: We need an independent inquiry

Why? We have already had two and both reconfirmed the need for 1080. Indeed the Parliamentary Commissioner for the Environment (PCE) 2011 report went further than the Environmental Risk Management Agency’s (ERMA’s) 2007 review, and said we should be using more of it.

“My underlying concern is the decline in bird populations. In the future, the only place native birds will exist is on protected offshore islands and mainland sanctuaries. Without good pest control we will move to the functional extinctions of populations, where numbers are so low they are not viable,” said Dr Jan Wright, the current PCE.

Myth Three: We are poisoning paradise

Well if we are, we’re doing a terrible job of it. What does the science tell us?The latest studies on 1080 in soil measured degradation at 20, 10, and five degrees. Even at five degrees, 1080 disappeared in six to eight weeks (Dr Penny Fisher, Landcare Research). So the soil is not being poisoned and there are no lasting impacts from 1080 drops.

In a normal aerial 1080 drop there would be a pellet every 32 square metres and only 0.15 percent of each pellet is poison. A week after an operation it can be hard to find any 1080 pellets.

The pellets all biodegrade, and how long they remain depends on the rainfall and temperature. Importantly, 1080 does not bio-accumulate and does not persist in the soil. Studies show that no 1080 residues will remain and some of the most productive wildlife areas, for example the East Taupo forests such as Pureora and Tongariro, have had multiple 1080 drops.

The user agencies have also got much better at application. The dose rates have dropped over time from 20 kg to two kg (and some promising science research may allow us to drop it further( and the use of GPS systems specifically modified for NZ keep helicopter overflies to a minimum.

Myth Four: What about the water then?

The dispersal of 1080 in water after operations has been studied for nearly 20 years and there have been over 2400 tests. Over 96 percent of the tests showed no detection at all, and where there were slight traces, these soon dissolved and there have been no impacts on human health.

The most authoritative work on breakdown in water has been done by Alastair Suren of the National Institute for Water and Atmosphere (NIWA). His water trials show that after five hours half the 1080 was lost with the concentration down to 10 percent of original after 24 hours. The baits themselves remained intact for 48 hours and by 72 hours fragmentation was occurring.

Myth Five: It’s impacting on human health

1080 is a poison and must be managed accordingly. All risk is relative and nothing can be guaranteed as totally safe. What the department is saying is that the risks to health from 1080, for a population or an individual, are insignificant in a well managed operation done under strict protocols. Most New Zealanders will never come in contact with 1080. In terms of human health risks, the people who would be most at risk from using 1080 are those who process it into cereal baits and other formulations at the factory in Whanganui.

The workers’ health is being monitored closely. The department also runs a random testing system for aerial 1080 operations to ensure staff using the product are protected. There has only ever been one death from 1080 in NZ and that happened to a possum trapper in the 1960s. It is possible that he mistook the raspberry-based paste for something edible but it is not really known.

Myth Six: You can’t prove it works

Of course we can. There are a wealth of field and working reports done by staff showing the benefits of 1080 (see sidebar). The department currently has an active science research portfolio, focused on the impacts of 1080, partly as a response to recommendations made by ERMA.

These include a forest monitoring project to look at forest recovery in the wake of 1080, a study of the impact of 1080 on kea, and a three-site trial looking at the benefits and risks for a range of native birds, when using aerial 1080 for rat and stoat control. These will be formally published. The results so far from the latter trial are very encouraging and confirm what we have been saying.

• 14 kaka nests were monitored through the last 1080 drop in Whakapohai in South Westland, seven in the 1080 zone, seven in nearby areas that had not had 1080 for two or more years.

Four of seven nests fledged in the 1080 area, only three of seven fledged in the non-1080 – not much difference, but two nests in the non-1080 area were taken by possums and one by a stoat. No mammalian predators were identified killing nests in the 1080 zone though a kea got one of them.

• 36 riflemen were monitored through the last 1080 drop in Whakapohai. All survived. This is the first time riflemen have been monitored in 1080 drops.

• Comparing bird counts in two of the blocks that get 1080, and the other block that gets none, Kaka were heard nine times more often in the 1080 area, bellbirds six times more often. Kakariki and tomtits were heard significantly more often in the 1080 area, but the difference was not great.

Myth Seven: DOC ignores the native species by-kill

It certainly does not. We have always acknowledged that there may be a small by-kill but argue strongly that the benefits will comprehensively outweigh the losses – a claim which ERMA endorsed in its 2007 report on 1080. Eleven species of native bird have been intensively monitored, and several other bird species monitored using less precise techniques. None of these studies have identified population level mortality which threatens the viability of the species.

Kea are a concern. We know that in low predator environments kea will have 80-100 percent fledgling success, but in high predator environments this will be well below 40 percent.

We do lose birds, and the real question is whether individual losses can be made up for by fledgling successes.We recently lost seven kea in Okarito out of a total of 38 being monitored for the recent aerial 1080 operation which was aimed at protecting rowi, the country’s rarest kiwi, in their habitat.

The operation itself, over 30,000 ha, has been wonderfully successful in reducing rats by 99 percent and stoats just hovering above zero, so this should allow for a much greater fledging success not just for kea but for kiwi.In two previous 1080 operations where kea have been monitored we lost none at all.

Invertebrate populations have been monitored in nine aerial poisoning operations and none have shown significant population effects on any species studied, nor is there evidence to suggest poisoned invertebrates are a significant factor in secondary poisoning of other animals. Long-term monitoring of native land snails indicates substantial benefits to threatened populations in sites treated with aerial poisoning.

Myth Eight: It is not a humane poison

It can take the best part of a day for a possum to die from 1080. This is rated by most authorities as a ‘moderately humane’ toxin. The other element is to focus on what the bait is trying to achieve, which in the case of 1080 for conservation, is trying to protect our most vulnerable species. It is certainly more humane than the brodifacoum that goes into the common household Talon bait for rats, which can take four days to work.

Myth Nine: You don’t do any work on alternatives

Just two examples provide the comprehensive rebuttal to this claim. DOC, in conjunction with private firm Connovation has just produced a toxin specifically for stoats, known as PAPP. It’s the first stoat toxin ever produced and together we have invested over $1 million. It is humane, quick acting (30-45 minutes) and it works very well.

The government and the Green Party agreed to invest $4 million over three years into our self-resetting trap, which allows a trap to be reset 12 times. If successful and large-scale trials are going on, this should significantly improve the cost/efficiency of ground control. Overall the government has invested $3-4 million annually in new and improved methods of pest control.

Myth 10: We can do it all by fur trapping and promote an industry as well

The Department is committed to working with the possum fur harvesting industry so long as conservation objectives are not compromised. The reality is that there is often an inherent contradiction between trying to eliminate possums as the undoubted pests they are, and the needs of the possum trappers to have enough animals to make their industry economic. For the 1080 user agencies, the driver for the possum control operations is to slash numbers to as low a level as possible.

The current situation on public conservation land is that generally our possum contractors are able to recover fur if they so wish. Some do but most don’t because of the nature of our performance-based contracts and the need to do the job as promptly as possible.

Beyond this, there are literally millions of hectares of both public conservation land and private land, which are not subject to possum control management, and where possum fur trappers can go right now to get fur if they wish. They can get a permit from their local DOC office or get permission from the individual landowners and away they go. Fur price is the main driver of activity and the recent price lift to $135/kg has seen more trappers chasing the fur.

The Department is also working closely with the industry to extend its balloting system, which allows fur trappers not doing pest control the exclusive right to harvest possums off individual blocks of land for 4-8 months, thus giving them some business certainty.

In conclusion, if we didn’t have 1080 available for pest control we would have to invent it. But it is not a silver bullet and must be respected as the poison that it is. The real tragedy of 1080 is that its impact doesn’t last long enough.

1080 Success stories

• Kiwi populations in Tongariro Forest were boosted following a very successful aerial 1080 pest control operation in 2006.

• Mohua (yellowheads) were under threat from predators at the head of Lake Wakatipu. Ground operations controlled stoats, but 1080 aerial control in 2007 and 2009 was needed to control the threat from rats.

• An aerial 1080 pest control operation in Kahurangi National Park’s Anatoki River area in October 2009 significantly reduced predator numbers, curbing an expected explosion of rats and stoats.

• 1080 has been used once every four years to suppress possums in the Otira forest.

• A study during a rat plague in Fiordland in 2006 showed much reduced levels of rat predation on bats in areas treated with 1080.

For more details on these and other examples see TrakaBat’s channel on

A hoax the size of a mountain?

The Bosnian Pyramids: The Biggest Hoax in History? Directed by Jurgen Deleye. VOF de Grenswetenschap. Watch online ( €5.95. DVD: €19.95 (excl. shipping). Reviewed by David Riddell.

While there are people in New Zealand who variously claim this country was settled in prehistoric times by a motley assemblage of Celts, Phoenicians and Chinese, among others, the alternative archaeology scene here is nothing like it is in Bosnia.

Now seeking to shake off the traumas of its recent past, the country has apparently embraced the theories of one Semir ‘Sam’ Osmanagich. Resplendent in his Indiana Jones-style hat, Osmanagich is delivering his compatriots a glorious ancient prehistory in the form of giant pyramids, dwarfing those of Egypt. The largest, which Osmanagich calls the Pyramid of the Sun, towers 220 metres above the town of Visoko. He claims underground tunnels link it to other, almost equally massive pyramids nearby. Single-handedly he has created a substantial tourist industry, much to the delight of the Bosnian government, which has given him support.

The Dutch team making this documentary follow Osmanagich around his sites, and generally give him enough rope to hang himself, bringing in other experts as necessary to add further comment. Those familiar with the Kaimanawa Wall (NZ Skeptic 41) and the Overland Alignment Complex in Northland (NZ Skeptic 72) will recognise how natural features can be reinterpreted in a more dramatic fashion, though the situation in Bosnia has a couple of added layers of complexity. First, there are genuine archaeological sites on and around the ‘pyramids’ and second, Osmanagich has actively reworked the landscape, even following and enlarging fissures in the earth to create his ‘tunnels’.

Bosnia is a country with a remarkable and lengthy human history and, as is very apparent in this film, great natural beauty. It shouldn’t need the dubious enhancement Osmanagich provides to entice tourists from abroad. On the other hand, it’s such a magnificent folly if I ever found myself in Bosnia I’d probably stop by Visoko to see it all for myself.

Chemistry: an antidote to pseudoscientific thinking?

Having a basic knowledge of the principles of chemistry can help one evade the pitfalls of many pseudosciences – but it’s not infallible. This article is based on a presentation to the 2011 NZ Skeptics Conference.

2011 is the International Year of Chemistry and as such I have been involved in a number of activities to celebrate the many contributions chemistry has made to our world. It has also been a time of reflection, during which I have asked myself, can an understanding of chemistry act as an antidote to pseudoscientific thinking? But first let us start with a definition of what chemistry is.

Chemistry is the study of matter, where matter is the material in our universe which both has mass and occupies space. Matter includes all solids, liquids and gases, and chemistry explores not only the properties and composition of matter but also how it behaves and interacts. Therefore chemists also have to understand how matter and energy interact.

While in theory chemistry can be described as an isolated discipline, in its practice and application it often contributes to, and is supported by, other scientific disciplines including biology (pharmacology, molecular biology) and physics (materials science, astrochemistry).

Core Chemical Concepts

At the heart of chemistry are some central concepts which form the foundation of this discipline. Let us examine some of these.

1) Matter is made up of atoms

The most basic structural unit in chemistry is the atom. The atom itself is made up of a nucleus containing particles called protons and neutrons, around which smaller particles called electrons orbit.

2) Atoms with different numbers of protons give rise to the different elements

Atoms exist with different numbers of protons (neutrons and electrons). These different atoms afford the different chemical elements which are usually represented in the form of the periodic table (see diagram). Each element has different properties and is represented on the periodic table by a one or two- letter symbol. Ninety of the elements occur naturally and these elements can combine to form the fantastically diverse types of matter that make up our universe.

The atomic number (the number above each element) signifies the number of protons each atom has in its nucleus. You will see as you read across each row and then down the number of protons in the nucleus increases.

3) Atoms are really, really small

Atoms are so incredibly small that it can be hard to visualise how very small they are. For example, our lungs hold approximately 1,000,000,000,000, 000,000,000,000 gas atoms, while a grain of sand contains approximately 100,000,000,000,000,000,000,000 atoms.

4) Matter cannot be created or destroyed, it can however be rearranged

All of the atoms in existence were created billions of years ago in the heart of stars early in the formation of the universe. I find this an extraordinary concept – that the atoms which make up our bodies have existed for billions of years during which time some of them may have formed part of the last Tyrannosaurus rex, the first flowering plant, or occupied the bodies of various historical figures. Carl Sagan puts this more eloquently and succinctly when he explains that “we are made of star stuff.”

5)Atoms combine to form molecules

The true diversity of the matter in our universe comes from the ability of atoms to combine to form molecules. Molecules can be simple, for example water, which is made up of one oxygen atom and two hydrogen atoms, or complex, such as DNA, which can be made up of billions of atoms of the elements carbon, hydrogen, oxygen, nitrogen and phosphorus.

Molecules are also incredibly small – a single aspirin tablet contains approximately 1,000,000,000,000,000,000,000 molecules of the active ingredient, acetylsalicylic acid.

6) The shape of a molecule is key to its properties

The shapes of molecules have a fundamental effect on their properties. Water molecules, for example, have a V-shape which allows water to exist as a liquid at room temperature and to dissolve many different compounds. Without these fundamental properties, life as we know it would not have been able to evolve on Earth.

The shape of molecules is a key consideration in the development of new drugs. Many drugs work by interacting with specially shaped receptor or active sites in the body. To activate or deactivate these sites, a molecule of complementary shape must be able to fit into the site. And by making subtle changes to the shapes of such molecules it is possible to tune the effect of the drug molecule.

7) Matter moves

It may not be obvious to the naked eye or even under a microscope but all matter moves. In liquids such as water, the individual molecules move relative to each other, only fleetingly and temporarily interacting with other water molecules. This can be observed by adding a drop of food colouring to a still glass of water. The movement of the water molecules alone slowly mixes the colouring throughout the glass without any need for external agitation.

What do these concepts tell us about homeopathy?

Homeopathy was developed just over 200 years ago and is based on three principles:
a) that diseases can be treated by using substances that produce the same symptoms as the disease; b) that the greater a substance is diluted the more potent it becomes;and
c) that homeopathic solutions are ‘activated’ by physically striking them against a solid surface.

If one considers these principles against the core chemical concepts discussed so far they make little sense. How can less of a substance be more potent? How could the variable striking of water solutions have any effect on water molecules which are already in motion relative to each other, and which are therefore unable to form any collective memory of an active substance? For homeopathy to work, key chemical concepts which underlie and explain much of what we know about the physical world would have to be turned on their heads. Such a challenge to well-established chemical concepts would require extraordinary evidence.

To date, no such evidence has been provided by homeopaths. Instead, over the past 200 years, repeated attempts to prove that homeopathy works have demonstrated little more than the placebo effect and the human propensity for confirmation bias.

More Chemical Concepts

8) The Earth is a closed system in terms of mass

Apart from the launch of the occasional deep space probe, the loss of helium into space, or the addition of the occasional meteor, the Earth retains a constant mass. Thus, our physical resources are limited.

9) Matter is continuously recycled

Although we have a limited resource in terms of matter, this matter is continuously recycled as these ancient and indestructible atoms are converted from one chemical compound to another. For example, the carbon in coal when burnt is converted to carbon dioxide which may then be converted by plants into sugars. Such recycling occurs for many elements, particularly in the biosphere of our planet.

10) Chemical compounds can store and release energy

Some chemical compounds are rich in energy and this energy can be released to produce energy-poor compounds. For example, when we burn coal or oil we release energy and produce energy-poor carbon dioxide, or when we consume sugars we use the energy released in our bodies and again produce carbon dioxide. This carbon dioxide can be recycled through photosynthesis in plants to produce more sugars and other energy rich compounds for food. The same is not possible for coal or oil, and as such these are limited resources.

11) Systems are in equilibrium

The systems by which matter is continually recycled are very complex and interrelated. Such complex systems are usually in equilibrium – this means that if we change one variable the system will adjust itself to compensate. For example, as the amount of carbon dioxide has increased in our atmosphere, some of it has been removed by dissolving in the oceans.

The idea of system equilibrium is used by some to claim that an increase in carbon dioxide concentrations in our atmosphere is harmless as the system can rebalance itself. This is potentially dangerous thinking. Most systems, particularly complex ones, can only buffer a certain amount of change, beyond which the system may undergo significant change as it attempts to rebalance itself. Such changes would not necessarily be conducive to human life.

What do these concepts tell us about our environment?

Fossil fuels are a non-sustainable source of energy that also release pollutants and increase carbon dioxide levels in the atmosphere. Humanity would be better served developing alternative sources of energy which harness the power of the sun more directly, for example, through solar panels, hydroelectricity, wind turbines or biofuels. More attention needs to be paid to the effects of increasing carbon dioxide levels in our atmosphere, and its effect on the equilibrium of the Earth’s biosphere.

Chemophobia – Causes and Consequences

There are millions of different chemical compounds in existence and chemists use a standardised naming system in order to better catalogue and compare these fascinating compounds. Unfortunately, amongst non-chemists this chemical jargon can create concern and even fear. For example, most people when asked would turn down an offer to eat a mixture containing methylmethoxypyrazine, phenylacetaldehyde and b-tocopherol, at least until it is revealed that the aforementioned mixture is a chocolate bar, and all of the compounds are natural components of chocolate.

This caution or fear of the unknown is a natural instinct which has served human beings well throughout our evolution – allowing us to avoid poisonous foods and dangerous predators. However, in the modern world it can be used against us. Referring to compounds by their chemical names is a ploy used by various interest groups including alternative health gurus and anti-vaxers to try and create fear of mainstream medicines.

Furthermore, it has allowed the development of the myth of the ‘chemical-free’ product. To a chemist, the only thing that is chemical-free is a vacuum.

The term ‘chemical-free’ appears to be an invention of the marketing industry: an attempt to sell products by suggesting that if they contain only natural compounds they must be safe, healthy and/or environmentally friendly. This is, of course, very flawed reasoning. Nature produces a wide range of compounds that are toxic to humans. Tetrodotoxin from poorly prepared puffer fish, ricin from castor beans (used to assassinate a Bulgarian dissident in 1978), digitalis from foxgloves and arsenic in groundwater are all just as capable of knocking us off as any synthetic compound.

Indeed, when it comes to toxicity it is not whether something is natural or synthetic that is important. Rather it is the dose. Any substance is capable of being toxic. Consuming four litres of water in two hours can prove fatal, as can several hours’ exposure to a 100 percent oxygen atmosphere.

The idea that toxicity is dose-dependent is not new. In the 16th century the Swiss physician Paracelsus stated that “all things are poison, and nothing is without poison; only the dose permits something not to be poisonous.” However, it remains a concept that is not well understood today. Special interests groups have used this to create fear around issues such as water fluoridation, vaccines, and environmental issues. For example, when DDT started to be detected in the environment at part per million levels, the resulting knee-jerk withdrawal of DDT from the marketplace resulted in a resurgence of malaria in many vulnerable populations. Following the introduction of DDT in Sri Lanka, by 1963 the number of cases dropped to 17. A few years after DDT use was banned, the number of cases increased to 2.5 million cases in 1968 and 1969.

Another consequence of chemophobia, is that it can encourage people to embrace ‘alternative’ treatments, such as homeopathy. An example of the terrible consequences of such erroneous thinking was the death of Gloria Thomas, aged nine months, in Australia in 2002, when her homeopath father refused to treat her eczema with conventional medicine. Instead, she was given homeopathic remedies until she died of septicaemia and malnutrition.

Absurd Chemical Therapies

One of the incredible hypocrisies of some alternative medicine practitioners is that they may also embrace absurd chemical therapies. Anti-vaxers who claim autistic children are really suffering from mercury poisoning sometimes promote the use of chelation therapy. Chelation therapy involves the intravenous use of chemical agents which bind to heavy metals in the blood. It is an invasive technique which can also strip the blood of important metal ions such as calcium. Indeed, there are examples of patients who have died because too much calcium has been stripped from their blood.

Other alternative treatments have included ‘miracle mineral solution’ as a treatment for everything from Aids to Irritable Bowel Syndrome. Such wide-reaching claims are an immediate warning sign, as is the revelation that ‘miracle mineral solution’ is, in fact, a 28 percent solution of bleach! Dilute solutions of dimethylsulphoxide (DMSO), an industrial solvent, have similarly been promoted as a cure-all, supported by, of course, only vague anecdotal evidence.

When challenged, those peddling these absurd therapies will often cry ‘conspiracy’, and claim they are being victimised by the all-powerful pharmaceutical industry.

Consequences of not understanding chemistry

We live in a world where important public debates are becoming contaminated with non-science and nonsense. Knowledge of chemistry can help us identify and challenge some of the non-science and nonsense when exploring important issues such as climate change, environmental issues, water fluoridation and vaccination.

Is chemistry an antidote to pseudoscientific thinking?

At the beginning of this article I posed the question, “Is chemistry an antidote to pseudoscientific thinking?” And while I hopefully have demonstrated that knowledge of chemistry can help identifiy and challenge pseudoscientific thinking, I cannot claim that it, alone, is an antidote. I know this because there are those who despite a background in chemistry still embrace pseudoscientific beliefs. These include:

  • David Rasnick – after training as a chemist and working in medicinal chemistry for 20 years Dr Rasnick became an Aids denialist and proponent of vitamin ‘therapies’.
  • Kary Mullis, Nobel prize-winning biochemist, is an Aids denialist, a believer in astrology, and claims to have met an extraterrestrial disguised as a fluorescent raccoon.
  • Lionel Milgrom, research chemist for 30 years, is now a practicing homeopath and prominent advocate of homeopathy.

The idea that those who have trained to an advanced level in chemistry (or any other science) can go on to embrace pseudoscience has always intrigued me. I’ve often wondered how such a transition could occur, and would suggest that perhaps one or more of the following factors may be involved:

1) Frustration with science

Progress in science is often slow and frustrating. The temptation to find an easier, albeit fallacy-based career may be appealing when faced with the many frustrations of laboratory work.

2) External bias

Religious and moral beliefs may introduce bias. For example, a number of Aids denialists are blatantly homophobic.

3) No understanding of the scientific method

While most scientists pick up the principles of the scientific method during their training, few that I am aware of are explicitly taught the scientific method.

4) Need for attention/notoriety

5) Financial motives

The peddling of pseudoscience can be quite lucrative, particularly when you can use academic qualifications to lend the appearance of legitimacy to one’s claims.

I suspect that in most cases, the embracing of pseudoscientific beliefs by scientists is a gradual process, where step by small step, they move away from the scientific method until eventually they find themselves no longer bound by its philosophy and rigour.


While an understanding of chemistry does not necessarily provide an antidote to pseudoscientific thinking, when coupled with the tools of rational thinking, it provides the skills to critically assess many areas where pseudoscientific beliefs persist including water fluoridation, environmental science, climate change, homeopathy and alternative medicines.

“Never let yourself be diverted by what you wish to believe, but look only and solely at what are the facts.” -Bertrand Russell

Earthquake forecasts and earthquake predictions

Earth scientists can forecast the size and frequency of the aftershocks following Canterbury’s September 2010 earthquake. But this is very different from earthquake prediction. This article is based on a presentation to the 2011 NZ Skeptics Conference.

Since the moment of the magnitude 7.1 earthquake in Christchurch on 4 September, GNS scientists have been using models based on aftershock statistics to ‘forecast’ the expected range of aftershocks of given magnitudes. Not to be confused with earthquake ‘predictions’, which require specific magnitudes, locations, depths, times, and methodological reproducibility estimates to be useful, this forecast model is based on a modified version of the long-established Omori’s Law for aftershocks, which states that the rate of aftershocks is proportional to the inverse of time since the mainshock. Thus, depending on the values of parameters specific to certain regions, whatever the odds of an aftershock are on the first day, the second day will have approximately half the odds of the first day and the tenth day will have approximately one tenth the odds of the first day. These odds can be summed over various time scales, and the longer the time scale, the higher the probability, even though the probability decreases with time.

At present, these forecasts commonly look something like this:
“The expected number of aftershocks of magnitude 5.0 and above for the next month is 0-2, with an expected average of <1”.

Of course, one could dress this up differently using the same model applied over a full year, taking into account a reducing number of expected aftershocks, and the statement would look something like this:
“The probability of a magnitude 5.0 and above aftershock over the next year is ~82 percent”.

We have had 31 magnitude ≥ 5.0 events since September, the frequency of which has declined systematically following our large earthquakes in September and February. So to say that there is a near certainty of an event occurring somewhere in this range in the next year is no surprising conclusion, because the unfortunate reality of aftershock sequences is that earthquakes decrease in frequency but not magnitude. Remember also that this takes into account the entire aftershock zone, spanning an area from the eastern foothills of the Southern Alps, to offshore east of Christchurch, to Rangiora and throughout the Banks Peninsula; it doesn’t forecast the likelihood of one of these events occurring beneath your house. Large aftershocks have been recorded as far west as the Porter’s Pass area.

The probability of larger earthquakes (M<6) is a bit trickier, although the methodology behind the statement:
“There is a 10 per cent chance of a magnitude 6.0 to 6.4 quake in the next year”
is the same.

To generate an earthquake of M ≥ 6, it is helpful to know whether there are faults that are long enough and ‘connected’ enough to be able to do this, and whether these faults have ruptured in big earthquakes in the past. One way to explore this is to image faults in the subsurface using geophysical methods such as reflection seismic, gravity, and aeromagnetics. These can be combined with ‘relocations’ of aftershocks and by analysing the extent to which seismic waves are ‘guided’ by fault networks, which collectively help to refine the internal structure and strength of fault zones.

The Gap

The ‘Gap’ is a term used in reference to the region of intense and continuing aftershock activity between the eastern end of the Greendale Fault that ruptured in the 4 September Darfield earthquake and the western end of the Port Hills Fault that ruptured in the 22 February Christchurch earthquake.

Analysis of earthquake data and geophysical seismic reflection surveys indicates that the Gap is not a simple continuation of either the east-west striking Greendale or ENE-WSW striking Port Hills Faults. Instead, it is a complicated zone of NE-SW to E-W oriented, steeply SE dipping faults with a total length of up to 10-12 km that is defined by an array of aftershock earthquakes that range in depth from 2 km to greater than 10 km.

Preliminary interpretations of seismic surveys indicate that a series of faults in the Gap have ruptured at various times over the past several hundred thousand years. Based on the length of the aftershock zone and the types of deformation we see in the seismic sections, we estimate that this region has probably experienced major earthquakes in the range of Mw 6-6.3 in the geologic past. Such events appear to be very infrequent, ie, recurring only once every 10,000 years or more, because even sediments that are millions of years old are only subtly deformed. We do not see any evidence for a surface rupturing earthquake in the last 5000-10,000 years or so based on interpretations of air photos from this area.

The Gap has been seismically active throughout the Canterbury earthquake sequence, from immediately following the September mainshock to the present. There have been two earthquakes of M > 5 and 23 earthquakes of M > 4 in the gap since 4 September.

The total seismic energy release in this Gap (seismic moment) is less than the total energy released in the adjacent Port Hills and Greendale Faults. In the simplest interpretation, the total seismic energy release from the Gap would eventually fit a ‘smoothed’ profile between the Greendale and Port Hills Faults. This is not necessarily required, but it is something that would best fit our models for how fault slip accumulates across fault systems through time. ‘Filling the Gap’ could occur via a continuing series of smaller earthquakes, as has been the case so far, or via a larger event, possibly as large as a low magnitude 6 to high magnitude 5. From what we understand about the behaviour of earthquakes in this area to date, it seems most likely to us that this region will continue to release seismic energy in the form of smaller earthquakes rather than an isolated large one, although this possibility still remains.

The processes governing fault rupture are somewhat complicated, but our scientific understanding of these processes continues to improve. One could ask, “Why should the Gap behave one way during one earthquake sequence and a different way in another?” The answer is that the order and the direction in which adjacent faults rupture, the areas of these ruptures, and the processes that go on between large earthquakes, such as fault rock healing and fault closure, all influence the rupture behaviour of an individual fault segment. The overall pattern since September has been an eastward propagation of major earthquakes, starting with the Darfield earthquake in September, then the Port Hills fault rupture in the February earthquake, then the June earthquake even further east. If the sequence had started in the east and propagated west, it is entirely possible that some of these faults may have behaved differently.

Marine surveys by NIWA immediately offshore of Christchurch have revealed additional faults, some of which have had small earthquakes on them during this seismic sequence. The lengths of these faults suggest that some are capable of generating earthquakes as large as or larger than the 22 February event, however, the increased distance from Christchurch would reduce the impact on the city for a similar-sized event. In the face of our seismic realities, the best way forward is to take this opportunity to make Christchurch one of the world’s most earthquake-resilient cities.

Geologic analogies

This is my favourite geologic analogy for the Canterbury earthquake sequence. On April 23, 1992, the Mw 6.1 Joshua Tree earthquake rocked the Californian desert east of the San Andreas Fault. Two months later, on June 28, 1992, the Mw 7.3 Landers earthquake occurred in the same region, with an epicentre located approximately 40 km north of the Joshua Tree epicentre. Three hours after the Landers event, the Mw 6.2 ‘Big Bear’ aftershock occurred some 40 km to the west. On 16 October 1999, seven years after the Landers event, the Mw 7.1 Hector Mine earthquake occurred, with an epicentre some 40 km north of the Landers epicentre.
This area is adjacent to a section of the San Andreas Fault (America’s version of our Alpine Fault) that had not had a major earthquake since 1812 (one segment) and 1680 (another segment), just as our Alpine Fault does not appear to have ruptured in a major earthquake since 1717.

Palaeoseismologic estimates of the recurrence intervals of clusters of earthquakes in the Mojave Desert near the Landers rupture are in the range of 5000 to 15,000 years (Rockwell et al., 2000), similar to the expected range of recurrence intervals of active faults in our Canterbury Plains. So a situation like this is possible, although we would obviously prefer that the region settled down without the occurrence of any more big events.

Where to from here?

We’ll do our best to provide the best scientific information possible. Wait for the information to come from scientists regarding the earthquake history, likely lengths, and ‘connectivity’ of faults in our region. Then take into account whether you want to occupy your time with fear of the next big one, which may or may not eventuate in the next few years or more, or get on with your life while learning lessons about being prepared for earthquakes.

Could the magnitude and location have been predicted?

Generally, when considering the maximum magnitude in an aftershock sequence, seismologists refer to Bath’ s Law, which states:
“The average difference in magnitude between a mainshock and its largest aftershock is 1.2, regardless of the mainshock magnitude”.

This is a generalisation based on analysis of global earthquake datasets, recognising that each aftershock sequence is different and there are many exceptions to the rule. Let’s look at how Bath’s Law predicts the largest aftershock magnitude for some of New Zealand’s largest earthquakes.

Earthquake Date Magnitude Largest Aftershock(s)
Hawke’s Bay 1931 7.8 6.9, 5.9
Pahiatua 1934 7.5 5.7
Wairarapa June 1942 7.0 4.7
Wairarapa December 1942 6.0 4.7
Gisbourne 1966 6.2 5.0
Inangahua 1968 7.1 6.0
Arthur’s Pass 1994 6.7 6.1
Table 1. A comparison of the magnitude of some NZ earthquakes and their largest aftershocks

Table 1 shows mainshock-aftershock comparisons for some large New Zealand earthquakes.

The average difference between the largest aftershock and mainshock for this small New Zealand dataset is 1.2, consistent with Bath’ s Law. Prior to 22 February 2011, the largest difference between the 2010 Darfield 7.1 mainshock and largest aftershock (5.6(, that occurred only about 20 minutes after the mainshock, was 1.5. There was reason to be optimistic, as this difference had been seen from other events; however all scientists working on the Darfield earthquake acknowledged that a larger aftershock was still possible. Unfortunately, our fears were confirmed, with the 22 February magnitude 6.3 aftershock (0.8 point difference from mainshock, perhaps higher than predicted from a simplistic interpretation of Bath’s Law( and the June 13 6.0 event.

This illustrates that, while we can use historical examples to help us predict possible aftershock magnitudes, each sequence can be different, depending on the length (or more accurately, the potential rupture area) of faults throughout the area, the strength of the faults, how close they are to their breaking points, and how things like stress transfer and fluid pressures associated with the mainshock or other aftershocks influence these faults. This illustrates how important it is to know the location and length of other faults in the vicinity of Christchurch and offshore before we even discuss putting billions of dollars into a rebuild. This can be done relatively inexpensively with existing technology. Shouldn’t we know the location and magnitude potential of other faults throughout this region, and model how they may have been stressed or de-stressed following our big earthquakes before buildings are even designed?

To summarise, the magnitude of the 6.3 could not have been exactly predicted, but something within this magnitude range was always possible and all scientists involved in this event recognised this. We were hopeful it would not occur. A glance through some of the largest New Zealand earthquakes from the last century indicates considerable variability in the magnitude of the largest aftershock, but an aftershock of this large magnitude compared to the mainshock is not unprecedented (eg the 1994 Arthur’s Pass earthquake sequence(.

Earthquakes and the moon: should we worry?

  1. No one has predicted the recent earthquakes in Canterbury. Vague quotes about dates of ‘increased’ activity plus or minus several days, without magnitudes, locations, and exact times do not constitute prediction. Consider this: Ken Ring’s probability of getting a prediction correct based on perigee/apogee new moon/full moon for 2010 was 63 percent. That’s 230 out of 365 days that fall on some day that he would argue influences earthquake activity. For days that combine several factors of new moon/perigee etc, he missed out on several predictions and nothing unusual happened on those days. (ie 30 January, 14 February, 27 February, 29 March, 14 June, 12 July, 10 August, and so on for his liberal interpretation of the aftershock sequence). This does not constitute ‘prediction’. It is opportunistic and meaningless self-promotion.

  2. Consider your chances of getting a ‘prediction’ correct given this unscientific definition of prediction. On average, New Zealand gets around 330 earthquakes of M4-4.9 every year, 26 M5-5.9s per year, two M6-6.9s per year, and one M 7-7.9 every three years (see stats on Geonet). If unspecific about magnitude and location, then your chance of ‘predicting’ an earthquake that is likely to be locally felt and recorded is greater than 90 percent (based on the simplified method of assuming each earthquake occurs on a different day, which isn’t the case, but you get the picture). This of course goes up immediately following a major earthquake like our 7.1 where the occurrence of large events is high. We had 203 earthquakes greater than 4 in the Canterbury region close to the 7.1 rupture in the six months since 4 September. So one’s chances of ‘prediction’ are actually quite high.

  3. If we had been specifically predicting large earthquakes (M>6) on the faults near Christchurch that ruptured on 4 September and 22 February using the moon over the last several thousand years, we would have been wrong many thousands of times, with a success rate of ‘zero’, even invoking the broad criteria cast by invoking all of the possible moon scenarios listed above.

  4. There is no clear correlation between the largest aftershocks in the Darfield earthquake aftershock sequence and diurnal tides. Some of our largest earthquakes have occurred near high tide and some near low.

  5. Consider implementation of this ‘predictive’ strategy. Should we evacuate an area every time the moon is on its closest approach, is full or new, is moving rapidly, is at its maximum declination or is crossing the equator? Imagine the fear and frustration of such an approach, particularly given the unspecified times, locations, and magnitudes of the supposed ‘imminent’ events. Without a basic understanding of how faults generate earthquakes, where the faults are, at what stage they are at in the seismic cycle, and how they have been affected by prior activity, where should we evacuate and where should we go to? This would require several evacuations a month of ‘unspecified areas’ to other ‘unspecified areas’.

  6. Since humans first looked into the sky and felt the effects of earthquakes, they have wondered if the moon and planets are in some way responsible for major earthquakes. As early as 1897, scientists began to pose hypotheses about moon-earth earthquake connections and test them in honest and rigorous way. After all, the moon still gets earthquakes in the absence of plate tectonics, so perhaps there is some validity to this claim.

While some astrologers may feel isolated from the scientific community, this shows a true lack of appreciation for all of those dedicating significant effort to this issue. Many of these findings from studies comparing earthquake catalogues to tides have been published in high-quality journals such as Science (eg, Cochran et al, 2004) and some scientists have argued based on statistical data from global earthquakes for an influence of tides on earthquake activity under certain circumstances, such as beneath the oceans and within active volcanoes. Some scientists have even argued for a small correlation (perhaps an increased earthquake likelihood of 0.5 to 1 percent) between smaller, shallower continental earthquakes and ‘solid earth tides’ (changes in the shape of our planet due to the gravitational pull of the moon).

This is peer-reviewed but controversial research; it does not make it so, but it has undergone scrutiny and will continue to do so. This is the scientific process. To this end, I have a postgraduate student conducting high-level geologic and statistical research on the Canterbury aftershock sequence, including spatial, temporal, and mechanistic relationships with lunar parameters. You can bet that any results, regardless of the outcome, will be published for all to see and openly scrutinise.


Top scientist turns to alternative medicine

Prominent physicist and science commentator Sir Paul Callaghan is resorting to vitamin C megadoses and Chinese medicine to treat his terminal cancer (Dominion Post, 22 September).
Diagnosed in 2008 with aggressive bowel cancer, he has been advised by his oncologist to take a break from chemotherapy to establish the full extent of the cancer’s spread. He is using the time to trial “unproven but interesting” therapies, including traditional Chinese medicine, intravenous vitamin C and “Uncle CC’s famous vegetable juice”.

“Let me be clear. I do not deviate one step from my trust in evidence-based medicine,” Sir Paul said in his blog. However, if there was a potentially effective but unproven drug, “Why would I not try it?” he reasoned. “Am I mad? Probably.”

Victoria University’s Professor Shaun Holt said he could understand terminal cancer patients clutching at straws, but there was no evidence to support vitamin C treatment. It could be harmful, causing kidney problems and interfering with effective treatments such as radiation therapy.

He was concerned Sir Paul’s use of the treatment would further increase the already high number of cancer and leukaemia patients asking for the injections.

GG swears by homeopathy

Another high-profile New Zealander expressing interest in alternative therapies recently was new Governor-General Sir Jerry Mateparae The 56-year-old revealed in an interview (Dominion Post, 2 September( he and his wife Janine, Lady Mateparae shared an interest in homeopathy.
He said he had not taken a sick day since 1998. “We’ve practised a certain way of looking after ourselves which has been very good for me.”

Perhaps he feels it’s part of the job, given his position as the Queen’s representative in New Zealand, and the royal family’s well-known interest in the field.

Blogger John Pagani commented: “Placebos get you quite a long way, but only so far. After that you need actual medicine. If a soldier gets shot up on a battlefield in, say, Afghanistan, he doesn’t want Sir Jerry rubbing arnica cream on the sore bit.”

Divine solution to liquefaction

A Sefton water diviner believes he has the solution to Canterbury’s liquefaction problems (Central South Island Farmer, 7 September).

Dave Penney says he can identify underground water flows by running a crystal over a Google map, followed by on-site investigation. While the article quoted one happy customer, Waimakariri utilities manager Gary Boot was unconvinced by Penney’s proposal that “confluences” of underground flows could be located and drilled, to reduce pressure and stabilise the land. Areas with the worst liquefaction had widespread and very consistent groundwater, Mr Boot said.

“Finding the groundwater is not the challenge. The challenge is how best to treat the land in an affordable manner.”

iPhone trumps psychic

Chilean authorities have used a psychic to help find 17 missing bodies after the crash of a plane near Robinson Crusoe Island killed all 21 people on board (NZ Herald, 6 September).
“Not only are we using all of our technological capabilities, but also all the human and superhuman abilities that may exist,” said Defence Minister Andres Allemand.

He did, however, seek to lower expectations of recovering all the bodies.

The plane’s fuselage was located a few days later, in part using information from a passenger’s iPhone, which transmitted its location shortly before the crash (AVweb, 9 September). Now if only psychics were as smart as iPhones.

Spontaneous Human Combustion in Ireland?

An Irish coroner has ruled a pensioner found dead at home was a case of spontaneous human combustion (NZ Herald, 25 September).

Unsurprisingly, given the history of this phenomenon, 76-year-old Michael Faherty’s charred remains were found on the floor near an open fireplace. Forensic experts concluded the fire was not the cause of the blaze, and that there were no accelerants at the scene. The only damage to the room was a scorched ceiling and floor adjacent to the body.

The case sounds like a classic of its type: an elderly diabetic with presumably limited mobility is found next to an open fire, with his body consumed but his head left unburned. In 1998 scientists on the British TV programme QED (available from the NZ Skeptics video library( showed how this happens, using a pig carcass wrapped in fabric to simulate the victim. An ember spat from the fire catches in clothing and starts to burn; the fire is then fed by fat from the victim (who has already died of a heart attack, or is about to due to the stress of finding himself alight) as it melts and ‘wicks’ into the clothing. The head, lacking a decent supply of fat, remains unscathed, and any sign of heart disease or other pathology is burned away. QED, indeed.

Medium caught cheating

Sally Morgan, who styles herself “Britain’s best-loved psychic” has been caught receiving outside information during one of her shows (The Guardian, 20 September).

Chris French, editor of UK magazine The Skeptic, relates how an audience member named Sue reported on an Irish radio station how she had been impressed by Morgan’s accuracy during the first half of her performance.

“But then something odd happened. Sue was sitting in the back row on the fourth level of the theatre and there was a small room behind her (‘like a projection room’) with a window open. Sue and her companions became aware of a man’s voice and ‘everything that the man was saying, the psychic was saying it 10 seconds later.'”

Other callers to the radio show confirmed Sue’s account.

Sue said she believed the man was feeding information to Morgan via a microphone. The voice would say something like “David, pain in the back, passed quickly”, and a few seconds later Morgan would have the spirit of a David on stage with just those attributes. When a member of staff realised several people were aware of the voice the window was gently closed.

Sue speculated that information had been gathered in the foyer prior to the show by an accomplice engaging audience members in conversation, a technique French says ‘psychics’ use widely, as their marks naturally discuss among themselves who they are hoping to hear from.

The theatre’s general manager claimed the voice came from two theatre staff members. Sally Morgan Enterprises also denied that the medium was being fed information during the show.

French compared the incident to James Randi’s use of a radio scanner to pick up messages sent to faith healer Peter Popoff’s earpiece in 1986, the subject of an entertaining YouTube video clip. Although his exposure led to him declaring bankruptcy the following year, Popoff is back; his ‘ministry’ received US$23 million in 2005. History suggests, says French, that most of Morgan’s followers will continue to adore her and pay the high prices demanded to see her in action, despite this incident.

Ring again

Just one more small piece on Ken Ring then no more, I swear. Despite promises to get out of the earthquake prediction business, he was in Upper Hutt recently declaring Wellington could expect a magnitude 7 quake some time between 2013 and 2016 (Upper Hutt Leader, 5 October).

Of course, predicting earthquakes in Wellington is a bit like predicting drought in the Sahara, and a four-year timeframe is a bit vague, to say the least. He says Wellington gets magnitude 7 quakes every 11 to 13 years (really?) and this period is when the next one is due.

I guess any half-way decent shake in the next eight years or so will be put down as a hit, and if nothing comes along before the end of 2016, who’s going to remember what he said in the Upper Hutt Library in October 2011? How can he lose?

Irrationality waxes once again

There are times when the world seems to run along quietly from day to day, with very little happening. Then there are times like these. There are the ongoing aftershocks in Christchurch, many of them big enough in their own right to qualify as major quakes at any other time. There was the far larger earthquake in Japan, with its ensuing slow-motion nuclear disaster. There are wars and revolutions across the Middle East and North Africa which seem set to transform the politics of those regions. Millennial anxieties are on the rise once more.

It’s only to be expected at such times that irrationality should flourish. When natural disasters strike at random, many have a desperate need to seek some kind of pattern, or cause. Hence the attention given to Ken Ring’s claim to have used phases of the moon and solar activity to predict the Christchurch quakes – if the experts can’t say when earthquakes will strike (though the general pattern of aftershocks has actually followed GNS’s forecasts quite well) then there is a niche for those who claim they can. Many skeptical bloggers (eg Peter Griffin, Matthew Dentith, Alison Campbell, Darcy Cowan and particularly the Silly Beliefs team) have dealt with Ring’s claims; we add our five cents’ worth later in this issue.

Meanwhile in the US, many commenters on internet forums are putting the Japan earthquake down to karma for Pearl Harbour. Also in that country self-proclaimed prophet Harold Camping is raising quite a stir with his calculation that the Rapture will occur on May 21 this year – 19 months before the 2012 buffs’ choice for the Big Day. Camping says of the current upheavals: “There are still people that God has to save, and he uses them to get them to cry out for his mercy.”

There’s not much sign of that happening yet in Christchurch, where the citizens are more intent on helping themselves and each other, rather than seeking divine assistance. Slowly the city is getting back on its feet, despite ongoing tremors; life is returning. A small sign of that is that the NZ Skeptics annual conference will once again be held there, from 26 to 28 August. Register with the form mailed out with this issue, or do it on-line at

Christchurch always seems to have had more than its share of Skeptics, many of whom have been seriously affected by the quakes. It will be good for us to get together once again, to share the strength of our usually far-flung community.

Apocalypse soon: Unwarranted skepticism and the growth fetish

The dire predictions of the Club of Rome’s 1972 report on The Limits to Growth have supposedly been refuted by subsequent studies, but the refutations have serious shortcomings. This article is based on a presentation to the NZ Skeptics 2009 conference in Wellington, 26 September.

We belong to a species that dominates the planet. After millennia of steady growth which have altered regional environments and killed off many species, the human population has exploded during one lifetime. Whereas it took millennia to reach the first billion, the human population tripled in 140 years to three billion by 1960, and is currently trebling again in just 80 years, to nine billion in 2040. We have become a plague.

Many scientists, including myself, have been concerned with this picture. There is considerable evidence describing an overpopulated world, threatened by food and water shortages, a shortage of oil supplies, and huge changes due to global warming. Consider the message in the figure on the right, which adds more recent data to the Limits to Growth forecasts of Meadows et al (1972) for the Club of Rome’s Project on the Predicament of Mankind. World population may hit a peak around 2040-2050 and then rapidly decline. My own research, including work with a number of international forecasting projects, suggests that the peak will happen sooner, around 2030.

This model was based on a considerable body of research and is supported by many other more detailed studies. Here we have a picture of a world in which population may plummet following an overshoot-and-decline pattern when limits are passed.

I looked at this some 35 years ago with the eyes of an applied mathematician. I had seen that a model can capture the essence of a situation and provide realistic guidance, just so long as the model is based on the key aspects of a greater complexity. The thought of possible global collapse within one lifetime impressed me and I set off on a new career. I have found that the picture based on physical science can readily be fleshed out by reference to past historical events. It is easy to foresee the repetition of population collapse, social breakdown and war.

So here am I proclaiming apocalypse in just 20 years. What do you make of it? Either I am mistaken or society is just a little bit crazy. J B Priestley made this point in relation to William Blake:

And no doubt those who believe that the society we have created during the last hundred and fifty years is essentially sound and healthy will continue to believe, if they ever think about him, that Blake was insane. But there is more profit for mind and soul in believing our society to be increasingly insane, and Blake (as the few who knew him well always declared) to be sound and healthy.

I introduce this point as I have been treated as a pariah for taking up an extremely important scientific endeavour. Should you be sceptical of those like me who talk of impending catastrophe? Certainly, but consider the alternative which is to put your faith in those who have dismissed the reality of a finite Earth. The Limits to Growth was the subject of widespread denunciation by the supporters of status quo economic growth. Let’s look at the validity of some of the critics; we at the DSIR considered many and found some bizarre arguments.

One key critique was a 1977 Report to the United Nations, The Future of the World Economy, by a team headed by Nobel Prize- winning economist Vassilly Leontief. The Dominion reported that:

Among the most significant aspects of the study are its rejection of predictions by the Club of Rome that the world will run out of resources and choke on its pollution if it continues to expand its economy.

The summary of the report emphasised this theme:

No insurmountable physical barriers exist within the twentieth century to the accelerated development of the developing regions.

Read that carefully. It says “in the twentieth century”. The Limits to Growth‘s authors made a forecast of a possible calamitous population collapse around 2050 – not within the twentieth century. By stopping their model 50 years before, in 2000, the UN team made quite sure that they avoided any possibility of such an event. In fact, as far as they went, their forecasts are very similar to those of The Limits to Growth.

Such sleight of hand is not uncommon. In 1978 I worked for six months with the OECD Interfutures project. While I was able to study an extensive collection of input information, I had no real part in the analysis, which was dominated by a small core group. The 1979 report includes a claim that would be satisfactory to the clients, the wealthy nations of the world:

Economic growth may continue during the next half-century in all the countries of the world without encountering insurmountable long-term physical limits at the global level.

There are two reasons why this statement is misleading. Firstly, all their many computer model calculations stopped in 2000 and did not reach out that far, so this is not in any way based on the work of so many of us in this project. Secondly, they look ahead for just 50 years, thus stopping short of 2050, the forecast time for crisis. It is always easy to dodge a crisis by stopping short of the due date, like the fellow falling off a building who felt that all was well as he sailed down, before he reached the pavement. They knew what The Limits to Growth forecast; they knew what they were doing.

These are examples of the way in which organisations employ expertise to generate desired results and make unjustified claims. Many readers will be sceptical of the warnings of approaching limits. Such scepticism may be better applied to many of the arguments for continuation of growth; here is a New Zealand example.

In 1990 the Planning Council published a report, The fully employed high income society (Rose 1990), which received nationwide publicity due to its suggestion that sustainable full employment with full incomes was possible by 1995 due to high rates of productivity increase – but otherwise continuing current policies.

When I read the document carefully I found some very questionable points:

(a) Estimates of employment requirements commenced in 1988 and ignored the significant loss of jobs between that date and 1990.

(b) Modelling of productivity increases commenced with modelling which has proved unrealistic and overly optimistic, and assumed a further doubling of productivity.

(c) The model run commenced in 1984 with these increases in productivity in order to generate an optimistic result in 1995, thus ignoring the negative experiences of 1984-90.

(d) The model was instructed to produce full employment by 1995 – this was not a consequence of the modelling based on policy changes as represented by input parameters.

(e) Full employment was completely generated by additional capital investment.

The model failed the most basic scientific test of forecasting even before it was published. In the four years from 1986, the date one model run commenced, to 1990, the date of the report, the model had suggested an increase in employment of 38,000, whereas the actual experience was of a fall of 90,000. Nor was that followed by a fully employed society – indeed unemployment was 11 percent in 1991.

The main feature of this work was a failure to produce the required result of full employment within realistic model parameters. The correct process would then have been to report that finding, which would have been in line with what actually happened, but they chose to tweak the model by the introduction of massive capital investment. This artificial process forced the model to say what was wanted and the result was then widely publicised.

Whereas the growth merchants have feet of clay, the limits forecasts from the 1970s hold up well when put to the test. When in 2008 the CSIRO (noted above) returned to the 1972 forecasts of The Limits to Growth and considered whether the real world had followed the forecast trends, the results were convincing. They considered measures of population (birth rates, death rates and population growth), food (and food per capita), services (basic education, electricity and suchlike), industrial output per capita, non-renewable resources and global pollution. All were tracking along the forecast paths towards the coming crisis.

The graph of population on page 3 is typical. These further graphs (right) of food and industrial output per capita, non-renewable resources and global persistent pollution show the same correlation between forecast and observation.

Data since 1972 follow the standard run closely, and do not deviate to follow alternative paths. This result echoes a study I carried out in 2000, when I found that my worrying picture built up around 1980 was robust. Trends have been intriguingly following the expected pattern, including more recently the 2008 oil peak and economic collapse, galloping global warming and the appearance of boat people off the Australian coast.

When I studied the futures literature back in 2000 I found two very different dominant themes. Each followed observed trends and each could describe features of the coming decades. Some of the articles suggested the possibility of food shortages, which would exacerbate the considerable inequalities observed today. That negative scenario may be exacerbated by water shortages and climate change. However a much more prevalent picture was of increasing human capabilities, new technologies and wealth.

No choice is needed; both sets of forecasts may prove robust, as existing trends take different regions or different groups along very different paths. There is then the possibility of the coexistence of two very different societies in the future. This is quite likely; after all it was like that in mediaeval times and in eighteenth and nineteenth century Europe, and this is the reality in many parts of the world today.

I have described the application of the scientific method to long-term forecasting. This is the way a scientist operates, in a search for the truth. An opposite process is followed in economics, where false analyses are widely publicised, and the fit of forecast to reality is ignored. New Zealand discourse is dominated by shonky science. The key work on global crisis comes from the Australian CSIRO while the DSIR, where I started my work, is no more. Here science is in a straitjacket of controls, totally gutted. In a recent round of grants eight out of nine applications were turned down, and initiative is killed as scientists waste time writing proposals for guaranteed results rather than asking questions and exploring the world. The human cost has also been enormous with the crushing of the lively, questioning spirit in true science. The fun of science is gone. Sadly the spokesman for the scientific community, the Royal Society (RSNZ) is quiescent.

Even in economics much more can be done. In 1989 I was able to foresee the collapsing system we have now. Sometimes I dream that we can recover the spirit of the 1970s when the debate was well-informed, when an initiative in the DSIR was supported and the Commission for the Future was set up. It is nowhere on the horizon. This is a country that is deep in denial, which can sign up to Kyoto and then do nothing as greenhouse gas emissions from 1990 to 2007 increase 39.2 percent for energy and 35 percent for industrial processes. Where is the madness here?

Ignorance goes nowhere. A people which faces the world with eyes wide open can gain a national spirit and decide to work towards a satisfying and full life for all, even in the face of adversity, rather than put up with the massive inequality introduced in 1984 and still touted as the way forward.

Graphs are reproduced with permission from Graham M Turner 2008: A comparison of The Limits to Growth with 30 years of reality, Global Environmental Change 18(3): 397-411.

Scare Stories Endanger the Environment

Vicki Hyde hands out this year’s Bent Spoon and Bravo Awards

A documentary which highlights the “distress, cruelty, horror, ecocide, cover-ups and contamination” involved in 1080-based pest control has won the Bent Spoon from the NZ Skeptics for 2009.

Poisoning Paradise – Ecocide in New Zealand claims that 1080 kills large numbers of native birds, poisons soils, persists in water and interferes with human hormones. Hunters-cum-documentary makers Clyde and Steve Graf believe that 1080 has “stuffed the venison business”, and have been travelling the country showing their film since March.

The NZ Skeptics, along with other groups, are concerned that wide media coverage and nation-wide screenings of Poisoning Paradise will lead to a political push, rather than a scientifically based one, to drop 1080 as a form of pest control, with nothing effective to replace it. United Future leader Peter Dunne appeared in the film, and described 1080 as “an indiscriminate untargeted killer”.

Emotions run high in the debate, with one anti-1080 campaigner going so far as to hijack a helicopter at gunpoint and last month threatening to die on Mount Tongariro unless the documentary received prime-time billing.

Members of the NZ Skeptics are involved in various conservation efforts across the country. They have seen first-hand the effectiveness of 1080 drops and the brutal ineffectiveness of attempts to control pests by trapping and hunting, even in the smaller fenced arks, let alone in more rugged, isolated areas like Hawdon Valley or Kahurangi National Park.

People say that 1080 is cruel – so is a possum when it rips the heads off kokako chicks. Environmental issues aren´t simple; we are forever walking a difficult balancing act. At this stage, 1080 is the best option for helping our threatened species hang on or, even better, thrive. It would be devastating for our wildlife were we to abandon this.

I have a particular interest in this area, having served for eight years on the Possum Biocontrol Bioethics Committee, alongside representatives from Forest & Bird, the SPCA and Ngai Tahu. Over the past 20 years I have seen 1080 use become more effective with the advent of better knowledge and application methods, though I acknowledge there is always room for improvement.

We would dearly love a quick, cheap, humane, highly targeted means of getting rid of possums and other pests but until that day comes, we cannot ignore the clear and present danger to our native wildlife. To do so would be environmentally irresponsible in the extreme. People should be cautious about taking documentaries at face value. A 2007 TV3 documentary, Let Us Spray, has just been cited as unbalanced, inaccurate and unfair by the Broadcasting Standards Authority.

We tend to assume that documentaries are balanced and tell us the whole story, but the increased use of advocacy journalism doesn´t mean this is always the case. After all, remember that psychic charades in programmes like Sensing Murder are marketed as reality programmes!

The NZ Skeptics also applaud the following, with Bravo Awards, for demonstrating critical thinking over the past year:

  • Rebecca Palmer, for her article The Devil’s in the Details (Dominion Post 15 June 2009) pointing out that the makutu case owed more to The Exorcist than to tikanga Maori. Exorcism rituals, regardless of where they come from, have been shown to harm people, psychologically and physically. There are over 1,000 cases of murder, death and injury recorded on the website as a result of exorcisms reported in the Western world over the past 15 years. There are thousands more, for the most part unregarded, in places like Africa, or Papua New Guinea. These are all needless victims, often injured by people who care for them and who tragically just didn´t stop to think about the nature of what they were doing.
  • Closeup for Hannah Ockelford´s piece Filtering the Truth (11 September 2009), regarding the dodgy sales tactics by an Australian organisation which claims that New Zealand’s tap water can cause strokes, heart attacks, cancer and miscarriages. Paul Henry described the Australian promoter as a shyster using scare tactics targeting vulnerable people.
  • Rob Harley and Anna McKessar for their documentary The Worst That Could Happen (Real Crime, TV1, 29 July 2009). They took a hard look at the increasing tendency for accusations of accessing computer porn to be made on unfounded grounds, and how it can have devastating consequences for people.
  • Colin Peacock and Jeremy Rose of Mediawatch on Radio New Zealand National. Every week Colin and Jeremy cast a critical eye on New Zealand media. That´s something we all should be doing in demanding that we get thoughtful, informed news and analysis from our media.

Skeptics and the environment

When it comes to environmental issues, it’s not always easy for a skeptic to decide where to stand

Over the last few years, there has been a growing community of “environmental skeptics”, who question the validity of global environmental concerns. Bjorn Lomborg’s book The Skeptical Environmentalist is a major contribution to this strand of thought. At the 2004 Skeptics’ Conference in Christchurch, Lance Kennedy presented some of the ideas that he espouses in his book Ecomyth. The final speaker of the conference, Owen McShane, presented his version of environmental skepticism, and an abridged version of his presentation appeared in Issue 74 of this journal.

Writers such as Lomborg, Kennedy and McShane provide interesting food for thought, and illustrate that in the environmental field, as in others, there is a need for careful critical thinking. However, there is a significant difference. In general, we skeptics tend to be skeptical about beliefs that run counter to mainstream scientific thought – astrology, paranormal phenomena, UFOs, creation science and alternative medical practices are examples. In contrast, environmental skeptics often bravely challenge the opinions of scientists who are specialists in the fields concerned. In this respect, environmental skeptics are somewhat equivalent to alternative medical practitioners or creation scientists. This does not mean that they are necessarily wrong, but it does mean that they have to demonstrate very good evidence to prove that the experts are wrong. For environmental skeptics, the adage “extraordinary claims demand extraordinary proof” applies to them rather than the objects of their skepticism.

In practice, environmental skeptics are often inconsistent and selective in their attitudes to science and professionals. For example, in his chapter on global warming, Kennedy largely ignores and discounts the work of the 2000+ climate scientists who make up the UN’s International Panel on Climate Change (IPCC). Yet in his chapter on nature conservation, he states “We should question what enables the amateur environmentalists to set themselves up as ‘experts’ and deny the analysis and planning of professionals.” Having made this statement, it is interesting that he feels that he is qualified to state categorically that “Global warming and its consequences are an unproven theory.” This statement is suspiciously similar to those made by “creation scientists” in criticism of evolution theory. In fact, I see parallels between the history of the evolution debate and the current climate change debate. It may be that in 100 years’ time, those that continue to deny the reality of climate change will be seen as the lunatic fringe minority and objects of ridicule for skeptics of the time.

In other ways also, environmental skeptics display some of the characteristics of those who we as skeptics would normally challenge. For example, I believe that the refusal to accept the reality of global environmental problems is very similar to the refusal of most people to accept that there is no life after death. It seems that humans instinctively reject unpalatable news.

In a similar manner to people such as proponents of quack medicine, environmental skeptics are selective in their use of scientific information. In his talk to the conference, Lance Kennedy stressed the need to employ good science, and that it is essential to “rely on the numbers”. Unfortunately, his book does not provide a good demonstration of this. For example, his chapter on global warming includes the graph in Figure 1. It is virtually meaningless, with no indication of the origin of the data, no data on the vertical axis and in fact no indication at all of what it purports to illustrate.

Environmental skeptics often ignore rather than challenge the mainstream environmental science community. They focus much of their criticism on sometimes admittedly questionable claims by the more visible and extreme environmental lobby groups. Greenpeace, WWF and the World Resources Institute are favourite targets. The skeptics often fail to clarify that, at a less visible level, there is a huge body of rational and responsible scientists world-wide who confirm a high degree of real cause for environmental concern. This is somewhat akin to condemning the whole world of Islam by quoting Muslim philosophies as espoused by Al Qaeda.

In the fields that we are traditionally involved in, we skeptics get frustrated about the willingness of the media to give time and credence to mediums, alternative health practitioners and the like, without seeking an informed balanced viewpoint. In the environmental field, it is the professional practitioners who can be frustrated by the coverage given to the environmental skeptics (and, for that matter, the antics of radical environmental lobbyists).

Environmental Management

I suspect that if the human lifespan was really the 500-800 years claimed for the Old Testament patriarchs, self-interest would assure that we would have quite a different attitude to the future state of the world.

Owen McShane states that “We are rich enough to care about the environment…Truly poor people focus on finding tomorrow’s breakfast.” In fact, the great majority of environmental aid projects in developing countries through UN and other reputable international agencies focus on the impacts of environmental degradation on people. They explicitly address and focus on the need to protect and improve the welfare of those in poverty. None of the many international environmental projects that I encountered in 10 years’ work in around 15 poor countries was based on the ecocentric anti-people philosophy that Owen McShane criticises.

As just one example of the direct impact of environmental mismanagement on human welfare, I mention Muinak, a village in northern Uzbekistan. Up until the 1960s, Muinak was the home for a fleet of fishing trawlers and a fish factory, as part of a fishing industry that took some 40,000 tonnes of fish per year from the Aral Sea. Under the direction of Soviet central planning in Moscow, the waters from the two major rivers feeding the Aral Sea were taken for irrigation of cotton crops. The Aral Sea is now a remnant of its former self, and the fishing industry is gone. When I visited Muinak about four years ago, the trawlers were rusting hulks in the sand. Muinak was over 100 kilometres from the water’s edge, and was fast becoming a ghost town. Most poignantly, the town’s World War II memorial, built on the sea cliffs overlooking the point where local soldiers embarked to cross the Aral Sea to join the war effort, now looks out over desert stretching to the horizon and beyond.

A particular concern that I have is that environmental skeptics (and for that matter some environmental lobbyists) tend to think in time scales that are far too short. A profound influence on my thinking was the marvellous “Time-Line” installation by Bill Taylor that we saw at Victoria University at the 2003 Skeptics Conference (NZ Skeptic 70). In brief, the 4.6 billion year life of Earth was represented by a cord 4.6 km long. On this basis, the 2000 years since the dawn of the Christian era occupied the final two millimetres.

I find it amazing and somewhat sobering to consider that, on this scale, the dawn of the Industrial Revolution occurred only 0.15 millimetres ago. There is no question that in that instant of geological time, humans have wrought major changes to our global environment. For example, it is an accepted fact that recent human activity has caused measurable changes to the composition of the Earth’s atmosphere, and in particular the concentration of the so-called “greenhouse gases”. The debate is not about whether these changes have occurred, but whether they are causing climate change. To me, that is almost academic. The fact that, in such a blink of time, we have caused measurable changes to the atmosphere that sustains all life is adequate cause for concern.

The Resource Management Act requires us to consider the needs of future generations. Owen McShane talked about the difficulty of this concept, because the future generations are walking away in front of us, so that we never get there. To me, this indicated that he sees “future generations” in the very short term, meaning our immediate successors, our children and grandchildren. I see things quite differently and in a longer term. Given the headlong pace of change and impact in just the last hundred years, my concern is for the way the 20th and 21st Century generations might be viewed in say 500 years (0.5 mm), 1000 years (1 mm), or even longer.

Environmental skeptics tend to airily dismiss energy concerns by saying that we have enough fossil fuels to last part or all of this century. Again, this is short term thinking. While one may argue about the remaining life of reserves of fossil fuels, the inescapable fact is that they are a finite resource. There is no doubt whatsoever that, on a geological or evolutionary time-scale, the period in which humans have been able to develop and maintain a lifestyle that relied on one-off extraction of fossil fuels will be a mere instant of history.

Loss of Forests

On the subject of forest loss, Lance Kennedy states that world forest cover has increased from 1950 to the present — from 40 million to 43 million square kilometres. In fact, the 2000 Global Forest Resources Assessment by the Food and Agriculture Organisation of the UN puts the current figure at 39 million. More importantly, its best assessment is that there was a net global forest loss in the 1990s of about 2.2%. This is equivalent to an area the size of New Zealand each three years. Again, this apparently small percentage figure is very serious if considered in any form of medium or long-term time frame. For tropical forests, environmental skeptics accept that there is a rate of loss of about 0.5% per year, but dismiss this as being of little cause for concern. Again, this is in fact a very high rate of loss both in absolute area and if considered in the context of even a medium time frame of say 100 years.

Some environmental skeptics dismiss concerns about any future scarcity of fossil fuels and their polluting effect by suggesting that they will be replaced by hydrogen as a source of energy for transport. In reality, hydrogen is not a fundamental energy source, but only a medium to transport energy, somewhat equivalent to electrical cables or batteries. Production of hydrogen itself requires huge energy inputs. Current technologies to produce hydrogen either use fossil fuels as a base (with large energy losses on the way through), or require electrical energy to produce it by electrolysis of water, again with energy losses in the process. For the moment, there seems little prospect of achieving the required dramatic increase in electricity production other than by using fossil fuels or nuclear energy, which of course is no more than a relocation of the same problem.

Both Lomborg and Kennedy ridicule pessimistic writers of previous decades. They paint a rosy picture of the current situation and point out how much better things are than such writers’ forecasts. What seems to escape them is that all such earlier predictions had an underlying message, “If we don’t change our ways, … will happen.” In fact, the improvements that Lomborg and Kennedy are now trumpeting are in nearly every case because governments and society responded to the concerns that grew so rapidly in the 60s and 70s, and did change their ways.

Ironically, having criticised the weaknesses in earlier predictions, Lomborg and Kennedy are willing to make or embrace unsubstantiated predictions that suit their arguments. For example, in discussing oil prices, Lomborg said that “It is also expected that the oil price will once again decline from $27 to the low $20s until 2020.” As I write this, the price is hovering in the mid-$50s.

Kennedy’s optimistic predictions are of a more general nature, apparently based more on a touching faith in science and technology rather than on rational analysis. They read rather like the confident predictions of a clairvoyant or an evangelist. For example, in discussing the predicted world population size in the middle of this century, he states “The world will be able to nourish such numbers by the time growth reaches this point. This ability will come from improvements in biotechnology and in other sciences, and in the increase of prosperity and agricultural efficiency in developing nations. The pessimists will again be wrong.”

This article is not a call to ignore and ridicule the work and beliefs of environmental skeptics. In this as in other fields there is a need for critical thinking. Having worked in the environmental field for some 30 years both in New Zealand and elsewhere, I have my own doubts about certain aspects. I have my own concerns about both the philosophy and application of the Resource Management Act. However, skeptics do need to appreciate that environmental skepticism is of a different character to skepticism as we usually understand it, and needs to be approached with caution. It is easy to criticise mediums, psychics, homeopaths and spoon benders, with little fear of exposing ourselves to credible scientific challenge. If we do join in the environmental skepticism debate, let us be sure that we do so with the same quality of informed critical thinking and respect for all the facts that we espouse in our other activities.

Communicating the nature of science: Evolution as an exemplar

Science as taught at school is often portrayed as a collection of facts, rather than as a process. Taking a historical approach to the teaching of evolution is a useful way to illustrate the way science works.

The need for a scientifically literate population is probably greater now than ever before, given the rapid pace of change in science and technology. Members of such a population have the tools to examine the world around them, and the ability to critically assess claims made in the media. However, there are difficulties with conveying just what science is about and how it is done; in letting people know how the scientific world-view differs from “other ways of knowing”. This is particularly evident when dealing with evolutionary theory, often described as “just a theory”, and probably the only scientific theory to be rejected on the grounds of personal belief. How can we alter such misconceptions and extend scientific understanding?

Part of my role at Waikato University involves liaising with local and regional high school teachers of biology and science. Over the past few years I have received numerous requests from local secondary school teachers to provide a resource they could use in teaching evolution. Discussion with teacher focus groups revealed a number of content areas they would like to have available:

  • links to the New Zealand curriculum and to relevant web-sites,
  • evolutionary process (including the sources of genetic variation and how natural selection operates),
  • human evolution,
  • New Zealand examples,
  • evidence for evolution,
  • ways of dealing with opposition among students (and colleagues),
  • and the historical perspective.

These last two items are particularly significant, since modelling a way of presenting the historical development of evolutionary thought, and by extension the nature of science itself, offers a way of countering opposition to the theory of evolution.

I have deliberately used the example of evolution, because there is good evidence (eg Abd-El-Khalick and Lederman 2000; Passmore and Stewart 2000; Passmore and Stewart 2002) that altering the way in which evolution is traditionally taught offers the opportunity to show people the nature of science – what it is and how it works. For example, rather than taking a confrontational approach to their students’ beliefs, Passmore and Stewart (2000) provided a number of models of evolution and encouraged the students to determine which model best explained a particular phenomenon.

Similarly, William Cobern (1994) has commented:

“Teaching evolution at the sec-ondary level is very much like Darwin presenting the Origin of Species to a public who historically held a very different view of origins. To meet this challenge, teachers [should] preface the conceptual study of evolution with a classroom dialogue… informed with material on the cultural history of Darwinism.”

He goes on (Cobern 1995):

“I do not believe that evolution can be taught effectively by ignoring significant metaphysical (ie essentially religious) questions. One addresses these issues not by teaching a doctrine, but by looking back historically to the cultural and intellectual milieu of Darwin’s day and the great questions over which people struggled.”

Taking such an approach is highly significant in developing an understanding of the nature of science, since an historical narrative will not only place Darwin’s work into its historical and social context, but will also show how he applied the scientific method to solving his “problem” of evolution. This approach is central to the Evolution for Teaching website ( – see NZ Skeptic 71; the other members of the website team are Dr Penelope Cooke of Earth Sciences, Dr Kathrin Cass, and Kerry Earl from the Centre for Science & Technology Education Research), and is also one I use in my own teaching, where every year I encounter students who have a creationist worldview. Such views may well become more common, given that there appears to be a coordinated effort to make material promoting Intelligent Design Theory (and denigrating evolutionary thought) available in schools.

This teacher-generated list, and the philosophy described above, informed the planning and design of the Evolution for Teaching website, which is hosted by the School of Science and Technology at Waikato University. First we felt it important to make explicit the nature of scientific hypotheses, theories, and laws, to overcome difficulties originating in differing understandings of the word “theory”. (Much of the language of science offers opportunities for such misunderstandings, since it invests many everyday terms with other, very specific meanings eg Cassels and Johnstone 1985; Letsoalo 1996.)

The site offers links to the NCEA matrices for Science and Biology, plus FAQs, book and site reviews, and a glossary.

Feedback has been almost entirely positive, with all the teachers attending its launch in March indicating that they would use it in their teaching and recommend it to their students. Without exception they found it attractive, easy to navigate, and informative, providing information at a level suitable for both themselves and their students. Student comments support this last point. Since the site went “live” in March 2004 it has received around 100,000 hits per month, indicative of a very high level of interest.


It is with sadness that I see that the Skeptic is still accepting articles and letters with political bias. I would like to spend much of this letter countering some of Owen McShane’s arguments from his article “Why are we crying into our beer?”, but I see we are still arguing in the pages of our magazine about science. It would be really nice if Jim Ring or C Morris could explain to me and I’m sure others who are puzzled by this whole affair, as to what legitimate arguments between legitimate scientists have to do with scepticism.

Continue reading