Digital Photography and the Paranormal

More ‘ghosts’ than ever are appearing in photos – thanks to digital cameras. This article is based on a presentation to the NZ Skeptics 2009 conference in Wellington, 26 September.

Since the beginnings of photography in the mid-nineteenth century people have used the medium to capture images of ghosts, both naïvely and as a hoax for commercial gain. Until the arrival of roll film late in the nineteenth century, which was more light-sensitive than earlier wet and dry plates, long exposure times sometimes resulted in spectral-looking figures accidently or intentionally appearing in photographs. Nearly all early photographs showing alleged ghosts can be explained by double exposure, long exposure, or they are recordings of staged scenes – contrivances such as the cutout fairies at the bottom of the garden in Cottingley.

As cameras became more foolproof, with mechanisms to eliminate double exposure etc, accidental ghosts in photographs became scarce. During the 1990s I carried a compact 35mm camera (an Olympus Mju-1) and shot more than five thousand photos with it. At the time I was not looking for paranormal effects such as those described below, but a quick review showed only very few strange occurrences in the photos. This century digital compact cameras have become ubiquitous and supposed ghost photos are also now common. There is a connection.

Design-wise, the basic layout of a compact digital camera isn’t much different to a compact 35mm film camera; both have a lens with a minimum focal length a little shorter than standard1 and a flash positioned close to the lens. The main differences are the lens focal lengths and the image recording medium.

A typical 35mm film camera has a semi-wide angle lens (which may also zoom well into the telephoto range but we’re not much interested in that) in the range of 28mm-38mm. A standard lens for the format is about 45mm. A digital compact camera is more likely to have a lens focal length starting out in the range of 4mm to 7mm. A 5mm lens is typical, and at a maximum aperture of around f2.8, the maximum working aperture of the lens can be less than 2mm and the stopped down aperture less than 0.5mm. (As a comparison, the maximum aperture of my Mju-1 was 35mm/f3.5=10mm.) These tiny apertures allow things very close to the lens to be captured by the recording medium (albeit out of focus) even when the lens is focussed on medium-long distance.

The most common photographic anomaly that is mistakenly held up as evidence of paranormal activity is the orb. While there are natural objects that are visible to the unaided eye and may photograph as orbs – that is, any small or point source light, either close by such as a lit cigarette or burning marsh gas, or distant such as the planet Venus – there are other types of orbs that only show up in photographs. You don’t see them but the camera does. These are mainly caused by airborne dust, moisture droplets, or tiny insects. In the dark, they are visible only briefly (for a millisecond or so) when illuminated by the camera flash. Dust is the most common cause of orbs in photographs, captured as an out-of-focus glow as it passes within centimetres of the camera lens, in the zone covered by the flash.

The diagram above shows how a compact digital camera, having its flash close to its short focal length lens, is able to photograph dust orbs. Most 35mm cameras won’t do this because the lens is too long in focal length to be able to create a small enough Circle of Confusion2 image of the dust and larger Single Lens Reflex (SLR)-type cameras tend to have the flash positioned farther from the lens (above) and also have larger image sensors and longer focal-length lenses which are more like a 35mm camera.

Note: a built-in flash on a digital SLR, while being closer to the lens axis, is set some distance back from the front of the lens, so the dust particles it illuminates are also out of view of the lens; they are behind it.

Specifically, a dust orb is an image of the electronic flash reflected by a mote, out of focus and appearing at the film plane as a circular image the same shape as the lens at full aperture. Most of the time when a compact camera takes a flash photo the aperture blades automatically stay out of the way to allow the widest possible lens opening. If the aperture blades close down at all, they create a diamond-shaped opening and any dust orb then becomes triangular, an effect predicted by this theory of dust orbs.

The diagram above shows a dust mote much closer to the camera lens than the focussed subject, a tree, and how the out-of-focus orb appears over the tree in the processed image, appearing the size of its Circle of Confusion at the film plane (or, in this case, digital imaging plane).

Other common photographic anomalies which are sometimes assumed to be paranormal are caused by lens flare, internal reflections, dirty lenses and objects in front of the lens. These can all occur in any type of camera. What they have in common (and this includes dust orbs), is that the phenomena exist only in the camera: they will not be seen with the unaided eye. Most of the time, photographs that are held up as paranormal were taken when nothing apparently paranormal was suspected: the anomalous effect was only noticed later upon reviewing the images.

Another confusing aspect of photographic anomalies is the loss of sense of scale, caused by the reduction of the 3D world to a 2D photograph. In the photo opposite, it appears the baby is looking at the orb, but actually the dust particle causing the orb is centimetres from the lens and the baby is looking at something else out of frame.

A variation on this is when someone senses the presence of a ghost and responds by taking a photograph. If a dust orb appears in the photo it may be assumed to be a visual representation or manifestation of the spiritual entity. Naïve paranormal investigators and other credulous types get terribly excited when this happens, and it often does during a ghost hunt. And ghost hunting is about the only type of activity that involves wandering around in the dark taking photos of nothing in particular. Now that digital cameras have large displays, photographers using the cameras during a paranormal investigation are able to immediately see dust orbs in their photos. If they believe these orbs to be paranormal, the hysteria of the investigators is fed. I’ve seen it happen. With film cameras and even with older digital cameras having smaller displays or no photo display at all, the orb effect was not usually observed until after the investigation.

Next is an enlarged part of a photo of the Oriental Bay Marina. The ghost lights in the sky are secondary images of light sources elsewhere in the photo, caused by internal reflections in the camera lens.

While operating a camera in the dark it is easy to make a mistake such as letting the camera strap or something else get in front of the lens, or put a fingerprint on the lens that will cause lens flare later. Use of the camera in the Night Photography mode will cause light trails from any light source due to the slow shutter speed (usually several seconds), combined with flash. Also, in Night mode a person moving will record as a blur combined with a sharp image from the flash, making it look like a ‘mist’ is around them.

It is important to remember that a compact digital camera will process an image file before displaying it. While a more serious camera will shoot in Raw (unprocessed) mode, most compact cameras record the image in JPEG form, which is compressed. Cellphone cameras usually apply a lot of file compression to save memory and minimise transmission time. Digital compression creates artefacts, and the effect can be seen in the enlarged photo of the dust orb (page 12). Also, digital sharpening is automatically applied, which can make a vague blur into a more definite shape, a smear into a human face.

We are all aware of the tendency to want to recognise human faces or figures in random patterns. This is a strong instinct possibly linked to infancy, picking out a parent’s face from the surrounding incomprehensible shapes. Once people see human features in a photo it is difficult to convince them that they’re looking at a random pattern and just interpreting it as a face. The effect is called pareidolia, sometimes referred to as matrixing, or the figure as a simulacrum.

The ‘Face in the Middle’ photo, below, is an example of pareidolia. The third face appearing between the boy and girl is several background elements combining to produce the simulacrum. The low resolution and large amount of compression in this cellphone photo exacerbate the effect.

While we all know it is easy to fake a ghost photo using in-camera methods such as long exposure or multiple exposure, or in post-production using imaging software such as Photoshop, current camera technology makes it hardly necessary. It is far easier to choose to use a compact digital camera or cellphone camera and allow it to produce the anomalous effects automatically: one reason why ‘ghost hunters’ use them. Then one can claim ignorance and honestly say they didn’t mess with the photo, it is exactly how the camera saw it. Having done a fair amount of ghost hunting myself, it is tempting to use a digital compact camera with the knowledge that while it is highly unlikely an actual ghost will be photographed, a certain number of anomalous photographs will result which will at least spice up the investigation report!3

In my experience of analysing photographs, I have found that some people are prepared to accept a rational explanation of what they thought may have been a photograph of a paranormal event. Others don’t want to hear anything rational; they’ve made up their mind that there’s a ghost in the photo and that’s the end of it. Having looked at a large number of photographs that allegedly show ghosts, I haven’t yet come across one that doesn’t fall into one of the general categories of photographic anomaly referred to above or isn’t a probable fake.

While I think that people do have ghost-like experiences (an opinion based mainly on the vast accumulation of published anecdotal evidence but also on some personal experiences that remain unexplained), it is probably not possible to photograph a ghost as such using any known method of photography (including pictures using the EM spectrum outside visible light). Photographs are not considered hard evidence of anything much these days anyway, because it is widely known that even a moderately skilled photographer or Photoshop operator can create a realistic looking picture of almost any fantasy. In paranormal matters a photograph can at best be considered circumstantial evidence requiring backup from other types of hard data and witness accounts to lend it evidential weight.

Footnotes:

  1. A standard lens has a focal length close to the diagonal measurement of the film or digital sensor. This lens renders objects in correct proportion according to their distance – a neutral perspective, neither compressed (as by longer focal length, or ‘telephoto’ lenses) nor exaggerated (as by shorter focal length, wide-angle lenses).

  2. Circle of Confusion (COC) is a term in optics for the image of a point of light that is in or out of focus at the imaging plane of a lens. Each point of an object forms an image circle of a diameter relative to its degree of sharp focus, with an in-focus point forming a tiny COC that effectively appears as a point. An Infinite number of larger, overlapping COCs form the blurry (unfocussed) areas of an image. This is the basis of Depth of Field in photography.

  3. In Strange Occurrences we use digital photography in much the same way as police photographers, that is, to record details of a location for later reference. Also, long exposures with a digital SLR on a tripod can show things the unaided eye cannot quite make out in low light, such as reflected and/or diffracted light patterns from external light sources that may appear somewhat ghost-like. Captions: The placing of the flash close to the short focal length lens of a digital camera means that dust motes can be illuminated as ‘orbs’.

Science as a human endeavour

If students are to pursue careers in science, they need to be able to see themselves in that role. One way to encourage this may be through the telling of stories. This article is based on a presentation to the 2008 NZ Skeptics Conference in Hamilton.

New Zealand’s new science curriculum asks us to develop students’ ability to think critically. As a science educator I think that’s about the most important skill we can give them: the ability to assess the huge amount of information that’s put in front of them from all sorts of sources. We also need to recognise that the ideas and processes students are hearing about have come to us through the activities of people – it’s people who develop science understanding. Science changes over time, as people’s ideas change. It’s fluid, it’s done by people, and it’s a human endeavour.

This puts science in an interesting position. It has its own norms, and its own culture, but it’s embedded in the wider culture as well. Those norms of science include its history. I find it sad that many of my students have no idea of where the big ideas in science came from. They don’t know what the people who were developing those ideas were like.

The new curriculum document recognises that the nature of science is an important strand in the curriculum, because it is what gives science its context, and lets students see science as a human endeavour. They’re going to learn what science is, and how scientists do science. They will become acquainted with the idea that scientists’ ideas change as they’re given new information; that science is valuable for society. And students are going to learn how it’s communicated.

Our future prosperity depends on students continuing to enter careers in the sciences. Richard Meylan, a senior adviser at the Ministry of Research, Science and Technology, said to me recently that somewhere between the end of year 13 and that two-month break before they go to university, we seem to be losing them. The universities are tending to see a drop in the number of students who have picked science as something that they want to continue in. Students don’t seem to see it as a viable career option, and there are many reasons for that.

We need more scientists, we need scientifically-literate politicians, and we need a community that understands science: how science is done, how science is relevant; one that sees science and scientists as being an integral part of the community. But how are we going to get there? What sorts of things can we do that are going to make young people want to carry on in science? Students often don’t choose science – how are we going to change that?

One of the reasons, perhaps, is that they often don’t see themselves as scientists. We did a bit of research on this at Waikato University last year, asking what would encourage our first-year students to continue as scientists. And what they were saying was, “Well, a lot of the time I don’t see myself as a scientist.” We asked, what would make a difference? The response: “Seeing that my lecturers are people.” People first, scientists second.

When I googled ‘scientist’ I had to go through eight or nine pages of results before finding something that looks like my own idea of a scientist. (‘Woman scientist’ is a bit better!) Almost all the guys have moustaches, they’ve all got glasses, all the women are square-shaped. Students don’t see themselves in this. We need them (and the rest of the community!) to see science as something that ordinary people do.

Now, what sorts of things are those ordinary people doing? They’re thinking; they’re speculating, they’re saying ‘what if?’ They’re thinking creatively: science is a creative process and at its best involves imagination and creativity. Scientists make mistakes! Most of the time we’re wrong but that doesn’t make good journal articles; usually no-one publishes negative results. So you just hear about the ‘correct’ stuff. Scientists persist when challenged, when things aren’t always working well.

Science stories

One way of fostering students’ engagement with science, and seeing themselves in it, is to tell them stories, to give them a feeling of how science operates. Brian Greene, a science communicator and physicist in the US, says:

I view science as one of the most dramatic narratives our species can tell. The story of our search to understand the Universe and ourselves. When that search is conveyed using the power of story – the story of discovery – we can all feel part of the journey.

So I’m going to tell you stories. And I’m going to tell stories about old, largely dead, people because one of my passions at the moment is the history of science. A lot of science’s big ideas have a history that stretches back 3-400 years. But they’re just as important today, and I think that an understanding of the scientists who came up with those ideas is also important today.

I think it’s important that kids recognise that a lot of scientists are a bit quirky. But then, everyone’s a bit quirky – we’re all different. One example of someone ‘a bit different’ is Richard Feynman. Famous for his discoveries in the nanotech field, he was a polymath: a brilliant scientist with interests in a whole range of areas – biology, art, anthropology, lock-picking, bongo-drumming. He was into everything. He also had a very quirky sense of humour. He was a brilliant scientist and a gifted teacher, and he showed that from an early age. His sister Joan has a story about when she was three, and Feynman was nine or so. He’d been reading a bit of psychology and knew about conditioning, so he’d say to Joan: “Here’s a sum: 2 plus 1 more makes what?” And she’s bouncing up and down with excitement. If she got the answer right, he’d give her a treat. The Feynman children weren’t allowed lollies for treats, so he let her pull his hair till it hurt (or, at least, he behaved as if it did!), and that was her reward for getting her sums right.

Making mistakes

We get it wrong a lot of the time. Even the people we hold up as these amazing icons – they get it wrong. Galileo thought the tides were caused by the Earth’s movement. At the time, no-one had developed the concept of gravity. How could something as far away as the Moon possibly affect the Earth? We look back at people in the past and we think, how could they be so thick? But,in the context of their time, what they were doing was perfectly reasonable.

Louis Pasteur, the ‘father of microbiology’, held things up for years by insisting that fermentation was due to some ‘vital process’ it wasn’t chemical. He got it wrong.

And one of my personal heroes, Charles Darwin, got it completely wrong about how inheritance worked. He was convinced that inheritance worked by blending. When Darwin published The Origin of Species, in 1859, Mendel’ s work on inheritance hadn’ t been published. It was published in Darwin’s lifetime – Mendel’s ideas would have made a huge difference to Darwin’s understanding of how inheritance worked – part of the mechanism for evolution that he didn’t have. But he never read Mendel’s paper.

Scientists do come into conflict with various aspects of society. Galileo had huge issues with the Church. He laid out his understanding of what Copernicus had already said: the Universe was not geocentric, it didn’t go round the Earth. The Church model was that the Universe was very strongly geocentric: everything went round us. Galileo was accused of heresy, and shown the various instruments of torture; for pulling out his thumbnails and squashing his feet. He did recant, and he was kept under house arrest until his death. And the Church officially apologised to him in 1992. A long-running conflict indeed.

And there’s conflict with prevailing cultural expectations. Beatrice Tinsley was an absolutely amazing woman; a New Zealander who has been called a world leader in modern cosmology, and one of the most creative and significant theoreticians in modern astronomy. She went to the US to do her PhD in 1964, and finished it in 1966. Beatrice published extensively, and received international awards, but she found the deck stacked against her at the University of Texas, where she worked. She was asked if she’d design and set up a new astronomy department, which she did. The university duly opened applications for the new Head of Department. Beatrice applied. They didn’t even respond to her letter. So she left Texas. (Yale did appreciate her, and appointed her Professor of Astronomy.) A couple of years later she found she had a malignant melanoma, and was dead by the age of 42. The issue for Beatrice was a conflict between societal expectations and the area where she was working: women didn’t do physics.

Science versus societal ‘knowledge’

Raymond Dart was an English zoologist who worked at the University of Witwatersrand in South Africa. He was widely known among the locals for his fondness for fossils; you could trundle down to Prof Dart’s house, bring him a lovely bit of bone, and he’d pay you quite well. One day in 1924 the workers at Taung quarry found a beautiful little skull – a face, a lower jaw, and a cast of the brain – in real life it would sit in the palm of your hand. Dart was getting ready for a wedding when the quarry workers arrived, and he was so excited by this find that when his wife came in to drag him off to be best man, he still didn’t have his cuffs and his collar on and there was dust all over his good black clothes. He was absolutely rapt.

Dart looked at this fossil and saw in it something of ourselves. He saw it as an early human ancestor. The jaw is like ours, it has a parabolic shape, and the face is more vertical -relatively speaking – than in an ape. He described it, under the name Australopithecus africanus, as being in our own lineage and went off to a major scientific meeting, expecting a certain amount of interest in what he’d discovered. What he got was a fair bit of doubt, and some ridicule. How could he be so foolish? It was surely an ape.

By 1924 evolution was pretty much an accepted fact in the scientific community. But there was a particular model of what that meant. In some ways this built on the earlier, non-evolutionary concept of the Great Chain of Being. They also had a model that tended to view the epitome of evolutionary progress as white European males. It followed from this that humans had evolved in Europe, because that’s where all the ‘best’ people came from. Black Africans were sometimes placed as a separate species, and were regarded as being lower down the chain.

Yet here was Dart saying he’d found a human ancestor in Africa. This would mean the ancestor must have been black – which didn’t fit that world-view. It’s a racist view, but that reflected the general attitudes of society at the time, and the scientists proposing that view were embedded in that society just as much as we are embedded in ours today.

Another difficulty for Dart had to do with prevailing ideas about how humans had evolved. By the 1920s Neanderthal man was quite well known. Neanderthals have the biggest brains of all the human lineage – a much bigger brain than we have. And the perception was that one of the features that defined humans, apart from tool use, was a big brain. It followed from this that the big brain had evolved quite early. Dart was saying that Australopithecus was a hominin, but Australopithecus as an adult would have had a brain size of around 400cc. We have a brain size of around 1400cc. Australopithecus didn’t fit the prevailing paradigm. The big brain had to come first; everybody knew that.

And belief in that particular paradigm – accepted by scientists and non-scientists alike – helps to explain why something like Piltdown man lasted so long. Over the period 1911-1915 an English solicitor, Charles Dawson, ‘discovered’ the remains of what appeared to be a very early human indeed in a quarry at Piltdown. There were tools (including a bone ‘cricket bat’), a skull cap, and a lower jaw, which looked very old. The bones were quite thick, and heavily stained. This was seized upon with joy by at least some anatomists because the remains fitted in with that prevailing model: old bones of a big-brained human ancestor.

People began to express doubts about this fossil quite early on, and these doubts grew as more hominin remains were confirmed in Africa and Asia. But it wasn’t completely unmasked as a fake until the early 1950s. The skull looked modern because it was a modern (well, mediaeval) skull that had been stained to make it look really old. The jaw was that of an orangutan, with the teeth filed so that they looked more human and the jaw articulation and symphysis (the join between right and left halves) missing. When people saw these remains in the light of new knowledge, they probably thought, how could I have been so thick? But in 1914 Piltdown fitted with the prevailing model; no-one expected it to look otherwise. And I would point out that it was scientists who ultimately exposed the fraud. And scientists who re-wrote the books accordingly.

Thinking creatively

The next story is about Barry Marshall, Robin Warren, and the Nobel Prize they received in 2005. (These guys aren’t dead yet!) Here’s the citation:

[The 2005] Nobel Prize in Physiology or Medicine goes to Barry Marshall and Robin Warren, who with tenacity and a prepared mind challenged prevailing dogmas. By using technologies generally available… they made an irrefutable case that the bacterium Helicobacter pylori is causing disease.

The prevailing dogma had been that if you had a gastric or duodenal ulcer, you were a type A stress-ridden personality. The high degree of stress in your life was linked to the generation of excess gastric juices and these ate a hole in your gut. Marshall and Warren noticed that this bacterium was present in every preparation from patients’ guts that they looked at. They collected more data, and found that in every patient they looked at, H. pylori was present in the diseased tissue. One of them got a test-tube full of H. pylori broth and drank it. He got gastritis: inflammation of the stomach lining and a precursor to a gastric ulcer. He took antibiotics, and was cured. The pair treated their patients with antibiotics and their ulcers cleared up.

Because they were creative, and courageous, they changed the existing paradigm. And this is important – you can overturn prevailing paradigms, you can change things. But in order to do that you have to have evidence, and a mechanism. Enough evidence, a solid explanatory mechanism, and people will accept what you say.

Which was a problem for Ignaz Semmelweiss. He had evidence, alright, but he lacked a mechanism. Semmelweiss worked in the Vienna General Hospital, where he was in charge of two maternity wards. Women would reputedly beg on their knees not to be admitted to Ward 1, where the mortality rate from puerperal fever was about 20 percent. In Ward 2, mortality was three or four percent. What caused the difference? In Ward 2 the women were looked after exclusively by midwives. In Ward 1, it was the doctors. What else were they doctors doing? They were doing autopsies in the morgue. And they would come from the morgue to the maternity ward, with their blood-spattered ties, and I hate to think what they had on their hands. Then they would do internal examinations on the women. Small wonder so many women died. Semmelweiss felt that the doctors’ actions were causing this spread of disease and said he wanted them to wash their hands before touching any of the women on his ward. Despite their affronted reactions he persisted, and he kept data. When those doctors washed their hands before doing their examinations, mortality rates dropped to around three percent.

The trouble was that no-one knew how puerperal fever was being transmitted. They had this idea that disease was spread by miasmas – ‘bad airs’ – and although the germ theory of disease was gaining a bit of traction the idea that disease could be spread by the doctors’ clothes or on their hands still didn’t fit the prevailing dogma. Semmelweiss wasn’t particularly popular – he’d gone against the hospital hierarchy, and he’d done it in quite an abrasive way, so when he applied for a more senior position, he didn’t get it, and left the hospital soon after. He was in the unfortunate position of having data, but no mechanism, and the change in the prevailing mindset had to wait for the conclusive demonstration by Koch and Pasteur that it was single-celled organisms that actually caused disease.

Collaboration and connectedness

Scientists are part of society. They collaborate with each other, are connected to each other, and are connected to the wider world. Although there have been some really weird people that weren’t. Take Henry Cavendish – the Cavendish laboratory in Cambridge is named after him. He was a true eccentric. He did an enormous amount of science but published very little, and was quite reclusive – Cavendish just didn’t like talking with people. If you wanted to find out what he thought, you’d sidle up next to him at a meeting and ask the air, I wonder what Cavendish would think about so-and-so. If you were lucky, a disembodied voice over your shoulder would tell you what Cavendish thought. If you were unlucky, he’d flee the room.

But most scientists collaborate with each other. Even Newton, who was notoriously bad-tempered and unpleasant to people whom he regarded as less than his equal, recognised the importance of that collaboration. He wrote: “If I have seen further than others, it is because I have stood on the shoulders of giants.” Mind you, he may well have been making a veiled insult to Robert Hooke, to whom he was writing: Hooke was rather short.

What about Darwin? Was he an isolated person, or a connected genius? We know that Darwin spent much of the later years of his life in his study at Downe. He had that amazing trip round the world on the Beagle, then after a couple of years in London he retreated to Downe with his wife and growing family, and spent hours in his study every day. He’d go out and pace the ‘sandwalk’ – a path out in the back garden – come back, and write a bit more. Darwin spent eight years of that time producing a definitive work on barnacles, and he didn’t do it alone. He wrote an enormous number of letters to barnacle specialists, and to other scientists asking to use work that they’d done, or to use their specimens to further the work he was doing.

He was also connected to a less high-flying world: he was into pigeons. This grew from his interest in artificial selection and its power to change, over a short period of time, various features in a species. So he wrote to pigeon fanciers. And the pigeon fanciers would write back. These were often in a lower social class and various family and friends may well have been a bit concerned that he spent so much time speaking to ‘those people’ about pigeons. And Darwin had a deep concern for society as well. He was strongly anti-slavery, and he put a lot of time (and money) into supporting the local working-class people in Downe. He was still going in to London to meet with his colleagues, men like Lyell and Hooker, who advised him when Alfred Wallace wrote to him concerning a new theory of natural selection. Now there’s an example of connectedness for you, and the impact of other people’s thought on your own! It was Wallace who kicked Darwin into action, and led to him publishing the Origin of Species.

That’s enough stories. I’m going to finish with another quote from Brian Greene:

Science is the greatest of all adventure stories, one that’s been unfolding for thousands of years as we have sought to understand ourselves and our surroundings. Science needs to be taught to the young and communicated to the mature in a manner that captures this drama. We must embark on a cultural shift that places science in its rightful place alongside music, art and literature as an indispensable part of what makes life worth living.
Science lets us see the wonder and the beauty of the stars, and inspires us to reach them.

Forum

The leading medical journal The Lancet recently published yet another analysis of trials of homeopathy. After examining 110 such trials, the Swiss researchers concluded that there was no convincing evidence that homeopathy was any more effective than placebo. In the accompanying editorial, the editor, Dr Richard Horton, made a comment which has an uncanny, and no doubt intentional parallel with the views of the founder of homeopathy over two hundred years ago:-

Continue reading