Archive for the ‘Guardian Articles’ Category

Survival of the Thickest

21/12/2012

Intelligent Design is so intellectually bankrupt, it does not deserve to be taught in school – even in religious education classes.

Originally published by the Guardian 24th January 2007

http://www.guardian.co.uk/profile/simonunderdown

Evolution is a subject that elicits a wide range of responses: simple denial by the religious fundamentalist to demi-worship in the occasional scientist. However, the most common response, and the one that is most overlooked in this most crystallised of debates, is that of confusion. Although everyone has at least heard of Darwin, and probably have the phrase “survival of the fittest” somewhere in the back of their minds (a term, in fact, coined not by Darwin but by Herbert Spencer, in 1864), there does seem to be widespread public misunderstanding about evolution and the mechanisms by which it operates (for example, the oft-repeated question: “If we evolved from monkeys, why are there still monkeys?”). This problem can only be exacerbated by the announcement by the QCA that Intelligent Design (ID or “Creationism Lite”) will be taught in Religious Education lessons in England.

Intelligent Design – the idea that organisms of great complexity cannot have evolved by natural selection and that a creator or God is therefore responsible for all or some life as we know it – is not a science, as it cannot be scientifically tested, as evolution continues to be. There is no debate among serious scientists beyond bemused amazement that small groups persist in holding ID up as a genuine alternative to Darwinian evolution.

Yet, even though the debate will take place in the RE classroom, the reverberations will be felt, not just in the science class but also across the educational sector as a whole. The decision to include ID in school curricula will give the impression that ID is a worthy alternative to evolution. This move by the QCA has the potential to do one of two things, depending on how it is taught: either show Intelligent Design for what it really is (empty waffle based on the creation myth) or to muddy the already murky waters of public understanding of Darwinian evolution.

We have come to a fork in the road. ID can be embraced as part of the curriculum (and, surely, that way madness lies) or it can be cast out into the wilderness; an historical footnote comparable with that written on the authorities who confidently opposed universal suffrage on “scientific grounds”. ID is not science and, despite the increasingly vocal objections of a small minority, has yet even to fire a shot across the bows of Darwinian evolution. As a human evolutionary biologist, the thought of having to spend time explaining the glaring errors of ID to undergraduates at the expense of more worthy material fills me with dread.

 

Advertisement

The Evolution will be Televised

21/12/2012

I disagree with Johnjoe McFadden’s criticisms of Steve Jones, but genetics are not the only factor in our species’ survival

Originally published by the Guardian 24th January 2007

http://www.guardian.co.uk/profile/simonunderdown

The geneticist Steve Jones has announced that human evolution has stopped. This is based on a genetic view of the evolutionary process and while it is almost certainly true from a gene-centric perspective, it is really only addressing a small part of what it is to be human. While Jones is absolutely right when it comes to the slowing down of the effects of natural selection on the human gene pool, human evolution is not a purely genetic affair and the path of our development as a species cannot possibly be understood without an examination of the role played by cultural adaptations.

The course of human evolution has displayed a complex relationship between biological adaptations (such as bipedalism or brain size) and cultural adaptations (such as tool use) since the development of the first stone tools, around 2.6 million years ago, in East Africa. The appearance of stone tools in the archaeological record was a major cultural adaptation that provided our ancestors with the ability to manipulate their environment – a process that lead to ever more complex behavioural innovations and one that has continued ever since.

Our earliest hominin ancestors were very different in appearance to us – if still extant, they would look to us very “ape-like”. This was the case for approximately five million years until the appearance, two million years ago, of Homo ergaster – the first human and the first to rely heavily on cultural adaptations rather than biological ones. The process of human evolution from two million years ago onwards was one of relatively small-scale biological changes in tandem with massive and far-reaching cultural development. It was the development of cultural adaptations that provided the basis for our evolutionary success and produced the current genetic pattern that Jones describes. The use of cultural adaptations, such as fire and clothing, removed the need for biological adaptation and meant that the basic body plan of the genus Homo has remained relatively unchanged. Although there are of course differences between species such as Homo ergaster, the Neanderthals and Homo sapiens, the factor that unites us all was the role that cultural innovation played in allowing a wide range of habitats to be exploited without the need for biological adaptation. Massive increases in intelligence, a biological process, provided the raw material for a huge range of cultural adaptation and environmental manipulation.

Johnjoe McFadden argues that Jones’s view is incorrect and that the role of genetic engineering means that we will be soon be entering a period of evolutionary dynamism – a result of being able to tweak our genome to, for instance, remove cancer and other genetic diseases – an interesting idea but it overlooks the fact that many of these diseases take hold later in life, after reproduction, and as such could be argued to be relatively selectively neutral. Genes are not the only factor; the environment plays a huge role in our makeup. Simply put, genetic predisposition to heart disease is not the same as heart disease!

Ultimately, genetics is only part of the story of human evolution. While the process of genetic evolution is clearly slowing down and we are no longer subject to the widespread effects of natural selection, cultural evolution continues to play a crucial role in the development of the human species as it has done for nearly 2.6 million years. The pace of cultural adaptation is still moving rapidly and producing a greater range of variation than at any other time in our evolutionary history. We may be heading for a homogenous genetic future but the human evolutionary story tells us that our culture will continue to evolve and flourish as long as humans are around.

Dem Bones

21/12/2012

Originally published by the Guardian 7th March 2007

http://www.guardian.co.uk/profile/simonunderdown

One of the saddest but often untold stories of the 18th and 19th centuries was the huge loss of human life, and diversity, as European empire builders spread “civilization”. Tragically, this “civilization” took the form of enforced western modes of behaviour and all too often the extermination of populations that were considered troublesome or occupied regions rich in valuable natural resources. Within a relatively short space of time whole ways of life were wiped out: millennia of rich human diversity were gone forever.

Most of the indigenous populations that suffered had traditions of oral history and as they died so to did their histories. Pockets of indigenous people remain but the process of “Mac-Disneyfication” continues unabated. What we are left with, however, are the skeletal remains of indigenous groups that were collected during the same period, albeit in ways abhorrent to modern standards, now residing in museums.

In many cases these skeletons are the last representatives of populations that were utterly destroyed. Analysis of these skeletal remains by scientists, always with extreme reverence and respect, gives us a last chance to better understand the story of humanity. Human skeletons are an invaluable source of information for the understanding of recent human evolution; or how we came to be who we are. The data that can be collected ranges from sex, age at death and disease right through to dietary make-up and DNA profiles.

However, over the last few years there have been increasingly vocal calls from minority groups for the repatriation and reburial (ie destruction) of many of the remains held in British institutions. These bones allow every single human being to better understand our shared history. Genetic analysis of human bones has shown that we are a very closely related species of primate. Surface differences, like skin colour, are insignificant when seen in the light of how recently we evolved in Africa (circa 200,000 years ago), demonstrating how little genetic difference exists between us all. Analysis of these human bones has done so much to show up the flaws in racist arguments.

Institutions that hold collections of human skeletons are generally happy to work with those seeking repatriation of remains and simply ask that they be allowed to carry out tests first. The research that anthropologists carry out is generally non-invasive and when samples are collected they are usually very small. DNA analysis or radiocarbon dating can be performed with a single tooth. Groups that seek repatriation try to demonise those of us who work with human skeletal remains, recently stating that these experiments were nothing more that “mutilation” and were causing “torment to the souls of the dead”.

The most vocal calls for destructive repatriation generally come from groups who disagree with the story scientific analysis presents, which often sharply contrast with indigenous creation myths (which are so central to land claims). Preventing the analysis of these bones through reburial won’t just stop the current generation from better understanding our global history but will make it impossible forever. Can one generation of people be allowed to rule this out for all that come after them?

An Evolving Tale

21/12/2012

We should not underestimate the importance of climate change – humans are part of the environment, not masters of it.

Originally published by the Guardian 3rd April 2007

http://www.guardian.co.uk/profile/simonunderdown

There really seems to be very little doubt that human activity is responsible for climate change: atmospheric concentration of CO2 (a major cause of global warming) is now significantly higher than at any time over the last 600,000 years. The start of this massive increase coincides very closely with the genesis of the industrial revolution.

We should be worried about the effects of climate change, not just because of the short-term problems it will undoubtedly lead to, but also because of the long-term issues we can only guess at. The media is full of speculation about the effect of relatively short-term climate change, such as rising sea levels and desertification.

Yet it is worth examining just how powerful a hold climate really has on our species from an evolutionary perspective. It would not be going too far to say that climate change has been one of the major factors in human evolution (the other is, of course, technology). A drop in global temperature during the Miocene epoch approximately 8-10 million years ago, resulted in the fragmentation of the large African forests, which in turn led to the development of savannahs (wide open grassland). It was this incidence of climate change that seems to have kick-started human evolution.

Around 7 million years ago our early ancestors ventured out of the forests and onto the savannah, slowly adapting to this new environment (while the ancestors of chimpanzees stayed within the forests). The key adaptation caused by this shift in habitat was that our ancestors began to walk on two legs (bipedalism), probably to reduce the surface area exposed to the sun. This left the hands free to do other things, aiding the development of stone tools, which could be used to scavenge and butcher meat, which in turn provided energy for bigger brains. Without that change in the global climate, it is fair to suggest that we might not have become the species we are today.

Human evolution continued to be highly influenced by the environment over the next 5 million years, but this changed dramatically around 2 million years ago when our evolutionary ancestor, a species called Homo ergaster, first started to significantly manipulate its environment. Over the last 2 million years we have been gradually lessening the hold that climate has on us, but never removing it. The extinction of the Neanderthals around 30,000 years ago seems to have been closely related to climate change. Our own species, Homo sapiens, has been able to populate almost every area of the planet; using technology to exploit areas our biological make-up would not be able to cope with.

The process has now come full circle: the environment had a massive impact on our evolution, we evolved strategies to reduce this impact, but these technological innovations have now caused the environment to start moving beyond our control once again. The lessons from our evolutionary past are very clear; humans are part of the environment, not masters of it.

Race Against Time

21/12/2012

Evolution isn’t making people in different parts of the world more distinct. There are no human races, just the one species: Homo sapiens

Originally published by the Guardian 12th December 2007

http://www.guardian.co.uk/profile/simonunderdown

Race is one of the most misunderstood terms in modern science, misused by seasoned scientists and laymen alike. Put simply, there are no human races, just the one species: homo sapiens. The idea of human races is a totally artificial concept, a sloppy form of shorthand that refers to an ill-defined mish-mash of surface differences, such as skin colour (probably controlled by a small number of genes), as well as different cultural practices, especially religious ones. Humans have an innate need to define and categorise, but race is a dangerous and outmoded idea that just can’t keep up with modern science.

The concept of different human races is an old one. From the 19th century onwards, Darwinian ideas of natural selection were misused to justify erroneous concepts of Victorian racial superiority and nationalism. To still talk about separate human races in the 21st century is at best misguided and at worst woefully ignorant of biology.

Our own species is remarkable for our lack of genetic variation. The eruption of the supervolcano Toba approximately 74,000 years ago is thought to have wiped out much of our genetic diversity by causing the extinction of many human groups. All of the differences that we now see in humans are a mixture of small genetic variations, built up over time, and of environmental effects. The Masai Mara and the Inuit have almost identical genes but the differences in their environment have greatly influenced how those genes are expressed, producing different outward appearances.

Yet a recent study continues to prop up this sick old man of biology, suggesting that “human races” in different parts of the world are becoming genetically more distinct. The fact that we are one species does not mean that we should not expect variation between populations, especially ones separated by large distances. Differences do exist, but the shared similarities are far greater. We all remain homo sapiens but the outward and genetic differences we see between populations are retained because of sexual selection and allegorical mating, the simple concept that like attracts like. Similarly the idea that we will all end up looking the same given long enough time is just as flawed as the idea of human races.

The study of human evolution has done much to show up the fallacy of separate human races. Indeed when we examine the work carried out on DNA from Neanderthal fossils (a separate species) huge areas of shared genetic information emerge, not least the FOXP2 or “speech” gene which is identical in humans and Neanderthals. If such little variation exists between two species that last shared a common ancestor over 500,000 years ago, then how comfortable can we be with the idea of separate human races today? Surely it is at last time to put away the idea of different races, celebrate our cultural differences and warmly embrace what makes us all Homo sapiens.

Evolving a Belief in God

21/12/2012

Originally published by the Guardian 9th June 2008

http://www.guardian.co.uk/profile/simonunderdown

The idea that humans are in some way special or set above all other species is an old one. Creation mythologies frequently see humans given dominion over the whole world as a result of recognising the god figure. The theory of evolution undermines this concept of superiority by demonstrating that humans are subject to the same evolutionary pressures as all other living things, hence the antipathy between evolutionary science and religious believers. However, as discussed from a religious perspective by Joanna Collicutt in her recent article, research in cognitive neuroscience suggests that religious belief is “hardwired” into our brains, through a desire to attach agency and purpose to inanimate objects and the most impersonal forces.

From an evolutionary perspective, the idea that a belief in God might be hardwired into the brain is as intriguing as it is problematic. Humans are a relatively recent evolutionary phenomenon. Our story goes back a mere 7m years and our own species, Homo sapiens, only appeared 200,000 years ago in Africa. We are not even the only species to which the term “human’ can be applied. Compelling evidence suggests that 2m years ago Homo ergaster, an African ancestor, was caring for the terminally sick, while the Neanderthals wore clothes, made jewellery and buried their dead (possibly with medicinal plants adorning the corpse) – sound familiar? Although there are some anatomical and behavioural differences that mark Homo sapiens as different from preceding human species, the similarities are far greater. The archaeological record points towards the gradual development of ever more complex human behaviours over millions of years, so when, if ever, did this predisposition to religious belief appear in the evolutionary process? Can we really accept the idea that a belief in God was actively selected for by natural selection in Pleistocene Africa?

The simple answer is that it was not belief in God that was being selected for, rather intelligence, imagination and empathy: just because for the past few thousand years we have used our brains to do something does not mean that is why it appeared in the first place. Our massive intelligence, and in turn, capacity for creating gods, was most likely the result of needing to manipulate and control our interactions with each other – then natural selection in turn favouring larger and larger brains. To use a computer analogy, our brain has almost limitless spare processing power, which can be put to millions of different uses. The creation of religions is simply one of these different tasks, just as music and engineering are others. We are not hardwired to have religious thoughts, to imagine otherwise detracts from the simplicity and beauty of the evolutionary process. We simply have limitless imagination – not so much a gift from God as a realisation borne of Darwinian thought.

The Misuse of Darwin

21/12/2012

The idea that Darwin is to blame for high school massacres and far-right politics is a huge intellectual mistake

Originally published by the Guardian 12 November 2009

www.guardian.co.uk/profile/simonunderdown

For evolutionary scientists there is no such thing as “Darwinism”. Instead we have a scientific theory that, in combination with Mendel’s work, provides the modern or neo- Darwinian synthesis, which explains the development of life on Earth. Although this is a rather succinct definition it effectively sets the limits of the usefulness of Darwin’s theory. However, in the last 150 years, there have been many attempts to take Darwin’s idea and apply it outside of the context for which it was developed, hence the influence of social “Darwinism” on concepts such as eugenics and a more recent Darwinian nihilism that absolves the individual of any moral or social responsibility. There is an inherent danger in extrapolating science beyond the realm for which it was intended, but ironically this human trait is perhaps best understood as an evolutionary hangover from the development of our massively expanded brainpower. We have an innate need to expand and develop ideas in order to explain our wider existence or justify our behaviours.

This inherent danger of using Darwin’s theory outside of its biological context has lead to attempts to portray Darwin as the de facto cause of 20th century genocide: see, for example, Andre Pichot’s book The Pure Society. There is a fallacy at the core of this line of thinking – can scientists really be held responsible for what is done with their ideas when they are misunderstood and corrupted by groups such as the Nazis? I would argue that they cannot: the actions of criminals do not need such highbrow justification and trying to do so merely lends a pseudo-scientific veneer the actions of the Third Reich.

A newer and perhaps more insidious attempt to blame “Darwinism” for human atrocity comes in the form of Dennis Sewell’s book The Political Gene: How Darwin’s Ideas Changed Politics. Sewell cites Darwin’s work as the reason for the development of something that he broadly categorises as a form of moral detachment from societal rules and norms: evolution is random and without purpose therefore I can do whatever I please. He argues that this moral vacuum can lead to disturbed teenagers perpetrating horrific crimes such as the Columbine school massacre. Sewell does not propose that Darwin’s theory leads inevitably to such actions, however he suggests that some of Darwin’s other writings were racist and not in keeping with modern views. This is hardly a stunning revelation: Darwin was a man of his time and of his society. Sewell is making a common mistake in grafting the faults and flaws of Darwin the man onto Darwinian evolution. Darwin the man has been venerated and condemned during the 2009 celebrations – surely it is now time to move on from either hero worship or iconoclasm to a more nuanced view, just as evolutionary biology has developed since 1859.

An interesting parallel can be seen in how Islamists subvert the essentially peaceful message of Islam into a justification for violence and vitriolic hate. One can no more blame the actions of misguided Islamists on Muhammad than the Nazis or high school school shooters can be blamed on Darwin. Humans have a tremendous capacity for selflessness and creativity but we also have an equally developed ability to cause destruction and misery. Both extremes are a result of our evolutionary heritage. If we blame Darwin for the dark side of human nature, logically we must also credit him with all that is good.

Teach the Bigger Story of Science

21/12/2012

Children have so much curiosity about the natural world, but the current school curriculum drains away their enthusiasm

Original published by the Guardian 17th February 2010

http://www.guardian.co.uk/profile/simonunderdown

A gulf seems to exist between our natural curiosity about the world around us and the popularity of science at university level in Britain. Scientists have such heated arguments because we are so passionate about our fields. Yet many school students seem to dislike the subject. Why are so many young people apparently bored by science ?

Small children frequently develop near obsessions with aspects of science, be they dinosaurs, insects or aeroplanes. So where does this fascination go? No one would deny the need for standards and benchmarks in education, but the process that began with the national curriculum is eroding the preparedness of students to cope with university science education.

The “Google generation” is taught in bite-sized chunks throughout their school lives. When they go to university, this teaching method lets them down. This is not the fault of students or teachers, but the nationally imposed criteria that all schools must fulfil. The way that school science curricula are designed primarily to meet testing benchmarks saps them of flexibility and the time for practical experimentation – the bedrock of any enriching science teaching.

The majority of lecturers in higher education would agree that the unprecedented rise in A-level grades is not the result of an unexplained increase in teenage intelligence: rather the nature of the questions has changed, and expectations seem to have been lowered. That means that universities are increasingly spending time addressing the science basics that 10 years ago were taken as read. This not only wastes time but prevents students from developing the deep analytical skills that employers now bemoan the lack of.

It is important to note that the students themselves are blameless: they can only take the tests they are given. Bored students switch off and find themselves unable to appreciate the material presented to them or to understand the research of academics. This places pressure on universities – should they adapt (in other words, dumb down) or maintain standards and risk losing students to softer subjects?

But it would be a mistake to substitute style for substance when it comes to science teaching. The Conservatives’ policy that only those with the best degrees should be allowed on to PGCE courses, while appearing superficially intellectually satisfying, does not offer a solution. The best teachers are not necessarily those who have amassed the most knowledge or excel in examinations – enthusiasm, creativity and charisma are just as important and cannot be measured in degree classifications. It isn’t teachers that are the problem; it is what they are required to teach.

Take my own specialism, evolution, a fascinating subject that arouses strong opinions – including outright hostility – yet its teaching in schools can lack relevancy and engaging examples. That old stalwart the peppered moth, though a fascinating creature, does lack something in the excitement stakes. Far better to use examples that are both relevant and inspiring, such as MRSA’s evolutionary tricks to resist treatment or the role that meat eating played in human brain expansion and intelligence.

Rigid adherence to the same old examples makes for boring lessons and unmotivated students (not to mention teachers). Perhaps if bite-sized subject syllabi were to be replaced with broader subject descriptions that rely on linking well-developed core principles, we could develop a much wider range of illustrations and examples to really motivate students. The downside would be more work for exam boards, and of course teachers (but also the opportunity for greater creativity and flexibility): surely a price that would willingly be paid for the resurrection of science education in the UK?

Of course scientists can always improve the way we present our work to the public, but well-taught, well-designed science curricula that have the freedom to be difficult and exciting will go a long way to harnessing and developing the fascination that children have with science. That can only benefit the next generation of potential scientists and society at large.