WELCOME TO YOUR BLOG...!!!.YOU ARE N°

Death: a special report on the inevitable



New Scientist, October 22 2012

The only certain thing in life is that it will one day end. That knowledge is perhaps the defining feature of the human condition. And, as far as we know, we alone are capable of contemplating the prospect of our demise. In these articles we explore the implications: the shifting definition of death, how knowing that we will die gave birth to civilisation, the grim reality of decomposition and whether it makes sense tofear death. But first, when did we become aware of our own mortality?

You may have heard that there are more people alive today than have died in all of human history, but this is not true.  According to calculations by demographer Carl Haub, about 107 billion people have been born up until now, and 100 billion of them are dead.  You are among the 6,5 per cent of humanity that is still alive. For now, anyway...

Memento mori: it's time we reinvented death


The knowledge that we will die profoundly shapes our lives – but the nature of death itself is elusive and changeable

IT'S said that when a general returned in glory to ancient Rome, he was accompanied in his procession through the streets by a slave whose job it was to remind him that his triumph would not last forever. "Memento mori," the slave whispered into the general's ear: "remember you will die". The story may be apocryphal, but the phrase is now applied to art intended to remind us of our mortality - from the Grim Reaper depicted on a medieval clock to Damien Hirst's bejewelled skull.
As if we needed any reminder. While few of us know exactly when death will come, we all know that eventually it will. It's usual to talk about death overshadowing life, and the passing of loved ones certainly casts a pall over the lives of those who remain behind. But contemplating our own deaths is one of the most powerful forces in our lives for both good and ill (see "Death: Why we should be grateful for it") - driving us to nurture relationships, become entrenched in our beliefs, and construct Ozymandian follies.
In this, we are probably unique. Most (not all, e.g. elephants are a notable exception) animals seem to have hardly any conception of mortality: to them, a dead body is just another object, and the transition between life and death unremarkable. We, on the other hand, tend to treat those who have passed away as "beyond human", rather than "non-human" or even "ex-human". We have developed social behaviours around the treatment of the dead whose complexity far exceeds even our closest living relatives' cursory interest in their fallen comrades. Physical separation of the living from the dead may have been one of the earliest manifestations of social culture (see "Death: The evolution of funerals"); today, the world's cultures commemorate and celebrate death in ways ranging from solemn funerals to raucous carnivals.
So you could say that humans invented death - not the fact of it, of course, but its meaning as a life event imbued with cultural and psychological significance. But even after many millennia of cultural development, we don't seem to be sure exactly what it is we've invented. The more we try to pin down the precise nature of death, the more elusive it becomes; and the more elusive it becomes, the more debatable our definitions of it (see "Death: The blurred line between dead and alive").
And those definitions matter, because they are the only way we have of rationalising our otherwise illogical fear of death - a fear that's probably the most widespread phobia on Earth (see "Death: Don't fear the reaper"). Most of us would wish for a peaceful death after a long and well-lived life. Of course, not all of us get our wish. For some, death comes sooner than we would like, and that's one reason to fear it. Only recently has it become commonplace for death to come later than we would like. Death can now be deferred by mechanical and medicinal means for days, weeks, months or years - and that brings with it fears of its own: of impotence, dependency and pain. Nothing in the way our societies are constructed is at all suited to this new situation.
So perhaps it is time for humanity to reinvent death, 3 million years or more after our first intimations of it. Indeed, the job is already underway: the proliferation of new types of death - industrial, vehicular and biochemical - has led to correspondingly complex legal codes. And there are those who seek to redefine death still further, by freezing their heads or replicating their minds outside their bodies - all to reify our long-held notions of passing beyond humanity.
Such projects may seem outlandish. But even for sceptics, the idea of greatly deferring or even defying death outright is worth deep and sincere reflection: in thinking about death, we are also thinking about life.

The evolution of funerals

















When did our ancestors become aware of their own mortality? The answer may help us understand the origin of our unique way of life, says Graham Lawton
PANSY died peacefully one winter's afternoon, her daughter Rosie and her friends Blossom and Chippy by her side. As she lay dying her companions stroked and comforted her; after she stopped breathing they moved her limbs and examined her mouth to confirm she was dead. Chippy tried twice to revive her by beating on her chest. That night Rosie kept vigil by her mother's side.
Pansy's death, in December 2008, sounds peaceful and relatively routine, but in fact it was highly unusual. Captive chimpanzees are rarely allowed to die at "home"; they are usually whisked away and euthanised. But the keepers at Blair Drummond Safari and Adventure Park in Stirling, UK, decided to let Pansy stay with her loved ones until the last so that their response to her death could be observed.
It is hard not to wonder what was going on in the minds of Rosie, Blossom and Chippy before and after Pansy's death. Is it possible that they felt grief and loss? Did they ponder their own mortality? Until recently these questions would have been considered dangerously anthropomorphic and off-limits. But not any more.
The demise of Pansy is one of many recent observations of chimpanzee deaths, both in captivity and the wild, that are leading to surprising insights about our closest living relatives' relationship with death. This, in turn, is opening up another, deeper, question: at what point in human evolution did our ancestors develop a modern understanding of death, including awareness of their own mortality? The answer goes much wider than our attitude to death - it may help us to better understand the origin of our unique way of life.
As far as most animals are concerned, a dead body is just an inanimate object. Some species have evolved elaborate-looking behaviours to dispose of bodies - mole rats, for example, drag them into one of their burrow's latrines and seal it up - but these are practical acts with no deeper purpose or meaning.
Some non-human animals, though, clearly have a more complex relationship with death. Elephants are known to be fascinated with the bones of dead elephants, while dolphins have been observed spending long periods of time with corpses.
No animal, though, arouses interest as much as chimps do. PsychologistsJames Anderson and Louise Lock from the University of Stirling, who recorded Pansy's death, point out that her companions' responses were "strikingly reminiscent of human responses to peaceful death", including respect, care, testing for signs of life, attempts to revive, vigil, grief and mourning.
Similar things have been seen in the rare occasions that death has been observed among wild chimps. Primatologists Alexander Piel of the University of California, San Diego, and Fiona Stewart of the University of Cambridge witnessed just such an event in Gombe national park in Tanzania in January 2010. Early one morning, rangers discovered the body of a female chimp, Malaika, who had apparently fallen out of a tree.
When Piel and Stewart arrived at 9.15 am there was a crowd of chimps around Malaika's body. For the next three and a half hours the pair observed and filmed the scene as a succession of chimps visited the body, while others observed from the trees. Some seemed merely curious, sniffing or grooming the body. Others shook, dragged and beat it as if in frustration and anger. Dominant males performed displays of power around it or even with it; the alpha male threw it into a stream bed. Many made distress calls.
When the body was finally removed by rangers, eight of the chimps rushed to where it had lain and intensively - and excitedly - touched and sniffed the ground. They stayed for 40 minutes, making a chorus of hooting calls before moving off. The last chimp to visit the spot was Malaika's daughter Mambo.
What are we to make of this? According to Piel, the chimps' behaviour can be classified into three categories: morbidity, or intense interest in the body, mourning and "social theatre". And as with Pansy's death, these are very reminiscent of how we behave.
"The danger is to anthropomorphise, but much of this behaviour is still practised by modern humans," says Paul Pettitt, an archaeologist at the University of Sheffield, UK, who studies the origins of human burial. "We see in chimps very simple behaviours that have become elaborated into more formal expressions of mourning. It gives us a feel for what we might expect to have been practised by Miocene apes and early protohumans."
We will never know for sure, of course. But the fossil and archaeological record contains tantalising hints of how this kind of behaviour evolved into modern rituals. And this has become a major question in palaeoanthropology. Our treatment of the dead clearly falls into the category of "symbolic activity", akin to language, art and the other things that make modern humans unique. These were all thought to have emerged around 40,000 years ago, but recent discoveries have tentatively pushed this back to 100,000 years or more.
Anything resembling mortuary practices predating 40,000 years ago used to be dismissed as an artefact. But not any more, says Francesco d'Errico of the University of Bordeaux in France. "Most archaeologists now accept that modern humans, Neanderthals and possibly other archaic hominins were engaged in mortuary practices well before 40,000 years ago."

Hominids on a hillside

The earliest signs are very old indeed. In 1975, on a steep grassy hillside near Hadar, Ethiopia, palaeontologists discovered 13 specimens of our 3.2 million-year-old ancestor Australopithecus afarensis - nine adults, two juveniles and two infants - all within touching distance of one another and apparently deposited around the same time. How they got there is a mystery. There is no evidence of a flash flood or similar catastrophe that could have killed all of them at once. There is no sign that the bones had been chewed by predators. They are, as discoverer Donald Johanson later wrote, "just hominids littering a hillside" (see diagram)
Last year, partly in light of chimp research, Pettitt proposed a new explanation: the bodies were left there deliberately in an act of "structured abandonment". That doesn't mean burial, or anything with symbolic or religious meaning. "It was probably just the need to get rid 0f a rotting corpse," says Pettitt. Even so, it represents a significant cognitive advance over what is seen in chimpanzees, who leave their dead where they fall - perhaps the first stirring of something human. "It could be recognition that the appropriate place for the corpses is not among the living - a first formal division between the living and the dead," says Pettitt.
Barring new discoveries it will be impossible to confirm that australopithecines deposited their dead in a special place. But by half a million years ago the evidence is much clearer.
Sima de los Huesos - the pit of bones - was discovered in the 1980s at the bottom of a limestone shaft in a cave in the Atapuerca Mountains of northern Spain. It contained the remains of at least 28 archaic humans, most likely Homo heidelbergensis, a probable ancestor of both Homo sapiens and Neanderthals.
How did they get there? An obvious possibility is that they accidentally fell down the shaft, but that seems unlikely from the way the bones fractured. "It doesn't look like a natural accumulation," says Pettitt. Most of the skeletons are adolescent males or young men, and many show signs of bone disease or deformity.
According to Pettitt the best explanation is that they were deliberately placed at the top of the shaft after death and then gradually slumped in. If so, this is the earliest evidence of funerary caching, or the designation of a specific place for the dead - perhaps, in this case, for deformed outcasts - a further advancement towards the modern conception of death. Once you have designated places for the dead you are clearly treating them as if they still have some kind of social agency. "Once you've reached that point you're on the road to symbolic activity," says Pettitt.
What did these protohumans understand about death? Did they know that they themselves were mortal? Did they have a concept of an afterlife? "We haven't got a clue," says Pettitt.
What we do know is that funerary caching became increasingly common: bodies are found in places that are hard to account for any other way, tucked into fissures and cracks, in hard-to-reach overhangs or at the back of caves.
From funerary caching it is a short conceptual leap to burial - creating artificial niches and fissures to stash the dead. The earliest evidence we have of this is from two caves in Israel - Skhul and Qafzeh - where the skeletons of 120,000-year-old Homo sapiens were found in what are clearly human-made hollows. There is also evidence of Neanderthal burials from around the same time. All this adds to the evidence that humans were on their way to a symbolic culture much earlier than we thought. "Once you start getting deliberate burials I think it's much more likely that people are thinking in formalised terms, things like life after death," says Pettitt.
Even so, these burials do not represent a point of no return. Only a handful of such sites are known; compared with the number of people who must have died they are incredibly rare. It appears that burial was for special occasions; most dead people were probably still cached or abandoned.
It was not until about 14,000 years ago that most people were buried in what we would recognise as cemeteries. Around the same time people were settling in one place and inventing agriculture and religion - it is probably no coincidence that the world's oldest ceremonial building, Göbekli Tepe in Turkey, was built at that time.
Well before that, however, archaic humans appear to have had a concept of death not unlike ours. Art, language, elaborate funerary practices - they are just expressions of the same thing, says Pettitt. "It's part of what distinguishes us not only from other animals but from every other type of human that's gone before."
Graham Lawton is deputy magazine editor of New Scientist


Death: The blurred line between dead and alive

    by Dick Teresi
    It's now easier than ever to be declared dead – even when you're still moving, sweating, and there's blood pumping around your body
    IT IS now easier to be declared dead than at any time in human history. The standards have fallen so low that your heart can be beating, your brain can be sending out brainwaves, and the doctor can still declare you an ex-person. The good news: only about 1 per cent of the population is subject to minimal death criteria. The bad news: if you fall into this 1 per cent, you may be vivisected.
    But we're getting ahead of our story.
    The question "When is a person dead?" has troubled us for thousands of years. It is not a trivial matter, especially to the person about to be buried or cremated. So we look for what we believe to be foolproof clues. Is there a central organ that when it stops functioning means a human is dead? Is there a set of behaviours that signals with certainty that a human has shuffled off this mortal coil, kicked the bucket, expired?
    In ancient Egypt the buck stopped at the embalmer. The ancient Greeks knew that many conditions mimicked death. Their test was to cut off a finger before cremation.
    Medieval Europeans became increasingly uncertain about who was dead and who was alive as the literature began to fill with accounts of premature burial. The difficulties were underscored by the anatomy theatres that sprang up across Europe in the 1500s to the 1700s, where anatomists would perform public dissections on executed prisoners. The performances sometimes demonstrated that the stars of the show were not quite dead. An anatomist might extract a heart, hold it aloft, and be greeted with gasps because it was still beating. One anatomist, Niccolò Massa, asked to be left unburied for two days "to avoid any mistake".
    The 18th century saw the beginning of two important trends. First was the medicalisation of death. Doctors began to appear at the bedside of the dying to administer opiates, and as the boundary between death and life became more confused, medical technologies were introduced to tell the difference. During the next two centuries, innovations were developed that revealed signs of life in those previously thought to be deceased: artificial respiration, smelling salts, electric shocks, the stethoscope, microphones to amplify chest sounds, radiographic fluoroscopy to detect the motion of vital organs, and the ophthalmoscope to examine the circulation of blood in the retina.
    The second important trend in this era was a shift from what today we would call cardiopulmonary death towards brain death. There was no such term as "brain death" then, but doctors talked about "sensation" and "will" as the measure of a human.
    The concept of brain death played a major role in one of the most extraordinary medical advances of the 20th century. In 1954, surgeon Joseph Murray performed the first successful solid organ transplant, transferring a kidney between living identical twin brothers. It was not long before organs were being transplanted from dead donors into living ones.
    This remarkable technology promised to save lives, but it faced a major problem: stale organs. You can use live donors for kidney transplants because people have two kidneys and can lope along on one. But for other organs, you need a dead donor. When a person dies, however, organs are deprived of oxygen.
    In 1968, a team of 13 men formed the Ad Hoc Committee of the Harvard Medical School to Examine the Definition of Brain Death, and devised a clever plan to solve the problem. Why not declare dead some of the patients on ventilators in intensive care units, and harvest their organs? These patients were in a deep coma but not dead. Their hearts were still beating. If the ventilator was kept in place even after they were declared dead, their organs would continue to be bathed in blood right up to the moment the surgeons needed them. Voila.
    That's precisely what the Harvard committee did. It defined a second form of death, what one doctor calls "pretty dead." Up to that point, doctors had used the cardiopulmonary standard: when your heart stopped beating and you stopped breathing, you were dead. Now there was "death lite", created for the benefit of the transplant industry.
    The original Harvard criteria were frighteningly simple, requiring a test shorter than an eye exam. The patient must simply be "unreceptive", showing "no movements" and "no reflexes". Rudimentary clinical tests determined this, such as ice water in the ears, a flashlight in the eyes, cotton swabs touched to the eyeball or reflex tests (JAMA, vol 205, p 85).
    Then comes an "apnea test". The ventilator is disconnected and the doctors see whether the patient can breathe unaided. If not, he is brain dead. Here's the scary part. Then the ventilator is reconnected. People talk about "pulling the plug", but the opposite happens. Few people realise this. Nor do they realise that their "do not resuscitate" orders or living wills no longer have legal sway. Once declared brain dead, you are legally dead and your legal rights go down the drain.
    The Harvard criteria ran into trouble almost immediately. The committee did no patient studies, and cited none. In the early 1970s, two studies on actual patients showed that the brains of "brain-dead" people were not always dead. The Harvard tests only indicate whether the brain stem is dead, not the neocortex, the part of the brain where consciousness is most likely seated.
    The Harvard criteria did, however, specify a test to make sure this part of the brain was also nonfunctioning: an EEG. What patient studies showed was that some of these otherwise brain-dead people were producing brainwaves on the EEG. If the brain was dead, what was waving? This problem was easily solved: doctors were told to skip the EEG.
    Then in 1981 came the US Uniform Determination of Death Act. It declared that brain death was legal death. The act stated that the "entire brain" must be dead, but left exam techniques to the doctors, who rarely test the cortex.
    Even these low standards proved not low enough. Doctors noticed that some brain-dead organ donors ("beating-heart cadavers" in the parlance) were moving about slightly, and exhibiting reflexes. They were violating two of the Harvard criteria: no movement and no reflexes. In the US, this was easily solved by changing the standards. In 1995, the American Academy of Neurologists stated that you could move about somewhat and display reflexes and still qualify as brain dead.
    In 2000, The Lancet published a study of 38 brain-dead patients, 15 of whom were still moving in the first 24 hours after being declared dead (vol 355, p 206). Another study of 144 beating-heart cadavers found that 79 had retained their reflexes after death (Journal of Neurology, vol 252, p 106). One doctor advises hospitals not to let the families of brain-dead donors see their loved ones after death is declared for fear they'll see these movements.
    Through more than 4000 years of history, we have learned that human life is tenacious, and many signs of death are misleading. Yet today, we dissect for their organs patients who in any era before 1968 would be considered very much alive.
    Keep in mind, though, that only about 1 per cent of the population is declared dead based on brain-death criteria. And if you are not an organ donor, it won't matter. The ventilator will not be reconnected, and you will be allowed to die a normal cardiopulmonary death, because morticians will not embalm or bury a brain-dead body. They are not idiots.

    Death: Why we should be grateful for it

    Congratulations – as a human, you know you're going to die. That's why you've learned some impressive cultural and psychological techniques to cope
    DEATH gets a bad press. Invariably the unwelcome visitor, arriving too soon, he is feared and loathed: "the last enemy" in the words of the Bible.
    But a few poets and philosophers throughout history have argued that without death we would be at a loss. It's the prospect of his coming that gets us out of bed in the morning and drives us to great deeds. Now a growing body of evidence from social psychology suggests that these thinkers are right. People might dream of a deathless civilisation, but without death, there would barely be a civilisation at all.
    The story begins with the awareness of our mortality. Like all living things, we struggle to survive. Yet unlike other creatures - as far as we know, anyway - we live with the knowledge that this is a struggle we are bound to lose. Our mighty brains, so good at inferring and deducing, tell us that the worst thing that can possibly happen surely will, one day. We must each live in the shadow of our own apocalypse.
    That isn't easy. Indeed, it is terrifying and potentially paralysing. So we work very hard to stave off death, to defy it for as long as possible or deny it altogether. All this frantic defiance and denial result in some of our greatest achievements.
    This is perhaps most obvious when considering humanity's material progress: agriculture, for example, was invented to give us the food we need to live. Clothes and buildings keep us warm and give us shelter, weapons allow us to hunt and defend ourselves, and medicine heals our sicknesses. The great majority of the material innovations that make up our civilisation are in essence life-extension technologies that we have been driven to invent by the spectre of oblivion.
    Of all these achievements, perhaps the greatest is science. This, too, has always been motivated by the fear of death. Francis Bacon, the father of empiricism, described indefinite life extension as "the most noble goal". He sacrificed his own life to the cause, dying of pneumonia contracted while attempting an experiment in cryopreservation involving a chicken and some snow. Science is the business of self-aware mortals - the gods would have no need of biochemistry.
    Despite the best efforts of science and technology and the very real improvements in life expectancy that they have achieved, the terrifying prospect of death still hangs over us. That is why humans invented culture as well as material civilisation. Many thinkers, from Georg Hegel to Martin Heidegger, have suggested that its purpose is to reassure us that even though the body will fail, we will still live on. One scholar in this tradition was the anthropologist Ernest Becker, whose 1973 book The Denial of Death won the Pulitzer prize. It was this work that inspired a group of social psychologists to seek empirical evidence to support the speculations of the philosophers.

    Clinging on

    These researchers - Jeff Greenberg at the University of Arizona, Sheldon Solomon of Skidmore College in New York state and Tom Pyszczynski at the University of Colorado - came up with what they called terror management theory: the idea that most of what we do and most of what we believe is motivated by the fear of death. They surmised that if our world views exist to help us cope with mortality, then when reminded of our inevitable demise, we should cling all the more fervently to these beliefs.
    One of their starting points was religion, a set of belief systems that arguably epitomise our attempts to assuage the fear of finitude. If religions really are offering existential solace, Greenberg, Solomon and Pyszczynski's thinking went, then when death looms, there should be a measurable increase in religiosity.
    Which is just what they found. In one study they asked a group of Christian students to assess the personalities of two people. In all relevant respects the two were very similar - except one was Christian and the other Jewish. The students in the control group judged the two people equally favourably. But those students who were first asked to fill in a personality test that included questions about their attitude to death, and were thus subtly reminded of their mortality, were much more positive about their fellow Christian and more negative about the Jewish person.
    This effect is not limited to religion: in over 400 studies, psychologists have shown that almost all aspects of our various world views are motivated by our attempt to come to terms with death. Nationalism, for example, allows us to believe we can live on as part of a greater whole. Sure enough, Greenberg and colleagues found that US students were much more critical of an anti-American writer after being reminded of their mortality. A further study, by Holly McGregor at the University of Arizona, showed that students prompted to think about death were not merely disapproving of those who challenged their world views, but willing to do violence to them in the form of giving them excessively large amounts of hot sauce (Journal of Personality and Social Psychology, vol 74, p 590).
    These initial studies supported Becker's bleak view that the denial of death is the route of all evil. It causes the creation of in-groups and out-groups, fosters prejudice and aggression, and stokes up support for wars and terrorism. For example, people who were exposed to TV images of planes flying into New York skyscrapers were more likely to support the invasion of Iraq. Terror management theorists initially focused on this dark side. But lately they have come to recognise the positives in our struggle with death.
    For example, one of the most powerful forces shaping human culture is the desire to leave a legacy. Some of the greatest achievements of civilisation can be attributed to this urge, from the pyramids of Egypt to Paradise Lost. Now terror management theorists have demonstrated that, at least among undergraduates in the US, thoughts of death continue to stoke our drive to be remembered.
    Socrates saw this 2000 years ago, arguing that much of what men do can be understood as a desperate attempt to immortalise themselves; women, he thought, could take the more direct route of having children. Several studies suggest he was right to see founding a family as a terror management strategy: one showed that German volunteers expressed a greater desire to have children when reminded of death; another that Chinese participants were more likely to oppose their country's one child policy when similarly primed.
    A recent review paper by Kenneth Vail at the University of Missouri and colleagues catalogues the many ways that contemplating mortality can be good for us. For example, it can induce us to live more healthily by exercising more or smoking less (Personality and Social Psychology Review, doi.org/jfg).
    The team also identify an important distinction between conscious and non-conscious death reminders. The latter - subtle or subliminal prompts - tend to cause us to cling unthinkingly to the values of our community. This can be positive if those values are positive, but can also be negative if they induce us to aggressively defend those values against others.
    Conscious death reminders, on the other hand, stimulate a more considered response, leading people to re-evaluate what really matters. The more we actively contemplate mortality, the more we reject socially imposed goals such as wealth or fame and focus instead on personal growth or the cultivation of positive relationships.
    Which suggests we do not yet think about death enough.
    Stephen Cave is a writer based in Berlin and author of Immortality: The quest to live forever and how it drives civilization (Biteback)

    Death: The natural history of corpses

      The human body's final journey might not be pretty but at least it is eventful. Have your fill of the gruesome facts of decay and disintegration

      IT'S NOT a nice thing to contemplate. But set aside the thought of any of the below befalling you or your loved ones, and what happens to our mortal remains when we are no longer using them is pretty fascinating. If nothing else, it proves that nature is ruthlessly efficient at clearing up its messes.
      At least it can be. Very few people in the modern world get to be dead the old-fashioned way - out in the open, exposed to the elements. Of those that do, the speed at which the body turns to dust depends on a mix of factors including temperature, moisture and the animals, insects and microbes that happen to be there. In a relatively warm and moist spot with plenty of insects and scavengers, a human body can be turned to bones within a few weeks and disappear completely in months.
      But what about the majority of bodies, which get refrigerated soon after death, then embalmed and put in a coffin? Again, it depends. Temperature and moisture are still the most important factors, but numerous others play a part, from how well the body was embalmed to the tightness of the seal on the coffin, the acidity of the soil and that of the groundwater which will eventually seep inside. All of this means that it is impossible to predict how long a particular body's final journey might take - it can be anything from months to decades.
      What we can say, though, is that whatever the timescale, the vast majority of bodies will go through the same stages of decomposition.
      First comes the "fresh" stage. Within minutes of death, carbon dioxide starts to accumulate in the blood, making it more acidic. This causes cells to burst open and spill enzymes which start to digest tissues from within.
      The first visible sign of decomposition comes after half an hour or so, as blood pools in the parts of the body closest to the ground. At first this looks like purplish-red blotches; over the next day or so it turns into an almost continuous purplish mark known as livor mortis. The rest of the body turns deathly pale.
      Around the same time, muscles go floppy and then stiffen as rigor mortis sets in. In life, pumps in the membranes of muscle cells control the amount of calcium ions in the cell - high levels stimulate contraction and low levels allow relaxation. The pumps no longer work after death, so calcium ions diffuse into the cells from the higher concentration outside, causing the muscles to contract.
      Rigor mortis passes after two to three days. But what looks like relaxation is actually rot setting in, as enzymes break down the proteins that held the muscles in their contracted state.
      Embalming the body stops the rot in its tracks, at least temporarily. Unlike ancient Egyptian embalmers, who aimed to keep the body intact for all eternity, modern embalming is designed to make a corpse look presentable and keep it in one piece long enough to organise a funeral.
      This is done by disinfecting the body and replacing the blood and other fluids with a mixture of water, dye and preservatives, usually including formaldehyde. The dye is to restore something resembling a healthy skin tone, while the formaldehyde preserves the body in several ways, first by repelling insects and killing bacteria. It also inactivates the body's enzymes and makes the tissues more resistant to decomposition by adding cross links to the chains of amino acids that make up proteins.
      This protective effect only lasts so long, though, leaving the body more or less back where it started.
      The next stage, putrefaction, gets a little ugly - not to mention smelly - as the enzymes, aided and abetted by microbes, get to work. After 48 hours or so, when enough nutrient-rich fluid has spilled from the burst cells, these microbes spread rapidly. The main beneficiaries are among the 100 trillion bacteria that have spent their lives living in harmony with us in our guts. As they break down proteins they churn out two compounds with names as stinky as their smells, putrescine and cadaverine, and these give a corpse its repulsive odour.
      From the outside, putrefaction can be seen as a green hue, slowly spreading from the front of the belly across the chest and down the body. The green colour comes from the action of anaerobic bacteria, which convert haemoglobin in the blood to sulfhaemoglobin.
      All this bacterial action also creates gases, including hydrogen, carbon dioxide, methane, ammonia, sulphur dioxide and hydrogen sulphide. These contribute to the stink and also distort the body, blowing it up like a balloon and eventually, after a month or so, bursting it open. Hydrogen sulphide also combines with the iron in haemoglobin to make the black-coloured iron sulphide, which turns the skin darker.
      This heralds the start of the third stage: active decay. The rate of decomposition now speeds up and what is left of the flesh is rapidly consumed, until all that remains is the skeleton. Sometimes, something else can happen too, though. If the body happens to be in particularly cold soil, a waxy covering called adipocere, or grave wax, might form. Adipocere is a particularly spooky side effect of the work of some anaerobic bacteria, such as Clostridium perfringens, as they digest body fat. It takes around a month to start forming and can leave the corpse with what looks like a wax coating.
      The final stage - breakdown of the skeleton - takes the longest. For the bones to disappear the hard mineral parts need to be broken down. This happens if they come into contact with acidic soil or water, and speeds up if they are mechanically broken up by tree roots or animals. Once the hard stuff is gone, the body's last proteins, including the collagen that once gave the bones flexibility, succumb to bacteria and fungi and disappear.
      There are some cases where this sequence of events doesn't play out at all and the body doesn't get a chance to decay. If the corpse is kept completely dry bacteria can't do their work and the tissues will mummify. The same goes for bodies that fall into natural preservatives such as bogs, salt marshes or snow, where bacteria don't thrive and the body's enzymes don't work.
      Then there are the rare cases when a person dies in the company of hungry scavengers. In these cases the body can be stripped to the bones and chewed into tiny pieces in a matter of days. The same can happen under the sea.
      Of course, without a bog, dog, shark or icy grave to hand, the only way to avoid the harsh realities of decay is cremation. In a chamber heated to 750 °C the coffin and entire corpse can be burned in under 3 hours. After that, the ashes are passed through a grinder called a cremulator to take care of any particularly big or stubborn bones that haven't completely burned and turn the entire remains into fine ash.
      And that, as they say, is that. It may not be pretty but it's one of the few definites in life: ashes to ashes, dust to dust, in the end there's not a lot left.
      Caroline Williams is a science writer based in Surrey

      Death: Don't fear the reaper


        Most of us are afraid of death, but it doesn't make sense, says philosopherShelly Kagan
        ONE of the commonest reactions to death is fear. Indeed, "fear" may be too weak a term: terror is more like it. But is fear of death a rationally appropriate response?
        The crucial word here is "appropriate". I don't want to deny that many people are afraid of death. What I want to know is whether fear of death is an appropriate response.
        Under what conditions does it make sense to feel fear? Three requirements come to mind. The first is that the thing you are afraid of has to be bad. I imagine that this is fairly uncontroversial.
        The second is that there has to be a non-negligible chance of the bad thing happening. It is not enough that it's a logical possibility. There is, for example, a chance that you will be ripped to pieces by tigers, but it's negligibly small. If you were to tell me that you are afraid that you will die this way, then I would say that such a fear is not appropriate.
        Condition number three is more controversial: you need to have some uncertainty about whether the bad thing will actually happen, or else how bad it will be. To see the point of this condition, we need to imagine a case where a bad thing is certain to happen, and you know how bad it is going to be. In circumstances like that, fear is not an appropriate response, even though the first two conditions have been met.
        Suppose that every day you come to the office with a packed lunch. For dessert you bring a cookie, and every day somebody steals it. Admittedly not the worst thing in the world, but it's a bad thing. Furthermore, there is a more than negligible chance that your cookie will be stolen tomorrow. So the first two conditions are in place.
        But not the third. It is pretty much guaranteed that your cookie will be stolen tomorrow. The bad thing is certain to happen, and you know how bad it is. In this case, I think, fear doesn't make any sense. In contrast, if the thief strikes at random then you might reasonably be afraid.
        One other point is worth mentioning. Even when fear makes sense, there's a proportionality condition that should be kept in mind. Even if some fear is appropriate, the amount of it might still be inappropriate. When the risk is slight, mild concern may be all that is appropriate. Similarly, the amount of fear needs to be proportional to the size of the bad.
        Armed with these ideas, it might seem that we are now in a position to ask whether fear of death is appropriate. However, we first need to clarify something important: what exactly are we afraid of? There are different ways to answer this question, and depending on which we have in mind, fear may, or may not, be appropriate.
        One thing you might worry about is the process of dying. Insofar as there is some chance that you will die a painful death, there seems to be room for some fear. But I imagine this is not what most people have in mind. What most people mean is that they're afraid of death itself - afraid of what it will be like to be dead. In this case, I think, the conditions for appropriate fear are not satisfied. The main point here is that there is nothing that being dead is like. It involves no kind of experience at all, so it is not intrinsically bad. Thus the first condition for appropriate fear isn't satisfied. (Things might look different if you believe in an afterlife.)
        Of course, I am not suggesting that there is nothing bad about death. On the contrary, I accept the "deprivation" account, according to which death is bad by virtue of the fact that you are deprived of the good that you would have if you weren't dead.
        So perhaps we can specify an appropriate object of fear this way. Instead of fearing what death will be like, perhaps we should fear the deprivation of life. If so, perhaps fear of death is appropriate after all.
        But that's not quite right either. First of all, I believe that immortality would not be good for us; to be condemned to live forever would be a punishment, not a blessing. So fear is not appropriate. More precisely, if what we are afraid of is the inevitable loss of life, then the object of our fear is not bad, but good, and so fear is still out of place.
        However, even if immortality would not be bad, it doesn't follow that fear of death is appropriate. Appropriate fear requires a lack of certainty with regard to the coming of the object of our fear. And I know that I am going to die.
        But now a different possibility suggests itself. Fear of death is inappropriate because death is certain. But what is not at all certain is when you are going to die. Perhaps, then, what we should be afraid of is not loss of life per se, but rather the possibility that we will die sooner rather than later.
        Consider an analogy. Suppose you're at a party. It's great, and you wish you could stay, but this is taking place in high school, and your mother is going to call and tell you it's time to go home. Now, there's nothing bad about being at home; it's intrinsically neutral. You just wish you could stay at the party.
        Suppose you know that the call is going to come at midnight, guaranteed. Then, I think, there isn't anything to be afraid of. But if all you know is that your mother is going to call some time between 11 pm and 1 am, the conditions for appropriate fear have been met. There is something bad, there is a non-negligible chance of it happening, and yet there is also a lack of certainty that it will happen. Now some degree of fear makes sense. Perhaps we have something similar with regard to death. Perhaps it makes sense to be afraid given the unpredictability of death.
        Further distinctions might be helpful. Am I afraid that I will die soon, in the sense that, given the range of years I might reasonably hope for, death may come sooner rather than later? Or am I afraid that I will die young, with death coming sooner for me than it does for others? These ways of specifying the object of my potential fear differ in important ways, including how much fear is appropriate, and when.
        Take the fear of dying young. Clearly, if you have reached middle age any fear of dying young is irrational. But even among the young, the chance of this actually happening is extremely small.
        As one grows older, the chance of dying within a given period increases. But even here, fear that one will die soon can easily be out of proportion. Even an 80-year-old has a more than 90 per cent chance of living at least another year.
        Obviously, fear that death may come soon can make sense among the very sick or the very aged. But for the rest of us, I think, it is typically misplaced. If you are reasonably healthy and yet you say to me, "I am terrified of death", then all I can say in response is that I believe you, but terror is not appropriate. It doesn't make sense, given the facts.
        Shelly Kagan is a professor of philosophy at Yale University. This is adapted from his book Death (Yale University Press)

        No hay comentarios:

        Publicar un comentario

        COMENTE SIN RESTRICCIONES PERO ATÉNGASE A SUS CONSECUENCIAS