These are a few of the many articles I have written about the human brain and its oddities. All of them have been published in various newspapers and magazines but you can read the full versions on this site and may quote or use them as you like, though I would be grateful if you would contact me first by writing to: info@ritacarter.co.uk.

Introduction: does reading matter?

If you Google “Google + stupid” ,you get about 24 million hits, the first of which is the online version of a now famous essay by American writer Nicholas Carr. “Is Google making us stupid?” begins with Carr describing his spooky intuition that: "someone, or something, has been tinkering with my brain, remapping the neural circuitry, reprogramming the memory.. My mind isn't going - so far as I can tell - but it's changing. I'm not thinking the way I used to think."

View Rita's BBC4 documentary on Reading and the brain

The moral brain

People who don't know anything about science, and are not interested in knowing anything, often justify their position by claiming that it can tell us nothing about the "big" questions that haunt us: what is Good? Is there a God? How should I lead my life? And so on. The work I describe in this piece, by the US psychologist Marc Hauser, gives the lie to this arrogant assumption.

Read more

Imagine this: you are paddling a canoe along a jungle river when you spot a group of crocodiles gliding swiftly through the water with deadly intent. Happily for you, they are not heading in your direction - their sights are set firmly on a family of swimmers at the water's edge. If nothing is done the crocs will soon have all five of them for dinner.

By driving your canoe into the crocodiles you could send them off in a different direction, but this would put the hungry reptiles directly on course to a sixth swimmer.

So what should you do? Allow things to take their course, and allow five to die? Or intervene and reduce the toll to one?

Thought experiment enthusiasts might recognise this scenario as an exotic version of the famous "runaway train" dilemma, in which five pedestrians are about to be mown down unless you, the bystander, throw a switch which will divert the train to a track where just one reckless track-walker will be killed.

Over the last three years some quarter of a million people have responded to the train dilemma via a web-based survey posted by evolutionary psychologist Marc Hauser and his colleagues at the Harvard Cognitive Evolution Laboratory. The vast majority of respondents - some 90% - say they would flip the switch.

The Harvard researchers invented the crocodile-canoe variation to help establish whether this moral judgement is learned or "hard-wired" into our brains. They used it on the Canu Indians, a tribe whose world is so different from that of the Internet sample that they wouldn't know a train if one…er…..hit them. Despite the vast cultural difference between the two groups, however, they both judged the situation in almost exactly the same way, supporting the idea that moral decision-making is more instinctive than conditioned.

Instinctive judgements, by definition, are not arrived at rationally, so this may explain why many of our moral presumptions are nonsense.

Most people, for example, think it is worse to attain a given end by actively harming someone, than to achieve the same end by omitting to prevent the harm occurring. Another thought experiment illustrates the point:

Scenario one is this: Six-month-old Jimmy is all that stands between his Wicked Uncle Sidney and a large inheritance. So Sid has volunteered to baby-sit his nephew with the intention of doing him in. Come bath time he plans to hold the baby under the water, then claim that the drowning happened when he momentarily left the room. Jimmy's Mum places the baby in the bath before she leaves the house, then hands over to Sidney who enters the bathroom and duly murders the child.

In scenario two Sid enters the bathroom full of evil intent but then sees that little Jimmy has tipped himself upside down and is about to drown anyway. Sidney could scoop him out and save his life or he could do nothing. He decides to do nothing.

When people are asked to measure out blame in each case, they invariably rate Sid's active drowning of Jimmy as more morally reprehensible than his omission to save him. Yet in both cases Sidney's intention - to bring about the death of Jimmy - is identical, and so are the consequences. These are the factors that people consciously think matter, yet their judgement does not reflect this.

Using thought experiments like these, Hauser has determined that our moral judgements are based on three largely "hidden" principles:

One - as shown by the Sidney/Jimmy scenario - is that harm caused by positive action is morally worse than equivalent harm caused by omission.

The second is that doing harm in order to achieve a goal is worse than knowingly allowing harm to be done in pursuit of the goal.

To illustrate this Hauser uses another version of the runaway train theme. The train is again on course to run over five people, but you can divert it. Instead of sending it off in another direction, though, this diversion takes the train on to a loop which rejoins the main track before it reaches the hitchhikers. There would be no point in sending it this way, except that there happens to be a very fat man walking along the tracks on the loop so if you do the train would hit the fat man and his weight would bring it to a halt. Five people would thus be saved, and one killed.

This is very similar to the "basic" runaway train experiment, but whereas in that the one who was killed was a "side effect" of preventing the others' death, in this case he is the means by which the train is stopped.

It seems like a subtle difference - so subtle, in fact, that most of us don't see it unless it is spelt out. Yet unconsciously it makes a lot of difference: whereas 90% think its ok to allow the man to be killed, only about half of us agree that it is ok to kill him in order to save others.

The third principle is that doing harm by using physical contact is worse than harm done at arm's length. If people are asked if it is ok to use the fat man to stop the train by actually pushing him into its path, the number who say yes drops right down to 11%, even though - for the fat man the distinction is really quite academic.

Thought experiments may seem like a game but these ones have serious implications because the way that we judge right or wrong determines our whole system of crime and punishment. Speeding a person's demise, for example, is legal if it involves not doing something that would prolong life, but (in most places) illegal if involves actively doing something to bring about death. Killing civilians in a war is a criminal act, but when the deaths are called "collateral damage" they are treated as merely regrettable. By exposing these intuitive, nonsensical, judgements runaway trains, hungry crocodiles and wicked Uncles may yet help to make our world a fairer place.

Take the Harvard Moral Sense Test

Phineas Gage and the flying tamping iron

Phineas Gage is known to every first year psychology student as the man who had a metal rod go through his brain. And survived. But who was never the same after. This odd fame arises because his accident demonstrated something that, even today, people find very hard to swallow: that many of our most cherished characteristics, such things as a sense of decency, morality, conscientiousness, self-discipline - even love - could be extracted from our bodies in much the same way as a speeding cricket ball might extract a tooth.

Read more

The strange story of Phineas Gage opens, like a spaghetti western, on a furnace-hot afternoon in the late19th century, with a gang of railroad workers laying tracks across a boulder-strewn desert in Vermont.

Gage's job was to clear the way for the track-layers by blowing away any boulders that were too big to manhandle. To do this he packed explosive beneath the rock, then tamped it down with a thick iron rod before lighting the fuse. Gage had been chosen for this dangerous and important job because, of all the men, he was judged to be the most careful, sober, and responsible.

On this afternoon, however, Gage allowed himself to be distracted at a crucial moment. As he tamped down a packet of explosive, something caused him to glance over his shoulder, and in that second his rod struck the rock and created a spark. The resulting explosion jettisoned the iron bar into Gage's eye socket, then through the front of his head and out the top of his skull. As it exited it took with it a clean core of brain tissue.

The force of the explosion knocked Gage to the ground, but - astonishingly - he was back on his feet and talking lucidly within minutes. He was taken by cart to the nearest town, where, displaying the gaping hole in his head, he greeted the doctor with a cheery: "See what business I have for you here!"

As it turned out the business was less disastrous, in one way, than anyone might have believed possible. Apart from being blind in one eye, Gage made an almost complete recovery. Or so, at first, it seemed. As the weeks went on, however, it became clear that the sober, conscientious, polite and law-abiding man who led the railway gang had changed into a foul-mouthed, reckless, heavy-drinking renegade. As a close friend remarked, sadly, "Gage is simply no longer Gage".

The case of Phineas Gage and the Tamping Iron has since become famous because the doctor who initially treated him subsequently made a meticulous report of both the personality changes and the precise damage to his brain. And so it was the first case to show clearly that a particular area of brain tissue "contained" the faculties that we would call human sensibility. Gage's tragic accident put the first major landmark on what, just over a century later, has become a detailed map of the human mind.

Today we can tell which bits of the brain do what by looking directly at the electrical activity in it. Various imaging technologies show exactly where our sensations, emotions, thoughts and behaviour are generated.

Apart from furthering our understanding of the brain, brain imaging has proved invaluable for neurosurgeons because it allows them to identify precisely how to wield the scalpel. If a patient suffers from severe epilepsy, for example, surgeons sometimes cut some of the fibres that connect different parts of the brain to prevent the seizures from spreading uncontrollably. Brain scans now ensure that the knife stays clear of areas which are crucial for faculties such as speech or movement. They can also show the precise location of tumours and other forms of damage or injury, allowing better diagnosis and treatment.

Until the development of brain scanning techniques surgeons and neuroscientists were largely dependent on Gage-type "natural experiments" to reveal the geography of the mind. One drawback to this was that in order to match the location of an injury to a particular mental function, it was usually necessary to wait for the subject to die. The two major speech "modules" of the brain, for example, Broca's and Wernicke's areas, were discovered by the two neurologists who gave their name to them by dint of first detailing the speech deficiencies of certain patients, and then examining the patients' brains at post mortem. Broca, for example, located the bit of brain that allows us to speak by keeping tabs, for years, on a man who was called "Tan" on account of that being the only syllable he was able to utter. When Tan died Broca found a patch of damage in the left frontal lobe of his patient's brain and concluded - rightly - that this was the part responsible for language articulation.

Other pre-imaging methods of brain-mapping included the famous experiments of Canadian neurosurgeon Wilder Penfield. The brain has no pain receptors, so Penfield (like other brain surgeons) operated on his patients while they were conscious. He found that many of his patients reported what seemed like memory flashbacks when certain parts of the cortex were touched by an electrode. Penfield concluded that memories were stored in clusters of nerve cells and he subsequently drew up a "memory map" by stimulating various parts of his patients' brains and asking them what they experienced as a result.. We now know that memories are distributed throughout the brain, but the areas Penfield charted as the "storage depot" of experience - mainly in the temporal lobe at the side of the brain - are closely associated with memory recall.

Brain imaging is improving today at a phenomenal rate. The latest technique - known as Diffusion Tensor Imaging - can trace the connections between one neuron and another through the entire breadth of the brain. And another technique, using minute electrodes, can reveal exactly which neurons respond to, say, the raising of an eyebrow.

Scanning technology has even shed more light on poor old Phineas Gage. Hannah Damasio, a neuroscientist at the University of Iowa recently used computerised tomography to chart the precise area of brain that was removed by the tamping iron. They turned out to be precisely those which modern brain scans show to be active when people make moral or ethical decisions, or inhibit self-satisfying impulses. The "responsible" Gage it seems, was quite literally left behind in the desert sand.

"I've been here before…"

No you haven't - you're just experiencing déjà vu. The strange familiarity produced by this quirk of memory has often been interpreted in a supernatural way. I am fascinated by weird experiences, but I have little tolerance for supernatural explanations. In this piece I look at possible scientific explanations for that "been here before" feeling.

Read more

Most people - two out of three according to surveys - have experienced déjà vu. It is that weird sensation of having "been here before" or having "lived this moment already". You may be talking to a stranger, for instance, and suddenly feel that your conversation is a replay, word for word, of a previous one. Or you may be visiting some entirely unfamiliar town and "realise" that you have been right there, in that precise spot, at some other time, even though you know it is impossible. The feeling goes way beyond any vague sense of having seen or done something similar before - it feels identical to a past experience. Yet trying to pin down the memory is like trying to catch a dream - just as you think you are homing in on it, it turns to vapour.

The eeriness of has led to all sorts of spooky theories. One popular one is that it is evidence of reincarnation - a bit of a past life "breaking through". Others believe it is the memory of a dream in which the person has lived through the current moment in advance. In recent years, however, neuroscientists have discovered enough about perception and memory to piece together a more plausible explanation.

Every conscious experience we have is "constructed" by our brain out of lots of different components, rather as a car might be made in a factory. We tend to think of an event as a bundle of sensations: sight, sound etc, but there is actually much more to it. If you (literally) bump into someone in the street, for example, you will be aware of the sight of them, the touch of them as you bump, the sound each of you make and so on. But you will also be aware of the meaning, tone and intention of the sound, the pain from the bump, a sense of irritation or embarrassment; a thought, perhaps, that you, or the other person, is clumsy, and so on….. There is much more to experience than simple sensations.

One, very important "component" that often gets added is a sense of familiarity. This is generated in the deep part of the brain that creates emotions. The sense of : "ah yes! I recognise this!" usually only gets attached to experiences which "match" stored memories. Sometimes though, the part of the brain which generates the feeling of familiarity becomes "trigger happy" and attaches the feeling to an experience that is actually quite novel. This is what seems to happen in déjà vu. The brain then tries to dig out matching memories, but of course, they aren't there - hence the maddening feeling of chasing shadows.

For most people déjà vu is a rare and fleeting phenomenon, intriguing rather than disturbing. And it doesn't seem to be unhealthy - indeed, déjà vu is most commonly reported by people who are young, intelligent, well-educated and wealthy. Given that it is actually a minor brain malfunction, this may seem strange. The explanation may be that young brains are more "recognition sensitive ", so they are more easily triggered into familiarity mode. Similar sensitivity may also be a factor in intelligence - bright people "see things" quickly, meaning they get that "ah-ha!" feeling more readily than others. And intelligent people tend to go on to higher education and thus become wealthy. So déjà vu may be a side-effect of having a brain that is quick to recognise things.

For an unfortunate few, though, déjà vu is a constant companion, and a serious blight on their lives. Dr Chris Moulin is a psychologist at Leeds University who is studying this strange disorder. He first came across it when he was working in a memory clinic: "We had a peculiar referral from a man who said there was no point visiting the clinic because he'd already been there, although this would have been impossible. Déjà vu has developed to such an extent that he had stopped watching TV - even the news - because it seemed to be a repeat, He even believed he could hear the same bird singing the same song in the same tree every time he went out."

Apart from the Groundhog Day tedium of chronic déjà vu, the condition can also get people into social difficulties. "Some patients feel that everyone they meet is familiar and this makes them dangerously trusting of strangers," says Moulin. "If they don't constantly remind themselves that the sensation is false they are at risk of being exploited."

So next time you find yourself "re-living" an experience, don't struggle to recall the previous time. Just sit back and relax. And make sure that you don't sign on the dotted line until the moment has passed.

Gorilla blindness: so you think you can see?

Like everyone else I have a sweetly naïve idea that I can see what is going on around me. But change blindness, and its near neighbour, inattention blindness, makes fools of us all. I first saw one of the experiments described in this piece at a large conference of cognitive neuroscientists. The audience of 500-plus were as comprehensively caught out by it as any hapless victim of Derren Brown.

Read more

The notice on the door invited volunteers to take part in a short, simple, psychological experiment. Just go inside, it said, and sign up at the reception desk.

Encouraged by the reward of a book token, a steady stream of students entered the campus building and told the person at the desk they wanted to volunteer. Each one was handed a consent form to fill in. When they had completed it, the receptionist asked them to wait a moment while they filed the form in an area behind a screen. Then the receptionist returned and directed them to another room where they were greeted by a psychologist.

At this point the volunteers discovered an unsettling fact. The experiment was already over. It had taken place in the minute or so during which they signed up. And the results of it gave three out of four of them reason - literally - to doubt their own eyes.

What had happened was this: when the student approached the reception desk they were greeted by a man with blonde hair, wearing a yellow shirt. This person handed out the consent form, waited while the student filled it in, then took it back and disappeared around the back of the screen, ostensibly to file it. In fact, while hidden from view, the blonde man swapped places with a colleague - a man with darker hair and wearing a different coloured shirt. This other person then returned to the student and directed them to the other room. Astonishingly, most of the would-be volunteers failed to notice the switch. Only when the psychologist showed them a video of the entire event did they believe what had happened.

That experiment, carried out at Harvard University in the late 90s, was one of the first to demonstrate a phenomenon known as "change blindness" - our ability to miss massive changes that occur in front of our eyes, provided the "before" and "after" scenes are separated by a short gap or visual interruption. As well as failing to notice when one person takes over from another, experiments have shown that vast chunks of a visual landscape can be removed between glimpses without most of us noticing. In one series of experiments none of the subjects noticed when a large building, smack in the middle of the picture, shrunk by a quarter between glimpses. None saw that a mountain range disappeared. And more than 90% failed to spot the sudden extinction of 30 puffins on an otherwise uninhabited ice floe.

Inattentional and change blindness demonstrate that what we see is not at all what we think we see. We seem to observe the world in all its richness, but in fact we only register consciously a tiny handful of elements - those that catch our attention. We are good at spotting changes if they happen while we are looking because change involves movement and movement grabs attention. But we cannot spot changes if there is a gap between the before and after because to do so we would have to hold a complete memory of the "before" scene to compare with the "after" one. And our memories are only as complete as our perceptions.

Magicians have always exploited change and inattention blindness. When a sleight-of-hand artist makes a big deal of directing your attention to what is happening in his left hand you can be sure that the real business is happening in his right. But inattention blindness is not just an amusing curiosity. Our tendency to neglect things that we are not deliberately attending to has real life implications. Drivers, and pilots, for example, frequently make dangerous errors, not because they are not attending, but because their attention has got locked on to the wrong thing. In one case, for example, a plane crashed because the pilot and co-pilot were focussing so hard on a dodgy instrument that they failed to notice the ground rushing up to meet them. And the attention-grabbing effect of talking on a mobile phone while driving is reckoned to increase a driver's risk of an accident fourfold.

Attention is largely an unconscious faculty - most of the time it is "grabbed" by events rather than deliberately directed, so it is very difficult to avoid zooming in on some things to the detriment of others. Our best defence may be simply to remember that at any time we are only seeing a tiny bit of the picture. The whole thing may look very different indeed.

See examples of change blindness

Fractured minds

This is about Multiple Personality. I became fascinated by the subject when I was researching Consciousness - how can a person switch from one "self" to another? - it seemed impossible. As a result of this article I made contact with some "multiples" who strongly reject the notion of multiplicity as some kind of weird pathology. They insist that their condition is actually beneficial rather than incapacitating. This led to my researching the possibility that multiplicity is actually a normal state - the idea that underpins my new book.

Read more

John, Jay, Decca, Mac and Sanji describe themselves as a happy household. John, speaking for them all, says they have the odd spat, as people living closely together always do. But on the whole they like and support one another, share in the burdens of day-to-day living, allow each other the freedom to pursue their own interests and generally rub along like any other group of house sharers.

John and his friends don't just share a house, though. The arrangement is much more intimate. They share a body. Each of them is a personality - an "alter" - in what they call the "system" known publicly as John. Were they to come to the notice of a psychiatrist, John would almost certainly be diagnosed as having dissociative identity disorder, the strange condition more commonly known as multiple personality.

We all have conflicting thoughts and feelings from time to time. We experience mood shifts, and have ideas and desires that change from moment to moment. Nevertheless, we mostly experience ourselves as a single, solid and continuous "me". Why do we feel this way? Where does our sense of self come from? And why is it different for people who experience multiple selves?

We take the feeling for granted, yet our "normal" sense of being a self anchored in one particular location and time, the concrete "me, here, now", is a creation of our brains and thus more fragile than it may seem. A slight shift in the way the brain processes information may destroy the comfortingly familiar feeling of being a single, continuous being.

The states of mind that most commonly disturb our sense of self are known collectively as dissociation. They range from vague feelings of "spaciness" to bizarre conditions such as multiple personality disorder and "fugue" - the sudden loss of personal memories. Some psychiatrists consider the symptoms of the weirder dissociative states to be fictitious - a form of "acting out" rather than an involuntary response to altered brain function. But recent evidence from brain scanning studies is not only giving these conditions credibility, it is beginning to reveal how our sense of self is generated.

Normally, certain cognitive faculties - memory, self-recognition, consciousness, sensation, intention and action - are bundled together, giving us a sense of singular and continuous identity in a single stream of experience. In multiple personality and other dissociative states, these strands of "self" are experienced separately.

People with multiple personalities have several distinct states of mind, each of which has the habits of thought, emotions and memories of an individual personality. Some, like the John system, share awareness of one another - so-called "co-consciousness" - rather like the common dream experience of being both the observed and the observer. "When one of us is in charge of the body the rest of us are sort of there in the background," John says. "We can't actually do anything, but we see what is going on, we are aware of what the one who is 'out' is thinking, and we see everything through their eyes."

Multiple personality disorder shows how important our personal memories are to our sense of who we are. In people with the condition, memories of events that occurred when one particular alter was in charge of the body are "claimed" by that personality, so when another alter appears these events feel as though they happened to someone else.

In John's case each alter knows the others' memories, but they do not experience them as their own. They can therefore "fake" being a single personality and their strange inner life may go undetected. In a few rare cases, though, like those portrayed in the films The Three Faces of Eve and Sybil, the personalities are so cut off from one another they do not even know of the others' existence. Each alter is able only to retrieve their own memories, so they all have "gaps" in their lives when it seems they did not exist.

Sue, a multiple interviewed for the BBC TV's Horizon science series (11 November 1999), described gaps in her life of days at a time: "I know that one of my other personalities was out there...when we get in the car, depending on what personality has driven...the radio station will be different, the seats and mirrors will have to be readjusted. Once I came back and the car I had was gone and this sports car was in its place."

Brain imaging studies of a multiple similar to Sue revealed the changes in brain function that occur as various personalities come and go. Don Condie, a psychiatrist at the Massachusetts General Hospital in Boston, and neurobiologist Guochuan Tsai at McLean Hospital in Belmont, Massachusetts, found that the arrival of an alter coincided with distinct changes in activity in the hippocampus, the part of the brain that lays down and retrieves personal memories. When her dominant personality was replaced by a weaker alter, hippocampal activity died down, only to flare up again when the main character returned (Harvard Review of Psychiatry, vol 7, p 119).

The finding suggests that the weaker alter had access to a smaller "bag" of memories than the stronger one, and that the memories of each were not available to the other (New Scientist, 18 December 1999, p 26). Crucially, there was no change in hippocampal activity when simply "acting out" a personality shift.

Dissociative amnesia, or fugue, is another way that losing access to the precious cache of personal memories can split apart a person's "me-ness". The person suddenly adopts a new life, a new name, and sometimes a whole new way of behaving. Their old self seems to be entirely forgotten, though in fact the memories that formed their old identity are merely "dissociated" from consciousness and thus beyond voluntary recall. Often the only evidence that they are still encoded in the brain is when a person acts on them unconsciously. For example, a girl who claimed to have no memory of anything in her past was invited to dial telephone numbers at random. Although she thought she was hitting the buttons blindly, within a few tries she had dialled her own home.

A sense of control over and ownership of your body is another vital element of the self. The normal feeling of inhabiting your own body is produced by a continuous matching up of physical reality and the mental representation or body "schema" held in the brain. If the mental representation is damaged (by a stroke or dementia) or if part of it is dissociated from the rest, the corresponding body part may be disowned. This may manifest as conversion or "hysterical" paralysis, where a person is unable to move a limb despite the absence of any physical injury.

A brain-imaging study carried out at the Radcliffe Infirmary in Oxford of one patient with conversion paralysis showed that the part of the brain that plans movement (the supplementary motor cortex) was not communicating its messages to the part that actually instructs the body to move. So however much the patient "willed" the affected area to move, it just couldn't (Cognition, vol 64, p B1).

Ownership may break down in an even stranger way in a phenomenon known as alien hand. Here a limb may act independently of, or against the conscious intentions of, its owner. Alien hands have been known to hit people at the very moment the other hand reaches out to make an affectionate gesture, and to undo their owner's buttons and zips just after the other hand has done them up.

Having a bit of your body beyond conscious control is deeply disturbing because it makes it feel as though that part no longer "belongs" to you. People who are mentally disconnected from part of their body may neglect to use it even when it is not paralysed. Despite incontrovertible evidence, some people have even been known to deny that the disowned limb is attached to them, and others have begged for surgical amputation of body parts that do not fit into their internal sense of self. Some have even managed to persuade surgeons to carry out their wishes.

Similar to feelings of ownership is the sense of agency - the feeling that our actions are dictated by our intentions. People who lose this have a form of dissociation called depersonalisation in which conscious intentions and motor activity are partly decoupled. This creates a feeling of automatism or being "outside" oneself.

Beside herself

Jackie suffered her first bout of depersonalisation when she was walking home from the hospital where her mother had just died. She describes the feeling:

"Of course I felt sad and all that. But it was as though the sadness was 'out there' somewhere, not inside me. And I saw myself walking along and even smiling at a neighbour I met, but I wasn't in there - I was somewhere else. The person walking along was like a puppet or something - empty inside. All the time I was thinking...but the thoughts were like some sort of subtitles or something - they weren't my thoughts."

Other people lose their normal perception of the outside world, a condition known as derealisation. Joe, 24, started suffering bouts of derealisation in his teens. His condition is partly controlled by a drug that reduces anxiety. But he still has periods when the outside world seems strange and unreal:

"It happens most in the evenings, especially if I go somewhere where there are bright lights or lots of noise. Suddenly the whole scene seems to whoosh away from me.

"If I'm talking to someone they may seem as though they are standing on the moon, and I hear them as though their voice is coming down some long tunnel. At other times objects just look weird. I look down and see something like a mug in my hand and it looks like something I've never seen before. And the ends of my arms seem to be miles away. When I'm in this state it feels as though the whole world is some sort of film...I hate it...sometimes I hit my head like you hit a TV set that isn't working properly...but it doesn't clear the picture. It's terrifying."

Most people who experience these states find them disturbing, and they are often associated with odd behaviour - which is why dissociation is usually considered "bad". However, most recreational drugs cause dissociation of one sort or another and those who take them do so precisely to achieve that effect. And dissociation is not itself abnormal - it is simply a reflection of the brain's ability to process information along parallel pathways and at different levels of consciousness. You are dissociating when you get "lost" in a book or find that you've carried out some routine task (driving is the usual example) without being able to remember it. Far from being dysfunctional, this everyday type of dissociation can give the imagination free rein or leave the body to carry out routine tasks while the conscious mind roams elsewhere.

Some tasks may even be carried out more competently as a result of dissociation. A doctor who constantly has to deal with horrifying injuries may need to dissociate emotionally in order to function and not be overwhelmed by pity. Stay in that state after work, however, and it is likely to be disastrous.

But even at the extreme end of the dissociative spectrum, multiple personality, there are those who find the state preferable to "normal" cognition. John and his friends, for example, have no intention of seeking psychiatric help because they like being a group. "We keep each other company and each of us has our own special strengths and weaknesses which we can call on when needed," he says. "We don't want to be 'integrated' - group living makes us stronger, more adaptable...and we are never lonely."

And although most of us take our sense of self for granted, there is probably a great deal more variation out there than we care to consider. Multiple personality is generally assumed to be very rare, at least outside the US. Indeed, some psychiatrists in Europe still refuse to acknowledge that it is a genuine splitting of identity. But two surveys from the early 1990s, one in Winnipeg and the other in the Netherlands and Belgium, of people who were not being treated for a psychiatric disorder found that 3 per cent of the North Americans and 0.5 per cent of the Europeans fulfilled the current diagnostic criteria for dissociative identity disorder. Yet the number of reported cases worldwide was only a little over 6000.

The prevalence of "milder" forms of dissociative disorder is probably underestimated to an even greater extent. Both the American and European population surveys, using questions such as "Are you able to ignore pain?" or "Are you ever approached by people you don't know who know you?" indicated that around 12 per cent of people dissociated to a degree that most psychiatrists would diagnose as pathological.

Hidden epidemic

Marlene Steinberg of the University of Massachusetts Medical School in Worcester, a therapist who specialises in dissociative disorders, says her own clinical research suggests that 30 million people in North America are afflicted. "Were it not for the fact that so many people who actually have a dissociative disorder are misdiagnosed and mistreated for something else, the reported numbers would skyrocket to reflect their true epidemic proportions," she says.

The reason for this presumed "epidemic" of dissociative disorders is unclear. Steinberg believes that pathological adult dissociation is nearly always linked to childhood abuse. According to this theory, the abused children learn to dissociate as a way of shutting out or distancing themselves from the horrific things that are happening to or around them. Once learned, dissociation becomes a brain "habit" that persists into adult life.

Anthony David, who runs a research clinic specialising in depersonalisation at King's College London's Institute of Psychiatry, explains: "I think that these mechanisms exist in all of us and could be triggered by circumstances." He suggests that people with the conditions probably have an innate physiological vulnerability to react in this way.

Childhood abuse seems to be only one factor. An ongoing Institute of Psychiatry survey has found that only 1 in 3 patients with symptoms of dissociation believed they had suffered abuse as children - a much larger proportion than in the general population, but still a minority. All the respondents, however, claimed to have suffered some sort of stress as children: for example overly strict parenting; a parent who was ill; or losing a family member at a young age. Sadness, loneliness and boredom may also cause a child to shut out the real world and take their conscious mind off to a more pleasing imaginary land. Even overzealous therapy may encourage escape to alternative identities.

In addition, stress of any kind may damage the hippocampus, making it less capable of weaving memories into a coherent whole. A number of studies have found that people who have been in a chronic state of anxiety - war veterans, for example - have a smaller hippocampus than others, and these people are also more likely to report symptoms of dissociation.

Another explanation for the "epidemic" is our increasing use of recreational drugs. Most of these produce some sort of dissociation, and once the brain has been primed by them it may be more likely to dissociate spontaneously. Some 15 per cent of drug users in treatment report spontaneous bouts of derealisation or depersonalisation, or even experience it continuously. Although they may find the experience pleasurable when they are in control, when it happens out of the blue they tend to find it as frightening as anyone else.

The now familiar story about our living a modern life with an ancient brain may help explain our vulnerability to dissociation. The integration of cognitive functions, such as those that create the sense of self, is likely to be quite a new adaptation and thus still fragile. The emotional stressors that people encounter today - during childhood or later in life - may be triggering sensitive brains to revert to a more primitive mode of cognition. John and his friends may not be a bizarre new phenomenon after all, but a throwback to a time when everyone made their own friends - literally.

This article first appeared in issue 2412 of the New Scientist, 13 September 2003.

Tune in and turn off: autistic savants

The brain is modular - that is, different and fairly discrete bits do different things. One bit, for example, works out what's needed to make a physical action while another counts beans. Many of these areas have a see-saw relationship: if you inhibit one, the other becomes more active and vice versa. This piece looks at some fascinating studies which suggest that turning off the "cleverest" parts of our brain might allow other parts to function in such a way as to give the appearance of genius.

Read more

James can tell you the precise time—to the second —without looking at a clock. Jennifer can measure anything to within a fraction of an inch just by glancing at it. And Christopher can speak 24 languages—including a couple of his own devising. Amazing? Definitely. But unusual? Not necessarily. According to a controversial new theory you too can do these things. Or at least you could—if only you could just stop being so clever for a moment.

Christopher, James and Jennifer are autistic savants—people who score low on IQ tests and have severe difficulties in communicating and interacting with others but who nevertheless have seemingly superhuman competence in a specific area like music, art or maths. About one in ten autistic people have notable talents, but truly prodigious savants like Stephen Wiltshire, who can draw spectacularly detailed and accurate representations of buildings, or the lightning card-counting calculator played by Dustin Hoffman in the film Rain Man are very rare. There have probably only been about 100 people described as savants since the phenomenon was first identified a century ago and only about 25 are alive at the moment.

Such is our fascination with these people that nearly all of them are publicly known and celebrated, and many of their skills have been studied exhaustively. Yet there is still no generally accepted understanding of how savants do whatever it is that they do.

Theories range from enlargements of certain specialised brain regions to the simple "practice makes perfect"—but none of them alone satisfactorily explains all the weird anomalies.

The latest contribution to the puzzle is startling because it proposes that savant skills—far from being unique—are possessed by everyone, and might even be unleashed with quite simple, existing technology.

The idea comes from psychologists Allan Snyder and D. John Mitchell from the Centre for the Mind at The Australian National University in Canberra. Essentially they think that savant skills are the manifestation of brain processes that happen within us all, all the time, but are usually speedily swamped by more sophisticated conceptual cognition. While this high-level stuff fills our consciousness, the savant-style information-crunching that the researchers suggest precedes it is relegated to the unconscious back rooms of the brain.

"It's not that savants are cleverer than the rest of us," says Snyder, "it's just that most of us go one step further in our brain processing—from detailed facts to meaningful concepts—and once we've done that we can't go back."

Snyder and Mitchell formulated their theory from analysis of many existing studies of savants—mainly mathematically gifted ones . Among the findings they rely on are brain-imaging experiments, which reveal the extent of unconscious processing that goes on before we ever become aware of perceptions, thoughts and feelings.

A visual image falling on the retina, for example, takes about a quarter of a second to pop up in a person's mind as a conscious perception. Before that moment, each element of the image—including its colour, shape, movement and location—is identified separately by various specialised regions in the brain. These components are then assembled into a pattern which is shunted onwards to regions that attach meaning to it. Normally we have no idea that all this is happening—we only become conscious of it after the detailed processing is complete and we have a fully constructed perception.

"What matters for survival is that we have a concept we can work on—it's a face and it's friendly, say—not a mass of detail about how we arrived at that conclusion," says Snyder. "So in normal people the brain takes in every tiny detail, processes it, then edits out most of the information leaving a single useful idea which becomes conscious." Taking these ideas a step further, he asserts: "In savants the suppression doesn't happen so they see the picture in fantastically detailed components, like individual pixels in a photograph."

Using the same reasoning, Snyder believes that if, for example, you were asked to calculate the day of the week on which any particular date falls (an obsession peculiar to savants) or to discern the precise pitch, length and sequence of notes in a musical score, you would do it, more or less instantly, in your unconscious mind. But because knowing what day of the week 1 September 2056 would be is of no practical use, he thinks the information would be edited out before it passed into consciousness. Equally, because notes in isolation usually carry little meaning you would tend to hear the music as a melody rather than as separate sounds.

If Snyder and Mitchell are correct in supposing that savant cognition is happening in us all, is it possible that we could learn to shift our consciousness back a gear and become aware of it? Niels Birbaumer of the Institute of Behavioural Neurobiology at the University of Tübingen, in Germany, an enthusiastic supporter of Snyder and Mitchell's theory, believes we could. Birbaumer recently led a team that fitted paralysed patients with scalp electrodes that picked up signals from the brain and translated them into movement of a computer cursor. The patients first had to learn to control brain activity that was normally unconscious (New Scientist, 16 January, p 4). Birbaumer thinks it would be possible to access pre-conscious savant cognitive processes in much the same way—and that some people have already learnt to do so, without even realising what they were doing.

Accessing the subconscious

He cites, for example, a non-autistic student whose calculating skills rival those of the best mathematical savants. Electrical monitoring of the student's brain waves while he was doing a calculation showed that his brain was more active than usual at the start but less active just before he answered ( Psychophysiology, vol 33, p 522). "Later cognition involves more cortical activity and is associated with conceptual thinking," says Birbaumer. "This student seems to be able to prevent this activity from occurring when he is calculating—leaving him free to access the earlier low-level processes."

Other researchers in the field—though expressing polite interest in Snyder and Mitchell's theory—remain sceptical that we all have latent savant skills. The most commonly favoured explanation for savant talents is that they are "islands" of highly developed ability, probably linked to physically enlarged specialist brain regions. In most people the development of such skills is held back because the brain's resources are focused from an early age on conceptual thinking and what is known as "global processing"—pulling together various thoughts and perceptions and extracting meaning from the overall picture rather than concentrating on the concrete details of each perception.

Autistic people seem to be unable to process things in this way. The result is a detailed but incoherent cognitive style described by autism experts Uta Frith from the Institute of Cognitive Neuroscience at University College London, and Francesca Happé, senior scientist at the Institute of Psychiatry, also in London, as "weak central coherence". Their idea is different from Snyder and Mitchell's because they assume that savant processing never happens in non-autistic people—consciously or unconsciously. They believe the drive towards central coherence is so strong that it sweeps perceptions and thoughts into meaningful concepts before every tiny detail of them is registered, so we wouldn't be able to access this information.

Happé explains: "If you were able to look inside the brain of an autistic savant I think you would find that their talent arises from very specific and circumscribed brain areas which are neurologically isolated from the areas which bind things together to make concepts. This allows the areas dedicated to savant abilities to develop without interference from parts of the brain which deal with concepts. As a result they may turn into large specialised brain areas like those that normal people have for speech."

The idea that unusually enlarged brain regions may create exceptional artistic, mathematical or musical skills in the people who possess them took an interesting turn recently. An anatomical study of Einstein's carefully preserved brain showed the area associated with maths was bigger than normal and not dissected by the usual groove. Grooves often mark the boundaries of functional brain areas, so it's fascinating to toy with the notion that the mathematical "module" in his brain had annexed neurons from an area next door that would normally do something else.

The trouble with the big brain hypothesis is that anyone's brain will enlarge or get denser in an area that is constantly active, so it is hard to know if an enlarged module is the cause or result of a particular skill. Vilayanur Ramachandran, Director of the Center for Brain and Cognition at the University of California, San Diego, has charted neuronal hijacking in cases of "phantom limbs"—when amputees continue to feel their lost body parts because the brain regions that once gathered sensory signals from the limb are drawn into the regions monitoring neighbouring body parts. He thinks something similar might explain the astounding quality of savant cognition. "Maybe when the brain, or a bit of it, reaches a critical mass new and unforeseen properties emerge," he speculates. "So a doubling of neurons wouldn't produce a doubling of talent but a hundred-fold increase."

A simpler explanation comes from Michael Howe, a psychologist at Exeter University who has studied both autistic and non-autistic people with exceptional skills and believes that constant practice is generally enough to account for both types of talent. "Savants seem to just `see' things effortlessly," he says, "but I think if a non-autistic chess player who has been immersed in the game for thirty or forty years looks at a game in progress they just `see' the position and the best moves in a similar way." He adds: "The main difference between experts and savants is that savants do things which most of us couldn't be bothered to get good at."

Not just practice

Howe admits, though, that mere practice cannot account for the abilities shown by very young savants, simply because they have not had time to hone their skills. One celebrated artistic savant, named Nadia, drew stunningly animated pictures of prancing horses in perfect proportion and perspective from the age of three. She did not seem to learn the skill. Unlike normal children, who go through very specific stages as they develop drawing ability, such as putting huge heads on people and showing limbs as sticks, Nadia was drawing brilliantly from the moment she could grasp a pencil. And there are children who can do the amazing day of the week calculations, who have not yet learnt to divide and have developed the skill without adult help.

It may be that all very young children perceive the world in a savant-like way. One incredible skill shown by children is language acquisition. Eight-month-old babies seem to carry out fantastic calculations in order to work out where word boundaries fall in a stream of speech. They do not consciously work it out. They simply learn to "know" when a word begins and ends, just as a mathematical savant may say they just "know" the square root of a six-figure number. Adults, by contrast, have to labour over learning these patterns in a new language; simply immersing themselves in it is usually not enough.

Similarly some researchers believe that perfect pitch—a skill common in musical savants—is easily acquired by children but rarely develops in adulthood. And eidetic memory—the automatic perception, storage and retrieval of visual images in photographic detail—is far more common in children than in adults.

Savant-like skills may be lost—or hidden, according to Snyder and Mitchell's theory—in non-autistic people as they grow up because of a shift in the way we process information. Imaging studies show that brain activity in newborn babies is limited to regions we are unconscious of in adults but which register incoming sensory information and respond to it by generating urges, emotions and automatic behaviour. The cerebral cortex—the area associated with conscious thought and perception—becomes active within a few months, however, and as the child grows up an increasing proportion of information processing is done cortically. This shift accelerates in non-autistic children around the age of eighteen months, when they start to babble, and language acquisition may help to "kick-start" activity in the frontal cortex where conceptual processing is mainly carried out.

In autistic children this shift appears to be slowed or incomplete and so their savant-like processing style may be preserved. Autistic savants who do seem to make the change, albeit belatedly, may thus lose their abilities. Nadia, for example, lost much of her prodigious talent when she finally mastered language around the age of 12.

Language development also seems to bring about the dominance of one hemisphere of the brain. In right-handers this is nearly always the left hemisphere, where the main language regions develop, but in left-handers language may occupy the right brain. Many researchers argue that savant skills tend to be those which are associated more with the right hemisphere: music, identifying mathematical patterns and art, for example, rather than skills that are predominantly associated with the left-hemisphere. Even the rare savants who have amazing word power, like Christopher, tend to be less interested in reading or the meaning of words, and more interested in skills like translation. Because of this, many have suggested that savant skills are produced by a dominant right hemisphere which has flourished in the absence of effective communication with or inhibition by the left.

Held back

"Autistic people often show both structural and functional dysfunction in the left hemisphere," says Wisconsin psychiatrist Darold Treffert, author of a book called Extraordinary People: Understanding Savant Syndrome, back in 1989. "Most cases are probably due to some prenatal interference with brain development which prevents normal development of the cortex and left hemisphere," he says. "Testosterone, for example, is known to inhibit left-hemisphere development and in male fetuses temporary slowing of the left hemisphere may be a normal developmental stage. In autism that slowing may be protracted beyond normal, resulting in an overdeveloped right hemisphere and stunted growth on the left. This could explain why autism, and savant skills, are about six times more common in males that in females."

His theory seems to be supported by a number of extraordinary cases in which normal people have suddenly developed savant-like abilities after left-sided brain injuries. One 9-year-old boy, for example, was transformed from an ordinary school-kid to a genius mechanic after part of his left hemisphere was destroyed by a bullet.

And Bruce Miller and co-workers at the University of California Los Angeles School of Medicine recently reported five patients who developed amazing drawing skills after dementia destroyed part of the left side of their brains ( Neurology,vol 51, p 978). "One of our patients had spent his life changing car stereos and had never shown any interest at all in art," says Miller. "Then he developed dementia which destroyed neurons in the left frontotemporal cortex—an area which gives meaning to things—and suddenly he started to produce sensational images recalled from early childhood. It was as though the destruction of those brain cells took the brakes off some innate ability that had been suppressed all his life, and opened access to an amazing personal memory store he never knew he had."

As yet it isn't clear whose interpretation of these cases is correct, if indeed anyone's is, but Snyder thinks there might be a way to test it. He is planning an experiment in which, he hopes, the unconscious savant will be unleashed at the flick of a switch. Magnetic pulses can interfere with normal brain activity. If you time and position the surge just right, it can temporarily turn off activity in a particular region. Snyder's plan is to "switch off" the conceptualising area. If his theory is correct, and if he can find the area, this should cause the normally pre-conscious savant skills to burst into consciousness.

"I'm thinking of trying it on myself first," says Snyder. "If I start to get crystal clear pictures of my childhood or a sudden knowledge of prime numbers I'll really know I'm onto something."

James and Jennifer are not the real names of the people described.

This article first appeared in New Scientist, issue 2207, 9 October 1999.

Chronicling the Future

In 2000 I was invited to contribute to the Sunday Times "Chronicle of the Future" - a series of news stories of the future. These are some of the events I thought might happen.

Read more

ESP bridges the gap

30.03.23: THE FINALS of the World Bridge Tournament were halted dramatically today amid claims that players had been cheating.

The pan-European contestants, George and Georgia Blackwood ­ who astounded bridge-watchers earlier this week by achieving a grand slam with only two aces between them ­ are now having tests to check their brains for so-called telepathy bugs.

These minute implants can transmit and receive information from brain to brain, producing a crude form of telepathy, depending on their positioning. If you have a transmitter fitted in Broca's area (the part of the brain responsible for language articulation) and someone else has one fitted in Wernicke's area (responsible for language comprehension), you can exchange thoughts.

"You wouldn't want to conduct diplomatic negotiations using these bugs," says surveillance expert Jessica Scholl, "but they're fine if it's just a few specific words you want to convey ­ 'jack of diamonds', for example."

Telepathy bugs were developed by the Anglo army for use in undercover anti-terrorist operations, and the commercial arm of the military now markets them worldwide as an aid for the deaf. The sale and fitting of such devices is strictly controlled: only one military neurologist in Britain is licensed to perform the operation.

Last year, however, a Harley Street surgeon was struck off after admitting that he had fitted transmitters in the brains of a magician and his assistant.

The bugs were discovered when the entertainers were tested following claims by the Magic Circle association of performers that their tricks could not be done any other way. The performers have denied the charges. RC

Cyber ghost must die...

12.02.49: THE SUPREME COURT yesterday ordered computer scientist Nat Baker to switch off the quantum supercomputer in which he claims to have cloned his consciousness in a bid for electronic immortality.

The machine, programmed with the information content of the scientist's brain, appears to be conscious and identical in personality to Baker.

The court order was issued after the Council for the Protection of All Sentient Entities (Cepase) argued that prolonging the computer-based consciousness was potentially cruel.

Many machines are assumed to be self-aware, but the subjective quality of machine consciousness is distinct from that of humans because they function in a different physical context. Earlier hearings determined that this ruled out any possibility of suffering. Baker's case is unique because he claims the computer's consciousness is identical to his own.

"To have human consciousness you need a human body," explains Baker. "I have got around that by putting my consciousness in a virtual reality which gives him the illusion that he is still physically inside my head."

To do that, Baker implanted a matrix of transmitters in his own brain that relay incoming sensory experiences to the computer, where they create the illusion of walking, talking, seeing, hearing and feeling.

"The consciousness created inside the computer cannot distinguish itself in any way from the consciousness I am experiencing here and now," says Baker.

"In fact, it might be that I am in the computer and that another Nat Baker is out there. Whichever way, I know for certain that we are both happy, and I don't want to pull the plug on either of us."

A Cepase spokesman said: "They may be happy now, but what happens when Baker dies? His other half will experience the agony of dying, then will be left without any sensory input, a disembodied mind with no link to the world - a fate worse than death." RC

Mindless meat

29.10.16: THE FIRST meat produced from non-sentient animals went on sale in New York today.

The "mindless meat" ­ sausages and cold cuts sold under the brand name Veggiemeat ­ comes from pigs genetically modified so that they will not develop consciousness.

Non-sentient pigs are produced by removing from the parent animals' germ-line cells (eggs and sperm) the genes that code for the neurotransmitters essential for awareness.

The resulting offspring are matured in a vegetative coma. Their muscles are then exercised by external electrical stimulation to give the meat normal texture.

"Mindless meat" was developed by a UN-funded team of neuroscientists to pave the way for global animal rights legislation that will prohibit the slaughter of unmodified animals for any purpose other than to relieve their pain.

The meat-eating lobby thought to be about 15% of the world's population threatened to make the legislation unworkable, but UN leaders hope the opposition will abate now that they are able to keep meat on the menu. RC

A California man has regained his sight after part of his brain was replaced with silicon-based neurons.

Tom Keirney, 35, had been blinded by a bullet that passed through the back of his head, damaging the visual cortex. Three months ago, scientists inserted a dense matrix of artificial neurons, which has now connected to the remaining tissue in the cortex.

At first Keirney could only make out shadows, but he can now identify shapes and colours and is gaining more clarity every day. RC

Mind where you go

11.03.18: SCIENTISTS now know exactly which parts of the brain make us feel good, bad or indifferent.

The 20-year project to map the human brain is finally complete, and the resulting "brain manual" shows how the brain's 400,000 "modules" interact to produce the human mind.

"We now know, in theory, precisely which areas of the brain need to be manipulated to produce any illusion, any mood, any type of recollection ­ virtual reality, if you like," says the project's leader, Professor Sean Fowler.

The Brain Mapping Project was started in the 1990s, when functional brain imaging techniques first made it possible to watch patterns of activity in a working brain. The sensory modules ­ for example, the areas that deal with elements of visual experience, such as colour, depth and form ­ were relatively rapidly charted. However, it was not until helmet-style scanners were invented in 2005 that it became possible to watch what happens in a person's brain when it carries out normal day-to-day activities.

Critics of the project have claimed the technique is no better than the Victorian "science" of phrenology.

The interactions between brain modules revealed by the new machines were so complex that the brain-mapping project was nearly abandoned.

"People said that the brain would always be mysterious. Luckily, some of us never believed that," Fowler said. "Now we have finally produced the full instruction manual ­ we just have to decide what to do with it." RC

And the language of love

30.06.34 : Colette, the talking ape, and her human speech facilitator Claude Lemann, have announced that they intend to marry. If the ceremony - planned to take place in a lunar hotel beynd terrestrial jurisdiction- goes ahead it will mark the first inter-species marrriage.

Colette, a 17-year-old great ape, is the only primate to undergo "artificially enhanced evolution". She was separated from her mother at birth and brought up by human foster parents.

To encourage Colette's development, electrical currents were used to stimulate the frontal lobes of her brain - the parts responsible for imagination and thought - and the left hemisphere areas which, in humans, are reponsible for language. Her jaw and vocal chords were surgically altered to make it possible for her to talk, and she was subjected to continuous intensive speech training. Lemann, 52, took over her educatio n last year and in May he announced on WebWorld that their relationship had "blossomed into romance".

Their wedding plans have sent shock waves through the scientific community and Lemann has been accused of exploitation, perversion and blasphemy. "This is just prejudice" he said. "We are in love and nothing will stand in our way".

Dressed in a grey suit, with her facial hair closely shaved, Colette smiled shyly at photographers and held hands with Lemann throughout their press statement. Her verbal skills - said by Lemann to those of a 17-year-old human - were not greatly in evidence.

However, when asked how she felt about Lemann, she turned to him and said, softly but clearly: "Je t'aime" RC

Going bananas over Florida fruit

27.03.48: HOSPITALS IN FLORIDA are being swamped by patients experiencing intense flashbacks of past horrors.

'The symptoms are like post-traumatic stress disorder,' says health chief Dan Roche. '"Body memories" stored in the brain's fear centre, the amygdala, repeatedly flood back. We've seen people reliving car crashes and air disasters. The ER rooms are like disaster-movie sets.'

Brain scans reveal amygdala abnormalities, probably caused by a prion, a slow-acting infectious protein. One theory is that the prion was created by a faulty switch in early crops of GM fruit. Florida was first to produce GM crops and some fruit was sold illegally on the home market. RC

This article first appeared in The Sunday Times Chronicle of the Future in the year 2000.

Shadow syndromes

There used to be a very clear distinction between illness and health. But increasingly it is recognised that there is no such dividing line. This is especially true of mental conditions. Things like shyness may be seen as a cute sensibility in one culture, and a socially crippling disorder in another. A "clumsy" child is now likely to be labelled "dyspraxic", a dreamy or naughty child may be diagnosed with Attention Deficit Disorder, a literal-minded one as Asperger's. Is this a sign of creeping medicalisation (bad) or enlightened recognition of individual differences (good)?

Read more

James was a terrific boyfriend - but he turned out to be a curiously unsatisfactory husband. "There was something wrong from the start " explains Rose. "When were dating he used to turn up to collect me in an immaculate suit with a bunch of flowers and a plan of action - cocktails, theatre, restaurant - like something out of ‘50s etiquette book. And when he proposed he went down on one knee and produced a diamond ring. But we never became intimate in the way married couples usually do - it was as though there was some invisible barrier between James and everyone else - including me. The worst thing was that he would miss the most obvious emotional signals. Like he never seemed to know instinctively if I fancied sex. In bed he would always ask, really politely: "would you like to make love?". Can you imagine what a turn-off that was, even when I did want it?"

The "something wrong" with James turned out to be a very mild form of autism. Autism is best known as a devastating form of mental retardation that locks its victims into a fragmented, nonsensical world and renders them utterly incapable of social interaction. But the idea of it being a single, severe condition is giving way to a more subtle view in which autism is a spectrum. Those at the milder end of this spectrum seem on first acquaintance to be - at worst - just a little odd. It is only those who live or work closely with them who know just how peculiar their thought processes are. As Rose says of James: "He was much cleverer than me on the surface, and you could have a perfectly sensible, intelligent conversation with him about something like the economy. But when it came to people he just didn't know what they were about. He was hopeless to watch TV or films with - never got jokes, and if I snivelled at the end of soppy film he'd ask me if I had a cold"

It is not just the autistic-related "shadow"conditions that are being identified more frequently. Many types of behaviour which were once thought of as mere personality deviations now attracting medical labels. Moody people are now frequently diagnosed as having "mild cyclical bipolar illness"; antisocial types have "borderline personality disorder"; worriers are tagged obsessive/compulsives and those who are excessively shy have social phobia. Serial adulterers, when they run out of ordinary excuses, opt for the "sex addict" label and a spell in a mental health farm.

It is easy to scoff at all this - but in fact the proliferation of medical labels marks a fundamental shift in the way that illness - especially mental illness, is coming to be seen. The line between those who are "ill" and those who are just a bit "odd" is essentially arbitrary and right now it is shifting fast. The main reason for this is rising expectations. Ten years ago there was very little that could be done for a person with some mild form of mental disorder - you had to be practically suicidal to get help for depression, and the idea of giving drugs for shyness was simply absurd. But now we have new techniques which can identify subtle mental disease, and powerful drugs which hold the promise of doing away with it. So for the first time there is some point in slapping a medical label on conditions which would once have passed for normal

Take shyness, for example. It may seem ludicrous to turn such a common and "normal" personality trait into an illness. But a person who is severely afflicted with shyness can be just as disabled as one with a recognised disorder like severe anxiety. "I didn't even try to get to University when I was 18 because I was too scared of having to meet all those new people" recalls Mary, a bright and pretty thirty-year-old I met recently on an Open University weekend. "I got a job in the local library but I even found that unbearable - I blushed every time I had to ask someone to pay a fine. In the end I got a job working from home and that's more or less where I stayed until last year". What happened to change things for Mary was that she discovered a drug which, together with counselling, has changed her from a terrified recluse to a normal outgoing person, anxious to make up for lost time.

Severe moodiness, like extreme shyness, is also starting to be seen as something that can - and often should - be treated rather than shrugged off as just part of a person's nature. Even if the person who has the moods is prepared to put up with them, those who have to live with may not be so tolerant. A couple of years ago Jonathan and Serena were on the point of divorce due to Jon's black moods and filthy temper. "One day I came back from the supermarket and he flew into a rage because I had forgotten to buy his favourite beer" recalls Serena. "He wanted me to turn round and go straight back to the shop. Instead I walked out and went straight to a solicitor". The shock of a divorce threat tipped Jon into severe depression, for which he was treated, successfully, with drugs and cognitive therapy. The change in him after treatment made him, and Serena, recognise the behaviour both of them had always dismissed as "just Jon" was actually a sign of a milder form of depression which he had been suffering, on and off, for years.. Now he takes a low maintenance dose of antidepressants, his rages have disappeared and the marriage remains intact.

Whenever a group of middle-class parents start chatting about their children, shadow syndromes soon crop up. Kids who would once have been called clumsy now have "mild cerebellar syndrome" or "developmental dyspraxia." Bad spellers are "borderline dylexic". Those who would once have been thought of as a bit slow now have "frontal lobe syndrome"; naughty kids have "impulse control impairment"; dreamers and those with butterfly-minds have Attention Deficit Disorder and the ones who wreck supermarket displays have Attention Deficit Hyperactivity Disorder". Such medical labels are useful because they encourage early treatment. Childhood anxiety and shyness, for example, may respond to psychotherapy or certain types of antidepressants. Physiotherapy and training exercises may improve dyspraxia, and low dose Ritalin or similar amphetamine drugs may help the dreamy or butterfly-minded child to focus.

This is not the case with Rose and James. Asperger's syndrome is not easy to treat because it seems to be caused by a physical abnormality in the brain which probably occurs during very early development and is irreversible by the time it becomes evident. If a person with Asperger's is very intelligent, they can often mask their inability to "know" what other people are about by "working out" what is going on, using logic rather than intuition. This often works quite well in the workplace and in casual social interactions, but it is rarely sufficient to make for a happy marriage. Rose gave up trying to teach James to read her mind and the couple wee amicably divorced a year ago.. She is now dating a man who invariably turns up late, dishevelled and breathing lager fumes .But he never has to ask her if she fancies going to bed.

The Limits of Imagination

Is human perception limited, or can our awesome powers of imagination allow us to see the world in any way we wish? In this article - derived from a talk I gave during an ICA seminar on Human Nature - argues that our brains force us to see and interpret the world in a way that is peculiar to our species.

Read more

Human imagination, we humans imagine, lifts us above other creatures because it allows us to spring free of the here and now. A mental excursion to Narnia or Brobdingnag, Middle Earth, Discworld or Lyra's Oxford suggests that it is practically unbounded; that in our mind's eye we can travel beyond the range of memory, defy the laws of nature and slip free from the limits of biology.

If this were so - if our minds really could float free of the material world - there would be an aspect of each of us that is transcendent. It would justify human claims to be ‘special' among animals.

When human imagination is scrutinised, however, its limitations become apparent. Our flights of fancy are slotted into existing conceptual templates - notions of time, space and embodiment - which are physically encoded in our bodies. These force us to see both the ‘real' world, and the worlds we dream up, in a particular way. If our bodies (and particularly our brains) are structured normally we will never imagine anything that we could not, in theory at least, experience in reality.

We cannot, for example, imagine an eight dimensional world. Mathematicians tell us that we probably exist in one, and we may believe them, but we will never imagine it because our bodies extend into only four dimensions. Nor can we imagine a true abstraction - infinity, or idealised justice - because our symbols are grounded in sensation. And the only sensations we can imagine are familiar ones. Try as you might, you will never be able to imagine ‘seeing' as a bat ‘sees', or ‘hearing' like a whale.

Imagination seems at first to be quite distinct from perception of the external world - the sort of here and now awareness we assume we share with all sentient beings. Whereas I feel that I can make anything I like of my fantasy world, my perception of ‘reality' is non-negotiable. I see a blue mug on the desk in front of me because it is there, and it has intrinsic qualities which make it look blue. I am inclined to assume my cat sees it too, in much the same way.

People's visual perceptions are usually so alike that they are for most intents and purposes identical. If a group of people were to be presented with the light waves that are bouncing off my blue mug, for example, it's unlikely that one of them would exclaim ‘There's a hippopotamus!' while another protests: ‘Rubbish! It's clearly a carrot'. All of them would probably say that they saw a blue mug. Close interrogation might reveal that one of them sees a slightly greener-blue mug than another, but the differences would generally be unimportant. Not only that, but all the people would see the blue mug in the same way: which would not be an x-ray view, or a view of it as seen by a heat-detecting camera, or a view of it as seen through an electron microscope.

Yet perception is itself largely imagination. Our close consensus about what's ‘out there' obscures the fact that what each of us sees is not ‘given' but individually constructed. Perception is the end result of a creative brain process which can be likened (up to a point) to product assembly in a factory. At one end raw materials - light rays, sound waves, molecules and vibrations - come in via our sensory organs, and at the other end there emerges the finished products - thoughts, emotions and sensations. The reason that the external world appears similar to us all is not because there is only one way to see it, but because the assembly lines in our brains are so alike that we all manufacture it in a similar way.

Our human consensus encompasses not just our perceptions of concrete objects, but also the way we see things in a more abstract sense, right up to sophisticated issues of social conduct. Despite the fact that human beings grow up in vastly different environments, practically all of them agree that food is good, a roaring tiger is frightening, a smile is more inviting than a frown; that pain is nasty and murder wrong. This common view is an evolved way of seeing things. It is the one that best equips us to survive. If our survival needs were different, our view would be different too. The mug on my desk looks blue to me because my visual apparatus has evolved to distinguish a wide range of colours, presumably because this gave my ancestors some advantage in foraging for food. The light waves from the mug are those that generally give me a ‘blue' experience, but they do not do that necessarily. Despite my casual assumption that my cat sees things much as I do, it is actually very unlikely that the mug is giving him a ‘blue' experience comparable to mine, because his eyes and brain do not process light waves in the same way as mine. In this, as in many other things (the fun potential of a mouse on the bed, for example) the cat and I do not share a common view of reality. We see things differently because we do not make the same of it. Human physiology dictates that we see things as we do, just as a cat's physical form and function dictates that they see mice as delicious playthings.

In order to make anything of the stimuli from which we construct experience we need to interpret them, using pre-existing concepts about the world. By concepts I mean any sort of knowledge, prejudice or disposition: memories, beliefs, ideas, even the species-specific distribution of cones in the retina that cause me to see blue where my cat probably sees grey.

Concepts, in this sense, are both mental processes and physical states. Recalling a personal memory, for example, involves the activation of a distinct (though constantly changing) neural firing pattern within a distributed system - a process. If you call up the last sight you had of your mother, the brain areas which will be activated include the hippocampus, temporal cortex and parts of the visual cortex. This combined activity is the neural correlate of the image you see in your mind's eye. But the memory also has a physical existence, of a sort, even when it is not being recalled. This is because the neural firing pattern correlating with a memory is largely preserved from recall to recall by physical linkages between the relevant cells. Each time a particular neuronal firing pattern occurs the cells involved form stronger bonds between their axons and dendrites. In the case of long-term memories (the ones you ‘relive' in recollection, or hold as known facts) the neurons involved are located in the association areas in the temporal lobes. Although it requires systemic activation - that is, other areas, such as the hippocampus or prefrontal cortex, need to be active in order for these patterns to fire up - long-term memories can be triggered just by stimulating a cortical ‘storage' area with an electrode. If you had a sensitive enough microscope and knew what to look for, you might even be able to discern the shape of a memory, woven like a cobweb in the dense tissue of the cortex.

Before a concept can inform perception, it has to be switched on. That is, the neurons in the pattern which encodes it needs to be firing. If they are firing rapidly (more than 40 times a second) they become conscious, but even when they are firing at a lower rate, and are not therefore actually ‘in mind', they can still influence behaviour. Certain concepts are more or less permanently ticking over at this subconscious level throughout our waking life. If we lost them we would lose our ability to ‘make' anything of the world at all.

To see a visual image, for example, we need an operative concept of space. You might think that such an idea is unnecessary because space is simply there - you don't have to invent it in your head before you can be aware of it. But this is not so. Our brains are primed to be aware of space - the parietal lobes contain a sort of spatial template closely associated with the body maps which grant us awareness of our bodies. Because the idea of space is thus physically encoded in our brains, it is vulnerable to physical injury. Certain types of brain damage produce a condition known as neglect, in which the individual loses awareness of one or another ‘chunk' of space and, with it, awareness of any objects within that space. The most common type of neglect involves the loss of one half of the visual field, but sometimes it is ‘near' space (the area immediately surrounding the person's body) that is lost, or ‘reaching space' (the area within the stretch of their limbs).

A person's ignorance of a ‘neglected' area is more profound than if they were blinded to it - it is not just that they can't see it, they don't realise it is there to be seen. Neglect is probably not caused by erasure of the concept of the neglected area of space, but by damage to the attention system which prevents people from activating the neurons in which the notion is encoded. Several studies of affected persons have shown that - even in their mind's eye - the lost area cannot be accessed. For example, when a patient with left-side neglect was asked to describe an imagined walk from the south coast of England to the Scottish Highlands, she named only the towns in the east on the way up, and only those on the west on the way down. Our concept of space is only useful so long as we can tell one bit from another. We have to know that ‘here' has a particular relationship to ‘there', rather than just being different. For that we need to have a mental concept of our bodies which we can place within our mental space. Only when we have placed ourselves firmly within it do we, literally, know where we are. If we did not have this idealised body it would be rather like having a map of a strange town - useless without an arrow saying ‘you are here'.

Our body concept has to be kept ‘switched on', at least at a low level, in order for us to use our (actual) body appropriately. It remains ‘on' by being constantly stimulated by incoming sensory information. Some of this comes from outside. When we walk, for example, the pressure on our feet tells us how our bodies are interacting with the floor. But most of the information is proprioception - a constant stream of messages coming from our joints, muscles, and the movement detectors in our middle ear. This information impacts on the conceptual body and changes it from moment to moment, keeping our inner sense of our body lined up with what is happening to the real thing. The concept, though, is always just a little ahead of the reality. It takes the information coming in ‘now' and uses it to construct a model of how our body will be a split second later. This is just the time it takes for the concept to become conscious, should it do so. Therefore, when we become conscious of our body it seems that we are conscious of it in ‘real time'. As we feel our weight shifting from the pushing foot to the stepping one in our walk, for example, our body concept is altered to match what we expect when the stepping foot hits the ground. Most of the time the prediction is so good that the real experience is more or less indistinguishable from our idea of it. Indeed, we don't have to take account of the external information at all, because we have already incorporated its effect into our concept, which ticks along, getting us about the place, without becoming active enough to be conscious.

However, if we put our foot down a hole and produce a sensory experience that clashes with the predicted concept of how our body is supposed to be at that moment, the concept has momentarily failed and needs very quickly to be re-arranged. To do that the brain has to extract as much ‘real' information as possible, in order to construct a more realistic internal model. At such times the neural representation of our body - caught in the act of incorporating new information into itself - becomes excessively active and flares into consciousness. Once the model is happily re-harmonised with the outside world it slips back into tickover mode.

The distinction between our conceptual body and the real thing is manifest most clearly in sleeping dreams or waking ‘out of the body' experiences. In sleep, the signals from the body that normally keep the two things - concept and reality - closely yoked are blocked off, so our dream body is just our concept, uninformed by proprioception or external stimuli. Hence our experience of it can float free of the flesh.

The most basic feature of our body concept is its boundary - where it begins and ends. Although the conceptual boundary is plastic in that, as in dreams, it can detach itself from the physical boundary, it is not easily breachable. We have an extremely strong concept of our bodies as whole, integral, undamaged entities, and it doesn't adapt easily to change. When someone loses a body part through amputation, therefore, it is common for them to continue to feel it, in the phenomenon known as phantom limb. When people with this condition say that they can ‘feel' their lost arm, what they are actually conscious of is the concept of that arm - which is still securely lodged in their brain.

Conversely, if the concept itself is partly lost, consciousness of the matching body part will be lost too. Stroke patients who suffer damage to the body map in their brain become partially paralysed, either because the signal pathways between their real body and their conceptual one are broken, or because part of the body concept is itself wiped out. In the latter case, patients seem to lose not only feeling and movement in the affected ‘real' body area, but also the sense of owning it. It moves beyond their body consciousness, becoming an object ‘out there' rather than an integral part of their selves.

The idea of our body - and the related concept of space - might seem to be the most deeply ‘plumbed-in' concept we have. But in fact there is another concept that is even more taken for granted by humans: that of time. Like space, it seems absurd to think of time as an idea. It seems just to be. But if that were the case - if time proceeded at its stately pace without any conceptual input - it would pass at the same pace for each of us, whatever our circumstances and whatever the condition of our brains. And that is not the case. Even in common experience we find that time does not proceed smoothly. Anyone who has ever been physically involved in or witnessed a traumatic event will know the sensation of time slowing down. Conversely, when we are tired and struggling to do the things that need to be done in the day, time seems to fly past us, leaving us constantly in its wake.

The notion of flowing time is encoded in a neural circuit in the brain fuelled by the neurotransmitter dopamine. Each ‘loop' of activity takes, on average, one tenth of a second to complete, and events registered by the brain within the duration of a single loop are experienced as a single occurrence. If the activity in the brain's time loop slows down, therefore, events get compressed in subjective time, so everything seems to go faster.

Such variations in subjective time are not usually great enough to affect our ability to function, but if the timing mechanisms in the brain are severely disrupted by illness the effect can be disastrous. People with Parkinson's disease commonly have a completely different idea of time to everyone else, because their internal clock is slowed down by their insufficiency of dopamine. If you ask most people to say - starting at a certain moment - when they think a minute has passed, their answer, typically, will be to say ‘now' after about 35 - 40 seconds. Parkinsons' patients (without medication) are likely to opt for a far longer duration.

The concept of time can be disrupted by damage to any part of the neural loop that comprises the brain's internal ‘clock'. One 66-year-old man, for example, found one day as he drove to work that the other traffic seemed to be rushing towards him at terrific speed. And he simultaneously felt that his own car seemed to be going unusually quickly. Even when he slowed down to walking pace it seemed to be hurtling along too fast for him to control it. He found that he couldn't watch TV because things happened too quickly for him to keep up with them, and he seemed to be perpetually tired. When doctors tried the ‘60-second' test on him, he waited nearly five minutes before saying ‘time up'. A medical examination revealed that the cause of his problems was a growth in the man's prefrontal cortex.

Subjective time may even stop altogether. Damage to the basal ganglia and/or frontal lobes sometimes produces a state known as catatonia, in which people may become ‘frozen', like living statues. Some such affected persons have been paralysed in mid-action, their hand outstretched as though to reach for something, or contorted into strange postures which they may hold - despite what would normally be severe discomfort - for days at a time. Although they do not appear to be conscious during this time, some patients have later reported that they had memories of it, but that their recollections lacked any sense of passing time and that their consciousness was utterly still and devoid of possibilities. A sense of timelessness - though starkly different to that of catatonia in that it seems full of possibility rather than empty - is also reported by people in meditation or trance.

At the other end of the scale, people whose brains are suddenly thrust into overdrive, experience an acceleration in subjective time, with a corresponding deceleration of events in the outside world. This is what happens when excitatory chemicals flood the brain during terrifying experiences like accidents, or thrilling ones like a first parachute jump. Suddenly consciousness becomes very clear, with each tiny change in the environment noted and considered. Even when the experience is awful, it delivers an overwhelming sense of being alive.

The sharpening of consciousness experienced when the brain is excited gives a hint of how our concept of time dictates what we are aware of. Our normal idea of the present moment is equivalent to one of the ‘temporal packets' or ‘ticks' of the internal clock - around one tenth to one fifth of a second. Each tick is the time it takes for the current to run around a loop of dopamine-producing cells in the brain. All the information we process during that time-window is experienced as happening simultaneously. This is probably, at our human scale, the optimum ‘size' of the time packet which is available for us to make sense of things. It means that when a cup falls off the table next to us we see the object hit the floor at the same time as we hear the crash, even though - as light travels faster than sound - there is actually a minuscule gap between the visual stimulus entering our brain and the auditory one. It also allows us to ‘smear' time, fleshing out the subjective moment by squashing into it all the events that fall into a particular time packet.

The drawback is that each of our moments is slightly blurred. When we watch the beating of a fly's wings, we cannot see each individual flap because several of them happen in each of our time windows. The result is that we see a fuzzy haze rather than a clear outline of a moving wing. If our subjective concept of time was more fine-grained, allowing us to split each moment into many more parts, we would see things more clearly. That we have not evolved to do so is probably because such clear-sightedness would burden us with more information than we need. After all, what advantage is there to seeing the individual beats of a fly's wing? The things we need to discriminate most clearly are those that happen in seconds (animals moving or, today, cars bearing down on us) - not milliseconds. Just as there is no need for us to experience all the visual details that our brains detect unconsciously, time experience is most usefully cast in relatively broad brushstrokes. Only when we are faced with a life-threatening situation, or one which is wildly exciting, can we afford to ignore everything in the past and future and concentrate on the present moment And when that happens our brains oblige by breaking the moment into more parts so each one can be separately scrutinised and dealt with.

In addition to the fundamental concepts that mould our experience, we each have a huge database of individual memories which give shape and colour to everything we perceive and imagine. Each one begins as a tiny ‘seed' of sensory experience. The concept of a dog to a baby, for instance, is probably no more than a particular sensation - the waft of hot doggy breath or a big moving shape. But over time it becomes more and more complex. The doggy sensation becomes ‘furry thing that goes woof!' and then grows to be hugely elaborate, incorporating ideas about different breeds of dog, the history of dogs, the biological characteristics of the genus canidae and so on. It may also be linked to other concepts: dog-days and dog-fights, doggie-bags and dogged people, dog-lilies and dog-ends, so that when any one of these ideas crops up it drags with it a whole host of associations.

Most of the things that happen to us get forgotten almost as soon as they happen. But some of them stick in the mind as memories, or act on existing concepts to alter them. The ugh experience of eating a bitter fruit may not itself be remembered, but it might leave a permanent mark by changing or elaborating an existing idea about apples.

The concepts formed by memorising certain experiences and conjoining them with related ones provide a massive database of knowledge that can be brought to bear on new experiences, and therefore affect behaviour. Take, say, a memory of a being bitten by a dog. It will be bound in with existing memories of dogs - of hairy bodies, wet noses, Rover, and so on. Any experience that occurs thereafter which ‘hooks' onto these peripheral memories will therefore also bring to mind - consciously or not - the memory of the bite.

So a new experience of a dog will be attended by a certain degree of caution. There is, however, a limit to the usefulness of such knowledge so long as it can only be accessed by a reminder that is purely sensory (the experience of a wet nose). The concept that ‘dogs can bite' remains locked away until it is needed - right here, right now. It cannot be used to predict what might happen in the future, or what might be happening to someone else, someplace else. In order to make that concept of a dog biting readily available, on tap, it has to be encoded is some way that makes it, so to speak, ‘portable'. The meaning derived from the real event (ouch!) has to be extracted from the memory and put it into a mental vehicle that caters for all events which contain that meaning. In other words, it needs to be symbolised.

The symbols used by humans to transport such experiences are words. Language - the structure in which words are embedded - can itself be considered as a concept. Rather like the body maps which need only to be ‘filled in' by physical exploration, the structure of language seems to be mapped in. You can actually see the parts of the brain where this language ‘instinct' is lodged. Wernicke's and Broca's areas make a discernable bulge (in right handers) along the side of the left hemisphere. When these areas become active, around the age of two, children start to use language to communicate but - perhaps more importantly - they also start to use it to structure their inner world. Language provides a scaffold for thoughts which, without it, would be amorphous and fleeting. It allows us to crystallise ideas, to link them to other notions, to encode them in a way that makes them retrievable on demand, to project into the future, and to string thoughts together in a rational and communicable train.

Once an experience is attached to and encapsulated in a word, therefore, much of the sensory, ‘re-liveable' nature of that experience falls away, because we now have a way of conveying information (dogs can bite) and thus making it ‘useable', without having to recall the experience itself. We may even seem to forget the experience and be left with just an idea.

If ideas could be totally divorced from bodily experience we would be capable of imagining pure abstractions. When we think about things like dogs biting, without invoking a mental image of such a thing, it seems that this is what we are doing. This is not the case, however. Rather, it seems that everything we can imagine has some physical ‘presence' for us. Every word and every thought is connected to a bodily experience.

Bodily elements are easy to grasp when we think in images. After all, an image is a sensation. And even the faintest of imagined images is created, in part at least, by a replay of some previous experience, or a juxtaposition of several such experiences. Similarly, the meaning that we discern in music is conveyed through its sensuality - a musical score means nothing if we cannot translate it into imagined sound.

But what about, say, a chair? We don't visualise a chair every time we say the word - sentences are not like those dumbed-down TV documentaries where every word of the script has to be accompanied by the matching image. So it is easy to think that the word ‘chair' is used instead of a sensation, that the ‘felt' meaning of the object has been transferred into a symbol. That is not the case.

We learn new concepts by linking them to ones we already have. These conglomerations form categories - living things, for example, may be one category, man-made things another. And categories are organised like Russian dolls. ‘Furniture' for example, may be nested within the larger category of ‘man-made things', and ‘chairs' may be nested inside the ‘furniture' category. Types of chairs - a throne, say - will in turn be nested within the ‘chairs' category, and ‘the Bishop's throne' within the ‘throne' module, which is itself inside the ‘chairs' category.

If each category is as ‘real' as any other, a child might first learn about their grandmother's rocking chair and then learn to ‘nest it' within the larger category of chair, working from the bottom up. Or in different circumstances they might first learn that there are objects called ‘furniture' and then learn to discriminate chairs. If the brain was working as a detached learning machine, it really wouldn't matter which concept came first.

But it does matter. The ‘chair' level categories (other examples are ‘tree', as opposed to ‘plant' or ‘oak tree', or ‘horse' as opposed to ‘animal' or ‘carthorse) are, in crucial ways, more ‘basic' to the brain. They are learnt first. They enter language before the others, and are identified faster by nearly everyone. They seem to comprise our ‘default' picture of the world.

One reason for this seems to be that this is the highest categorical level at which you, and I, and everyone else who uses ‘chair', find common meaning through our bodies. I can't imagine sitting in ‘grandmother's rocking-chair' if my grandmother never had one. Or rather, to do so I would first have to create an imaginary rocking chair, imagine it belonging to Grandma, then imagine myself sitting in it, which is quite a conceptual effort. And I can't intuitively know what to do with ‘furniture', because ‘furniture' could be anything from a bed to a bookcase. But say ‘chair' to anyone and they know how their body would interact with it, because you put your backside on the seat and bend your knees to interact (normally) with every sort of chair. Each member of the furniture category, in contrast, requires a different sort of motor action: lying down (if it's a bed), opening a door (if it's a cupboard), placing things on it (if it's a table), and so on. It therefore seems that the perfect example of any conceptual category is not the one that best encompasses all the others, as you might suppose, but rather the one that best exemplifies the way that everything in the category is physically experienced.

The way that we store and retrieve concepts also reveals their links to physical movement. Word knowledge is ‘stored' in the language areas of the brain. But when a person is asked to think of a particular word, it does not just ‘pop up' from the word bank. Brain imaging studies show that the word's meaning is ‘gathered' from wide-spread brain areas, including those that process sensations and plan movements in response to the object represented by that word.

Strange as it may seem, it is impossible even to think of a word without moving. Language-based thought (and most thought is contained in language) is accompanied by the beginnings of the motor actions required to articulate the words aloud. The area of the brain most closely concerned with speech production, Broca's area, is essentially a movement area - it triggers activity in the muscles that allow the lips, tongue and throat to produce sounds. When people read, even quietly, alone, to and for themselves, this area produces tiny contractions of those muscles, even if we long ago learned to stop our lips moving. And the amount of muscular activity is not related so much to the complexity of the words that are being uttered, but to the amount of movement implied by their meaning. Reading a list of verbs produces more motor activity than does reading a list of passive words.

Furthermore, the movement is not limited to Broca's area. One study found that when people saw words relating to tools - things that they would expect to pick up and use - the part of the brain which would normally plan the body movements required to deploy the screwdriver or the hammer became active, as though the tool was right there in front of the person, just begging to be put to use. Symbols, then, may be partially abstracted - that is, taken away from the bodily sensations associated with them - but they are never cut off entirely from physical experience. The brain keeps them connected through its elaborate feedback system, by which concepts track back to the sensations and actions associated with them, and actions and sensations constantly form and update their symbols.

The brain's creation of ‘action plans' with regards to objects may be what it takes to make the objects meaningful to us, or perhaps even to make them visible, or imaginable. From this it would follow that if an object has no potential for physical interaction we simply could not form an idea of it. We may very well be blind to many aspects of the world in which we live, simply because we do not create an intention to interact with them.

It is impossible, of course, to point to things we cannot sense. We can think of such things, as we can think about the possibility of other dimensions, but even in imagination we cannot experience them. There may be surfaces we fail to see because we cannot stand on, move across or place things on them. The surface of a bank of hot air in the sky, for example, may be clearly visible to a gull, looking to hitch itself a ride on a rising current. The fact that it is not clearly visible to humans is not necessarily just a matter of not having evolved the right sensory equipment - experienced glider pilots can ‘read' the sky in a way that others cannot, just as a fisherman sees subtle changes in the sea which escape landlubbers. Rather we do not have the concept required to recognise these things, in the way that a person with neglect does not have the idea of the part of space to which they are blind. The only sort of awareness we can have of these ‘hidden' things is what we can derive from existing, ‘near-to' concepts, much as a blind man might get an idea of ‘red' by thinking of the rich tone of a bassoon. It is only an approximation, though. To be conscious of the real thing you would have to construct an action schema that involved preparing your wings to glide on the surface. And that is beyond normal human imagination.

What, though, of abnormal imagination? When people diverge from the consensual view of reality we regard them as either mentally deranged or gifted visionaries, depending on whether their behaviour is socially acceptable or disruptive. The hallucinations, delusions and bizarre behaviour associated with psychoses were once deemed to be supernatural in origin - the work of God or the devil - and in some cultures they still are.

Eccentric ways of seeing the world occur when a person's brain puts together the raw materials of perception in an unusual way, due to some physical abnormality. The physical difference may be gross - a (literal) hole in the head, like poor Phineas Gage, or microscopic, a matter of molecular changes in a handful of brain cells. Such abnormalities occur for all sorts of reasons. The person's brain may have been damaged by injury or disease or affected by some abnormality in another part of the body. It may have developed in an unusual way due to a rare genetic constitution, been changed by some extraordinary event, moulded by weird cultural conditioning, or temporarily altered by a drug. Whatever the historical reason, the immediate cause of a person ‘losing touch with reality' is an abnormality in the structure or functioning of the nervous system.

Given that the normal way of seeing things evolved because it best equipped us to survive, one would expect any departure from the norm to be dysfunctional. But this is not always the case; sometimes abnormal biology creates a view that enriches us all. Einstein, for example, ‘saw' the space-time manifold in an act of imagination that preceded the laborious process of working out the mathematics that confirmed the intuition. One reason for his genius was probably the missing groove in his brain which allowed information to flow between neurons that would otherwise be separated. Several great artists did their best work during periods of mania, a condition associated with changes in the neurochemistry of the brain, and deliberate alteration of brain function through drugs inspired the Romantic poets to some of their most celebrated work. So, although human imagination may be limited by biology - biology is pliable. One day, perhaps, we will learn how to alter it in such a way as to create any experience we desire.

This article first appeared in Human Nature - Fact and Fiction - Continuum Books 2006.

Architecture and the Brain

In 2002 I was thrilled to learn that "Mapping the Mind" had been adapted as a course in neuroscience for students at the San Diego New S Director of Research Planning for the American Institute of Architects. JOhn is one of the first profesinals to recognise the huge practical applications of brain science, and he has now founded the Academy of Neuroscience for Architecture to help realise them within his own field. He has also written a book dedicated to the topic, for which I wrote this foreword:

Read more

Scientific revolutions - from the first inklings of a discovery through to its practical applications - take time. Major changes in our understanding of the natural world typically begin like a late-night drunk tottering home by the light of a match. There are false starts and time-consuming detours, misread signs and lengthy periods of inaction

Neuroscience shuffled along for centuries before it took off. The scientific study of the human brain dates back to the ancient Greeks and a few philosophers of that time recognised that the brain was the organ of thought. But the notion that this unprepossessing lump of flesh could generate our entire subjective universe was slow to catch on. Unlike the heart's vigorous pumping and the regular inflation and collapse of lungs, the subtle mechanisms of the brain were invisible and, until quite recently, unimaginable Even the most powerful microscopes gave no clue as to how the brain actually worked, so it was difficult to know how to study it. Anyone interested in human cognition and behaviour did better to study those subjects directly without worrying about their physical substrates, or even acknowledging that there were any.

In the 19th century, however, various heroic attempts were made to identify the physical location of mental phenomena. Franz Gall, for example, tried matching up bumps on a person's skull (thought to reflect underlying "organs" of the brain) with their personalities. Phrenology, as his method was known, was nonsense, but the idea was not, in principle, so crazy - people with particular skills or exaggerated character traits really do have more tissue in corresponding brain areas. You wouldn't know that, though, just by feeling their head. The researchers who discovered the first real correlations between brain areas and functions such as speech, understanding, memory and movement, did so by far more difficult means.

One was by opening up a person's skull, prodding the underlying brain and quizzing the patient about the effect. The other involved finding the victim of a "natural experiment" - someone who had suffered brain injury resulting in a clearly manifested behavioural change - then waiting until they expired and examining the dead brain. The first avenue was open only to a few surgeons and the second only to those who were exceptionally patient. One of our longest and arguably pointless intellectual detours - Freudian psychoanalysis - probably came about because its originator found his original career as a neurologist such slow going.

Recently, though, neuroscience has found its feet and has been sprinting onward and upward at a breath-taking pace. The sudden acceleration and expansion of the field was sparked, in the early 1970s, by the invention of functional brain imaging technology. This made it possible, for the first time, to observe a living, working brain and to match its processes with the sensations, thoughts and behaviour of its owner.

The first smudgy pictures of working brains gave little hint of how far and fast this line of exploration would take us. Imaging machines got better, and their use proliferated like wildfire throughout the ‘80s and ‘90s. The corresponding growth of the Internet, and increasingly open access to others' findings, allowed anyone with an interest to learn more about the brain in a few months than our forebears could have learned in a lifetime. Yet, right up to the Millennium it was common for commentators - even those who were themselves working at the cutting edge of neuroscience - to downgrade the its potential. They conceded that looking at how brains work might tell us something about the more mechanical aspects of human behaviour - what happens when we move a finger, say, or how we distinguish the sight of a face from the sight of a house . But it would never reveal the secrets of our more complex, essentially "human" faculties. According to the received wisdom, such things as the nature of human love, our sense of morality or appreciation of beauty are essentially mysterious, or at least too complex and subtle to relinquish their secrets through scientific investigation.

Increasingly, this view is fading . In the last few years neuroscience has provided intriguing clues to (some would say explanations of) experiences as diverse as empathy, religiosity, sexual attraction and aesthetic judgement. Very few people, in or out of the field, would now confidently declare that there is any area of human experience that will not eventually be correlated with the physical processes that underlie it.

Observing our brains in action has helped clear away some of myths and mysteries which previously shrouded the subject of human consciousness. In this respect the neuroscientific revolution has dovetailed with the cultural transformation begun by Darwin: where once we saw ourselves as something apart from the natural world, now we are forced to recognise ourselves as part of it: less divine spark, more awesomely complicated biological machines.

Science is not just about enlightenment, however. It also has the potential to bear fruit. No revolution can truly be said to have occurred until the knowledge it has delivered is put to use in some practical way.

John Eberhard was one of the first to envisage how neuroscientific findings could inform and enrich his own profession - architecture. In 1998 my book "Mapping the Mind" - a non-technical overview of brain research - was published in the US and I was delighted, shortly after, to learn that Eberhard had adapted it into a neuroscience course that he was teaching to students at the New College of Architecture in San Diego. I subsequently met some of these students when they were midway through the course, and had the satisfying experience of seeing how the relatively "raw" information about neuroscience that I had helped provide was being transformed, in their fresh and flexible minds, into a subtly new approach to design.

Essential to this approach is the notion of interaction between the built environment and the people who use it. This is not itself a new idea - the effects of the environment on behaviour have been studied formally by architects for a quarter of a century, and informally, one imagines, since our ancestors first fought for the cave with the best view. But until now environment-behaviour studies have depended on observing how people react to the built environment after it is built. In other words, architects have had to intuit the effect of their design, then find out later if they got it right.

Neuroscience allows that process to be reversed. It has revealed our subjective responses to the material world at a much more profound level than that of conscious likes and dislikes. Knowing about these largely unconscious reactions give architects the wherewithal to make better predictions about the effect of their designs, and to assess the effects in a much more detailed way.

Take, for example, the visual effect of a building. As Eberhard points out in this book, buildings have a very direct effect on our emotions. They can be depressing or uplifting, soothing, or surprising, welcoming or forbidding. Architects have always known this, and the best of them have had a pretty good idea about which particular features of the building produce these effects. What they have not known - what no-one until now has known - is why certain visual effects produce particular subjective experiences. What is that shape, that colour, that curve, that angle, doing to our brains? What is the process that turns the sight of one façade into a feeling of pleasure and the sight of another into anxiety? These questions are not purely academic because knowing why a particular stimulus produces a certain effect often points to how that effect can be enhanced or diminished.

Consider the way that we respond to the sight of other people. If we see someone dancing, our own toes start to twitch and if we see someone smiling at us we feel a sense of relaxation and the urge to approach them.. The reason for these responses might at first seem extremely obvious: we have learned what it feels like to dance, so the sight of someone else doing it "jogs our memory" and prompts us to do it ourselves. Similarly we have learned to associate smiling faces with pleasant social encounters, and therefore the sight of one leads us to anticipate another such encounter. In fact, brain research has shown that none of this elaborate cognition has to take place in order for those effects to occur. Our reaction to a dancer or smiling face is "hard-wired" into the brain and takes place even before we are conscious of the sight, let alone had time to think about it.

The physical mechanism that brings about our swift social symmetry is called the mirror neuron system. Mirror neurons are dual-effect brain cells. On the one hand they fire in response to the sight of someone else doing something, and they also produce - or start to produce - that very same action in the person who is making the observation. Hence, when you see someone smiling, the mirror neurons which respond to the sight of the expression also start to contract the little muscles around your mouth and eyes which put a smile on your own face. This, in turn, feeds information back to other areas of your brain which generate the state of mind appropriate to the expression. So, through an entirely mechanical process, two people's states of mind come to mirror one another. Neither has to have a single conscious thought.

The fact that all this happens unconsciously suggests that our brains can produce quite significant changes of mood by processing just one limited part of a stimulus rather than the whole context-rich scene.. The effect of a smile, for example does not depend on us recognising and liking the person who is smiling - we respond to the expression (albeit very briefly) even if it is on the face of Hitler. Similarly we might feel a few dance steps coming on when looking at an entirely abstract piece of art . Marcel Duchamp's famous Nude Descending A Staircase (1912) for instance, has been found in some people to stimulate mirror neurons in the area of the brain which is concerned with motion. As far as the brain is concerned the angular lines that make up the painting are interpreted - as the artist obviously intended - as a moving human.

As Duchamp, and many other artists have demonstrated, you don't have to know about mirror neurons to know that certain patterns and shapes produce empathetic reactions in an observer. But now we do know it is easier than before to see what further research needs to be done in order to use the mirror neuron mechanism to reliable effect in building design. One avenue of research, for instance, might be to discover precisely which features of a motion-inducing image (such as Duchamp's nude) stimulate mirror neurons and then to design architectural features derived from those features. Something similar could be done with an expression such as a smile - is it the angle of the mouth; or its width in relation to the rest of the face, or some other feature that - as far as the brain is concerned - makes a smile a smile? Having extracted the salient information, architects might then incorporate these features into their designs in order to achieve the effect of a "smiling building".

I doubt very much that neuroscientific findings will never usurp intuition and inspiration as a guiding principle within architecture, and that may be just as well. But this book, and its author's continuing dedication to the cause of introducing brain science to his profession,will, I am sure, soon establish it as an important tool. After centuries of alienation, art and science are starting to converge in all sorts of fields and John Eberhard's initiative within the architectural world reflects the new spirit of openess and curiosity that has brought this about. It also marks the beginning of the second era of the neuroscientific revolution - the one in which brain research emerges from the laboratory to become a practical tool for the enhancement of our day to day lives.

This article first appeared as the foreword to Architecture and the Brain: A New Knowledge Base from Neuroscience by John P Eberhard - Greenway Communications

The Curious Case of the Alien Hand

"Alien hand" or "anarchic limb" is sometimes called "The Dr Strangelove Syndrome" after the Peter Sellers character whose right arm seemed to have a life of its own. Real life Dr Strangeloves have fascinated psychologists and neurologists (and practically everyone else who comes across them) for more than 100 years.

Read more

"In living skills M.P. was making good progress with an omelette when the left hand "helped out" by throwing in, first, a couple of additional, uncracked eggs, then an unpeeled onion and a salt cellar into the frying pan. There were also times when the left hand deliberately stopped the right hand carrying out a task. In one instance I asked her to put her right hand through a small hole. "I can't - the other one's holding it" she said. I looked over and saw her left hand firmly gripping the right at the wrist" - Patient study, Alan Parkin Explorations in Neurospychiatry Blackwell 1996 .

Can you imagine how it would feel to be out of control of one of your hands? To watch, helplessly, as it undid your shirt buttons seconds after your other hand had done them up, or pluck goods you don't want from supermarket shelves and place them in your pocket? Worse - think of reaching out with one hand to give your lover a gentle caress, only to see the other hand come up and deliver a right hook instead. All of these things have happened to sane and otherwise quite normal-seeming people. Such events are described in the dry jargon of medical reportage, as "inter-manual conflict". Between themselves researchers call it the "alien hand" .

Alien hands arise in people who have suffered injury to one or both of two brain areas: one is an area called the Supplementary Motor Cortex, an area of cortex on top and to the front of the cortex. The other is the corpus callosum.. Some people with alien hands, like the woman with the culinary problems, above - have had a haemorrhage or stroke. Most of the cases, however, involve people who have undergone split-brain surgery.

Each hemisphere has local control over its own physical realm - mainly the longitudinal half of the body on the opposite side to itself. So if the right leg is to be extended, it is the left hemisphere which instigates the movement, and vice versa. Overall control, however, is vested in the dominant (usually left) hemisphere - it is here that the decision to extend the leg is made in the first place. The left brain exercises control by sending commands, mainly inhibitory ones, to the right hemisphere via the corpus callosum. The system makes for smooth running - there is only room for one boss in a single skull.

Sever the connection between the hemispheres, however, and in certain circumstances the command system breaks down. In split-brain patients the inhibitory messages cannot pass from hemisphere to hemisphere but most of the time this doesn't matter because the two hemispheres are so well versed in their respective roles that things just carry on as normal. . Occasionally, though, the non-dominant hemisphere seems to decide that it should be involved in something which is already being handled quite satisfactorily by the left hemisphere, and without its usual line of communication the left brain has no way of stopping it from acting. The two sides can therefore find themselves fighting - literally - for control.

One woman whose brain had been surgically split found, for example, that it often took her hours to get dressed in the morning because her alien hand kept trying to dictate what she should wear. Time and again she would reach out with her right hand and select an item from the closet, only to see her left hand whip up and grab something else. Once her left hand had got a hold on something it would not let go, and she, of course, had no way of making it obey her conscious will. Either she had to put on the clothes she was clutching or call someone to help her prise her fingers open.. Interestingly, the clothes selected by this woman's alien hand were usually rather more colourful and flamboyant than those the woman had intended to wear.

Another patient had a hand that insisted on pulling down his pants immediately after his other hand had pulled them up. A third found his alien hand unbuttoned his shirt as fast as his other hand could fasten it. MP, the woman whose hand chucked uncracked eggs into her omelette, had to put half a day aside to pack when she went away because her alien hand would systematically remove each item from her suitcase just after she had put it in.

Most alien hands are merely irritating or comical. As one patient put it: "It feels like having two naughty children inside my head who are always arguing". Occasionally, though, alien hands seem to be intent on more than mischief. One man reported reaching out with his right hand to give his wife a hug, only to see his left hand fly up and punch her instead. M.P., too, sometimes found her alien hand would prevent her other hand from making affectionate gestures - her husband was often subjected to a tug of war as one hand reached out to embrace him while the other pushed him away.

Despite this alien hands rarely do anything seriously violent and the world still awaits the first case of murder to be defended on the grounds that 'it wasn't me who did it - it was my hand'. Some people with alien hands have nevertheless become terrified that they might unwittingly do something catastrophic. One poor man, for instance, was frightened of going to sleep in case his hand strangled him in the night.

It is very tempting to think of the antics of an alien hand as the expression of an anarchic unconscious, let loose by the surgeon's knife. This notion chimes in beautifully with popular conceptions of neurosis - the idea that beneath our rational surface lies a naughty, child-like other self which is held in control only by some kind of cerebral police force.

However, florid psychological explanations may not be required to explain why alien hands seem to act in direct opposition to their sensible twins. It could be that, in a (literally) maladroit way, these wayward limbs are really just trying to be helpful.

The supplementary motor cortex (SMA) - the area which, in addition to the corpus callosum is implicated in alien hand - springs into action when the brain prepares to execute complex volitional bodily action.. It does not actually trigger the action itself - instead it acts rather as a motor executive, sending "move it" signals to the neighbouring motor cortex, which in turn sends the "get moving" message to the appropriate muscles. As with all other brain parts, the SMA is cross-wired - the left cortex controls the right side of the body and vice versa. Brain scans show that, in a normal brain, the SMA on both sides of the brain is activated even when action is consciously planned for only one side of the body. The activation on the side which is not actually going to move is pretty weak, but it may be enough to cause movement unless it is stopped. Normally this inhibition comes from the SMA on the side which is actually meant to move - it sends a message to its opposite number which reads, effectively: "Do not carry through .... leave this one to me."

This message passes though the corpus callosum, so in split-brain patients it does not get through. As a result both SMAs send "move it " messages to their respective limbs, even though the conscious brain had plans to move only one.

Why then, do alien hands always seem to work against a person's conscious will, rather than in service of it? One possibility is that the seemingly mischievous antics of alien hands are not designed that way at all - rather, their dogged determination to undo everything the other hand does is because that is all that is left to them.

Say there is some simple task to be done like opening a door.. The dominant hand duly does the deed. Then the alien - dragging along behind, as it always will - arrives on the scene. The task it came to help with has been done. But the hand "knows" it was sent to do something in the area and - without the leadership of a conscious, thinking mind - it does the closest thing there is to the open-door manoeuvre it came to do: it closes it.

If this is correct it begs the question of why split-brain patients do not find both hands - and legs for that matter - taking part in everything they do. In fact even the most active aliens are happy, most of the time, to cooperate with their neighbouring body parts. The author of the case study of M.P, the frustrated omelette-maker, noted:

"I was struck by how relatively infrequent her alien hand incidents were. I viewed with some trepidation her offer to pick me up at the station but was surprised by a smooth driving performance. Similarly during my first meal with her I half expected her aberrant hand to start flicking peas at unsuspecting patients. This never happened."

In fact MP's hand, like most aliens, usually only started its tricks when the task in hand was a one-handed on/off job like opening/closing; pull up/ pull down and so on. It did not seem to happen, either, when the action was a well-practised routine - using a knife and fork presented no problems, for example; nor did swimming. This might be because very familiar procedural actions become embedded in the subcortical areas of the brain and knowledge that is encoded in this way can pass from side to side of the brain via connections other than the corpus callosum .

Neat as this explanation is, it remains somehow rather unsatisfying. The idea of the alien hand as an emissary from some deeper level of the psyche has a compelling attraction which no amount of scientific illumination is likely to dispel entirely

This article is adapted from Mapping the Mind.

Things that go bump in your brain

(Some of) the science behind ghoulies and ghosties, doppelgangers, vampires, werewolves, out-of-the-body and near-death experiences, succubi and incubi, and all the other spooks that haunt us.

Read more

Tales of the supernatural have changed little down the ages - they always feature the same clutch of spine-chilling phenomena: vampires and ghosts; werewolves and zombies; astral bodies, doppelgangers, alien visitors and demonic visions.

Sceptics may deride such things as superstitious nonsense, but the fact that they endure, even in today's material age, suggests that they may be rooted in real experiences. But are they really glimpses into a world beyond? Or can they be explained by science?

Take ghosts. a Lancet survey showed that 12% of a group of perfectly sane and sensible people admitted on questioning that they regularly had ghost-like hallucinations. These particular people had eyesight problems, but other studies have suggested that seeing spectres may be nearly as common among people with perfectly good eyes.

The condition is known as Charles Bonnet syndrome (CBS), after the Swiss philosopher who first described it. People with CBS know perfectly well that what they see is not real - but most of them keep quiet about their visions because they don't want to be thought dotty.

The Lancet study found that 80% of the subjects saw visions of other people - some of them familiar, others complete strangers. Some of the spectres seemed to sit, firmly, in real chairs; others floated, ghost-like, above floors and through walls and a few seemed to be getting on with their normal day to day business. Pets, and inanimate objects featured regularly too and some subjects saw entire scenes. One woman, for example, reported watching cows grazing in a field opposite her house. It was the middle of winter and she remarked to a friend, who was with her, that the farmer was cruel to leave the beasts out in such cold weather.It was only when her companion pointed out that the field was in fact empty that she realised the cows were phantoms!

One explanation for these sightings is a phenomenon called eidetic, or photographic, memory. It is very common in children - about 50% of under-10s have the ability to remember things they have seen so well that if, for example, they are shown a picture of a zebra they can later reconstruct it in their mind's eye and actually count the stripes along its back . The imaginary playmates which people the lives of so many young children are probably created in this way.

The ability to create crystal-clear visual memories tends to be lost as we grow up, but some people seem to hang on to it. One woman with a spectacularly good eidetic memory complained that she was haunted by the ghost of her dead father. Experiments showed that when she looked at the "ghost" her brain registered the image precisely as though it was really there. If it appeared in front of a (real) light, for example, her brain did not register the light rays because her "ghost" blocked them out.

Doppelgangers

Perhaps the most terrifying type of spectre is the one that looks like you. Doppelgangers - spectral images of yourself - are seen in nearly all cultures to be evil or a portent of death. In fact they are produced by a trick of the brain called autoscopy.

People meet their doppelgangers in normal waking consciousness and they usually appear as a perfect mirror image. One woman first encountered her other self when she came home from her husband's funereal. As she opened her front door she was confronted by herself doing the same thing in reverse. Her spectral twin dissolved after a few minutes, but the next day it returned, and it soon became a regular - though unwelcome - companion. "Whenever it appeared I felt myself grow shaky and cold " she says. "It was as though all the life was draining out of me. As soon as it disappeared I felt the warmth come back. Eventually I learnt that seeing it was a signal that I was tired and stressed and needed to lie down and sleep. When I did that it would go away".

Another man was so irritated by the continual company of his other self that he took to making faces at it. "People thought I was a nutter grimacing at nothing" he says. "If I had told them what I was really doing I dread to think what they would have thought!"

Some people who are constantly haunted by their other selves get so angry that they try to attack them. Suicide is closely associated with doppelganger "hauntings" and it may be that some of these suicides were, in fact due to people trying to murder their other half - a tragic situation echoed in the classic horror film "The Man Who Haunted Himself".

Neurological research suggests that doppelganger sightings may be a rare side-effect of certain types of epilepsy. It is also seen quite often among people who have suffered damage to the back part of the brain - near to the area which processes visual stimuli.

Out-of-the-body experiences

Out of the Body Experiences (OBEs) also seem to give you a glimpse of yourself - usually from above. About one in four people say they have floated out of their bodies at some time, and, unlike seeing a Doppelganger, most of them find it quite pleasant.

Melvyn Bragg had OBEs as a teenager, and he later wrote them into his novel "The Maid of Buttermere", instilling them, typically, with spiritual significance: "It is as if a distinct part of me - is it my soul? - leaves my body entirely and hovers above it looking back on this vacated thing of flesh, blood, bone, water....."

Other people have described a sudden wrenching upwards....or have simply found themselves gazing down on themselves. Jennifer, a professional musician, once had a OBE while she was performing in a concert: "I suddenly realised l was up above the stage, watching myself play. It went on for nearly ten minutes and throughout it all I noticed that, technically, I played better than I had ever done before..."

Sometimes OBEs happen during normal consciousness, but at other times they come on during sleep. The sleeper feels as though they have woken up....but their body remains pinned to the bed while their ghostly self floats up from it. "The first time it happened to me I was terrified" recalls Alison, now a practised astral projectionist. "I tried desperately to get back into my body and to wake up ...but I couldn't move. Eventually I seemed to be sucked back in and I woke up with a start".

In folk myth that frightening feeling of being pinned down is blamed on an evil devil which sits on peoples chests. The female version is called a succubis - so named because it is said to suck men dry. The male version is an incubus - a demon which, as in "Rosemary's Baby", rapes and impregnates sleeping women.

What actually happens is that when we are asleep the brain paralyses our muscles in order to stop us acting out our dreams and injuring ourselves. Sleep paralysis usually switches off the moment our conscious minds switch on, but sometimes mind and body get out of synch and the paralysis lingers after we have woken up.

Flying dreams are similar to OBEs and they are probably related, too, with near death experiences (NDEs) in which people report leaving their body behind and floating down a dark tunnel towards a bright light. NDEs are often reported by people who temporarily "died" on the operating table - and this may be because they are triggered by certain types of anaesthetics. When it is starved of oxygen the brain produces chemicals which protect its cells from the effects of asphyxiation. A side-effect of these chemicals is to produce the depersonalised, euphoric state of mind reported in NDE, and some anaesthetics contain substances which mimic these chemicals.

The bright light which is so often reported may be the result of accompanying changes to the retina. As the cells in the eye start to die they fire at random, and each firing sends a light stimuli to the brain. The area with most cells - and which is therefore likely to produce most light - is right in the centre. So the effect may be very similar to seeing a bright light at the end of a dark tunnel.

Sci-fi classics like "Invasion of Bodysnatchers" and "The Stepford Wives" may be inspired by a curious condition called Capras syndrome, in which people believe their nearest and dearest have been somehow "taken over".

It is among the most terrifying and destructive of all psychiatric disorders.

One woman woke up one day and was convinced that the man lying beside her was a stranger who had somehow instilled himself into the body of her husband. Another was so convinced his father had been replaced by a robot that he tried to slash the imposter's throat open, looking for the wires.

Capgras' syndrome is thought to stem from a disturbance in the pathway which shunts stimuli to the part of our brain which registers familiarity. When we see someone we know we register it both intellectually - by sending messages to the cerebral cortex where we do our thinking; and emotionally - by sending messages to the deeply-buried limbic region of the brain which is the seat of our feelings. The French acknowledge these two ways of knowing in their language: savoir means to know intellectually, whereas connaitre means to know in the emotional sense. People with Capgras' syndrome are fully capable of knowing the "savoir" sense and they can tell that a person looks the same as ever. They just don't feel like the same person - and in an effort to square the conflict their brain comes up with the "imposters" explanation.

Vampires have generated an entire horror industry of their own. At the last count there were more than 300 novels and 200 films about them, plus countless journals, newsletters, comics - even cookbooks. But vampires are not entirely fictional - more than a thousand people worldwide claim to be the real thing and clinical vampirism is recognised as a definite psychological disorder. The symptoms include an incomplete sense of identity, a morbid fascination with death and sexual craving for blood. Some vampirists (as they are known) injure themselves in order to get their "fix" of blood. One psychiatrist reported a case of a man who did this so often that he need repeated blood transfusions.

The origin of the vampire myth is lost in time, but one theory is that it grew up around people who suffered from porphyria - the disease which caused the Madness of King George III. As well as causing peculiar behaviour the symptoms of porphyria may include anaemia and emaciation (vampires are traditionally thin and white-faced) extreme sensitivity to light (vampires have to be back in their coffins by daybreak) bloodshot eyes and - in extreme cases - pink-tinged teeth. The disease makes it impossible for the body to metabolise iron and some researchers have even suggested that sufferers may have tried to cure themselves by taking iron-rich blood from other people...

Lycanthropy is a rare psychiatric condition in which people believe they have been turned into werewolves. Researchers at Harvard University recently collected 12 cases of people who thought they had been transformed into animals and found that, unlike vampirism, the condition only afflicted people who were severely psychotic. Along with the wolves there were two people who thought they were cats, one who thought he was a bird - and a gerbil.

This article first appeared in The Evening Standard.

Books
Brain stimulation