Wednesday, March 21, 2007

Humanizing the Brain

Humankind has persistently probed into the mysteries of our residence on this planet. One of the confounding puzzles being where is the centre of our existence: do we operate from something inside us or from something outside ourselves? Though our history is rife with ideas that something other than our self controls and commands us, mythologists point out that even the Other is a psychological projection of our own demands for a better life. In fact, for something to exist, it requires the human consciousness to contemplate on it and mold it into an imperfect pattern for others to grasp “because humankind cannot bear very much reality.” So, there emerged an inner mind, which was to reward us with a swelling understanding of outside world.

But, where is the physiological seat of the human mind? Aristotle (384-322 BC) thought that it was in the heart. But the Greek father of medicine, Hippocrates (470-366 BC) was the first to proclaim – in his On a Sacred Disease – that a disease like epilepsy was not some divine punishment but something that had its origins in the human brain. So, a struggle is sparked for the source of a disease: is it the physiological brain or the psychological mind? In a sense, mental illnesses solved this paradox.

The cure for weird mental states like derangement, lunacy helped in the emergence of numerous theories over centuries. Some theories based their source in the Satan; others thought that it was maladjustment of the mind in the world. By the 21st century, medical practitioners have adopted a careful course of cure with a mix of mind and the brain. Every mental illness can be traced back to the physiological damage to brain, or to genetic anomalies as well as psychological disorientation caused by the patient’s surrounding environment.

Yet, most scientists today tend to think that we are nothing but our brains. Because it is the cognitive demand of the scientists to measure every thing objectively, they are cornering their expertise in reducing everything down to a single source of operation: brain. Neuroscience especially has played a major role in isolating the human brain and explaining everything from motion, emotion to consciousness through it. It is undoubtedly true that without intervention of the brain, none of our body processes are fully functional. Conversely, this proves that after all we require a body for the brain to go about its business.

The very discipline of humanities has blossomed with an awesome alarm about human selfhood and the ways that shape it. Neuroscience – by giving a physiological explanation to our inner responses to the outside world like emotions and memorizing – tries to remodel and reorganize the rich understanding that humanities have gifted us. In the process, neuroscience tries to virtually explain away the humanities (though this may not be its intention.) I don’t mean to belittle either of these critical disciplines because each one of them has its rightful place in our levels of perception of the reality.

What I wish to do is to use two major neuroscientific heartthrobs: memory and behavior (especially pain) to understand how this cutting-edge research may affect the way we do humanities. I will try to raise pertinent questions that this research may directly or obliquely pose to humanities.


Neuroscientists try to clamp down on every perception and net out the neural pathways. So, does this diminish the importance of the pain? Pain doesn’t vanish because we have successfully reduced the mechanism of pain into neuronal activities. By explaining away the pain mechanisms, we don’t diminish the pain of “pain.” But, the whole problem arises when neurobiologists rabidly reduce every known manner of pain into a matter of neuronal firings. Yet, as the University of Iowa neurologist Antonio Damasio suggests, a fair degree of reduction is essential to alleviate the suffering of others to the extent that we don’t fail to see the human as a complete organic being with an embodied brain.

Genes, as we now know, cannot be counted on to be single stretches of DNA, but may involve distinct segments of DNA. Additionally, a given gene may participate in different macro properties as a function of prior conditions. Nevertheless, many molecular biologists see their explanatory framework as essentially reductive in character, mainly because a causal route from base-pair sequences to traits in debilitating diseases such as Huntington's can be traced or at the very least, sketched in outline.

With the messiness and incompleteness of reductionistic techniques, a warning must be served not to explain away everything that such techniques sweep under their carpet. Brain has an unimaginable neurobiological complexity that can be scarcely explained – least understood. Ultimately, it is rather up to the diminishing wisdom of the neuroscientists to gauge their reach of research and, speak sanely and humbly about their degree of reductionism.

The eighteenth century German philosopher, Goethe who sparked a movement called Naturphilosophe, had suggested that there is an organic dynamic non-mechanistic force (natura naturans) driving us. This energy-force cannot be substituted by some mechanical functional force in human brain. By positing dynamism as the energy of the human soul, this movement couldn’t accept any reductionism. Essentially, the mind has to be viewed as a special and spontaneously proactive system as opposed to a passive product of electrochemical forces. Brain cells work as autonomously functioning units that relate and connect to the complex wholes that emerge out of the coming together of cells.

Though there is no such movement nowadays, there have been interesting debates between neuroscientists and philosophers. One such famous debate took place between a French neuroscientist Jean-Pierre Changeux (b.1936) and the quintessential philosopher Paul Ricoeur (1913-2005) which was published as What makes us Think? Changeux who is a self-declared experimenter-reductionist nonetheless acknowledges that outside the lab, reductionism is inadequate. He exposes the findings of modern neuroscience - at all levels from the molecular to systems theory - to the scrutiny of his opponent.

Ricoeur replies that, however valid these neuroscientific stories may be, within their own context, they have nothing to say about the experience or relevance of conscious thought, of human agency, of ethics and of moral responsibility. This has been the broad attitude on either side with consensus becoming an accidental outcome. But, debates must take place not to prove the worth of the parties but for the audience to enfold such patterns of thinking into a new intense pattern of understanding.


Memories evoke such strong responses that Marcel Proust’s (1871-1922) enchantment with a cake empowered him to write the twelve volume Remembrance of Things Past. The frustration to remember is evident in literature where narrative techniques like analepsis (flashback) and prolepsis (flash-forward) are impressively invoked to grip the reader’s mind.

Memory is so important to us and yet, one cannot explicitly localize a single superpower of memory in our brain. A world-famous neuroscientific case was that of a young man, known only as HM, who was operated for epileptic fits due to which large portions in either side of the brain were removed. As a result, HM was terribly stripped off his capability to form new memories. Though this loss can be due to the loss of the key brain regions, it was later found that memory is organized in different ways and various regions in the brain are assigned as the storehouses of different kinds of memory. Memory is usually of two kinds: short-term memory and long-term memory. Short-term memories deal with our immediate recollection of experiences that recently happened like being attentive while trying to listen to the phone-number and jotting it down.

Long-term memory helps us out in our overarching functions of memorizing the past and important ideas. This memory is divided into episodic and factual memories. Episodic memory helps recall the events that have happened in a certain location at a specific time. Factual memory helps recall facts and ideas relating to persons and places, words and things.

The location of this variety of memories can greatly vary but one can tentatively locate their store. Human brain is broadly and regionally differentiated, and brain imaging studies certainly show specific areas of the brain that light up when specific mental functions are in process. However, the relation between the areas and their designated functions seems complex. There is, for example, no “memory bank” corresponding to computer memory. Damage to the hippocampus interferes with the ability to retain new long-term memories, but those memories are not encoded by specific neuronal connections. Over time, other brain regions and connections become involved in specific memories.

There is no process of build-up and slow loss of fixed wiring connections of neurons in the brains that correspond to mental states. Certainly, new neurons and synaptic connections are being produced and are dying throughout life with, alas, a preponderance of losses as we grow older, but these births and deaths are not in some one-to-one connection with events remembered or individual mental processes. There is not some fixed physical module corresponding to the ability to do long division or remembering Pi to six decimal spaces.

Just as new neurons become physically involved in old functions and memories, neurons that are already present can increase the number of connections to other neurons over time, and thus become involved in new multiplicity of pathways. We don’t have neuron-by-neuron catalogs that relate particular cells to particular mental states, so it may well be that such multiple pathways are not related in any categorization that makes logical sense. Nobody really knows how neurons exactly manage their astonishing complexity of connections.

Consider two simple case-studies. Oxford zoologist John Krebs (b. 1945) worked on a species of birds called Marsh Tits that store food in a special hiding place. Storing of food is an event; so there is episodic memory involved here. Then, one group of birds was allowed to take the fed food back to their hideouts whereas another group of Marsh Tits was given food that couldn’t be taken back. After some time, it was found that the first-group birds’ hippocampuses were found to be enlarged.
Our brain is meant to adapt to its needs (just as the birds adapted to theirs) and this we acquire through direct confrontation with the outside reality which is mapped with the inner tools to encode the grasped reality. Without the experience that has a lot of emotional significance enfolded into it, brains remain in their primeval stages of resistance to growth.

Even the hippocampus stores memory for a short duration (say a year or two) and then transfer it to the cortex for permanent storage. Yet, to think that hippocampus and cortex are elite centers of memory storage is fatal. An accidental penetration of a foil through the nostrils into the brain of a radio technician damaged a portion above his brain stem called Thalamus. Though he lost his memory of the past two years, he consequently could recollect the lost memories and this proves that the Thalamus is responsible for memories as well as learning to memorize.

The eminent neurobiologist, Steven Rose, found that when long-term memories are recalled, it is not the original memory of the event that is referenced but the recent recall of it. So, we have “memories of memories.” This idea perfectly aligns with the French philosopher Paul Ricoeur’s notion of memory and forgetting. One has to forget to give way for new memories to be formed and to transform the brain. Heidegger’s saying that “memory is the foregathering of thought” is used by Ricoeur – while discussing historical knowledge – to understand the past in the present through the accessible “traces of the past.”

The past has two facets: the individual’s memory that animates the present (the part); the collective memory that provides a backdrop for individual memories to take shape. This collective memory is endorsed by the “testimonies” gathered by the individual memories. Ricoeur doesn’t wish to prioritize one memory over the other but one memory “feeds” into another. So, historical accounts are “interpretative” in nature having “a character of likelihood or credibility rather than certainty.” Further, memory is bound to be forgotten – everything of the past cannot be completely recalled in the present. So, testimony gives “credence to the historical representation of the past.”

Memory is largely connected with consciousness. We just don’t lose memory but a good part of consciousness with it. Most neuroscientists fear consciousness as CLM (career-limiting move,) a fair understanding of it would an awareness that requires the logical continuity of mind. Memory loss significantly alters this continuity and can cancel it any moment. Some have tried to view this aspect as related and solvable by the tempting metaphor of computer. Memory deficits cannot be compared and mimicked by computers to evolve roads to recovery. Computers have a stored memory space that can be used and reused whenever required. But, there is no evidence of specific memory traces stored in the brain in a similar manner.

Steven Rose has been at the forefront of memory research using chicks to understand the spread of associations in brain made by one’s memory. These chicks peck at objects that fall within the vicinity of their vision. If one dips a chrome bead in a bitter-tasting liquid, then these chicks avoided such beads. This was not because they remember such beads, but due to the ways in which various parts of the chick’s brain processed the size, shape and color of the bead to indicate in an integrated manner the distaste of the bead. Memories are therefore so deeply embedded and extended in the brain that no one single place is there where the complete association is achieved.

The great Canadian neurosurgeon, Wilder Penfield (1891-1976), who performed numerous surgeries on epileptic patients, has a lifelong pursuit of how to account for the mental states that are causing the brain states. Is mind in the brain or are the two inelegantly separated? Initially, he performed surgeries to epileptics by identifying the damaged portion of the brain’s temporal lobe (beside the ears) through touching electrodes on various parts. What makes these surgeries interesting is that while locating the brain tissue that is causing epilepsy and excising it, Penfield discovered that patients develop a “double conscsiousness”: while the patients indicate and explore the epilepsy-causing tissue, they are conscious of two realms: one immediate and other past, and both could be vividly and elegantly described.

The immediate environment is the operation theatre, the surgeons and their scalpels, their toolkit and tables, etc; the past was also vividly recollected and described exactly as it happened. There was negligible difference between the patient’s recollection of the event and the event’s actual occurrence. Or in neuroscientific terms, the patient has an exact blueprint of an event’s episodic memory in his working memory.

Double consciousness double-crosses to point to another aspect of the brain. A seductive metaphor for the brain is that it functions like a computer as both of them seem to be mechanically programmed whose code can be swiftly deciphered as well as encoded. Yet, double consciousness is an improbable feat of human mind that enhances the possibilities of the brain. Brain is not a rigid 1.3kg jelly whose majestic functions can be permanently replicated. It is so dynamic that it adapts itself to the needs of the body; its fluidity and density are not easily grasped and reformatted into a computer.

A computer can nevertheless have a vast memory that can be used for massive retrieval. But, a computer is not so dynamic as to adapt itself to changing conditions of memory storage and hence, smart retrieval doesn’t transform into sane recovery.

Memory teasers are common with many neurological patients. The great Soviet neuropsychologist Alexander Luria (1902-1977) talks in The Mind of a Mnemonist about the extraordinary vast memory of his journalist-patient Mr. S. He couldn’t forget anything; every thing perceived by him had a retrievable place in his memory. He could remember people not by their faces but astonishingly by voices and sometimes, even color. A person was a sharp red individual for him.

Places were visually mirrored in his memory and he could remember the objects in say a room by navigating through the visualized mirrored spaces. Though this is a rare kind of memorizing, the mnemonic techniques used by S were quite impressive.

Rituals are literally acted out as mnemonic activities in reconnecting us with the past. For example, the initiation ritual where a child is reminded of stepping into adolescence. It carries a mnemonic key of mentoring the child about the awakening to his adolescence as well as to his future tribulations. In most of these rituals, the myths of past heroes were summarily acted out for the child to have a glimpse and taste of life.

What memories reward us are rich narratives about our own as well as others’ lives – what Lyotard called “quintessential form of customary knowledge.” Without this knowledge and understanding about ourselves, we just sink down into a personal “I” who has nothing to be called uniquely human. Memories are our cables of connecting with the world; if we fail to connect, then we miss out on our aliveness of experiences.

For Jews, the narrative of the Chosen People in the Old Testament was essential. As Hannah Arendt (1906-1975) points out, remembering is a religious imperative. Judaism stresses on the sacred human life who have to memorize their holy book, Torah (Hebrew for direction) In Hebrew, the word to remember is zakhor, which very well extends to a meaning that to remember is to act (darhar). Memory is not just a vague recollection of past but action is embedded in every memory recall. That’s been a principal reason why Jews were a religion that became people rather than people becoming religion.


As with so many other issues within the modern life sciences, the study of emotions and their significance goes back to Charles Darwin (1819-1882), whose book on The Expression of the Emotions in Man and Animals identified what he regarded as a palette of distinctive emotions: anger, disgust, and so on, shared by humans with our evolutionary neighbors. Darwin speculated on their evolutionary importance and even went further, suggesting that the facial expressions that the emotions evoked were cross-cultural, wired-in aspects of our biology. In recent years, a psychologist at UCSF, Paul Eckman (b.1936) has updated Darwin. He developed computer-aided techniques for analyzing facial cues to these basic emotions, even when the individual tries to mask them. As for the brain mechanisms and mental processes underlying such expressions, it seems that at last, the neurosciences are catching up.

Antonio Damasio opened that door with his book Descartes' Error. Descartes should, he argued, have insisted that it was emotio, not cogito, ergo sum, and he went on to distinguish between emotions, which shape every animal's behavior and feelings, which are our conscious understandings of our own emotions. Feelings were constantly investigated by philosophers to load them with rational significance. Whereas they shouldn’t have been divorced from emotional significance for the sake of being objective. In most daily activities, we don’t have a definite rulebook by which we feel out ourselves; but, there is fluid manual that helps us react and respond to circumstances. This obvious attitude which was shunned for long has now been acknowledged by neuroscience as well in humanities (after the Existentialists reinstalled feelings to their proper place.)

Each species live in its own sensory world of which other species may be partially or totally unaware. Snakes – which have more senses than humans – maintain highly sensitive infrared imaging systems that help them prey in absence of visual information. So, every species perceives a partial richness of the sensory world out there. Brains function not by recording the exact image but a select picture of the sensory world. Our perceptions are created by innate rules. Colors, tones are “active constructs” created by our own brain out of sensory experiences.

Existentialists beginning with Søren Kierkegaard (1813-1855) to Sartre and Paul Tillich (1886-1965) have richly contributed with their humanistic insights into feelings: they are not just isolated inward subjective states but they have another aspect of co-existing with the body. The body together with the brain experiences feelings and the existentialists suggest an individual not to wallow in raw emotions but refine them for fulfilled living. To exist is not to physiological assert your place in the world; but psychologically grow out, be oned with the world as well as stand out as a unique individual.

When neuroscientists willfully dissect feelings – through neurobiology – they are in fact least worried about the philosophical effects of such feelings on our wellbeing. I think this realm has been left open for philosopher-psychologists to step in and analyze the dissected feelings to contribute to the total picture of decoding feelings. Simply put, if a neurobiologist decomposes feelings, then a philosopher-psychologist can recompose them to give an existential understanding of the whole process. Because we participate through feelings in the world, emotions and feelings, as it were, are registers of our being-in-the world.

As Heidegger pointed out, we are “beings-in-the-world” who are volitionally brought into the world; our birth is unwilling and death inevitable. Yet, during our sacred stay in this world, there are circumstances on which we have least control and this phenomenon is called “thrownness.” This thrownness provides us with ever-increasing opportunities to realize our limitless potentials in our limited existence.

For example, consider how neurobiologists and philosopher-psychologists treat an almost daily experienced feeling like anxiety. Included in our existence, is a basic anxiety that keeps our constant company: death. The being is threatened by the non-being and to overcome that, we reject our being’s conquest by death by erecting false facades of illusions like immortality or life-enhancing techniques. While we should have enriched our existence with ever-growing self-understanding, we are trapped in a funhouse of seeing reality and in turn ourselves in an inflected mode. Heidegger pointed that anxiety is the basic way by which one finds oneself. To acknowledge anxiety, arrange for its place in our life and deal with it by evolving private strategies is what one can do. This is not something that has been exclusive to Western thought; there have been Indian philosopher-musicians who have melodiously sung these insights.

The great 18th century vaggeyakar Tyagaraja (1757-1847) had rapturously sung a kriti Duduku Gala Nanne, where he parades his errant adolescent thoughts – unavoidable and universal, yet essential for his growth. This comparative reference is not to offset Indian thought with the Western philosophical advances; but, to suggest that as we are quite rooted in our cultural thought, we can comfortably be “at home” with our nearest philosophies and to expand and include other thought systems only helps us move ahead with firsthand understandings. So, feelings seem to lie somewhere between the mere life-processes of the body, of some of which we are barely conscious or even unconscious, and the conscious exercise of rational thought.

Neurobiologists try to reduce this anxiety to something like pain to help them locate the exact part of the brain where such anxiety manifests in an observable form. There are usually two types of pain: chronic and acute. Chronic pain has to be endured by the sufferer for long durations and usually emanates from the body (for example, backache that grips lower back and spreads to left leg and foot) This is usually thought to be a general unease in the body and so, it is left neglected and undiagnosed and this may result in the sufferer’s inability to stand or sit. Those who suffer the devastating effects of chronic pain may fantasize about a life that is completely pain-free.

In fact, such a life is far from idyllic. People who are born with congenital insensitivity to pain, a rare genetic disorder, chew their tongues and lips to pieces, burn their flesh, and fracture their bones without realizing the harm they are doing to their bodies. Lacking a warning system to protect themselves from dangers in the environment, they tend to die young, often in their twenties. Nociceptive or somatic pain—a normal response to noxious stimuli—is essential for life. It tells you to pull your hand away from a flame or withdraw your mouth from a cup of hot coffee. If you break an ankle, the pain keeps you from walking around on it, so the bone can heal.

Nociceptors are sensory receptors, or nerve-endings, that react to mechanical, thermal, and chemical stimuli that may damage tissues. They relay nerve impulses—electrical messages from the site of injury in peripheral tissues such as skin, muscles, and joints—to the dorsal horn, an area in the spinal cord that acts as a switchboard. There, different chemicals determine whether these electrical messages reach your brain, where you actually perceive pain.

Pain that occurs suddenly and has a real, definable source is considered acute pain. Rapid in onset and relatively short in duration, it generally follows a traumatic event such as a bone fracture or a surgical procedure, but can occur in other situations, such as when a hemophiliac suffers internal bleeding. Doctors often treat acute pain with strong drugs, knowing that it will fade as the healing process takes over. Sometimes, however, the pain message system goes awry and people perceive pain for much longer than it’s useful.

Chronic pain involves the mind and emotions more than acute pain does, and has more to do with chemical disturbances in the spinal cord and brain. Although experts say that chronic pain and depression often coexist, they are still unsure which condition comes first. What they do know is that serotonin and nor-epinephrine—chemical messengers that help to regulate mood—also modulate pain signals. Doctors treat chronic pain differently from acute pain. Recognizing that people may be living with discomfort for long periods of time, they may prescribe drugs, injections, stimulation techniques, surgery, physical therapy, or psychological interventions.

Most pain is actually a mixture of types. For example, if a disk—a shock-absorbing cushion between vertebrae—slips out of place and presses on a spinal nerve, the muscles in your back may go into spasm in an effort to protect your spinal cord. This is somatic pain; it’s short-lived and well-localized. In addition to this somatic pain, however, you may experience the persistent tingling, numbness, or burning of nerve pain, which travels down your leg. Acute pain is considered “good” pain because it is an alarm system that warns of injury. Chronic pain is like a faulty alarm. It serves no beneficial purpose and, if ignored, becomes an illness unto itself, causing changes in the nervous system that only result in more pain.

The English novelist, Virginia Woolf (1882-1941), in her essay, On Being Ill, lamented on the neglect of bodily pain by literature. She found literature to be lacking words for the “shiver and the headache….The merest schoolgirl, when she falls in love, has Shakespeare or Keats to speak her mind for her; but let a sufferer try to describe a pain in his head to a doctor and language at once runs dry.” By creating bodily narratives, literature helps patients, physicians and public at large to recognize the tribulations involved, avoid them to fall into traps and illusions and alert them to holistic recovery approaches.


The Nobel Laureate in Medicine for 2004, Richard Axel (b. 1946), warned that the future generations must be prepared to be mired in conflict between science and its various allies on one side and the remaining disciplines on the other. One will virtually fight in the battlefield of ideas where there will be a radical transformation of our understandings about the whereabouts of our origin, the interrelation between genes and (say) the environment, emotions. In fact, he suggests – inheriting a brusque Bush’s tone – that we must “choose either to have science or not have it.” Moreover, we shouldn’t attempt to harp any scientific progress because we have no control on the outcomes of scientific research. Good or evil, both should be welcomed even if we are swallowed by the monsters of science. To drown in the ethical depths is not a preferred experiment of surface sciences.

For those who revel in unethical scientific progress, the Director of the Consortium for Science, Policy and Outcomes at Arizona State University, Daniel Sarewitz has a plausible answer. He advocates a socially responsible way of doing science and technological innovation. Especially, he has nudged the discipline of nanotechnology by opining that nanotechnologists should not anticipate or strive for beneficial outcomes of their work because such results are serendipitous.

Mr. Sarewitz argues that serendipity is fine but seeking specific outcomes is better - prodding scientists towards socially productive directions. He has questioned the logic underlying one of the cardinal beliefs among many basic researchers: that the path from fundamental discovery to practical application is unpredictable. So, many researchers feel that it is counterproductive to prod them to work for particular outcomes. One oft-cited example is that of Alexander Flemming (1881-1955) who accidentally discovered the antibiotic Penicillin in 1928, when he noticed that the mould on the lab dish killed staphylococcus bacteria. Such groundbreaking discoveries may be neglected in directed research.

Nanotechnology for example implants sensors in the human body to monitor health. The monitored reports can be used to practice discrimination. But, if the inventors thought about the security of the data being collected earlier, then it wouldn’t be too late to alter the products and connect their problems. Mr. Sarewitz says: “The idea is to create a system where we at least think about what we’re going to do…So, it’s not about control but exercising awareness and choice and in turn, prediction.” In all, somehow we must keep a check on what they are up to; yet, in a democratic and widely agreeable way. Technological progress is inevitable; all we got to do is to probe this progress from various perspectives so that it doesn’t imperil human growth and demolish human selfhood (already plummeting to uglier levels.)

Few scientists own up the flaws in their research. If they wish to tag themselves as technologists, the whole team must be alert to misdirected research and prevent it before it eats up the team itself. A recent classic example is of the South Korean stem-cell researcher, Hwang Woo Suk, who faked cloning human embryos. So, think up your research before it gulps your selfhood and pukes out a selfless shameless spiteful researcher.


Damasio, Antonio. Descartes’ Error. London: Picador, 1994.
Macquarrie, John. Martin Heidegger. London: Lutterworth Press: London, 1968.
Rose, Steven ed. From Brains to Consciousness. London: Penguin, 1998.
--------------------. The Making of Memory. London: Bantam Books, 1993.
Scott Jr., Nathan. Mirrors of Man in Existentialism. Nashville: Abingdon, 1978.
Tillich, Paul. The Courage to Be. New York: Yale University Press, 1977.
Various debates on

No comments: