Proust Was A Neuroscientist
I am part of several organizations and committees that support ‘evidenced-based’ care. Although I much prefer the term ‘evidenced-informed’ care. We could probably spend several hours or more discussing what these things are, or how they’re often misused to suit a person or company’s agenda. Instead, I will simply acknowledge that unadulterated ‘evidenced informed’ care should be the goal we all strive for and this would work best if we could remove bias, exclude financial gains (or loss), and explore why and how what we do works so we can better care for our patients. However, even in its purest form there is an opportunity for the non-scientific to influence our thoughts and actions in a positive way (perhaps why I prefer the softer language of ‘evidence-informed’ to ‘evidenced-based’. So let’s take a deeper look into the world and see what the arts can offer science. To do this I’ll review some historical perspective, much of which was given to me, from Proust Was a Neuroscientist by Jonah Lehrer.
A touch of background: I traveled to Phoenix, AZ a few months back to assist Craig Liebenson, DC with a course he was teaching. During his lecture one of the things Dr. Liebenson touched on was ‘iatrogenic ideas’ which is a term I love and think is similar to the nocebo effect. He related this idea through this quote:
“For each ailment that doctors cure, they introduce ten others in healthy individuals by inoculating them with a pathogenic agent, thousands of times more virulent than all microbes-the idea that they are ill.” – Marcel Proust
At that time I was reading a book called, “How We Decide” by Jonah Lehrer which I had picked up on a whim out of curiosity for the title and admittedly because it had an ice cream cone on the front cover. I finished reading, “How We Decide” on the flight home and in reading the last few pages noticed that Jonah Lehrer’s first book was entitled, “Proust Was a Neuroscientist”. Considering the recent discussion of Proust and how his thoughts fit well with modern ideas my interest was peaked and without any deliberation I ordered a copy. Once again my lack of thorough research of my reading material was rewarded. [If you connect the dots, you can also tell how often my reading is interrupted and how long it takes me to get through a book. I should also acknowledge that I am not a literary or artistic scholar, so if any of the below is less than perfectly accurate, please realize these are likely my misinterpretations and feel free to offer corrections.]
Now that the stage is set, let’s get back to the journey we began.
Science commonly applies a reductionist approach. To form a testable hypothesis that can also be accurately retested, artificial constructs are created. This is done to allow clean experiments and data of statistically significant accuracy. However, the challenge is we must always take what we’ve learned and then see if it fits in with the larger picture of how things work. Our society often mimics this reductionist approach as it is increasingly disjointed, with emphasis on the need to specialize. In general most scientists do not fully appreciate, understand, or value the arts and most artists do not fully appreciate, understand, or value the rigors of the scientific method. [I will admit on this continuum, I would fit more under the scientific mindset struggling to appreciate art.] What I was exposed to in reading Proust Was a Neuroscientist is how often art predated science. With careful exploration into the world of the arts there is great insight into human nature, our senses, and the way our world works. Masterful artists intuited what the scientists of the day did not believe or could not yet prove. With time science has caught up with what art already knew, but I wonder how often this interplay of the two worlds was appreciated and how much faster we could have gotten here if science understood art?
To follow are some examples taken from Lehrer’s writings on how an artist (painter, author, chef) seemed to know what science did not. Some of these things may seem commonplace, but please recall that at one time they were not. What I hope you can take from this is an appreciation for the way these ideas have developed, and perhaps begin looking out into the present day artistic world for the next scientific advance.
Author George Elliot believed that our mind was a source of freedom and that adaptation of thought was possible. In Middlemarch, she wrote about a character named Dorothea describing her mind “is not cut in marble – it is not something solid and unalterable. It is something living and changing”. Neuroplasticity, myelinization and concepts of ever changing neurology were not prevalent in science and biology until more recently. Even looking at our rehabilitation paradigms many are focused on teaching accommodation rather than on training adaptation. We take the short, easy way instead of the more challenging but more productive method. In Elliot’s day ideas of positivism and determinism contained human beings. Thomas Huxley stated that, “we are conscious automata”, and scientists thought humans were born with a complete set of neurons and that brain cells did not divide after birth. As recently as 1980 Pasko Rakic realized this had never been tested in humans and conducted an experiment on Rhesus monkeys. He wrote in “Limits of Neurogenesis in Primates” (1985) that social and cognitive structure of primates required in the inhibition of neurogenesis. Later in 1989, Elizabeth Gould was working on documenting brain degeneration and stumbled upon the unimaginable, ‘brain regeneration’. While researching her experimental anomaly, Gould found the 1962 work of Joseph Altman who had used similar techniques to Rakic and found regeneration in rats, cats, and guinea pigs. However, his work flew in the face of socially accepted ideas and it was ridiculed and ignored. Michael Kaplan in the 1990s used an electron microscope to show neurons dividing and neuroscientists continue discovering and exploring this previously unfathomable frontier. Perhaps we all should have paid more attention to Elliot’s writing? Would this have made us more open to Altman’s research? Where would we be if we had not ignored this potentially potent field of study for decades?
In 1910, Paul Cezanne and postimpressionist painters were changing the way we think about what we see. The convention of the day was that they eye was like a camera (or better yet a camcorder, or maybe the iPhone that can take pictures while shooting video) and that our senses were perfect interpretations of the outside world. The eye simply collected pixels and sent them on to the brain. Cezanne said that “the eye is not enough”, “One needs to think as well” and then he showed us that this was true. The picture below gives you Cezanne’s offering to how we see. Take a look, what do you see?
You should see a landscape with a mountain view. Now look closer. Is there really a mountain? Look at just the mountain and realize that there’s almost nothing there, but you saw it just the same. In the 1950s Hubel and Weisel showed the role of the visual cortex in sight, refuting the ‘camera’ theory of vision. They noted in a 1959 paper that the neurons preferred contrast over brightness and edges over curves. Look back at Cezanne’s mountain and see if you believe he already knew this. Neurologist, Oliver Sacks demonstrated this in a clinical subject with an intact eye but had a cortical lesion. This patient’s vision of the world around him looked like Cezanne’s painting. When asked what something was he described in in terms of color and lines, “it looks about six inches in length. A convoluted red form with a linear green attachment.” After being allowed to smell it, he called it a rose. Could there be something today that we see in the world of Photoshop, computer graphics, digitally altered images, or the digital arts that gives us even more insight into the way we see the world around us?
Science in the fourth century and for a long time after that thought it had the tongue and how we taste figured out. Democritus hypothesized that taste was performed based on the size and shape of the molecules, and idea that Plato believed and passed to Aristotle. Aristotle described our four primary tastes as sweet, sour, salty, and bitter. If this sounds familiar it is likely because you learned the same thing I did in school, which was that we still operated on these 4 basic tastes and various combinations of them. In 1903 French chef Auguste Escoffier scorned the current trends in French cooking where presentation was everything and taste mattered little. (I strongly agree, while I enjoy a good presentation it means nothing if the food is as tasteless and terrible as Escoffier’s descriptions of French cooking of his day.) Escoffier wrote that his recipes were based ‘on the modern science of gastronomy’, which in fact ignored the science of the day and more correctly simply followed the tongue. He made recipes and food that simply tasted good. His two main staples were making stock and deglazing. These methods were espoused repeatedly in his writing and his work. His food was simply delicious. It has had lasting impact on the culinary world. However, the deliciousness of his creations could not be accounted for in the salty, sweet, bitter, sour explanation. The reality is that Escoffier’s methods aimed to put as much L-glutamate, which is released from life forms through cooking by proteolysis, on the plate as possible. A Japanese scientist, Kikunae Ikeda set out to understand why dashi (A Japanese broth) tasted good (or what it tasted like). It did not taste of any of the 4 known tastes, and was simply described as ‘umami’ (delicious). What he was eventually able to distill was glutamic acid, a precursor of L-glutamate. He published his findings in the Journal of the Chemical Society of Tokyo and later made a stable form of this, which we know as MSG. Despite this finding, western scientists continued to feel there was no room in the palate for deliciousness or umami, as all the available tongue receptors had been described. Some 90 years later in 2000, scientists noticed that the tongue contains modified forms of glutamate receptors and a second set of receptors able to detect L amino acids. In honor of Ikeda, they were named ‘umami’ receptors. In just under a century, what Escoffier knew in his kitchen and his diners knew at their table, was elucidated by science. Research continues into the role of smell in taste and the influence of expectation on taste. What does your favorite restaurant or food offer that science hasn’t yet made clear?
The idea that pain and perception is a brain phenomenon (yes, it’s in your head), is frequently explained by phantom limb pain. This is the presence of perception and pain in an amputated limb that is no longer present. First described during the American Civil War by Weir Mitchell in his medical notes as ‘sensory ghosts’, “a sense of existence of their lost limb was more vivid, definite and intrusive than that of its truly living fellow member.” Twelve years earlier, this was recorded by Herman Melville in Moby Dick where Captain Ahab explains to the carpenter fashioning his new leg that he still feels his peg leg “invisibly and uninterpenetratingly”, and says, “so now, here is only one distinct leg to the eye, yet two to the soul. Where thou feelest tingling life; there, actually there, there to a hair, do I. Is’t a riddle?” Could we be further along with our understanding of pain generation and perception if we read Moby Dick more carefully?
Walt Whitman wrote in an era where the body and mind were treated as separate and distinct entities. As this was a time where feelings and emotions came from the brain and the body was simply a mass of matter that carried us along our journey. His concepts were revolutionary as he wrote in Leaves of Grass,
Was somebody asking to see the soul?
See, your own shape and countenance….
Behold, the body includes and is the meaning, the main
Concern, and includes and is the soul
William James, a follower of Whitman’s writing, later wrote about this topic as did Carl Lange a year later. What we’re left with is the James-Lange Hypothesis. This is also described in modern neuroscience as the ‘body loop’ by Antonio Damasio, who has documented changes in those with brain injuries and how this lost connection to the body makes them unable to correlate bodily feelings into proper emotions. Damasio then devised some clever experiments to show how the body and emotions interact, at times unknowingly, during rational tasks. From Wired Magazine, Lehrer overviews Damasio’s work with a brief interview.
Composer Igor Stravinsky started a riot with a song you know and likely would allow your children to listen to. Existing in a period where music was stagnant, Stravinksy sought something knew. The period was full of soothing notes, chords, and familiar arrangements. The great composers of the day put them in various arrangements and pleased crowds of music enthusiasts. Stravinsky wanted more, he felt that music was not simply to sooth, but to help express or draw out emotion. He created a composition called The Rite, which had a brazenly new section he arranged and called “The Augurs of Spring”. The piece was an advertisement of his bold, modern style. The Rite was played for the first time at the ballet after an opening dance to a classic Chopin piece, which was familiar to the crowd. With no interruption the program continued with “The Augurs”. It begins with an easy, shrill bassoon which quickly turns into an ‘epileptic rhythm, the horns colliding asymmetrically against the ostinato… tension builds and builds, but there is no vent. The irregular momentum is merciless, like the soundtrack to an apocalypse, the beat building to a fatal fortissimo.” The interplay and rhythm of the music was completely unfamiliar to the audience. Igor Stravinsky was in a tuxedo sitting in the 4th row admiring the first performance of his brilliant creation when the crowd began to scream. Screaming led to brawling in the aisles, the performance broke down, and the riot broke out into the streets. The musical dissonance and unsettling newness of the arrangement had struck to the core of the audience’s emotions, irritating the ballet patrons. Despite creating a riot, unashamed, although frustrated by his assault on the audience Stravinksy and the ballet continued to offer this piece. The Rite is a highly regarded composition, is still played today and was included in the Walt Disney cartoon Fantasia. What was once new, shocking and irritating to the nervous system is now commonly enjoyed and lauded. What have you seen or heard that you closed your mind to or walked away from simply because it was irritatingly refreshing? Is familiarity important to you? Or should you look deeper to figure out why it seems ‘wrong’?
There are many things that we can understand at a deep level without contributions from reductionist science. Despite this we should strive to better understand and solidify our thoughts about the world. Hopefully this post will serve to encourage continued exploration of the world beyond one’s field of expertise and perhaps as a reminder that science is truly about answering questions and the arts can be a great place to look towards for a testable hypothesis. Could evidenced-informed care come from art-influenced science?
As this post expanded beyond the length I originally intended, you may notice that there’s little discussion of Proust’s work. Should you wish to know more attend one of Craig Liebenson’s seminars or read Jonah Lehrer’s book.