Blog


Manslaughter or Murder?                                                           Reinventing the Human with the ‘New Unconscious’ 


Philosophy Major, UCSB

When I consider what I learned from my time with the Unconscious Memory project, my mind goes to just about everything I’ve studied, from philosophy to neuroscience to literature to logic. By forcing me to step out of my conceptual comfort zone, interdisciplinary research has influenced how I frame any question. I learned many specifics about the brain and human behavior, but more importantly, I left with a reinvigorated image of the humanities and the role they can play in reinventing the human in light of scientific development.

My graduate mentor, Daniel Martini, and I focused primarily on the questions surrounding moral agency and intention. Intention is generally understood as the casual association between thoughts and actions, and it is heavily intertwined with our understanding of moral responsibility. For example, intention is the deciding factor used by the United States criminal justice system to distinguish between murder and manslaughter, with murder carrying much higher penalties. Looking at recent cognitive studies, Daniel and I attempted to locate intention in the picture of the mind that has been emerging in the last 35 years, namely the cognitive unconscious.

The ‘new unconscious’ reveals that much more of what we do escapes our awareness than was previously thought. Many complex psychological functions do not rely on conscious awareness, and many of the conscious processes that we thought were largely free of unconscious influence can be proven to rely on unconscious processing. For example, the feeling that our thoughts have caused an action can be manipulated to create the illusion of free will (see Daniel M. Wegner in The New Unconscious, Oxford UP, 2005). Additionally, people can be primed to perform actions they did not intend, and they are “often unaware of the reasons and causes of their own behavior” (John A. Bargh in The New Unconscious, p. 37).

These findings trouble a clean distinction between intentional and unintentional, and they thus spell problems for the dominant conception of moral agency. The current understanding of this distinction will have to become much more nuanced, or we will have to develop a conception of moral responsibility that makes no reference to intention, lest we risk unjust decisions predicated on faulty distinctions. 

The deflation of the conscious self associated with the ‘new unconscious' demands a conceptual rethinking of humanity itself. As a discipline that emphasizes concept production and distinction drawing, the humanities have their work cut out for them. Just as enlightenment philosophy emerged in response to modern science, the humanities will have to reconfigure some of our shared fundamental assumptions about the mind in response to advances in neurocognitive science. If we work in tandem with the sciences, the humanities are in a unique position to make significant contributions to the project of reinventing the human.


Published january 29, 2021

 



Kestrel Moments


English, UCSB

In 1966, philosopher and novelist Iris Murdoch identified “the fat relentless ego” as the biggest enemy to moral life (“On ‘God’ and ‘Good’” 51). Drawing on the thinking of Western philosophy’s most famous pessimist—Arthur Schopenhauer—Murdoch saw humans as naturally and tragically (and sometimes comically) self-centered. She also saw a way out of this narrow selfishness, however.

For Murdoch, the way out was to be found in love and in beauty. Articulating how natural beauty can move the beholder away from their self, she describes an encounter with a bird:

I am looking out of my window in an anxious and resentful state of mind, oblivious of my surroundings, brooding perhaps on some damage done to my prestige. Then suddenly I observe a hovering kestrel. In a moment everything is altered. The brooding self with its hurt vanity has disappeared. There is nothing now but kestrel. (“The Sovereignty of Good Over Other Concepts” 82)

What is important here is what happens after this appreciation of natural beauty: suddenly the anxiety diminishes. The resentment shrinks. Attention has shifted so that the damaged prestige seems less important next to the hovering kestrel. Such movements away from the self are always of ethical value because, as Murdoch put it, “anything which alters consciousness in the direction of unselfishness, objectivity and realism is to be connected with virtue” (“The Sovereignty of Good Over Other Concepts” 82).

To alter our consciousness in this way, we first have to become aware of what we pay attention to, a feat that turns out to be quite a bit more difficult than it sounds. A famous experiment making blatantly clear how ignorant we are when it comes to what we give our attention to is the so-called “invisible gorilla experiment.” In this landmark experiment from the late 1990s, psychologists Christopher Chabris and Daniel Simons asked participants to count the number of times a basketball is passed between people in a video. In the middle of this video, a person wearing a full-body gorilla costume walks through the group of ball players, looks into the camera, and thumps its chest. To the great surprise of common sense, half of the participants watching this video do not notice the gorilla. This, as Chabris and Simons explain, tells us two things about attention: we miss much of what goes on around us and we are utterly oblivious of how much we are missing.

Since this influential study, neuroscience has both confirmed and complicated our understanding of attention, helping us appreciate, for example, why it is that we like to attend to what feels good (beauty, for example, or our phones). The short answer here is spelled dopamine. Commonly known as the “feel-good neurotransmitter,” dopamine plays a crucial role for attention to function well. It helps you pay attention and focus more selectively. While much is still unknown about how dopamine aids attention, it is clear that dopamine’s rewarding function is crucial for habituating attention; for making you do whatever feels good again and again and again.

Bringing  together neuroscientific understandings of attention with Iris Murdoch’s concept of unselfing—the idea that encounters with beauty can make an individual escape that “fat relentless ego”— can help us appreciate  how literature offers a unique opportunity for readers to attend to something other than their selves. This is what I try to do in my dissertation. By looking for such “kestrel moments” in 20th and 21st century Anglo-American fiction, I investigate the relationship between attention, ethics, and aesthetics to ask why it is so difficult to attend to anything except our individual experiences. In other words, I’m investigating how reading literature can help us resist the self, and why, in a world of selfies and self-care, it is of ethical value to focus completely on the hovering kestrel outside your window, or on the book in front of you.


Published december 20, 2020

 



Unconscious Bias and Ancient Astrology


Classics, UCSB

At the moment, I am TA for a course about ethnicity in the ancient world. In light of current events in the US and abroad, I’ve been thinking actively about the existence of conscious and unconscious bias in the ancient world.

The scholarship on Critical Race Theory doesn’t provide easy answers to how our brains perceive and categorize differences among people, but the more I read ancient sources discussing these ideas, the clearer it is that discrimination based on skin color is not a modern phenomenon, as many Classicists have argued. I first noticed this when reading the Latin epic didactic poem Astronomica by Manilius, who composed it in the early days of the Roman Empire (20-40 CE).  It is the earliest surviving treatise on ancient astrology in the Greco-Roman tradition, and in one section uses the 12-sign zodiac to classify the various regions of the world, citing the influence of the zodiac signs on specific types of personalities and body types. This builds on earlier Hippocratic ideas of Environmental Determinism, but reorganizes it using the then newly adopted zodiac signs. By assigning specific signs to individual geographical regions within the Roman Empire, the poet engages with earlier writers to produce a renewed discussion of ethnic categorization that moves a step beyond older ideas of Environmental Determinism, where the environment was thought to define an individual’s physiology and psychology, to a Zodiacal or Cosmic Determinism. So what do the signs say?

The Romans are believed to have great intellect and physiology since they are ruled by Libra, who endows them with everything in balance (pun intended).  The Ethiopians, on the other hand are said to “defile the Earth” with their dark and shadowy appearance since their skin is burned by the fiery sign of Cancer. The Indians, who were considered a different species of Ethiopian, are described as “minus tostos” or “less burnt.”

There is still much debate over whether ancient civilizations understood ‘race’ and ‘ethnicity’ in a similar way to the European slave-traders of the 17th century. But finding the racialized zodiacs in the ancient text questions whether seeing human difference in physical attributes is a learned behavior that results from imperial ambitions. The desire to seek unity with those who resemble us and discriminate against those who are different may be an animal instinct that is part of our unconscious biases. However, it’s the impulse to make the subconscious tendencies explicit through ideas of empire and citizenship that brews trouble and continues to plague our world today.

The issue of unconscious bias was a topic of our seminars more than a year ago. It is more relevant than ever to re-analyze the ancient sources which have been read with a far too forgiving lens for far too long.


Published november 20, 2020

 



Teaching Literature and Neuroscience With the Help of William Faulkner and Digital Humanities


English, UCSB

In the summer of 2019, I taught a course called “Story and the Brain” at the University of California, Santa Barbara. Designed to examine the power of storytelling with reference to brain science, this was a course focusing on 20th and 21st century U.S. literature along scientific and philosophical theories and themes concerning the human mind.

One of these themes was memory. Thinking about how to find a way of introducing the science of memory studies while also exploring what literature can teach us about the human ability--and inability--to remember, I decided approach this through William Faulkner's 1929 modernist novel The Sound and the Fury. Pairing Faulkner's famously challenging text with the science of memory might at first seem like a doubly daunting task but, as it turned out, Faulkner's retelling of the same story from different perspectives proved to be a gold mine for introducing, exploring, and discussing the relationship between storytelling, memory, and identity.

You can read about how I used Faulkner's experimental novel together with Digital Yoknapatawpha -- a website mapping and visualizing Faulkner's fictional worlds--to discuss topics such as false memories, unconscious memory, and memory loss in a recent essay titled "Seeing Memory: Using Digital Yoknapatawpha to Teach Cognitive Literary Studies" in the Fall 2019 issue of the Teaching Faulkner Newsletter published by The Center for Faulkner Studies at Southeast Missouri State University.


Published MarCH 15, 2020

 



Rewiring My Brain: a take on interdisciplinary collaboration


Fourth Year, Comparative Literature, UCSB

Concepts. As scholars and students we make ample use of them. But how do they come about? Cognitive neuroscience pushes us to think of concepts as situated. Their meaning is part of their use. When I say lyrical, a poet or a literary theorist will have a pretty good grasp of what I mean. But when I turn to my collaborator Madeleine Gross in Psychology, she and I may not see eye to eye. Why is that? One reason is that semantic memory is tethered to a particular perspective or shared cognitive environment. This also summarizes my personal experience of interdisciplinary research. A lot of re-conceptualization and re-situation was needed.

Over the course of 6 months in 2019, Madeleine and I developed a study of avant-garde film and creativity. We discovered that we shared an interest in the avant-garde and meaning-making (or sense for those of us in the humanities). I offered access to great French cinema from devout anti-conformists seeking to create new modes of language and thought - the Lettrists. Madeleine held all the tools to measure and research the efficacy of this art form on real-life people. Our shared goal was to examine if film can actually shape the way we understand concepts. The short answer is yes. The study paid off, and after controls and replication we presented at Cognitive Futures. Now a paper is in the works.

But the project had demanded an amount of semantic gymnastics that I had not tapped into since undergraduate education. Collaboration does take longer because of scheduling hoops, disagreements and the process of mutual understanding. But interdisciplinary research took this to an entirely new level, for what I also had to do was learn.

Every coffee meeting, every email exchange and each write-up was a re-wiring of my brain. For instance, when Madeleine would call with exciting news and point to the statistical analysis of mediator and moderator, I was initially clueless. As a literary scholar, I hear mediator as medium, as text or form. The word moderator takes me to my last conference. A psychologist’s use of those concepts are situated in a totally different space: the mediator explains the relationship between independent and dependent variables, whereas the moderator affects the strength of that relation. That is: why and how did the avant-garde film lead to an expansion of conceptual boundaries?

In the end, there was an uncanny similarity between myself and the research subjects. Our interdisciplinary collaboration gave me the (painstaking!) chance to unpack and reformulate the meaning of familiar words that I hold so constant that I forget that they, like our disciplinary groundings, are always situated.

 


Published January 8, 2020



The Ancient and Modern Unconscious


Classics, UCSB

How do we know that the actions we are performing are actually conscious choices that we are making and not merely some unconscious urge coming to the surface? Freud made this label famous, but he obviously was not the first to suggest that such a thing existed. We, humans, believe that we are unique organisms because we have the ability to behave according to “reason”.  In some ways this definition hasn’t really changed since the ancient period.

While neuroscientists are now exploring consciousness and our actions by likening our minds to computers, ancient scientists believed the mind was a product of all its sensory inputs along with a divine inspirational force. Ancient Greeks believed they were acting according to their own rationality, but in reality, were acting in accordance with the dictates of fate. A Homeric hero’s intended actions would go awry, leaving him wondering who was actually controlling his behavior and decisions. Neuroscience has begun to systematically examine the specific function of different parts of the brain and to document which regions are activated during certain stimulatory events. fMRI and PET scans allow researchers to do so without having to perform live dissections on patients to poke and prod the sensory cortex itself like the ancient doctor Galen is said to have done in the 2nd century CE. Until his time, the Greeks believed the chest and midriff (phren) to be the region where the heart-mind was located. It was where reasoning, perception, emotion, and contemplation occurred.

The distinction between the conscious and the unconscious mind was perhaps as fuzzy a concept as it is for us, but they did note a difference between them which we find in Greek Tragedy. Characters in these plays sometimes act outside the norm, only to snap out of it, see the destruction they’ve caused, and consider themselves mad. The viewer is left wondering whether the character is acting according to fate’s dictates or merely manifesting their true nature (phusis), i.e. how they truly are without any conscious suppression of their behavior. Could the character have stopped himself from doing what he did? Usually these questions are left unanswered at the end of each Greek tragedy, but we see that the Greeks believed there were different levels of conscious and unconscious control.

The Greeks also placed great cultural value in poets who were divinely inspired to compose their verses.  In describing the Trojan War, Homer calls upon the Muses, goddesses who presided over the fine arts, to sing through him. Does this make Homer’s poetry a product of his conscious mind or is it coming from some deeper realm of his mind which he can access only by invoking the muses first? Was his invocation of the muses to sing through him a type of mental causation that then allowed the physical action of his reciting poetry to take place? What about Hesiod who came soon after Homer and wrote poetry for the first time? What sorts of neurobiological processes are involved in his subconsciously controlled writing?

I think there is value in trying to understand the connections between the mind and the physical process of writing. By studying the ancient Greeks, whose written literary tradition arose from unwritten, divinely inspired oral poetry, what can we learn about the mind from the process of encoding unwritten thoughts into sentences or poetry? I think the Greeks were on the scent of the concepts of conscious and unconscious and, interestingly, their literary output is what allowed them to explore these ideas.


Published January 5, 2019



Computational Approaches to Personal Memory


Computer Science, UCSB

Human visual and auditory perception can be mimicked by ever-improving software programs for electronic eyes and ears. Natural language processing provides computers with basic skills in comprehension. Moreover, our superb powers of information recall are slowly being replaced by queries to stored data. The emergence of statistical models of learning allow machines to solve problems in prediction and planning that once required guidance by human intuition. For instance, in games of chess or financial investing, the number of possible outcomes once prevented automation. Now, machine learning is strong enough to beat humans. These developments mirror one goal of technology: to replicate our cognitive abilities.

Less common than the work described above are efforts to reproduce features of the mind like personality, identity, and memory. We have generalized computational models of language understanding and visual perception, but how can we instantiate them to reflect notions of individuality? I believe issues of personality and identity can be partially rooted in personal memory, the internal representations of past events and our past selves, both those accessible through introspection and those in our unconscious awareness. Neural structures controlling, for example, language and vision in an individual’s brain develop during the same, specific series of settings and sensations, that is, over a lifetime. Even if the functions of my visual cortex and my neural language centers are relatively independent, they have been subject to similar patterns of input because they reside in the same brain. But if I was building a computer model of a human, I might piece it together using different software programs; one for language, one for vision, one for motion, and so on. These pieces would come from different, relatively unrelated, sources. Thus structures that could be treated as modular in a computer would be blended together, interconnected, in a brain. The computer wouldn't reflect the many biases, unconscious associations, and idiosyncrasies either, which exist in an individual human being. Yet, how accurately do we remember the progression of these interconnections in ourselves over time?

This leads me to wonder: how can a computer model of the mind be imbued with self-memory? How can advancing computational models of the mind reflect human-like introspection, especially with access to memories of an individual? My interpretation of memory draws inspiration from John Locke’s model of personal identity, which centers our sense of self around our continuous, subjective memory of our past selves.

The Unconscious Memory project provides an opportunity for me and my colleagues to study the formation of idiosyncratic associations in the depths of the mind and the ways in which these associations are encapsulated by memory. I hope to use the project to evaluate computational approaches to autobiographical memory and to write computer programs that extract biographical memory from relevant datasets.


Published December 29, 2018

 




Second Year, Psychological and Brain Sciences, UCSB

My conscious awareness seems paramount in my existence. Indeed, as many philosophers have noted, it is the only thing that I can ever be sure really exists. However, an ever-growing body of neuroscientific and psychological research reveals that unconscious processes play a necessary—and, in some cases, sufficient— role in many of the mental and behavioral actions I had long-assumed to result from consciousness. One surprising example involves the generation and evaluation of creative works.

The human drive to create seems to transpire, at least in part, from conscious thought— humans appear to create with agency and intentionality. However, many discerning creatives have noticed that this is often not the phenomenology of their creative process. Indeed, numerous innovators, scientists and painters have noted that it can feel as though they are channeling some creative “spirit”; they feel in some ways a mere vessel for their creative productivity. At times, I myself have experienced this when creating. If someone were to ask me what inspires the content of my work, I would feel somehow disingenuous saying that “I”—my conscious self— came up with it at all!

Recent psychological and neuroscientific research supports these observations and reveals that creative ideas may precipitate from the unconscious. This was elegantly shown in a series of experiments led by Dutch social psychologist Albert Dijksterhuis. In these studies, individuals are presented with a creativity problem and then are asked to either reflect consciously and deliberately on possible creative solutions or, alternatively, engage in a non-creative task that requires complete conscious attention directed elsewhere. In the latter condition, individuals cannot consciously think about creative solutions, but of course unconscious processes will proceed regardless. After a period of time, the individuals are asked to write down their ideas to the creative task they had earlier been presented with. These experiments show that individuals in the “unconscious” condition end up creating ideas that are more creative. Perhaps even more surprising, follow up work revealed that the unconscious group was also better at evaluating and selecting their best creative idea.

My work has researched unconscious factors that influence creativity and insight, including our mindsets, personality, and even beliefs. It is quite likely that creativity arises from a specific interplay between regions of the brain associated with self-monitoring and attention, so-called executive functions, and regions responsible for broadening possible solutions through tracking context-relevance, abstract reasoning and memory, regions of the salience network. I have often heard it said that creativity may result from a balance of order and chaos; now I am learning that, in neurological terms, this may translate as an interplay between the conscious and unconscious.


Published December 20, 2018




English, UCSB

In his brilliantly educational and entertaining course on human behavioral biology (available on YouTube), Stanford Professor Robert Sapolsky frequently discusses how our neurobiological make-up structures our everyday lives. As someone interested in the function of narratives in human life, two facts stand out to me: the neurobiology of metaphors and the fragility of human memory.

Because the human brain evolved during a time when human existence looked quite different and symbolic communication was far from as sophisticated as it is today, we are doomed to use these same old brain circuits to make sense of our brave new world. Evolution is slow, and we can only work with the tools we have. The effects of this recycling of brain capacity is striking when considering that the same area of the brain (the insula) is activated when you are presented with disgusting food as when you are presented with something morally disgusting. The fact that gustatory disgust and moral disgust are so closely linked in our brain hints at how figures of speech are shaped by our evolutionary history and how they continue to shape our experiences.

Neuroscientific research has also shown human memories to be fragile and flexible constructs. Not only can existing memories be altered by the stories we tell others and ourselves, but false memories can be created by the telling and retelling of narratives, and—perhaps most astoundingly—substances such as ethanol can hinder the formation of the synaptic connections that today are thought to be the material basis of memories. In other words, that last beer the other night might quite literally have prohibited memories from being created in your brain!

These are but a few of the facts that open up for astonishing ideas about the profound power of the symbols we use to understand ourselves, each others, and our surroundings. They gesture toward how much there is yet to discover about what goes on in the unconscious parts of our mental worlds, some tiny corner of which I hope to explore with the support of my colleagues in the Unconscious Memory project.


Published December 14, 2018




Third Year, Comparative Literature, UCSB

I join Unconscious Memory expecting to leave a changed man. The opportunity to study the mind with eminent colleagues from neuroscience, AI and psychology represents not only the chance to develop a deeper insight into human nature but also an augmentation of literary theory worthy of the 21st century. For how do we make sense of our narratives in the context of new insights into unconscious processes?

My present circumstances are an indication of past causes, just as my actions today point towards what tomorrow will look like. Yet when I evaluate former beliefs, even extended inferences are often generalized or affectively charged without my conscious control. Reflecting on my move from Denmark to France and later the UK, I instantly relive a certain optimism, inspired by the light of the cosmopolises and the positive experiences that I had there. Yet I also know the sentiments contradict declarative memories, like how I for years felt alone, lost in the crowds and averse to the pollution. The cognitive situation resembles a false memory of which I have limited control. My awareness of the past feels right, yet also misguided.

The implication is that what shapes my actions today may not be accessible to me in the future. I may constantly produce new narratives. Many of these involve unconscious inferences. The person I see myself as today – based on an evaluation of who I was ten years ago – is used to project into a new future, say that of a professor. The scene is vague: a house by the ocean large enough to host students. I have an inkling of what I want yet I don’t recall actively having painted the picture. Whose brushstrokes are they? Are they based on a description from a book, or a home I once visited? How did this desire take shape?

I don’t know what our project will uncover. Certainly, I don’t expect to permanently solve the mystery of the unconscious. The relationship between the mind and our cultural products like scripts and tools, for instance, point to the complexity of our cognitive capacities. Affects and shared symbols – such as what a house on the water represents – suggests that we are never the sole scriptwriters of our individual lives. To perceive ourselves is to perceive the world and hence there is no reason why analyses of the unconscious should yield a stable insular phenomenon.

This is precisely why we must advance critically and analytically across disciplines. When our approaches to research match the manifold nature of the unconscious, we increase our chances of asking the right questions.


Published November 16, 2018