Life-shifts

This week marks one of those “big birthdays”–my mother turns 90. The birthday feels bittersweet; for, in many ways, I have been in the process of “losing” my mother since her diagnosis of vascular dementia in 2017.

Or is it that she is losing? Losing cognition, a sense of time, the words to say…anything at all. She has not yet lost a sense of emotional self, though I know that if her body doesn’t give out first, that will eventually occur. I’ve been through this before, with my mother-in-law. Helping people navigate dementia is a challenging task.

Therefore, as I celebrate her birthday, I also celebrate the goodness of the people (nurses, CNAs) who assist her daily at her skilled-nursing apartment, the social worker who visits with her and brings her mail, the acquaintances who smile and greet her even though they know there cannot be conversation (of any meaningful sort). The doctors and nurse practitioners who find ways to communicate with her about how she feels physically. It cannot be easy, even with someone as even-tempered and pleasant as my mother continues to be.

A funny thing about my mom. When she was my age, we used to tease her and my dad about “getting old.” She’d toss our teasing aside by insisting, “You’re not old ’til you’re 90!” Even with a few health issues, she and my father continued to be curious about the world and the people in it, traveling, going to parties, trying new things (cross-country skiing, Thai cuisine, activities with grandchildren). About 8 years ago, when my dad was ill with cancer and meningitis and going in and out of the ICU, Mom said she felt old. We retorted, “But Mom! You’re not old ’til you’re 90!”

Now she really is 90. Bless her good kind heart. ❤

~

February 26th is just another day, another year–and at this point, my mother has very little sense of time. It is likely that my mother’s life-shifts are in the past, and the next shift (there’s no escaping it) will be death; but who can tell? My mother’s ninetieth feels like a huge shift in my life as her daughter, as an adult, as a mother to grown people, and as a writer in the world. Why this is so, I can’t say. It’s certainly something I’ll be reflecting on often in the coming years, and the reflections emerge in my writing. As I work on revising the poems I’ve drafted in the past 5 years, the topics of aging, mortality, aphasia, and memory keep showing up. Things I can consider myself fortunate, perhaps, to be preoccupied with, rather than being forced into confronting a natural disaster (Pakistan, Turkey, Syria, and others) or war (Ukraine, Syria, and other regions).

Here’s part of a poem I’ve been wrestling with lately.

                       ... --I would untangle
my mother's mind if I could be let access to its
recesses, but those stay hidden like the life in hedge
and meadow, in the woody undergrowth,
unknowns twisted together, impenetrable.
...

How fortunate for me that my mother is not far away, is well-housed and safely cared-for, and has had a long, creative, fruitful life to celebrate this weekend. Nonetheless, the grief inheres. The hardest shift? I miss the person she has been all my life until recently. And yet: here she is. Herself, more impenetrable than ever. And loved.

~

Mom at about my current age–ca. 1998 or ’99.

Getting through somehow

My mother has vascular dementia, which renders her more and more aphasic, though in her case–so far–her “emotional tone” (as philosopher Arne Naess calls it) has remained intact. I visited my mother on a recent occasion when I wasn’t feeling my best and had had a week of less-than-good health. It was not a matter of duty. The time I spend with my mother is beautiful. But it had been a tough week. Let’s leave it at that.

We sat in her apartment in the assisted-living wing and arranged the flowers I’d brought. Then we spent 20 minutes in a kind of conversation, to which I’ve become accustomed, during which she tries to convey information about something she needs to have done. In this case, after much of the usual (really, rather humorous at times) confusion, I deciphered that she wanted some sweaters taken to the dry cleaner.

Such minutia. And yet, so difficult to get across, across that divide of language and cognition. The incredible concentration and effort it takes her just to dial a phone number to call her ailing sister. To tell the nurse aide that she needs more yogurt. Anything.

Then she surprised me. She pointed to my forehead and then to her own. “This,” she said. “Is wrong. For you. What?”

Was she reading a crease in my brow? I told her I had not been feeling great. She wanted to know, so I told her details, the way one tells one’s mother. Even though I am never sure quite how much gets through.

“Lie down. Take off the peaks.” By which she meant shoes. Why not comply? We both took off our shoes and spent the visit relaxing. We even indulged in a glass of wine because she loves to offer wine to her guests. Never mind it was 11 am. My mother has lost that rigid cognitive sense of time that the rest of us spend our lives obsessing over. There’s something valuable in that loss, though it is a loss.

She’s still teaching me things. Other ways to live with loss (my dad, her “normal” brain, mobility, words…).

~~

The next evening, she called me. She wanted to know how I was feeling. I’m 63 years old and my mother is 88, and she’s still worried about me.

I’m feeling loved.


Love is all you need

Ways of reading

Conversing about books with a colleague recently, I began to reflect on how readers of literature read. The topic had come up earlier in the day when several students came in for tutoring on literary-analysis papers. In addition, a student in the Education program was devising a curriculum for third-graders; the lesson focus was about “different ways of reading.” I have always loved to read, and I never spent much time considering how I go about it. It just seemed natural to me…and then I encountered academia’s approach to reading and had to reconsider the way I devoured fiction.

My coworker consumes novels the way a literature professor does. He savors passages, re-reads earlier chapters in a novel to find connections with later parts of the book, and looks up references and allusions to be sure he understands the deep context of a literary text. He asks himself questions about what he’s reading. The questions keep him reading and engaged with the words on the page.

That method is how I read poetry. But it is not how I read novels or non-fiction books; those I read at a clip, almost inhaling them, seldom stopping. I read them for pleasure, for fun–I even absorb sad novels and memoirs this way, in a mad whirl of reading enjoyment, caught up in the events and characters and setting of the book in my hands. This is not to say I never look up words, places, references, but generally I do so after I have finished the book. I guess I examine such things in retrospect.

brad-hammonds-flikr-books

brad-hammonds-flikr-books.jpeg

The downside of reading fiction using my own “natural” method is that I tend to come away from a book with a strong sense of whether it was wonderfully-written or moving or amazing, but I cannot explain why it has that effect–how the author managed to get me to  believe in the characters or the world she created with her words alone. When I am reading for fun or information, however, there isn’t any need for analytical levels of cognition. If I forget a detail, I can go back and look for it later. Or forget about it. No great loss.

There are other methods of reading–certainly more than two ways to “get into” a book! The conversation about reading strategies (what feels natural to a literature-consumer, how readers savor a good book, questioning not just the text but also the self reading the words) piques my interest. I suspect some connection with consciousness and cognition, aspects of human-ness I have mulled about in previous posts.

Well, enough for now. I am signing off–to read a good book!

Language acquisition & its opposite

When my children were learning to talk, I developed a fascination with language  acquisition. The process of learning to communicate with other human beings in the lingua franca of the culture (speaking US English to adults) was taking place in front of me. I felt awed by the intelligence required to decipher language and delighted by the myriad ways the process and behavior unfolded. For about a year, I seriously considered enrolling in university to pursue a Master’s degree in some sort of language/linguistics-related discipline.

But I had two toddlers and lacked the energy, time, and money to devote to diligent scholarship of that sort. Instead, I took my usual autodidactic approach: reading and observing. One thing of vivid interest to me at the time was how differently my children each approached “learning to talk.” In retrospect, I recognize that their differences in personality and their differing cognitive strengths made significant impacts upon language acquisition, implementation, expression, and use.

ponchos~

 

At present, my interests in language revolve about the other end of the lifespan of human communication–the loss of language abilities as people age. The elderly Beloveds in my life are displaying markedly differing changes in how they experience, and express, cognitive gaps. Often the expression of such gaps appears in the way they speak.

This would be the opposite of language acquisition. Memory losses, or slower memory retrieval functions, are common to most adults over age 70; but those issues do not necessarily affect sentence structure, vocabulary, pronunciation, descriptive abilities, and emotive communication through language. Strokes, neurovascular constriction, and Alzheimer’s disease, among other physiological alterations, can exert marked effects on verbal and written communication, however. Hearing loss and diminished vision exacerbate these problems.

All too often, the human being seems “lost” beneath the symptoms or becomes isolated as a result of the immense challenges to human relationships we have taken for granted for decades of being relatively “non-impaired.”

The loss of language skills intrigues me as much as the acquisition; my readings in neuropsychology and neurobiology have taught me that there is so much yet to learn about the brain and how it processes–well, almost everything (but my special interest is communication).

And my experience with people who are aging, or in some cases–my hospice volunteer work–dying, demonstrates on a personal or anecdotal level how uniquely individual each one of us is. How we communicate, how we express ourselves, our neurological processes, our physiology, temperament, environment, genetic makeup…so gloriously complex, random, fascinating.

Maga

The late Edna Smith Michael in 1990. Her language skills stayed quite intact until her last hospitalization.

~

Some recent reading–

Into the Silent Land: Travels in Neuropsychology (Paul Broks); Cure: A Journey into the Science of Mind over Body (Jo Marchant); The Language Instinct (Steven Pinker)

A post I put up awhile back contains my poem “Age as a Foreign Language.” Apropos here, I think.

~

And no, I am not tempted to enroll in further formal study on this topic. But reading suggestions will be gratefully accepted!

 

Do we change? Can we?

I have blogged about the Myers-Briggs personality inventory–a tool that may or may not be useful to psychologists, depending on whom you talk to. Because my father used the inventory in his studies of people in groups, he “experimented” with his family, administering the inventory to the five of us. I was 17 years old the first time I took the survey; my type was INFP (introvert, intuitive, feeling, perceptive), heavy on the I and the F. Has that “type” changed over the years? The “brief” version of the test now shows me moving in the last category, still P but slightly more toward J (judgment). That makes sense, as I have had to learn how to keep myself more organized and ready for difficult decisions. After all, I am a grownup now.

The personality type does not indicate, however, what sort of thinker a person is. Certain types may tend to be more “logical” in their approach to problem-solving, and others tending toward the organized or the intuitive, but what do we mean by those terms? For starters, logical. Does that mean one employs rhetoric? That one thinks through every possibility, checking for fallacies or potential outcomes? Or does it mean a person simply has enough metacognition to wait half a second before making a decision?

Furthermore, if personality type can change over time (I’m not sure the evidence convinces me that it can), can a person’s thinking style change over time? Barring, I suppose, drastic challenges to the mind and brain such as stroke, multiple concussion damage, PTSD, chemical substance abuse, or dementia, are we so hard-wired or acculturated in our thinking that we cannot develop new patterns?

There are many studies on such hypotheses; the evidence, interpretations, and conclusions often conflict. Finally, we resort to anecdote. Our stories illustrate our thinking and describe which questions we feel the need to ask.

~ A Story ~

high school.jpg.CROP.rectangle3-large

This year, I did the previously-unthinkable: I attended a high school reunion.

We were the Class of 1976, and because our city was directly across the Delaware River from Philadelphia–the Cradle of Liberty! The home of the Liberty Bell and Independence Hall!–the bicentennial year made us somehow special.

Not much else made us special. Our town was a blue-collar suburb of Philadelphia, a place people drove through to get to the real city across the river, a place people drove through to get from Pennsylvania to the shore towns. Our athletics were strong, our school was integrated (about 10%  African-American), people had large families and few scholastic ambitions. Drug use was common among the student population, mostly pills and pot. There were almost 600 students in the class I graduated with, although I was not in attendance for the senior year–that is a different story.

But, my friend Sandy says, “We were scrappy.” She left town for college and medical school, became a doctor, loves her work in an urban area. “No one expected much of us, so we had to do for ourselves,” she adds, “And look where we are! The people here at the reunion made lives for themselves because they didn’t give up.”

It is true that our town did not offer us much in the way of privilege or entitlement, and yet many of us developed a philosophy that kept us at work in the world and alive to its challenges. The majority of the graduates stayed in the Delaware Valley region, but a large minority ventured further. Many of these folks did not head to college immediately, but pursued higher education later on in their lives; many entered military service and received college-level or specialized training education through the armed forces.

ann1975-76?

Does this young woman look logical to you?

I wandered far from the area mentally, emotionally, and physically; but then, I was always an outlier. One friend at the reunion told me that she considered me “a rebel,” a label that astonishes me. I thought of myself as a daydreamer and shy nonconformist, not as a rebel! Another friend thanked me for “always being the logical one” who kept her out of serious trouble. It surprises me to think of my teenage self as philosophical and logical. When one considers the challenges of being an adolescent girl in the USA, however, maybe I was more logical than most.

I find that difficult to believe, but I am willing to ponder it for awhile, adjusting my memories to what my long-ago friends recall and endeavoring a kind of synthesis between the two.

~

The story is inevitably partial, incomplete, possibly ambiguous. Has my thinking changed during the past 40 years? Have my values been challenged so deeply they have morphed significantly? Have I developed a different personality profile type? Are such radical changes even possible among human beings, despite the many transformation stories we read about and hear in our media and promote through our mythologies?

How would I evaluate such alterations even if they had occurred; and who else besides me could do a reasonable assessment of such intimate aspects of my personal, shall we say, consciousness? Friends who have not seen me in 40 years? A psychiatrist? My parents? A philosopher? It seems one would have to create one’s own personal mythology, which–no doubt–many of us do just to get by.

I have so many questions about the human experience. But now I am back in the classroom, visiting among the young for a semester…and who can tell where they will find themselves forty years from now? I hope they will make lives for themselves, and not give up.

 

 

Slightly less difficult books

photo ann e. michaelI recently read Paul Bloom’s book Descartes’ Baby while simultaneously reading Daniel Dennett’s Content & Consciousness. Of these two, the latter falls a bit under the “difficult books” category, but it is not too hard to follow as philosophy goes. Dennett’s book is his first–the ideas that evolved as his PhD thesis–and in these arguments it is easy to see his trademark humor and his deep interest in the ways neurology and psychology have aspects useful to philosophy. Bloom’s book, a somewhat easier read, suggests that the mind-body problem evolved naturally from human development: young children are “essentialists” for whom dualism is innate; Descartes simply managed to write particularly well about the evolutionary project (with which, I should note, Bloom disagrees; as a cognitive psychologist, he maintains a more materialist stance).

It turns out that because I have read widely if shallowly in the areas of philosophy, cognitive psychology, evolution, art, aesthetics, and story-making, I find myself able to recognize the sources and allusions in texts such as these. Quine, Popper, Darwin, Pinker, and Wittgenstein; Schubert, Kant, Keats, Dostoevsky, Rilke…years of learning what to read next based on what I am currently reading have prepared me for potentially difficult books. [Next up, Gilbert Ryle and possibly Berkeley.] I don’t know why I feel so surprised and happy about this. It’s as though I finally realized I am a grownup!

And I am glad to discover I am not yet too old to learn new things, young enough to remember things I know, and intellectually flexible enough to apply the information to other topic areas. Synthesis! Building upon previously-laid foundations! Maslow’s theory of humanistic education! Bloom’s taxonomy! The autodidact at work in her solitary effort at a personal pedagogy.

If I ever really discover what consciousness is, I’ll let you know.

 

 

 

Dozens of views

No one has ever found the traces of memory in a brain cell. Nor are your imagination, your desires, your intentions in a brain cell. Nothing that makes us human is there.

Deeprak Chopra

Chopra is not my favorite writer on consciousness, but he does an adequate job of explaining complicated concepts to people who are just getting accustomed to questioning experience and who are beginning to be open-minded about the mind, the body, and beyond. So often, we have been raised not to doubt, told what God is and is not, and trained into beliefs about the truth. This, in spite of the common human trait of curiosity that asks: who and where are we in the world? What makes me me? What happens when I die? Chopra, with his medical background and his experience spanning several major cultures, can offer both a great deal of information and pose provocative questions to his readers.

In our technologically-obsessed culture, it is easy to turn to science as foremost authority; I happen to be fascinated by neurology and neuropsychology when it comes to consciousness, for example, but I never rule out so-called spiritual insights. Chopra’s writing often falls into the fallacy of stating “there are two views,” when in fact there are dozens of views, even among scientists. My guess (it is but a guess) is that the either/or form of presenting perspective is simpler for the “average reader”–as defined by his editor–to understand. Yet it seems to me a slight to the average reader to narrow these big questions down to “two views.”

Here’s an example, just one of many in his writing:

There are two views about consciousness in science today. One is that consciousness is an emergent property of the brain and. therefore, also an emergent property of evolution. That’s the materialist, reductionist view. There’s another view…[that] holds that consciousness is not an emergent property but inherent in the universe.

Now, I genuinely prefer what Chopra calls the “mind first” argument in which consciousness is a kind of field effect. I would not, however, suggest that matter first and mind first are the only two views today’s scientists hold; and neither would anyone else who has read a number of the elegantly-argued, well-researched, thoughtful, passionate blogs of today’s science researchers. The majority of them are atheists, but some are agnostics and some are inclined toward non-theist teachings such as Zen. Even among the ranks of non-believers (in terms of an anchoring eternal presence or god), the question of consciousness leads to intriguing inquiries.

The philosophers of today cannot ignore scientific advances any more than Maimonides could in the 12th century. Physics is a thing! as my students might put it. For the ways in which this relates to the science of neurology, I return to the framework on consciousness proposed by Douglas Hofstadter in his book I Am a Strange Loop.

What is the world but consciousness? Or illusion, in Hindu and Buddhist teachings (Maya) and, in a slightly different but related way, in Plato. And how many perspectives are there on that consciousness?

Chopra would probably say that each of us has to experience a state of awareness and interaction with whatever deep potential “god” or the creating principle offers for us. Which basically admits of not merely dozens but billions of unique interactions or perspectives…if we even agree to the schema.

goldenrod (solidago) going to seed

goldenrod (solidago) going to seed

Noesis

noesis~

  1. Cognition; perception.
    2. The exercise of reason.

Interesting that definition number two is dependent upon definition number one. Lately I have been thinking about the difference between consciousness and conscience; the latter seems to me to be specifically human, I guess, because isn’t conscience a sort of cultural or judgmental entity based upon rules? Yes, I am talking about morality, a term I tend not to use much when I consider cognition, consciousness, narrative, being.

I recently perused Patricia Churchland’s Braintrust and found myself intrigued about where and in what ways morality and consciousness or sentience mesh. Churchland is a moral philosopher, but this book relies largely on arguments premised on neurology, biology, evolution, and animal studies. Her critics pose interesting rebuttals, too. I found her book readable and often convincing–and it’s the kind of book that leads me to other writers and scientists; I love that in a book!

braintrust~

The phenomenology of consciousness–the carbon body brain-based “real world” idea of the word–involves intentionality, sentience, qualia, and first-person perspective. We can identify qualities based upon our first-person consciousness and respond to them. This process has led Western thinkers toward the concept of reason or rational thinking. The exercise of reason derives from perception.

This does not mean that phenomenology is the sole form of consciousness or even that it is necessarily human-only, but it seems to me to be the easiest one for human beings to wrap their minds around. Yet the earlier philosophers were not phenomenologists. Their speculations about what consciousness originated in and what morality inhered in were quite abstract.

For a good sum-up of how contemporary scholars define and discuss consciousness, go to Stanford’s site here.

~

Being cognizant or conscious does not necessarily lead to moral behavior or reason…or does it? Here we have an idea that has been debated for centuries. In her book, Churchland often returns to Hume, who wrote about morality from what, eventually, became known as the utilitarian stance (though I would argue Hume is not really utilitarian). Stanford offers an overview of morality as defined by philosophers over the years; The Internet Encyclopedia of Philosophy says this of Hume:

In epistemology, he questioned common notions of personal identity, and argued that there is no permanent “self” that continues over time. He dismissed standard accounts of causality and argued that our conceptions of cause-effect relations are grounded in habits of thinking, rather than in the perception of causal forces in the external world itself. He defended the skeptical position that human reason is inherently contradictory, and it is only through naturally-instilled beliefs that we can navigate our way through common life.

These concepts should feel modern to most of us thanks to cultural anthropology, sociology, and psychology, among other disciplines. Hume’s position conflicts with much religious dogma, but his ideas were not out of line with many of his fellow Enlightenment-Era thinkers. During the Enlightenment, intellectuals were enamored of the exercise of reason (noesis).

~

So: consciousness and conscience. First we have the one–however it arises within us*–and the other develops (or evolves?) thanks to the need for social beings to navigate common life. And thanks, perhaps, to brain evolution adapting to social common life (see Churchland for more on this).

Much to mull over during my brief summer break.

~

Jiminy Cricket copyright Walt Disney Co.

Jiminy Cricket copyright Walt Disney Co.

*See my numerous previous posts on consciousness!

⇐ “And always let your conscience be your guide!”

 

Awareness, openness, & … magic mushrooms?

In a recent New Yorker article about potential medical uses for psilocybin (“The Trip Treatment”), science, culture, and food writer Michael Pollan interviewed researchers in neuroscience, medicine, and psychology. The medical potential of psychedelic drugs is not something I can comment on from reading just one article; what intrigued me most about this piece is how these drug studies overlap with studies on cognition, metacognition, consciousness, and spirituality. The medically-controlled “tripping” that volunteers have undergone overwhelmingly resulted in some form of what we term mystical or spiritual (for lack of a scientific term) feeling.

It’s almost impossible to consider these realms of experience without questioning concepts such as “soul” or “self-awareness.” Pollan writes:

Roland Griffiths is willing to consider the challenge that the mystical experience poses to the prevailing scientific paradigm. He conceded that “authenticity is a scientific question not yet answered” and that all that scientists have to go by is what people tell them about their experiences. But he pointed out that the same is true for much more familiar mental phenomena.

“What about the miracle that we are conscious? Just think about that for a second, that we are aware we’re aware!”

A man after my own heart. It is amazing, a kind of miracle. And we get consciousness and metacognition without any drug intervention at all. It just springs into our beings at some point, as we create ourselves from lived events and construct speculative worlds and an understanding (though often flawed) of other minds.

Here is another fascinating result from the psilocybin studies that may make us revise our ideas of interpersonal relationships, personhood, and creating a self. Pollan writes:

A follow-up study by Katherine MacLean, a psychologist in Griffiths’s lab, found that the psilocybin experience also had a positive and lasting effect on the personality of most participants. This is a striking result, since the conventional wisdom in psychology holds that personality is usually fixed by age thirty and thereafter is unlikely to substantially change. But more than a year after their psilocybin sessions volunteers who had had the most complete mystical experiences showed significant increases in their “openness,” one of the five domains that psychologists look at in assessing personality traits. (The others are conscientiousness, extroversion, agreeableness, and neuroticism.) Openness, which encompasses aesthetic appreciation, imagination, and tolerance of others’ viewpoints, is a good predictor of creativity.   [italics mine]

Openness, aesthetic appreciation, imagination, tolerance, creativity…and one researcher in neuropsychopharmacology suggests that this sort of un-boundaried openness signifies a temporary regression to an infantile state, very much as Freud hypothesized. I thought instead of Bachelard and the childhood reverie state.

Intriguing, that pharmacological work of this sort makes neuroscientists resort to citing William James and Sigmund Freud on mystical experience and the subconscious!

[Robin] Carhart-Harris believes that people suffering from other mental disorders characterized by excessively rigid patterns of thinking, such as addiction and obsessive-compulsive disorder, could benefit from psychedelics, which “disrupt stereotyped patterns of thought and behavior.” In his view, all these disorders are, in a sense, ailments of the ego. He also thinks that this disruption could promote more creative thinking. It may be that some brains could benefit from a little less order.    [italics mine]

I wonder if the aesthetic experience itself, when wholly engaged, can sometimes act like a drug on the art-viewer’s or art-maker’s being. When one reads the poem that rearranges one’s world, doesn’t it disrupt stereotyped, familiar, habitual patterns of thought? That’s what happens for me when I encounter great art of any kind. It is close to mystical.

~

 

Blame & fear

Amazing, the human brain, consciousness layered over instinct, habits of thought, the ways we feel, rationalize, justify, seek for why. In the wake of tragedies, we tend to react with fear and blaming; it is as if we could only discern who or what to blame, perhaps we could learn how to prevent it. So we “reason.”

But all too often, what we are doing is not using reason. Instead, people tend to blame whoever or whatever best suits their own, already-decided view of the world and use “reason” to justify their feelings, a psychological phenomenon called “confirmation bias” on which Daniel Kahneman has much to say. Cognitive biases inherently interfere with objective analysis, which is sometimes a lovely and rich part of the human experience but which also leads to terrible misuse of analysis. We usually act based on biases rather than on logic (see this page for a long list of biases). So many ways to justify our often-mistaken and uninformed beliefs or responses.

Anthropologist and philosopher René Girard offers insights into the desire to blame–a sociocultural desire, deeply rooted in the way humans behave when in groups and, he believes, one of the foundations for the development of religious rituals, among other things. As we endeavor to “make sense of” impossible events, to “discover why” they occur, we seem naturally to turn to blaming. Apparently, designating a scapegoat consoles us somehow, allows us to believe we might have some control over what is terrible, not unlike sacrificing a calf to propitiate an angry god.

~

I lived just outside of Newtown, CT for a few years in the 1980s. I still have friends there and I know the area well. It was a safe town, and it is still a safe town; only now, it is a safe town in which a terrible and statistically-rare occurrence happened. That sounds rather dry and heartless: “a statistically-rare occurrence.” Yet from the logic standpoint–if we are being reasonable–it is simple to discover that by any measure, U.S. schools are the safest place a school-age child can be. Fewer than 2% of deaths and injuries among children ages 5-18 occur on school grounds. I got these numbers from the US Center for Disease Control. Keeping an armed policeman at every U.S. school (as recently proposed by the president of the NRA) might possibly make an incrementally small difference in that tiny number. Might. Possibly. Rationally, would it not make more sense for us to address the 98% and decrease that number? Though I am all in favor of hiring more people to safeguard our cities, the only real value of such a move would be to reduce a mistaken sense of public fear.

Because we are afraid, and fear is keeping us from rational and compassionate behavior. Fear can be useful–it probably helped us survive in the wild, and it continues to serve good purpose occasionally; but human beings ought to recognize the value of fear is limited in a civilized, community-based, theoretically-rational society. Rational, compassionate behavior on the part of our nation would be to remove the lens of public scrutiny from the people of Newtown and allow them to deal with grieving in the privacy of their families and community. We cannot come to terms with private loss, nor ever understand it truly, through network news, tweets, photographs on our internet feeds, or obsessive updates on ongoing police investigations.

Fear also keeps us from finding resources of our own. It blocks us from our inner strengths. The families and friends of the victims and the killer need that inner strength more than they will ever require public notice, no matter how well-intentioned the outpourings are.

~

Blame. Whose fault is it? Children and teachers and a confused and angry young man and his mother have died violently, and I’ve been listening to the outcry all week–even though I have tried to limit my exposure to “media sources.” Here are the scapegoats I have identified so far: the mental health system; semi-automatic weapons; violent computer games; the 2nd Amendment; the media; autism; school security; the killer’s father and mother (herself a victim); anti-psychotic drugs and the pharmaceutical industry; divorce; god; U.S. legislation concerning weapons and education and mental health; bullies in schools; the NRA; the victims themselves, for participating in a godless society; poor parenting; narcissism; the Supreme Court; President Obama; the CIA. I’m sure I have missed a few. (Andrew Solomon’s recent piece in the New York Times also touches on our default blame mode; his list coincides pretty closely with mine; see this article.)

Scapegoats serve several purposes. They allow us to say we, ourselves, no matter how guilty we feel, are not at fault. They give us an excuse for disaster, something to punish or something to attempt to change through controls we can think through and develop (“logically”). And in fact some good may eventually come of the changes and the control we exert, but such change is likely to be small and long in arriving. Mostly what scapegoating achieves turns out to be bad for us, however, because what it does well is give us something to fear.

Fear motivates us to read obsessively every so-called update on the killer’s presumed (and, ultimately, unknowable) motives, to argue over the best way to address the complex and intertwined issues that each of us perceives to be the root cause of any particular tragic event. Our fears make us consumers of media, and our information sources respond to our need to know why and our desire to blame. Our fears drive us to purchase guns to protect ourselves even though statistics continually prove that more U.S. citizens are killed accidentally or intentionally by someone they know intimately (including themselves, especially in the case of suicides–which Solomon also addresses in the essay I’ve cited) than by strangers or during acts of robbery, terrorism or massacres. “News,” as we have come to know it, is predicated on reporting things that are dramatic and therefore statistically unlikely. Suppose our information sources kept an accurate hourly update on weapons-related or motor vehicle-related deaths…would we become immune to the numbers? Would we say “That’s not news”? Would we be less avid consumers of such “news sources”? Would it comfort us to know we are more likely to be struck by lightning twice than to die in a terrorist act on U.S. soil or be killed by a deranged gunman in a mall or school?

Can we delve into our inner resources of rationality in order to fight our fears?

~

I think not. Fear is not easily swayed by facts. Instinct trumps reason psychologically and cognitively in this case. Fear is so emotional that it requires a deeply spiritual, soul-searching response perhaps–instead of a reasoned one. Perhaps that is why so many of the “great religions” include stories of human encounters with a god, godhead, or cosmic intelligence which humans “fear” (though the term is used to signify awe and recognition of human insignificance rather than the fear of, say, a lunging tiger). In these stories–the Bhagavad Gita and Book of Job among them–a human confronted with the godhead recognizes such fear/awe that he can never afterwards fear anything this world has to offer. In the face of what is beyond all human understanding, there is no reasoning, and no human “feelings” that psychology can explain.

Roosevelt said we have nothing to fear but fear itself. Words well worth recalling in times like these.

~

waterpaper

Finally, this:

And the angel said unto them, Fear not: For, behold, I bring you good tidings of great joy, which shall be to all people.

Namaste, Shalom, Peace, Al-Salam. May your find the strength within yourself to make your way compassionately through this world.